Frustrated security researchers speak out about Apple's bug bounty program
Apple's bug bounty program is frustrating the security research community, with complaints spanning poor communication, confusion about payments, and more.
Credit: Laurenz Heymann/Unsplash
The program, which was first started in 2016, invites ethical hackers and security researchers to find flaws in Apple's hardware and software in return for bounties. In 2019, Apple opened its bounty initiative to all researchers.
According to The Washington Post, many researchers believe that Apple's slow response to fixing bugs, delays in payment or communication, and "insular culture" is hurting both the program and Apple device security at large.
For example, a former and current Apple employee both said that the Cupertino tech giant has a massive backlog of bugs that it has yet to patch. Luta Security CEO Katie Moussouris, who helped start the Defense Department's bug bounty program, says that frustrates those who find and report issues to the company.
"You have to have a healthy internal bug fixing mechanism before you can attempt to have a healthy bug vulnerability disclosure program," Moussouris said. "What do you expect is going to happen if they report a bug that you already knew about but haven't fixed? Or if they report something that takes you 500 days to fix it?"
Other issues exist in how much Apple pays for bounties, researchers said. For example, Apple's program pays up to $100,000 for vulnerabilities and exploits that allow attackers to gain "unauthorized access to sensitive data."
Earlier in 2021, researcher Cedric Owens discovered a vulnerability that may have allowed attackers to bypass the Mac's security mechanisms. He reported that flaw to Apple, who fixed the bug but only paid Owens $5,000. That's despite the fact that he and other researchers believe that it could have allowed access to "sensitive data."
Owens said he will likely continue submitting bugs to Apple, though other researchers might not. That could lead to "more gaping holes in Apple's processes and in their products."
Some researchers believe that Apple is too slow in issuing payments. Sam Curry, who spent months hacking Apple with a team of researchers, found a slew of vulnerabilities in the company's digital infrastructure. Curry and his team eventually received $50,000 in payment.
However, Curry believes that Apple knows its reputation in he security research industry might be shaky.
"I think they're aware of how they're seen in the community, and they're trying to move forward," Curry said.
There's also the issue of Apple's secrecy culture. That stands in direct contrast to hacker culture, which holds the open and free flow of information as one of its core values.
"It's not a surprise they haven't embraced this public security researcher culture until recently, when their hand was forced into launching a bug bounty program," said Jay Kaplan, a founder of crowdsourced security research company Synack.
Because of its reputation, researchers weren't reporting bugs to Apple. Instead, "they were going to security conferences and speaking about it publicly and selling it on the black market," said Kaplan.
At least one iOS engineer, Tian Zhang, said that Apple ignored one of his bug reports and did not issue payment for the vulnerability -- despite the fact that the company fixed it.
"It's a mixed feeling. On one side, as an engineer, you want to make sure the products you're building are safe for other people," Zhang said. "On the other hand, it seems like Apple thinks people reporting bugs are annoying and they want to discourage people from doing so."
Despite the bug bounty program, there also exists thriving gray and black markets for iOS vulnerabilities. Exploits for Apple's platforms can fetch up to $2 million, slightly lower than the $2.5 million for similar Android flaws.
Ivan Krstic, Apple's head of security engineering, characterized the bounty program as a "runaway success." However, when questioned about a case of a security researcher who did not receive a bounty for a flaw he discovered, Kristic said that Apple worked to fix such blunders.
"When we make mistakes, we work hard to correct them quickly, and learn from them to rapidly improve the program," Krstic said.
Read on AppleInsider
Credit: Laurenz Heymann/Unsplash
The program, which was first started in 2016, invites ethical hackers and security researchers to find flaws in Apple's hardware and software in return for bounties. In 2019, Apple opened its bounty initiative to all researchers.
According to The Washington Post, many researchers believe that Apple's slow response to fixing bugs, delays in payment or communication, and "insular culture" is hurting both the program and Apple device security at large.
For example, a former and current Apple employee both said that the Cupertino tech giant has a massive backlog of bugs that it has yet to patch. Luta Security CEO Katie Moussouris, who helped start the Defense Department's bug bounty program, says that frustrates those who find and report issues to the company.
"You have to have a healthy internal bug fixing mechanism before you can attempt to have a healthy bug vulnerability disclosure program," Moussouris said. "What do you expect is going to happen if they report a bug that you already knew about but haven't fixed? Or if they report something that takes you 500 days to fix it?"
Other issues exist in how much Apple pays for bounties, researchers said. For example, Apple's program pays up to $100,000 for vulnerabilities and exploits that allow attackers to gain "unauthorized access to sensitive data."
Earlier in 2021, researcher Cedric Owens discovered a vulnerability that may have allowed attackers to bypass the Mac's security mechanisms. He reported that flaw to Apple, who fixed the bug but only paid Owens $5,000. That's despite the fact that he and other researchers believe that it could have allowed access to "sensitive data."
Owens said he will likely continue submitting bugs to Apple, though other researchers might not. That could lead to "more gaping holes in Apple's processes and in their products."
Some researchers believe that Apple is too slow in issuing payments. Sam Curry, who spent months hacking Apple with a team of researchers, found a slew of vulnerabilities in the company's digital infrastructure. Curry and his team eventually received $50,000 in payment.
However, Curry believes that Apple knows its reputation in he security research industry might be shaky.
"I think they're aware of how they're seen in the community, and they're trying to move forward," Curry said.
There's also the issue of Apple's secrecy culture. That stands in direct contrast to hacker culture, which holds the open and free flow of information as one of its core values.
"It's not a surprise they haven't embraced this public security researcher culture until recently, when their hand was forced into launching a bug bounty program," said Jay Kaplan, a founder of crowdsourced security research company Synack.
Because of its reputation, researchers weren't reporting bugs to Apple. Instead, "they were going to security conferences and speaking about it publicly and selling it on the black market," said Kaplan.
At least one iOS engineer, Tian Zhang, said that Apple ignored one of his bug reports and did not issue payment for the vulnerability -- despite the fact that the company fixed it.
"It's a mixed feeling. On one side, as an engineer, you want to make sure the products you're building are safe for other people," Zhang said. "On the other hand, it seems like Apple thinks people reporting bugs are annoying and they want to discourage people from doing so."
Despite the bug bounty program, there also exists thriving gray and black markets for iOS vulnerabilities. Exploits for Apple's platforms can fetch up to $2 million, slightly lower than the $2.5 million for similar Android flaws.
Ivan Krstic, Apple's head of security engineering, characterized the bounty program as a "runaway success." However, when questioned about a case of a security researcher who did not receive a bounty for a flaw he discovered, Kristic said that Apple worked to fix such blunders.
"When we make mistakes, we work hard to correct them quickly, and learn from them to rapidly improve the program," Krstic said.
Read on AppleInsider
Comments
Considering how much Apple values secrecy, it simply isn't working. It's failing when it counts the most. If Apple really wanted to beef up its security, they could invest $10 billion on fully automated iPhone factories in the USA. That investment would pay off big time by producing robot technology Apple could sell to consumers and businesses.