How Apple's walled garden iPhone security can help hackers evade scrutiny
Apple has a secure mobile ecosystem because of choices in hardware and software that it has made, but the same systems and policies that keep most hackers out could be dramatically helping those few who can beat it.

Apple locks down its devices to prevent malware, but it's impossible to prevent everything
Apple not only publicly champions privacy, it locks down its devices to protect security -- even if some still say it's not enough. Now researchers are saying that this security actively benefits the small proportion of hackers who are able to defeat it.
"It's a double-edged sword," senior researcher Bill Marczak of cybersecurity firm Citizen Lab said to MIT's Technology Review. "You're going to keep out a lot of the riffraff by making it harder to break iPhones. But the 1% of top hackers are going to find a way in and, once they're inside, the impenetrable fortress of the iPhone protects them."
Marczak says that, for example, none of his team's systems could initially find any evidence of hacking on Al Jazeera journalist Tamar Almisshal's iPhone. Only by looking at the phone's internet traffic were they able to identify a connection to servers owned by Israeli hacking company NSO Group.
What's more, Apple's ongoing efforts to improve security derailed an investigation Marczak was conducting in 2020. Unspecified updates to iOS reportedly disabled a jailbreaking tool that Citizen Lab was using. It prevented the tool examining a particular update folder, which Marczak says is where hackers were hiding their code.
"We just kind of threw our hands up," he said. "We can't get anything from this -- there's just no way."
He refers to it as having a tripwire, meaning you can't observe malware, but you can detect it in action. "As we lock these things down," he said, "you reduce the damage of malware and spying."
Security researcher Patrick Wardle has previously described the increased security Apple has brought to the Mac with Apple Silicon, and he told Technology Review that this adds to the overall protection of Apple devices.
"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction."
"[However, security] tools are completely blind, and adversaries know this," he continued.
Aaron Corkerill of Lookout mobile security thinks that even so, Apple will continue to lock down devices, and so will other manufacturers.
"Android is increasingly locked down," he said. "We expect both Macs and ultimately Windows will increasingly look like the opaque iPhone model."
"We endorse that from a security perspective," he continues, "but it comes with challenges of opacity."
"I personally believe the world is marching toward this," says Ryan Storz. "We are going to a place where only outliers will have computers -- people who need them, like developers."
"The general population will have mobile devices which are already in the walled-garden paradigm," he continues. "That will expand. You'll be an outlier if you're not in the walled garden."
An Apple spokesperson told Technology Review that the company believes it is pursuing the correct balance between usability and security.
Separately, Apple recently responded to Citizen Labs' report of vulnerabilities in iMessage by completely rewriting the app and service to remove the issue. As released in iOS 14, a new sandbox called Blastdoor parses all untrusted data, analyzing it for issues that may affect users.

Apple locks down its devices to prevent malware, but it's impossible to prevent everything
Apple not only publicly champions privacy, it locks down its devices to protect security -- even if some still say it's not enough. Now researchers are saying that this security actively benefits the small proportion of hackers who are able to defeat it.
"It's a double-edged sword," senior researcher Bill Marczak of cybersecurity firm Citizen Lab said to MIT's Technology Review. "You're going to keep out a lot of the riffraff by making it harder to break iPhones. But the 1% of top hackers are going to find a way in and, once they're inside, the impenetrable fortress of the iPhone protects them."
Marczak says that, for example, none of his team's systems could initially find any evidence of hacking on Al Jazeera journalist Tamar Almisshal's iPhone. Only by looking at the phone's internet traffic were they able to identify a connection to servers owned by Israeli hacking company NSO Group.
What's more, Apple's ongoing efforts to improve security derailed an investigation Marczak was conducting in 2020. Unspecified updates to iOS reportedly disabled a jailbreaking tool that Citizen Lab was using. It prevented the tool examining a particular update folder, which Marczak says is where hackers were hiding their code.
"We just kind of threw our hands up," he said. "We can't get anything from this -- there's just no way."
Searching for evidence of malware
According to Technology Review, another security firm works by looking for indirect clues. Trail of Bits security engineer Ryan Storz uses an Apple-approved app called iVerify, which looks for anomalies like unexplained file changes.He refers to it as having a tripwire, meaning you can't observe malware, but you can detect it in action. "As we lock these things down," he said, "you reduce the damage of malware and spying."
Security researcher Patrick Wardle has previously described the increased security Apple has brought to the Mac with Apple Silicon, and he told Technology Review that this adds to the overall protection of Apple devices.
"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction."
"[However, security] tools are completely blind, and adversaries know this," he continued.
Aaron Corkerill of Lookout mobile security thinks that even so, Apple will continue to lock down devices, and so will other manufacturers.
"Android is increasingly locked down," he said. "We expect both Macs and ultimately Windows will increasingly look like the opaque iPhone model."
"We endorse that from a security perspective," he continues, "but it comes with challenges of opacity."
"I personally believe the world is marching toward this," says Ryan Storz. "We are going to a place where only outliers will have computers -- people who need them, like developers."
"The general population will have mobile devices which are already in the walled-garden paradigm," he continues. "That will expand. You'll be an outlier if you're not in the walled garden."
An Apple spokesperson told Technology Review that the company believes it is pursuing the correct balance between usability and security.
Separately, Apple recently responded to Citizen Labs' report of vulnerabilities in iMessage by completely rewriting the app and service to remove the issue. As released in iOS 14, a new sandbox called Blastdoor parses all untrusted data, analyzing it for issues that may affect users.
Comments
"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction.”
Are the good guys asking for a backdoor like the government so they can do their poking around?
Ultimately, the good guys were using a sort of back door to look for signs of infiltration but that backdoor got locked so now no one can use it. Good if it keeps a bad guy out, but bad if the bad guy finds another way in and the good guys can't see it any more.
It's just another example of the cat and mouse game of security.
Now, that is obvious from a sociological standpoint -- who monitors the monitors.
But, even from a technological perspective? Are we into the area of non-computable functions? Goedel's incompleteness theorem?
I hope Apple is using cloud based services and AI to analyse and weed out apps for closer inspection.
Trust is good, verification is better…
...lol Apple isn’t going to forgo security updates to protect jailbreak utilities.
Thus not really falling your argument. Apple’s model, while not perfect, is quantifiably better.
Here's the headline in case you missed it:
Apple's walled garden iPhone security can help hackers evade scrutiny
Not more than a few months ago one of the companies selling OS exploits said that it was harder to find quality (valuable) Android exploits as compared to those for iOS and was now willing to pay much more for Android ones than (apparently) more common iOS "backdoors".
EDIT:
"During the last few months, we have observed an increase in the number of iOS exploits, mostly Safari and iMessage chains, being developed and sold by researchers from all around the world," the Zerodium CEO said. "The zero-day market is so flooded by iOS exploits that we've recently started refusing some them.
"On the other hand, Android security is improving with every new release of the OS thanks to the security teams of Google and Samsung, so it became very hard and time-consuming to develop full chains of exploits for Android and it's even harder to develop zero-click exploits not requiring any user interaction," he added.
And:
For a long time, it used to be that Apple's iPhones offered considerably better security than their Android counterparts, with Apple flaunting how uncrackable its phones were, as well as refusing to offer backdoor access into iPhones. Apparently, though, things may have changed.
While Apple still refuses to create backdoors that can be accessed by the government in its phones, that doesn't seem to be able to stop intelligence agencies from gaining access to Apple's iPhones.
A report from Vice has revealed that several companies offer tools for breaking Apple's encryption. One of them is Cellebrite, an Isreali cybersecurity outfit, with its Universal Forensic Extraction Device being favored by the FBI. Going by the report, Cellebrite's tool is powerful enough to break into iPhone models up to—and including—the iPhone X. The tool allows app data, call logs, GPS records, messages, and contacts to be extracted.
Interestingly, though, doing that on Android devices is a lot harder. Cellebrite's UFED tool failed to adequately extract any app data, internet browsing, or GPS records from phones like the Google Pixel 2 and Samsung Galaxy S9. Even weirder, it completely failed at getting anything from the Huawei P20 Pro.
“Right now, we’re getting into iPhones, " Detective Rex Kiser, a digital forensic examiner for the Fort Worth Police Department, was quoted as saying. "A year ago we couldn’t get into iPhones, but we could get into all the Androids. Now we can’t get into a lot of the Androids.”