How Apple's walled garden iPhone security can help hackers evade scrutiny

Posted:
in General Discussion
Apple has a secure mobile ecosystem because of choices in hardware and software that it has made, but the same systems and policies that keep most hackers out could be dramatically helping those few who can beat it.

Apple locks down its devices to prevent malware, but it's impossible to prevent everything
Apple locks down its devices to prevent malware, but it's impossible to prevent everything


Apple not only publicly champions privacy, it locks down its devices to protect security -- even if some still say it's not enough. Now researchers are saying that this security actively benefits the small proportion of hackers who are able to defeat it.

"It's a double-edged sword," senior researcher Bill Marczak of cybersecurity firm Citizen Lab said to MIT's Technology Review. "You're going to keep out a lot of the riffraff by making it harder to break iPhones. But the 1% of top hackers are going to find a way in and, once they're inside, the impenetrable fortress of the iPhone protects them."

Marczak says that, for example, none of his team's systems could initially find any evidence of hacking on Al Jazeera journalist Tamar Almisshal's iPhone. Only by looking at the phone's internet traffic were they able to identify a connection to servers owned by Israeli hacking company NSO Group.

What's more, Apple's ongoing efforts to improve security derailed an investigation Marczak was conducting in 2020. Unspecified updates to iOS reportedly disabled a jailbreaking tool that Citizen Lab was using. It prevented the tool examining a particular update folder, which Marczak says is where hackers were hiding their code.

"We just kind of threw our hands up," he said. "We can't get anything from this -- there's just no way."

Searching for evidence of malware

According to Technology Review, another security firm works by looking for indirect clues. Trail of Bits security engineer Ryan Storz uses an Apple-approved app called iVerify, which looks for anomalies like unexplained file changes.

He refers to it as having a tripwire, meaning you can't observe malware, but you can detect it in action. "As we lock these things down," he said, "you reduce the damage of malware and spying."

Security researcher Patrick Wardle has previously described the increased security Apple has brought to the Mac with Apple Silicon, and he told Technology Review that this adds to the overall protection of Apple devices.

"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction."

"[However, security] tools are completely blind, and adversaries know this," he continued.

Aaron Corkerill of Lookout mobile security thinks that even so, Apple will continue to lock down devices, and so will other manufacturers.

"Android is increasingly locked down," he said. "We expect both Macs and ultimately Windows will increasingly look like the opaque iPhone model."

"We endorse that from a security perspective," he continues, "but it comes with challenges of opacity."

"I personally believe the world is marching toward this," says Ryan Storz. "We are going to a place where only outliers will have computers -- people who need them, like developers."

"The general population will have mobile devices which are already in the walled-garden paradigm," he continues. "That will expand. You'll be an outlier if you're not in the walled garden."

An Apple spokesperson told Technology Review that the company believes it is pursuing the correct balance between usability and security.

Separately, Apple recently responded to Citizen Labs' report of vulnerabilities in iMessage by completely rewriting the app and service to remove the issue. As released in iOS 14, a new sandbox called Blastdoor parses all untrusted data, analyzing it for issues that may affect users.

Comments

  • Reply 1 of 12
    lkrupplkrupp Posts: 10,557member
    I read every word of this article trying to understand what it says. The bottom line seems to be that security researchers are now saying that Apple’s drive for security and privacy is actually helping the bad guys because they, the supposed good guys, can’t get into iOS to see what's going on. 

    "[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction.”

    Are the good guys asking for a backdoor like the government so they can do their poking around?
  • Reply 2 of 12
    MplsPMplsP Posts: 3,931member
    lkrupp said:
    I read every word of this article trying to understand what it says. The bottom line seems to be that security researchers are now saying that Apple’s drive for security and privacy is actually helping the bad guys because they, the supposed good guys, can’t get into iOS to see what's going on. 

    "[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction.”

    Are the good guys asking for a backdoor like the government so they can do their poking around?
    I think it can be summed up by the line "it's a double edged sword" 

    Ultimately, the good guys were using a sort of back door to look for signs of infiltration but that backdoor got locked so now no one can use it. Good if it keeps a bad guy out, but bad if the bad guy finds another way in and the good guys can't see it any more.

    It's just another example of the cat and mouse game of security.
    edited March 2021 dewmeaderutterCloudTalkinwatto_cobra
  • Reply 3 of 12
    crowleycrowley Posts: 10,453member
    lkrupp said:
    I read every word of this article trying to understand what it says. The bottom line seems to be that security researchers are now saying that Apple’s drive for security and privacy is actually helping the bad guys because they, the supposed good guys, can’t get into iOS to see what's going on. 

    "[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction.”

    Are the good guys asking for a backdoor like the government so they can do their poking around?
    They're not asking for anything, it's just an observation.  Hiding the keys and the locks makes it harder for people to break in, but anyone who works out where the hiding place is are sitting pretty, and since everything is hidden it's very hard to even see them entering or leaving.
  • Reply 4 of 12
    larryjwlarryjw Posts: 1,031member
    This suggests an interesting sideeffect of security. The more secure the less able to detect a security breach. 

    Now, that is obvious from a sociological standpoint -- who monitors the monitors.

    But, even from a technological perspective? Are we into the area of non-computable functions? Goedel's incompleteness theorem?
    CloudTalkinwatto_cobra
  • Reply 5 of 12
    dewmedewme Posts: 5,371member
    Lots of insightful comments above, especially asking the question "who monitors the monitors." This is actually addressing some of the challenges within the domain of information security (InfoSec), which is still under the cybersecurity umbrella but with greater focus on the integrity of what goes on on the inside of the walled garden. In my own experience matters of InfoSec become a very large concern when you have a distributed development organization. To be clear, it's not because anyone who is on the inside is any more or less trustworthy in one location over another, but because the potential for damage in the event of a breach can vary significantly between locations based on external factors, such as some governments insisting on easier access to what's inside the walled garden or cultural tolerance to bypassing security & privacy measures. One would hope that all of the gates into the walled garden are secured with equally strong locks, but the larger and more expansive your development footprint, the more difficult it is to ensure that this is always the case.
    edited March 2021 FileMakerFellerCloudTalkin
  • Reply 6 of 12
    avon b7avon b7 Posts: 7,691member
    Methods change over time in response to new habits and technologies.

    I hope Apple is using cloud based services and AI to analyse and weed out apps for closer inspection.


  • Reply 7 of 12
    rcfarcfa Posts: 1,124member
    been saying that for a long time: a system so closed it can’t be inspected is asking to be abused.

    Trust is good, verification is better…
  • Reply 8 of 12
    StrangeDaysStrangeDays Posts: 12,879member
    What's more, Apple's ongoing efforts to improve security derailed an investigation Marczak was conducting in 2020. Unspecified updates to iOS reportedly disabled a jailbreaking tool that Citizen Lab was using.

    ...lol Apple isn’t going to forgo security updates to protect jailbreak utilities. 
    edited March 2021 macplusplusmuthuk_vanalingamwatto_cobra
  • Reply 9 of 12
    StrangeDaysStrangeDays Posts: 12,879member
    rcfa said:
    been saying that for a long time: a system so closed it can’t be inspected is asking to be abused.

    Trust is good, verification is better…
    Then what do you call Android’s non-walled garden famous for having even more exploits? Governments don’t even need to try to strong arm Google for back doors to Android. 

    Thus not really falling your argument. Apple’s model, while not perfect, is quantifiably better. 
    watto_cobra
  • Reply 10 of 12
    crowleycrowley Posts: 10,453member
    What's more, Apple's ongoing efforts to improve security derailed an investigation Marczak was conducting in 2020. Unspecified updates to iOS reportedly disabled a jailbreaking tool that Citizen Lab was using.

    ...lol Apple isn’t going to forgo security updates to protect jailbreak utilities. 
    No one said they should.  You're mistaking observation for criticism.
    FileMakerFellerCloudTalkin
  • Reply 11 of 12
    StrangeDaysStrangeDays Posts: 12,879member
    crowley said:
    What's more, Apple's ongoing efforts to improve security derailed an investigation Marczak was conducting in 2020. Unspecified updates to iOS reportedly disabled a jailbreaking tool that Citizen Lab was using.

    ...lol Apple isn’t going to forgo security updates to protect jailbreak utilities. 
    No one said they should.  You're mistaking observation for criticism.
    No, you're mistaking a criticism for an observation. The only reason it's mentioned is for the very theme of this story, where security people are complaining Apple's walled garden presents problems for them. Thus, it's criticism. That's the context.

    Here's the headline in case you missed it:

    Apple's walled garden iPhone security can help hackers evade scrutiny
    edited March 2021 macplusplus
  • Reply 12 of 12
    gatorguygatorguy Posts: 24,213member
    rcfa said:
    been saying that for a long time: a system so closed it can’t be inspected is asking to be abused.

    Trust is good, verification is better…
    Then what do you call Android’s non-walled garden famous for having even more exploits? Governments don’t even need to try to strong arm Google for back doors to Android. 

    Thus not really falling your argument. Apple’s model, while not perfect, is quantifiably better. 
    Are there realy more Android exploits? Specifically what "back doors" are you referring to? Rather than guessing you'd be better served by looking into it.

    Not more than a few months ago one of the companies selling OS exploits said that it was harder to find quality (valuable) Android exploits as compared to those for iOS and was now willing to pay much more for Android ones than (apparently) more common iOS "backdoors". 

    EDIT:

    "During the last few months, we have observed an increase in the number of iOS exploits, mostly Safari and iMessage chains, being developed and sold by researchers from all around the world," the Zerodium CEO said. "The zero-day market is so flooded by iOS exploits that we've recently started refusing some them.

    "On the other hand, Android security is improving with every new release of the OS thanks to the security teams of Google and Samsung, so it became very hard and time-consuming to develop full chains of exploits for Android and it's even harder to develop zero-click exploits not requiring any user interaction," he added.

    And:

    For a long time, it used to be that Apple's iPhones offered considerably better security than their Android counterparts, with Apple flaunting how uncrackable its phones were, as well as refusing to offer backdoor access into iPhones. Apparently, though, things may have changed.

    While Apple still refuses to create backdoors that can be accessed by the government in its phones, that doesn't seem to be able to stop intelligence agencies from gaining access to Apple's iPhones. 

    A report from Vice has revealed that several companies offer tools for breaking Apple's encryption. One of them is Cellebrite, an Isreali cybersecurity outfit, with its Universal Forensic Extraction Device being favored by the FBI. Going by the report, Cellebrite's tool is powerful enough to break into iPhone models up to—and including—the iPhone X. The tool allows app data, call logs, GPS records, messages, and contacts to be extracted.

    Interestingly, though, doing that on Android devices is a lot harder. Cellebrite's UFED tool failed to adequately extract any app data, internet browsing, or GPS records from phones like the Google Pixel 2 and Samsung Galaxy S9. Even weirder, it completely failed at getting anything from the Huawei P20 Pro.

    “Right now, we’re getting into iPhones, " Detective Rex Kiser, a digital forensic examiner for the Fort Worth Police Department, was quoted as saying. "A year ago we couldn’t get into iPhones, but we could get into all the Androids. Now we can’t get into a lot of the Androids.”

    edited March 2021 CloudTalkinmuthuk_vanalingam
Sign In or Register to comment.