'iBoot' leak may stem from low-level Apple engineer with ties to jailbreaking community
This week's publishing of the "iBoot" source code for iOS 9 can be traced back to a "low-level" Apple employee who shared it with a small group of jailbreaking friends -- and may not have wanted it to go beyond that circle, a report claimed on Friday.
The person was encouraged to use their inside access to help the friends out, Motherboard said. On top of iBoot, the employee is said to have taken additional code -- which has yet to be widely shared -- and distributed all of the material with a group of five people.
"He pulled everything, all sorts of Apple internal tools and whatnot," one friend noted.
Two of the friends said they hadn't planned on the stolen code leaving their group, but that it nevertheless ended up being shared more broadly and hence out of their control.
"I personally never wanted that code to see the light of day. Not out of greed but because of fear of the legal firestorm that would ensue," one person elaborated. "The Apple internal community is really full of curious kids and teens. I knew one day that if those kids got it they'd be dumb enough to push it to GitHub."
They argued that the initial group did its "damnedest" to make sure the code didn't leak until it was already old and less of a threat. Nevertheless, someone shared it with a person outside of the original circle a year after it was stolen, and it began spreading further and further during 2017.
The situation culminated with iBoot's appearance on GitHub. Apple subsequently issued a DMCA takedown, but downplayed the threat, saying that updated iPhones and iPads should be secure.
An anonymous Apple worker told Motherboard the company knew about the iBoot leak before it arrived on GitHub, but wouldn't say when it was discovered.
The person was encouraged to use their inside access to help the friends out, Motherboard said. On top of iBoot, the employee is said to have taken additional code -- which has yet to be widely shared -- and distributed all of the material with a group of five people.
"He pulled everything, all sorts of Apple internal tools and whatnot," one friend noted.
Two of the friends said they hadn't planned on the stolen code leaving their group, but that it nevertheless ended up being shared more broadly and hence out of their control.
"I personally never wanted that code to see the light of day. Not out of greed but because of fear of the legal firestorm that would ensue," one person elaborated. "The Apple internal community is really full of curious kids and teens. I knew one day that if those kids got it they'd be dumb enough to push it to GitHub."
They argued that the initial group did its "damnedest" to make sure the code didn't leak until it was already old and less of a threat. Nevertheless, someone shared it with a person outside of the original circle a year after it was stolen, and it began spreading further and further during 2017.
The situation culminated with iBoot's appearance on GitHub. Apple subsequently issued a DMCA takedown, but downplayed the threat, saying that updated iPhones and iPads should be secure.
An anonymous Apple worker told Motherboard the company knew about the iBoot leak before it arrived on GitHub, but wouldn't say when it was discovered.
Comments
You can't just take away company's code (aka intellectual property potentially worth of billions) and put it out there like that. That is not just petty theft and will result in 10-20 years behind bars easily...and that is without counting the fees/fines and other things Apple will try to sue that person for.
Yeah that is the Billion $ questions, usually source code access is control to what subsystems you work on, only upper level people would have full access to all the code and code branches. There is more to this store which is not being told.
Powerful NSA hacking tools have been revealed online
https://www.washingtonpost.com/world/national-security/powerful-nsa-hacking-tools-have-been-revealed-online/2016/08/16/bce4f974-63c7-11e6-96c0-37533479f3f5_story.html
The problem is Apple should have noticed their code leaving the building (they have the resources to do so). Basically you take a hash (fingerprint) of important files and monitor if that hash (file) goes out to an external IP address (leaves the building).
The problem is we don’t know how the files left. They weren’t directly uploaded to GitHub. They might have been taken home on a corporate laptop (work from home or consultant scenario) or USB drive (which can be disabled). Remember this was probably done by a proficient programmer... there is always a way. It could have been by screenshots of code, then reconstructed as text/code. You could encrypt the files, then send them by email (bypassing the file scanning).
The point is there is no way to create a perfectly protected environment. You can airgap (no network connections) the programmers machines and outlaw electronic devices in the building. But we’re talking about software for connected mobile devices... no work would get done.
At some point you have to rely on NDA’s and an army of lawyers as a protective/deterrent, but people often don’t think about the consequences until it’s to late.
Basically things like this are inevitable. The only good thing is it’s code from a few generations ago, and an army security specialists are going to be examining the code in hopes of getting the $200,000 bug bounty’s. Hopefully they find most of the bugs before the bad guys do...
Got any, you know, evidence?
I do support the idea of having a tin-foil hat. Someone really is out to get you, your data, your network, your computer (devices), your money, your dignity, your loyalty, your freedom, your worship, etc. etc.