The FBI's iPhone encryption backdoor demand is unsafe and now unwarranted
Congratulations are due to the FBI for breaking Apple's security on a pair of iPhones. Given that success, it's time to stop the federal call for iPhone backdoors.

A cellebrite extraction tool, used to pull data off a locked device
Here we go again. Apple is in the spotlight, again, for a security stance that they are taking to protect its users.
In case you missed it, on Monday, the FBI and US Attorney General William Barr issued a statement that whined about Apple giving no help to law enforcement, despite admitting earlier in 2020 that Apple nearly immediately provided iCloud contents.
A few hours later, Apple told its side of the story, stopping just short of directly calling Barr a liar.
"On this and many thousands of other cases, we continue to work around-the-clock with the FBI and other investigators who keep Americans safe and bring criminals to justice. As a proud American company, we consider supporting law enforcement's important work our responsibility," Apple said. "The false claims made about our company are an excuse to weaken encryption and other security measures that protect millions of users and our national security."
This war of words is just the latest artillery exchange in the ongoing battle between Apple and the feds that's been raging for five years, though. In January the FBI asked Apple to unlock the shooter's iPhones, and Barr demanded the same.
In short order, Apple denied the request. Previous demands, like those following the San Bernardino shooting, were made due to a lack of other options available to law enforcement agencies.
The FBI proved in Monday's statement that they have those tools now, and have no need of Apple's help. Tools from Grayshift, Cellebrite, and others provide ways to extract the contents of a locked iPhone, and have for years.
"We've got the tools to extract data from an iPhone 5 and 7 now," forensic firm Garrett Discovery CEO Andy Garrett said in late 2019. "Everybody does."
For governments and law enforcement agencies, the tools are reasonably priced. The software and hardware suite needed to gain access is around $15,000. This is a far cry from the near $1 million the FBI was alleged to have spent to access an iPhone 5C at the heart of the San Bernardino investigation.
But yet, here we are, still and again. The latest Barr-penned missive yet again calls for the mandated addition of encryption backdoors, supposedly in such a way that only law enforcement can gain access to stored data while still maintaining security. Tech companies and critics of the idea counter with the obvious observation that adding any backdoors at all will weaken device security overall.
Any encryption backdoors or security flaws will fall into the wrong hands. You don't have to look any farther than the exploits that have popped out as of late that had to be wrung from the devices over years of work by researchers -- and federal agencies like the NSA.
An intentional backdoor won't take that long to discover. The iPhone front door can be as strong as Apple wants to make it, but if the back porch door is secured with the digital equivalent of a chopstick holding the door shut, all that work securing the entry won't matter.
And, again, Apple is telling the feds to forget about it, and are being specific about why they should. Like Apple has said before, there is no backdoor limited only to the good guys. If it exists, it will be found and used by the bad guys.
Apple works to keep us safer, and should they accede to these continuing demands, we will all be less safe. Apple's continuing line in the sand benefits us all.
But, that line in the sand is surmountable by law enforcement. The FBI managed to get into these iPhones, loudly trumpeting that Apple didn't help break into the phones.
With Monday's release, the FBI has proven that the Secure Enclave and Apple's security isn't an issue for law enforcement, provided that they are sufficiently motivated. While there may have been a doubt before, and perhaps a point to be made, the FBI has practically demonstrated that there isn't any doubt about the capabilities of law enforcement and no point in the encryption backdoor argument any longer. Barr just wants to use encryption and Apple's stance on user protection as a political point-maker.
In his role as United States' Attorney General, he knows full well that the public will be less safe if he gets what he wants. He also obviously just doesn't seem to care.

A cellebrite extraction tool, used to pull data off a locked device
Here we go again. Apple is in the spotlight, again, for a security stance that they are taking to protect its users.
In case you missed it, on Monday, the FBI and US Attorney General William Barr issued a statement that whined about Apple giving no help to law enforcement, despite admitting earlier in 2020 that Apple nearly immediately provided iCloud contents.
A few hours later, Apple told its side of the story, stopping just short of directly calling Barr a liar.
"On this and many thousands of other cases, we continue to work around-the-clock with the FBI and other investigators who keep Americans safe and bring criminals to justice. As a proud American company, we consider supporting law enforcement's important work our responsibility," Apple said. "The false claims made about our company are an excuse to weaken encryption and other security measures that protect millions of users and our national security."
This war of words is just the latest artillery exchange in the ongoing battle between Apple and the feds that's been raging for five years, though. In January the FBI asked Apple to unlock the shooter's iPhones, and Barr demanded the same.
In short order, Apple denied the request. Previous demands, like those following the San Bernardino shooting, were made due to a lack of other options available to law enforcement agencies.
The FBI proved in Monday's statement that they have those tools now, and have no need of Apple's help. Tools from Grayshift, Cellebrite, and others provide ways to extract the contents of a locked iPhone, and have for years.
"We've got the tools to extract data from an iPhone 5 and 7 now," forensic firm Garrett Discovery CEO Andy Garrett said in late 2019. "Everybody does."
For governments and law enforcement agencies, the tools are reasonably priced. The software and hardware suite needed to gain access is around $15,000. This is a far cry from the near $1 million the FBI was alleged to have spent to access an iPhone 5C at the heart of the San Bernardino investigation.
But yet, here we are, still and again. The latest Barr-penned missive yet again calls for the mandated addition of encryption backdoors, supposedly in such a way that only law enforcement can gain access to stored data while still maintaining security. Tech companies and critics of the idea counter with the obvious observation that adding any backdoors at all will weaken device security overall.
Any encryption backdoors or security flaws will fall into the wrong hands. You don't have to look any farther than the exploits that have popped out as of late that had to be wrung from the devices over years of work by researchers -- and federal agencies like the NSA.
An intentional backdoor won't take that long to discover. The iPhone front door can be as strong as Apple wants to make it, but if the back porch door is secured with the digital equivalent of a chopstick holding the door shut, all that work securing the entry won't matter.
Backdoors are still not the answer
Like we've said before, again and again, the cat-and-mouse game between law enforcement and criminals has been played as long as civilization has existed. And, once again, in the interest of expediency, the government wants to unlock smartphones on demand.And, again, Apple is telling the feds to forget about it, and are being specific about why they should. Like Apple has said before, there is no backdoor limited only to the good guys. If it exists, it will be found and used by the bad guys.
Apple works to keep us safer, and should they accede to these continuing demands, we will all be less safe. Apple's continuing line in the sand benefits us all.
But, that line in the sand is surmountable by law enforcement. The FBI managed to get into these iPhones, loudly trumpeting that Apple didn't help break into the phones.
With Monday's release, the FBI has proven that the Secure Enclave and Apple's security isn't an issue for law enforcement, provided that they are sufficiently motivated. While there may have been a doubt before, and perhaps a point to be made, the FBI has practically demonstrated that there isn't any doubt about the capabilities of law enforcement and no point in the encryption backdoor argument any longer. Barr just wants to use encryption and Apple's stance on user protection as a political point-maker.
In his role as United States' Attorney General, he knows full well that the public will be less safe if he gets what he wants. He also obviously just doesn't seem to care.

Comments
That doesn't exactly sound like a great argument. Apple should be making phones that are essentially impossible to break into, and that they are not yet doing so is good for governments, and thieves, but not so great for Apple's customers.
Like Sflocal said, there will never, ever be a perfectly secure device. It is not Apple's job to make phone intrusion convenient.
It's not a good thing when something this bad for the people crosses administrations.
Apple provides iCloud data that it has, with a court order. The FBI can't just ask for the data. This iCloud data provision isn't a backdoor that would be created that can be exploited by miscreants.
Apple doesn't store passcodes in iCloud. It never will, and hasn't ever done this.
It is under the Frequently Asked Questions under "How does iCloud Keychain protect my information?". There is also another question on there that asks "Can I make sure my information isn't backed up in iCloud?" if you are wanting to know about that as well.
Either way, even with having the iCloud data, the FBI nor anyone else would be able to easily decrypt your passcodes.
Edit- others have beaten me to this point.
Which makes the whole backdoor discussion a side-issue (and a symbolic verbal excercise to defer public attention from the core security dilemma...)
The issue at hand is Apple being forced to make a backdoor for encryption because X law enforcement agency can't be bothered to get a warrant or do the legwork -- which could then be exploited by more than just law enforcement when the secrets are teased out.
The FBI keeps screaming about how they need these back doors, yet they have tools to get into most any phone they want.
Apple says they will not allow back doors, but they MUST be able to get their hands on these devices, examine them, and close the vulnerabilities, yet they haven't.
I see both sides claiming to be the good guys, claiming to be protecting the public.
All the while winking at each other.
Barr is both. He's shown his true colors: A political animal that will do whatever his bosses want, Rule of Law and the Constitution be d@mned.
If Congress creates new law that says Apple have to create a back door to encrypted devices there is nothing Apple can do at that point.
If you think this might not happen, think again there are lots of people in Congress who agree that law enforcement should have access to people's data. There are precedents on this subject, the government has passed laws which gives law enforcement access to otherwise private stuff with a warrant.
Keep in mind, the previous complaints came from the FBI, this time is the DOJ and they are making a legal arguments.