HomeKit flaw in iOS 11.2 allowed remote access to smart devices, temporary fix already in ...
Apple's software woes continued this week with the publication of a HomeKit flaw that allowed remote access to smart home devices like locks and lights. The company has since issued a temporary patch by disabling remote access to shared users, and plans to permanently plug the hole in a software update next week.
Demonstrated to 9to5Mac by an unnamed source, the HomeKit vulnerability granted unauthorized access to internet-connected devices controlled by Apple's smart home platform.
The process, which was not detailed in today's report, is said to be difficult to reproduce. However, unlike recent Apple software bugs, a HomeKit flaw presents a tangible real-world security threat to users who have smart door locks and garage door openers installed in their home.
Fortunately, Apple has implemented a temporary fix by disabling remote HomeKit access to certain users.
"The issue affecting HomeKit users running iOS 11.2 has been fixed. The fix temporarily disables remote access to shared users, which will be restored in a software update early next week," Apple said in a statement.
The report claims Apple was made aware of the vulnerability in late October, and says some issues were fixed as part of the recently released iOS 11.2 and watchOS 4.2 updates. Apple patched other holes related to the HomeKit flaw server-side, the report said.
Today's revelations come on the heels of an embarrassing week for Apple software. Last Tuesday, media outlets glommed on to a glaring macOS High Sierra flaw that provided root system administrator access without first requiring a password. Apple pushed out a quick fix, but that patch broke file sharing for some users.
Later in the week, users discovered a date bug in iOS 11.1.2 that threw some devices into a continuous soft reset loop. The issue forced Apple to release iOS 11.2 early in an overnight update on Saturday.
Demonstrated to 9to5Mac by an unnamed source, the HomeKit vulnerability granted unauthorized access to internet-connected devices controlled by Apple's smart home platform.
The process, which was not detailed in today's report, is said to be difficult to reproduce. However, unlike recent Apple software bugs, a HomeKit flaw presents a tangible real-world security threat to users who have smart door locks and garage door openers installed in their home.
Fortunately, Apple has implemented a temporary fix by disabling remote HomeKit access to certain users.
"The issue affecting HomeKit users running iOS 11.2 has been fixed. The fix temporarily disables remote access to shared users, which will be restored in a software update early next week," Apple said in a statement.
The report claims Apple was made aware of the vulnerability in late October, and says some issues were fixed as part of the recently released iOS 11.2 and watchOS 4.2 updates. Apple patched other holes related to the HomeKit flaw server-side, the report said.
Today's revelations come on the heels of an embarrassing week for Apple software. Last Tuesday, media outlets glommed on to a glaring macOS High Sierra flaw that provided root system administrator access without first requiring a password. Apple pushed out a quick fix, but that patch broke file sharing for some users.
Later in the week, users discovered a date bug in iOS 11.1.2 that threw some devices into a continuous soft reset loop. The issue forced Apple to release iOS 11.2 early in an overnight update on Saturday.
Comments
“Difficult to reproduce” sounds like it’s not something that’s going to cause a lot of problems, unlike the root thing.
Besides, if you don’t install the software then how’re you going to get the fixes?
Is that sentence correct? Because if it is, it seems to say that the bug requires a connection to the user's iCloud account. Or is that ANY iCloud account? Big difference.
Stop with the special pleading logical fallacy. It has enabled the computer industry to be the worst industry for reliability ever. Of all time. Stop excusing shitty software. You just further the problem by normalizing it.
Oh, and because the only products that are worth a damn are products that are shipping and selling, software teams can't labor away indefinitely testing and refining their work until they all grow old together. It has to ship. Product owners and executive sponsors have limited patience. The pressure to ship can be very intense.
But at least software doesn't wear-out or degrade over time and use, which are the leading causes of reliability issues with non-software products. This fact alone makes it very difficult to compare software reliability to the reliability of other products. All of software's defects are baked into it from the start and only rear their ugly heads when the right combination of enabling conditions arise. Like entering "root" with a blank password. Doh! So it's not even the software itself that's unreliable, it's the people who build the software that are unreliable. Comparing humans to machines is tough.
I know ... nobody wants to hear that building and testing software is hard, cry cry cry, but it really is. At least for humans. Maybe that will change someday, but for as long as software is being produced by meat based entities it is going to continue to demonstrate our human infallibility, limitations, and finite cognitive abilities. However, even with meat based designers, coders, and testers software can get much better. Maybe not perfect, but better. It's the people who must change, or eventually, get out of the way.
Well, with the macOS bug, the thief could have just walked across the office, unlocked the computer they liked, grabbed the info, and locked it back up again. But, yes, as far as we know Apple has gotten quite lucky. Saying, they aren't as bad as Equifax though isn't especially reassuring.
And, yet, we're going to let this stuff drive thousands of pounds of vehicle around our streets?
Keep trying.
Interesting that you say that, because I was thinking much the same thing. Here’s Apple’s apology:
‘We greatly regret this error and we apologize to all Mac users, both for releasing with this vulnerability and for the concern it has caused. Our customers deserve better. We are auditing our development processes to help prevent this from happening again.’
Note that they didn’t say that they would take steps to prevent this from happening again; they said that would take steps to help prevent this from happening again. They understand (unlike many people here) that once an app gets beyond outputting “Hello World” it is pretty much guaranteed to have bugs. When something like this happens, you do not panic and start promising stuff that can’t be done. You look at your processes and make improvements. For all we know the quality control process may be fine, and what Apple may decide to do is actually change the way that the code is written, opting for graceful failure rather than trying to account for every possible test scenario.
I used to work on aircraft flight systems, an industry where we used to generate five A4 sheets of test documentation for every line of code written. The QA manager used to stroll around the office with a large stick he called his “quality measure”. If an eye wasn’t dotted or a tee wasn’t crossed, he’d crack that stick down on your desk, pretty darn close to your fingers…
One day, he came out of a meeting with a partner organisation, and he was as white as skimmed milk.
He sat down put his head in his hands and told us that the partner’s team leader had said that his team would write their part of the code and guarantee it was hundred per cent bug free. He said up to that point he’d quite liked the guy…
He reported what the team leader had said, and the partner was removed from the project.