Hackers use radio waves to silently control Apple's Siri, Android's Google Now
A newly spotlighted hack utilizes an iPhone or Android handset -- with headphones plugged in -- to remotely and silently access the smartphone's built-in voice controls, potentially unbeknownst to the user.
Researchers from French government agency ANSSI found they were able to control Apple's Siri or Android's Google Now from as far as 16 feet away, according to Wired. The hack is accomplished by using a radio transmitter to tap into a pair of headphones with integrated microphone plugged into the mobile device, using the headphone cable as an antenna.
Headphone cables make decent radio antennas, as evidenced by Apple's use of them to enable FM radio reception on its iPod nano. The team at ANSSI found they can exploit this and trick an iPhone or Android device into believing the audio commands are coming from the connected microphone.
"Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker's number to turn the phone into an eavesdropping device, send the phone's browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter," Wired explained.
In its smallest, most compact form, the hack can be accomplished from up to about six and a half feet away with equipment that could fit inside a backpack. A more powerful form operational up to 16 feet away would require the hardware be housed in a car or van.
The hack only works on headphone-connected iPhones which have Siri enabled from the lockscreen -- which is Apple's default setting. It works not only with the new iPhone 6s which has "Hey Siri" always listening, but also with older devices, by spoofing the button press required to activate Siri on a set of headphones, such as Apple's own EarPods.
Of course, anyone who can get their hands on a user's iPhone can access Siri as long as it's enabled from the lock screen. But the ANSSI technique would allow for more remote, stealth access of a device, potentially unbeknownst to the user.
Some Android devices do feature voice recognition for Google Now access, which could thwart the potential hack. Apple has no such functionality built into Siri yet.
Starting with iOS 9, Apple has begun tailoring "Hey Siri" voice prompts to each individual user, helping the personal assistant recognize a user's voice when they use they functionality. The new setup process could be a potential precursor to voice recognition security in future versions of iOS.
Users concerned about such hacks should disable access to Siri from the lockscreen. This can be accomplished by opening the iOS Settings application, selecting Touch ID & Passcode, and then scrolling down to uncheck Siri under Allow Access When Locked. There, users can also disable access to the Today screen, Notifications View, Reply With Message, and Wallet, if they so choose.
For further security, users can also go back to the root Settings menu and choose Control Center and disable Access on Lock Screen. This will prevent a stolen iPhone from being placed into Airplane Mode without turning off the device.
As for the hardware side of security, the researchers at ANSSI have reached out to both Apple and Google, recommending that the companies adopt better shielding on their own headphone cords, which would make it more difficult for nefarious hackers to co-opt. Future handsets could also include electromagnetic sensors as a form of security.
Apple and Google could also fix the issue through software, allowing users to create custom voice prompts to invoke Siri and Google Now. Like Apple's "Hey Siri," Google allows users to begin a voice search with the generic query "OK Google."
Researchers from French government agency ANSSI found they were able to control Apple's Siri or Android's Google Now from as far as 16 feet away, according to Wired. The hack is accomplished by using a radio transmitter to tap into a pair of headphones with integrated microphone plugged into the mobile device, using the headphone cable as an antenna.
Headphone cables make decent radio antennas, as evidenced by Apple's use of them to enable FM radio reception on its iPod nano. The team at ANSSI found they can exploit this and trick an iPhone or Android device into believing the audio commands are coming from the connected microphone.
"Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker's number to turn the phone into an eavesdropping device, send the phone's browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter," Wired explained.
In its smallest, most compact form, the hack can be accomplished from up to about six and a half feet away with equipment that could fit inside a backpack. A more powerful form operational up to 16 feet away would require the hardware be housed in a car or van.
The hack only works on headphone-connected iPhones which have Siri enabled from the lockscreen -- which is Apple's default setting. It works not only with the new iPhone 6s which has "Hey Siri" always listening, but also with older devices, by spoofing the button press required to activate Siri on a set of headphones, such as Apple's own EarPods.
Of course, anyone who can get their hands on a user's iPhone can access Siri as long as it's enabled from the lock screen. But the ANSSI technique would allow for more remote, stealth access of a device, potentially unbeknownst to the user.
Some Android devices do feature voice recognition for Google Now access, which could thwart the potential hack. Apple has no such functionality built into Siri yet.
Starting with iOS 9, Apple has begun tailoring "Hey Siri" voice prompts to each individual user, helping the personal assistant recognize a user's voice when they use they functionality. The new setup process could be a potential precursor to voice recognition security in future versions of iOS.
Users concerned about such hacks should disable access to Siri from the lockscreen. This can be accomplished by opening the iOS Settings application, selecting Touch ID & Passcode, and then scrolling down to uncheck Siri under Allow Access When Locked. There, users can also disable access to the Today screen, Notifications View, Reply With Message, and Wallet, if they so choose.
For further security, users can also go back to the root Settings menu and choose Control Center and disable Access on Lock Screen. This will prevent a stolen iPhone from being placed into Airplane Mode without turning off the device.
As for the hardware side of security, the researchers at ANSSI have reached out to both Apple and Google, recommending that the companies adopt better shielding on their own headphone cords, which would make it more difficult for nefarious hackers to co-opt. Future handsets could also include electromagnetic sensors as a form of security.
Apple and Google could also fix the issue through software, allowing users to create custom voice prompts to invoke Siri and Google Now. Like Apple's "Hey Siri," Google allows users to begin a voice search with the generic query "OK Google."
Comments
If the iPhone was locked, all you would get is the prompt that "you'll have to unlock your iPhone first" no?
I enjoy reading AppleInsider on a daily basis and actually think its a must read if you want to stay on top of the technology instead of being buried by it. Thanks for your efforts to keep us informed.
Seems like quite a stretch, but I guess some dorks need a useless hobby.
Here's a security tip. If you see an acned teenager or a creepy long beard eating doritos and drinking mountain dew at the coffeeshop and they have a pair of cans and a morse code tapper hooked to a giant backpack.... stay 16 feet away. You'd probably have to stay that far away to avoid the smell of his momma's basement.
While watching a recent keynote on my iPhone 6, Siri unexpectedly activated and the streaming video paused. It took me several seconds to realize that the keynote speaker saying, "Hey, Siri", was causing Siri on my phone to activate.
Can't help but wonder just how powerful and precise the radio signal needs to be if it requires a van full of equipment to be effective at 16 ft. And I wonder what kind of feedback is heard over the headphone ends of the cable. If it's inducing a current in the microphone wire it's surely inducing a current in the earphone wire as well.
I'd put this in the category of theoretically possible put not practical for any large scale or random hacking attempt. Unless you feel you are a specific, named target of the NSA or some other government TLA (three-letter agency) probably not really worth worrying about.
Actually the only time I use my phone for music is while traveling - so now I just have to make sure to watch out for vans following me through the airport terminal.
That aside, wouldn't you hear Siri reacting to the rogue commands and take notice?
Oh well, just another reason to keep Siri switched off.
"Users concerned about such hacks should disable access to Siri from the lockscreen."
Not so much. Probably a billion other things to be concerned about before this.
Or switch to wireless headphones, which is probably where Apple is headed in order to make their devices thinner.
This may be a rhetorical question but:
Would the user be able to hear anything in the headphones if they were wearing them during the hack attack?
Of course if you tried to report it happening you'd be passed off as a delusional psychotic I suppose
I would suppose Siri might be trained to recognize this mode of hack and alert the user....
Aide, aide, je suis piraté - s'éloigner du sac à dos!
Translation:
Help, help, I'm being hacked - Move away from the backpack!?
That is funny. Could I text a voice message that says ' hey Siri, delete all my photos' and once the recipient listened to the message Siri would get to work?
Theoretically it is kind of a cool "hack."
Technically I need more info to understand exactly how it works (and what the user with the headphones experiences etc).
I had to disable "Hey, Siri" because it sometimes activates inadvertently during normal conversation when I'm talking to co-workers or on the office phone. The "training" features in iOS 9 reduced, but did not eliminate the problem.
Seems like quite a stretch, but I guess some dorks need a useless hobby.
Here's a security tip. If you see an acned teenager or a creepy long beard eating doritos and drinking mountain dew at the coffeeshop and they have a pair of cans and a morse code tapper hooked to a giant backpack.... stay 16 feet away. You'd probably have to stay that far away to avoid the smell of his momma's basement.
Hey don't talk bad about Doritos. Love 'em.