Apple enables head gesture controls in iOS 7.1
Buried in the Accessibility menu of iOS 7.1 Settings is a new feature that allows disabled users to activate certain features like Siri by moving their head to the left or right.

First spotted by The Loop, the new "head gesture" function is a novel mode of input for those users who have trouble interacting with multitouch displays.
Added to the "Switch Control" options of iOS, found under the "Physical & Motor" subheading in Settings > General > Accessibility, the head movement controls uses an iOS device's front-facing camera to track head gestures that invoke system controls. Actions include screen taps, home button presses, increase and decrease volume, activation of Notification Center and Siri and bringing up the App Switcher.
When turned on, two blue bars appear on the right and left side of the display. The size of the bars dynamically adjust in relation to perceived head angle. For example, when a user turns their head left, the left bar will shrink until it is completely offscreen. At this point, the predefined action is activated.
In addition to system functions, head gestures can also be enabled to control switch scanning, a feature that sequentially highlights items on a screen that can be activated with a screen tap or adaptive accessory. The accessibility function automatically scrolls through clickable graphics, such as menu items, radio buttons and Internet hyperlinks, among others, until the desired asset is highlighted by a blue bounding box.
Apple has experimented with alternative iOS input implementations in the past, as seen in patent filings from as far back as 2009.

First spotted by The Loop, the new "head gesture" function is a novel mode of input for those users who have trouble interacting with multitouch displays.
Added to the "Switch Control" options of iOS, found under the "Physical & Motor" subheading in Settings > General > Accessibility, the head movement controls uses an iOS device's front-facing camera to track head gestures that invoke system controls. Actions include screen taps, home button presses, increase and decrease volume, activation of Notification Center and Siri and bringing up the App Switcher.
When turned on, two blue bars appear on the right and left side of the display. The size of the bars dynamically adjust in relation to perceived head angle. For example, when a user turns their head left, the left bar will shrink until it is completely offscreen. At this point, the predefined action is activated.
In addition to system functions, head gestures can also be enabled to control switch scanning, a feature that sequentially highlights items on a screen that can be activated with a screen tap or adaptive accessory. The accessibility function automatically scrolls through clickable graphics, such as menu items, radio buttons and Internet hyperlinks, among others, until the desired asset is highlighted by a blue bounding box.
Apple has experimented with alternative iOS input implementations in the past, as seen in patent filings from as far back as 2009.
Comments
=================================
http://xurl.es/BEST-JOBS-4-U
================================
Im doing it
The head movement or the job offer that's gone.
Fantastic. This is very useful for accessibility.
[IMG ALT=""]http://forums.appleinsider.com/content/type/61/id/39442/width/350/height/700[/IMG]
better not turn that on.
Hmmm... is this really new in 7.1?
Hmmm... is this really new in 7.1?
[/QUOTE]
I don't remember seeing it before, speaking for myself.
better not turn that on.
*bing bing* “Playing What Is Love? by Haddaway.”
Super cool.