Apple Vision Pro gestures may spread to the iPhone, iPad, and Mac

Jump to First Reply
Posted:
in Future Apple Hardware edited April 8
Apple Vision Pro introduced new ways of controlling apps through gestures, but it appears that Apple wants to extend that to controlling any device it makes.

Tablet interface with numbers 1 to 6, a heart, and window icons. Two hands with labels, one on each side of the tablet.
Proposed hand movements controlling an iPad

Before the
Apple Vision Pro, if you gestured at a computer, it was to be rude. With the Apple Vision Pro, though, Apple introduced a whole collection of gestures from how to move windows and resize documents.

They are some of the finest elements of the visionOS in Apple Vision Pro, and it's remarkable how complete they seem. This is a whole set of gestures where once you've been shown then, they all feel so natural that it's impossible to imagine alternatives -- or that they are so new.

A newly-granted patent suggests that Apple is pretty pleased with these gestures, too. For using them across more devices is the whole point of "Devices, Methods, And Graphical User Interfaces For Using A Cursor To Interact With Three-Dimensional Environments."

It's far from new. Apple even applied for a similar patent back in 2009, but this new filing is bolstered by the success of the gestures in the Apple Vision Pro.

The patent filing begins with illustrative drawings and diagrams, of which 36 are to do with the Apple Vision Pro. There are even component breakdowns of the headset, and there are definitely signs that Apple is thinking about how the Apple Vision Pro could see a user's gestures and relay them to the Mac or other device being used.

But then there are stick figure drawings showing a user holding a regular iPad, or standing in front of a rather archaic-looking tower computer with a monitor. In neither case is the user wearing a headset.

Apple says this patent is for devices "that provide computer-generated experiences, including, but not limited to, electronic devices that provide virtual reality and mixed reality experiences via a display."

It makes sense that users shouldn't have to have. Right now iPads and iPhones have Face ID sensors that scan the whole face, so it's no leap to see them detecting a wave of the hand in front of them.

And Apple is already using the Face ID sensors for more than biometrics and unlocking devices. With iOS 18, these sensors are the way that users can entirely navigate through their iPhones just by looking at them.

Diagram showing a screen with touch controls, including numbers and icons, above a trackpad with hands performing gestures.
Gestures needn't be in the air, they could be done on an extended trackpad



What this new patent concentrates on is the steps needed to correctly recognize a gesture, and also to reject false positives, like someone scratching their nose.

Where there is "a determination that the movement of the respective hand satisfies a first set of criteria," the gesture can, for example, lead a device to the "changing a position of a cursor based on the movement of the respective hand." If the same system determines that the gesture doesn't meet a set of criteria, the system ignores it.

Since this is a patent, that's as far as it goes in describing its purpose, because patents are always low on use cases. This one is more repetitive than most, though, as it separately works through describing all of this determining of gestures, and the different results from cursor movement to moving windows.

But while the patent does not come close to even hinting at this, there would be one obvious outcome of Apple implementing it. The company has always said it won't put a touchscreen into a Mac because it's bad for users ergonomically if they're always reaching up to the screen.

With this idea, they wouldn't have to touch the screen, they could just wave to tell a document where to go.

This patent is credited to three inventors. They include Evgenii Krivoruchko, whose previous work for Apple includes patents for using users' attention to control elements in a 3D environment.



Read on AppleInsider

Comments

  • Reply 1 of 6
    hmlongcohmlongco Posts: 621member
    Wish they would speed things up and add swipes for paging through content, going back, etc.. 

    As is, it sort of reminds me of the original iPhone and Mac interfaces that deliberately left out shortcuts that might "clutter" the interface.
    williamlondon
     0Likes 1Dislike 0Informatives
  • Reply 2 of 6
    YES! bring it on!!

    Love my AVP and would enjoy having these gestures in front of a Facetime camera. 
     0Likes 0Dislikes 0Informatives
  • Reply 3 of 6
    hammeroftruthhammeroftruth Posts: 1,381member
    I can just hear the idiot customers not wanting this feature because of the use of the camera and their paranoia about cameras secretly recording users to send back to Apple to sell your personal info. 🙄

    These are the same idiots who don’t read the Apple privacy page when they setup their device. 
     0Likes 0Dislikes 0Informatives
  • Reply 4 of 6
    thttht Posts: 5,894member
    Reminder that for Facetime video and video apps that use Apple's video frameworks, it recognizes claps, hand-hearts, thumbs up, and maybe other things, and will display the corresponding emoji graphics in your video. 
     0Likes 0Dislikes 0Informatives
  • Reply 5 of 6
    jvm156jvm156 Posts: 76member
    this would be fantastic for iPad
     0Likes 0Dislikes 0Informatives
  • Reply 6 of 6
    I know eye tracking is an accessibility feature on Apple devices such as the iPhone and iPad, so it could be possible it becomes a legitimate feature similar to the double-pinch feature on the Apple Watch existing on previous generations.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.