Future Apple devices may let you select text fields at a glance

Posted:
in Future Apple Hardware edited August 2021
Apple's latest gaze-detection research could mean that future users may be able to select text input fields in an app or website with just a look.

Future Apple devices could see when you're looking at an online form
Future Apple devices could see when you're looking at an online form


As long as a website is written correctly, browsers like Safari recognize different types of text input fields. It's how Safari knows to offer your surname in one, and your email address in another.

In each case, though, you have to click into that field -- or tab from one to the next -- for it to know where you are. In future, you won't. Just looking at, say, the First Name field will be enough to at least pop the cursor there, and probably begin entering text for you.

"Selecting a Text Input Field using Eye Gaze," is a newly-revealed patent application that's concerned with devices in both the real world and Apple AR.

The application treats real-world and AR devices practically interchangeably.

"The techniques can be applied to conventional user interfaces on devices such as desktop computers, laptops, tablets, and smartphones," it says. "The techniques are also advantageous for virtual reality, augmented reality, and mixed reality devices and applications..."

However, there are some key differences. Within AR, it would be the headset, such as "Apple Glass," that performs the gaze detection.

It would do it with sensors placed right in the headset, so right next to the eyes. For real-world devices, the iPhone or iPad would need to perform gaze tracking at more of a distance.

Detail from the patent showing part of a system for gaze detection in a device
Detail from the patent showing part of a system for gaze detection in a device


"[In this case] rays are cast (e.g. projected) along the visual axes of the left and right eyes of [the] user, respectively," says Apple, "and are, optionally, used to determine the user's gaze direction and/or gaze depth in what is referred to as ray casting."

It's not clear whether Apple is proposing that devices constantly scan for anyone looking at them. But however and whenever the scan is activated, it looks for specific regions of the eye, and also determine "the user's gaze direction and/or gaze depth."

By looking for "the center of the user's pupil, and/or the center of rotation of the user's eyeball," the system can identify "the visual axis of the user's eye." So it knows when you're looking at it, and it also knows what it is displaying on the screen.

Apple says that the system should require the user to look for some unspecified amount of time before activating. That's just so when you walk around an Apple Store, you don't get every iPad leaping to fill in order details.

This patent application is credited to three inventors, including Earl M. Olson. His previous work includes determining the location of virtual objects relative to real, physical ones, in an AR environment.

Read on AppleInsider

Comments

  • Reply 1 of 2
    JapheyJaphey Posts: 1,767member
    It will be interesting to see how they eventually apply gaze detection tech to Live Text, which is one of my favorite features of iOS 15. 
  • Reply 2 of 2
    Japhey said:
    It will be interesting to see how they eventually apply gaze detection tech to Live Text, which is one of my favorite features of iOS 15. 
    I think it would work with the TrueDepth camera which already checks if the user is looking at the display when he/she wants to unlock his/her device with Face ID.
Sign In or Register to comment.