or Connect
AppleInsider › Forums › Mobile › iPhone › Apple exploring 3D frame-of-reference iOS interface based on eye, light location
New Posts  All Forums:Forum Nav:

Apple exploring 3D frame-of-reference iOS interface based on eye, light location

post #1 of 20
Thread Starter 
Apple has shown interest in creating a unique user interface for iOS, allowing new features like dynamic shadows based on the angle of light hitting an iPhone screen.

Apple's concept was revealed this week in a new patent application discovered by AppleInsider Entitled "Three Dimensional User Interface Effects on a Display by Using Properties of Motion," it describes a system relying on a number of sensors, including eye tracking with a forward facing camera, to display a user interface that automatically reacts to the world around it.

The application notes that devices like the iPhone can do many unique things thanks to the plethora of sensors included in them, such as compasses, accelerometers, gyrometers, GPS and cameras. In addition, face detection is also possible and can present when a user is positioned in front of a device.

"However, current systems do not take into account the location and position of the device on which the virtual 3D environment is being rendered," the filing reads, "in addition to the location and position of the user of the device, as well as the physical and lighting properties of the user's environment in order to render a more interesting and visually appealing interactive virtual 3D environment on the device's display."

Apple's proposed invention would be a user interface that would be continuously tracking the movement of a portable device, like an iPhone, as well as the lighting conditions and the position of the eyes of the user. Using these elements, Apple could create more realistic three-dimensional depictions of objects on the screen.

In one of the simpler examples presented in the filing, Apple's concept would allow for dynamic shadows that would be moved based on the position of a light source outside the phone. Where the sun is located in the sky, and how the phone is angled relative to the sun, would determine where the shadows would fall on the iPhone's screen.

By tracking a user's eyes, the system would also allow users to move it around and see the virtual "sides" of a rendered 3D object. Users could also rotate the iPhone and "see behind" something on the screen by changing their frame of reference.




In another example, the 3D operating system would have a recessed "bento box" that could be advantageous for modular interfaces. The user could rotate an iPhone to look into a number of "cubby holes" included in the virtual bento box.

"It would also be then possible, via the use of a front-facing camera, to have visual 'spotlight' effects follow the user's gaze, i.e., by having the spotlight effect "shine" on the place in the display that the user is currently looking into," the filing reads."

The filing also takes into account the effect this operating system could have on the battery life of a portable iOS-based device. To prevent over-use of an iPhones graphics processing unit, the system could be turned on or off quickly with a simple gesture.




By performing a gesture like a "wave," by quickly turning the iPhone, the 3D user interface could be "unfrozen" and enabled. Doing this again would have the display "slowly transition back to a standard orientation."

The filing also notes that the 3D user interface could also be accomplished on desktop machines, even though such computers are not portable. These elements could still be accomplished with a forward-facing camera that would track a user's head and eye movement.

The filing, made public this week by the U.S. Patent and Trademark Office, was originally filed in August of 2010. It is credited to inventors Mark Zimmer, Geoff Stahl, David Hayward, and Frank Doepke.

[ View article on AppleInsider ]
post #2 of 20
Sounds useless to me. Meh.
post #3 of 20
While the light thing sounds cool the eye tracking has been done before iirc.
post #4 of 20
I've thought something like this would be natural for iOS since shaders first came to the iPhone. Although full 3D might be interesting, they could get part of the way there with normal maps. A normal map shows the angle light reflects of a surface. They could be defined without redoing an existing interface. This would allow for dynamic gloss lines and beveled surfaces. Movable shadows could use layers. That would also be simple to incorporate in existing interfaces.

A normal map could also be used to determine surface shape if this tech is ever available:
http://www.patentlyapple.com/patentl...interface.html
post #5 of 20
Reminds me of the Johnny Chung Lee Wii Hack with head tracking
post #6 of 20
If I am not looking at my iOS device, does it cease to exist?

Or is this more of a quantum physics thing - where observation of the system affects the state of the system?

Or maybe if you don't look at the device often enough - Siri will says - "Excuse me! My Touch Screen is down HERE!"
post #7 of 20
Quote:
Originally Posted by lilgto64 View Post

Or maybe if you don't look at the device often enough - Siri will says - "Excuse me! My Touch Screen is down HERE!"

Great, now we will get this in real life! (As if our cell phones weren't distracting enough?)
Go Linux, Choose a Flavor!
"I aim to misbehave"
Reply
Go Linux, Choose a Flavor!
"I aim to misbehave"
Reply
post #8 of 20
Quote:
Originally Posted by mausz View Post

Reminds me of the Johnny Chung Lee Wii Hack with head tracking

I'm not a gamer so despite my love of new tech I could have very easily have missed this tech coming to market. It's been over 5 years since that video was uploaded, is this yet a reality? All the tech has already existed and you don't need new TVs or consoles for Lee's 3D to work.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #9 of 20
Quote:
Originally Posted by SolipsismX View Post

I'm not a gamer so despite my love of new tech I could have very easily have missed this tech coming to market. It's been over 5 years since that video was uploaded, is this yet a reality? All the tech has already existed and you don't need new TVs or consoles for Lee's 3D to work.

On a phone it's more feasible...it's more of a gimmick but so is a transition effect...such a thing would increase UX and rightfully so.

As far as gaming goes having to move around constantly to fully appreciate it makes no sense IMO, and for multiplayer games it's useless.

It definitely makes sense in a smartphone or a tablet but not so much elsewhere...yet.
post #10 of 20
Quote:
Originally Posted by SolipsismX View Post

I'm not a gamer so despite my love of new tech I could have very easily have missed this tech coming to market. It's been over 5 years since that video was uploaded, is this yet a reality? All the tech has already existed and you don't need new TVs or consoles for Lee's 3D to work.

You're absolutely right. This has existed for 5 years, and was really simple to implement. I did get to see it in real life myself and it's really stunning.

After these projects he worked for MS (kinect) and is now working at Google according to his blog
post #11 of 20
Quote:
Originally Posted by AbsoluteDesignz View Post

On a phone it's more feasible...it's more of a gimmick but so is a transition effect...such a thing would increase UX and rightfully so.

As far as gaming goes having to move around constantly to fully appreciate it makes no sense IMO, and for multiplayer games it's useless.

It definitely makes sense in a smartphone or a tablet but not so much elsewhere...yet.

I'm not sure I follow. On small screens it seems pointless to me. On a big screen for games designed specifically for it it seems great. Imagine an HD projector lighting up an entire wall. I might become a "hardcore" gamer after all if that existed.

Also, I don't want to wear headgear for using my iPhone or iPad. Now this patent with the camera following your eyes does mean you lose the headgear but it's still a small display and still seems to be missing one aspect of this tech... the distinct your eyes are from you are viewing. Now my trig is rusty but I think for this to work you need not only to know the location of the eyes as they move back and forth but also their distance from the display. That tells me there would most likely be IR or RF being pushed from the handsets and then recorded to gauge your distance but that seems like a battery waster and likely not as accurate as the IR googles sitting on your temples as shown in Lee's video.


Quote:
Originally Posted by mausz View Post

You're absolutely right. This has existed for 5 years, and was really simple to implement. I did get to see it in real life myself and it's really stunning.

After these projects he worked for MS (kinect) and is now working at Google according to his blog

The video is impressive with just him holding the kinect bar under the camera.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #12 of 20
Think beyond gaming, folks.
In 10 years, AR interfaces without this type of realism will look like MacDraw 1.0 quality to us.
I love the idea of using GPS, compass and time to know exactly where the sun is to render accurate shadows.
post #13 of 20
Quote:
Originally Posted by SolipsismX View Post

I'm not sure I follow. On small screens it seems pointless to me. On a big screen for games designed specifically for it it seems great. Imagine an HD projector lighting up an entire wall. I might become a "hardcore" gamer after all if that existed.

Also, I don't want to wear headgear for using my iPhone or iPad. Now this patent with the camera following your eyes does mean you lose the headgear but it's still a small display and still seems to be missing one aspect of this tech... the distinct your eyes are from you are viewing. Now my trig is rusty but I think for this to work you need not only to know the location of the eyes as they move back and forth but also their distance from the display. That tells me there would most likely be IR or RF being pushed from the handsets and then recorded to gauge your distance but that seems like a battery waster and likely not as accurate as the IR googles sitting on your temples as shown in Lee's video.




The video is impressive with just him holding the kinect bar under the camera.

touché

for single player games this would be beyond epic...being able to actually peek around corners and whatnot...

hmm, I guess we'll just have to see how this tech advances.
post #14 of 20
Quote:
Originally Posted by SolipsismX View Post

Also, I don't want to wear headgear for using my iPhone or iPad. Now this patent with the camera following your eyes does mean you lose the headgear but it's still a small display and still seems to be missing one aspect of this tech... the distinct your eyes are from you are viewing. Now my trig is rusty but I think for this to work you need not only to know the location of the eyes as they move back and forth but also their distance from the display. That tells me there would most likely be IR or RF being pushed from the handsets and then recorded to gauge your distance but that seems like a battery waster and likely not as accurate as the IR googles sitting on your temples as shown in Lee's video.

Couldn't it figure out your distance from the screen by tracking the distance between your eyes? I haven't seen Lee's video for a long time, but I think the premise was the distance between the two lights in the Wii sensor is a constant, so measuring the distance to the screen becomes possible by knowing that separation. Similarly, you could calibrate the machine once per user by having them hold the phone close enough to have their eyes fall within two circles, then track their distance in multiples based on how far apart the eyes look to the camera... right?
post #15 of 20
Quote:
Originally Posted by SolipsismX View Post

I'm not a gamer so despite my love of new tech I could have very easily have missed this tech coming to market. It's been over 5 years since that video was uploaded, is this yet a reality? All the tech has already existed and you don't need new TVs or consoles for Lee's 3D to work.

There's a working implementation for Android:

https://market.android.com/details?i...headtracking3d
post #16 of 20
Quote:
Originally Posted by DrDoppio View Post

There's a working implementation for Android:

https://market.android.com/details?i...headtracking3d

There's a basic working application of this on iOS NOW.

Check out "i3D".
post #17 of 20
... and that is why you troll AppleInsider instead of creating cool stuff...
post #18 of 20
Quote:
Originally Posted by SolipsismX View Post

The video is impressive with just him holding the kinect bar under the camera.

Wii sensor bar.

I saw those homebrew head tracking videos when they first came out and I cant believe that still no major manufacturer has followed up on this. It seems like the head tracking effect would work wonders for on-rails shooters - having to duck and weave IRL to dodge virtual projectiles bullet time style.
post #19 of 20
Quote:
Originally Posted by lilgto64 View Post

If I am not looking at my iOS device, does it cease to exist?

Or is this more of a quantum physics thing - where observation of the system affects the state of the system?

Or maybe if you don't look at the device often enough - Siri will says - "Excuse me! My Touch Screen is down HERE!"

Just don't look at it that way.
post #20 of 20
Quote:
Originally Posted by Haggar View Post

Just don't look at it that way.

so is that the official statement - You're Looking at it wrong.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPhone
  • Apple exploring 3D frame-of-reference iOS interface based on eye, light location
AppleInsider › Forums › Mobile › iPhone › Apple exploring 3D frame-of-reference iOS interface based on eye, light location