Holographic elements could give Apple AR headset an immersive experience

Posted:
in General Discussion edited November 2019
Apple's rumored augmented reality headset may use a holographic combiner to project light onto the wearer's eyes, while an eye-tracking system for monitoring the gaze of the user could be used to provide more information about an environment based on where the user is actively looking.

Apple's current AR offering revolves around ARKit in iOS devices
Apple's current AR offering revolves around ARKit in iOS devices


There have been many rumors about Apple's supposed AR headset or smart glasses, along with a number of regulatory filings indicating the company is exploring the possibilities of the device. While there are already many VR headsets and AR devices on the market, it seems Apple is still keen to iron out design issues to offer what could be a superior version once released.

The patent application for "Display Device," published by the US Patent and Trademark Office on Thursday, describes a complex system for displaying an image for the user in an AR headset. Rather than using a display showing a composite view of the environment and virtual content, Apple proposes the use of a "reflective holographic combiner" to serve the same purpose, reflecting light for the AR elements while at the same time passing through environmental light.

The holographic combiner is used with a light engine to project light into specific points of the combiner to create the representation of the scene. The light engine can consist of multiple image-projection systems, including laser diodes, LEDs, and other sources used as part of foveal and peripheral projectors.

An example of the holographic combiner and different projectors could be used in an AR headset
An example of the holographic combiner and different projectors could be used in an AR headset


The foveal projectors are intended to provide high detail images intended for foveal parts of the retina, while the peripheral projectors can provide a lower-resolution image intended for other areas of the retina not able to discern detail at the same level. The light engine can also include optical waveguides with holographic and diffractive gratings to generate beams of light at specific angles to scanning mirrors, which are then directed into other waveguides feeding them into the holographic combiner.

The key reason for the patent application's development is to eliminate "accommodation-convergence mismatch problems," where the virtual object is overlaid on the environment without taking into account the level of depth of field that the user expects it to have. This visual error can cause eyestrain, headaches, and nausea in some cases.

By including projectors for foveal and peripheral areas of the user's vision, in tandem with an eye-tracking system to determine the gaze, Apple hopes to tackle the problem once and for all.

Apple also advises the use of the holographic combiner eliminates the need for extra moving parts or mechanical elements to compensate for the eye changing position, or for changing the optical power during a scan, a development which simplifies the system architecture. It is also suggested the holographic combiner can be implemented using a relatively flat lens, rather than a curved reflective mirror as used in direct retina projector systems.

A second patent application filing attempts to cover the gaze element, in a document titled "Image Enhancement Devices with Gaze Tracking." Rather than dealing with the hardware side, the filing largely covers the intention of making an AR scene fit the real-world environment being viewed.

The system chiefly consists of a camera to capture the world, as well as a gaze tracking system, and a display. The gaze tracking system is configured to determine the point-of-gaze information, while control circuitry is configured to display a variety of content based on that data.

A simplified example of a gaze tracking system using light sources and cameras
A simplified example of a gaze tracking system using light sources and cameras


This data can include "magnified supplemental content," such as a zoomed-in image of the real world, a computer-generated graphic, and other elements. The external camera is used to identify areas of the user's real-world vision that could be points of interest, and could have extra information added for the user's consumption, like face and object identification.

This could lead to features such as using optical character recognition to view text in a scene, then to provide a more readable version to the user in a different language.

While the point-of-gaze could help inform what data to provide to the user, the filing also proposes the use of a "point-of-gaze gesture" to trigger such elements, along with using a "dwell time" to bring up data if the user's vision lingers in a specific point. Blinking could also be used, such as with repeated blinks in quick succession enabling or disabling a zoom function.

Apple files numerous patent applications on a weekly basis, but while the existence of a patent points to areas of interest for the iPhone maker, it does't necessarily guarantee the concepts will make their way into a future product or service.

A foveated display has been raided as an idea in a June filing, by combining high-detailed images where the user is actively looking with lower-resolution versions serving the periphery. Apple proposed the system could help provide higher refresh rates, due to reducing the amount of data the headset display needs to update.

Gaze tracking can also help VR systems become more accurate, such as in an August filing suggesting head-mounted displays could use it to reduce the "drift issue" to continually correct the display if the headset moves differently to the user's eyes during motion.

Outside of rumors and filings, Apple has recently included references to augmented reality glasses in internal betas of iOS 13, the Gold Master release, and a beta of Xcode, effectively confirming such a device is in development.

Comments

  • Reply 1 of 11
    The possibilities of this tech are massive. It’s good to see apple doing what they do better than anyone - focus on the person using the device, and not just the gee-whiz hardware ‘look what we can do’ approach to devices. Google glass was a true crash-and-burn, not even up to beta standards. If apple solves the issue of how augmented glasses will benefit the user, expect it to be the next iPhone or Watch. 
    fastasleep
  • Reply 2 of 11
    While possibly not unimportant in the long-term, personally I’d rather Apple delivered on the unspoken promise of their “Project Titan” R&D before a glasses/goggles system which may never be considered an acceptable accessory.
  • Reply 3 of 11
    Rayz2016Rayz2016 Posts: 6,957member
    While possibly not unimportant in the long-term, personally I’d rather Apple delivered on the unspoken promise of their “Project Titan” R&D before a glasses/goggles system which may never be considered an acceptable accessory.
    So you want them to deliver on something they’ve never said was a product they were going to deliver?

    Sounds reasonable 🙄
    fastasleep
  • Reply 4 of 11
    gatorguygatorguy Posts: 24,211member
    Rayz2016 said:
    While possibly not unimportant in the long-term, personally I’d rather Apple delivered on the unspoken promise of their “Project Titan” R&D before a glasses/goggles system which may never be considered an acceptable accessory.
    So you want them to deliver on something they’ve never said was a product they were going to deliver?

    Sounds reasonable ߙ䦬t;/div>
    Remember when I wrote about foveated displays a year or so ago? So here you are already aware of the tech!
    edited September 2019
  • Reply 5 of 11
    flaneurflaneur Posts: 4,526member
    The possibilities of this tech are massive. It’s good to see apple doing what they do better than anyone - focus on the person using the device, and not just the gee-whiz hardware ‘look what we can do’ approach to devices. Google glass was a true crash-and-burn, not even up to beta standards. If apple solves the issue of how augmented glasses will benefit the user, expect it to be the next iPhone or Watch. 
    Exactly, good post. 
  • Reply 6 of 11
    gatorguygatorguy Posts: 24,211member
    The possibilities of this tech are massive. It’s good to see apple doing what they do better than anyone - focus on the person using the device, and not just the gee-whiz hardware ‘look what we can do’ approach to devices. Google glass was a true crash-and-burn, not even up to beta standards. If apple solves the issue of how augmented glasses will benefit the user, expect it to be the next iPhone or Watch. 
    Google Glass (consumer version, enterprise is still in use and shipping) crashed and burned primarily due to invented privacy issues revolving around the camera. Google disallowed all face recognition apps, but that didn't prevent contrary FUD, and always displayed a red LED whenever the camera was active yet posters claiming there was no way to know. Again far too much FUD there for Google to overcome. They were unable to control the message, something we all here know Apple is excellent at.

    IMO Robert Scoble and his shower post was the beginning of the "OMG a CAMERA!!" movement and subsequent banning of Google Glass by a high-profile but small handful of restaurants and it went downhill from there. 

    So today we all seem to be more at ease now with the camera's around us, no longer automatically fearing them in our smart devices.  Video doorbells, home security, SnapChat and Instagram...
    The timing was wrong in 2012 when Google Glass was first revealed. With 7 years of improvements in hardware and technology, and 7 years for people to digest the fact that camera's can be helpful, useful and fun and not just intrusive, now might be an OK time to revisit a consumer version. If Apple decides to release one of their own I'd expect Google to watch the winds and if blowing the right way quickly release a revised consumer Glass. 
    edited September 2019
  • Reply 7 of 11
    flaneurflaneur Posts: 4,526member

    While possibly not unimportant in the long-term, personally I’d rather Apple delivered on the unspoken promise of their “Project Titan” R&D before a glasses/goggles system which may never be considered an acceptable accessory.
    “May God us keep
    From single vision and Newton’s sleep.”

    Lines meant for those who live in their left hemispheres, from William Blake in 1802. Apple’s mission to develop this new stereo visual technology, which is uniquely suited to the resources and talents of the company, will change the nature of human perception itself.

    Since the invention of 2D media such as the phonetic alphabet and the printed book or page, not to mention 2D cinema and video screens, we’ve been unknowingly crippled by technologies that train our visual systems not to see in depth.

    Wearable 3D vision amplifiers will be much more important than transportation, except now that I think of it, Apple’s version of the automobile will also be an exercise in depth vision and augmented realities.


    yojimbo007
  • Reply 8 of 11
    flaneurflaneur Posts: 4,526member
    gatorguy said:
    The possibilities of this tech are massive. It’s good to see apple doing what they do better than anyone - focus on the person using the device, and not just the gee-whiz hardware ‘look what we can do’ approach to devices. Google glass was a true crash-and-burn, not even up to beta standards. If apple solves the issue of how augmented glasses will benefit the user, expect it to be the next iPhone or Watch. 
    Google Glass (consumer version, enterprise is still in use and shipping) crashed and burned primarily due to invented privacy issues revolving around the camera. Google disallowed all face recognition apps, but that didn't prevent contrary FUD, and also always displayed a red LED whenever the camera was active. Again far too much FUD there for Google to overcome. 

    For whatever reason we all seem to be more at ease now with the camera's around us. The timing was wrong in 2012 when Google Glass was first revealed. With 7 years of improvements in hardware and technology, and 7 years to digest the fact that camera's can be helpful and not just intrusive, now might be an OK time to revisit a consumer version. If Apple decides to release a version of their own I'd expect Google to test the winds and if blowing the right way quickly release a revised consumer Glass. 
    Glass was also hated because it was creepy and droid looking. The primary flaw was that it was monocular and a violation of face symmetry, which we are hard-wired to fear and loathe. E.g., the pirate’s eye patch and the villain’s monocle. 

    This was all subliminal, and something which designers and engineers at Apple will see as aesthetic laws that must not be transgressed.

    Again, that is. They blew it with the hocky puck mouse — no bilateral symmetry.
    edited September 2019
  • Reply 9 of 11
    Rayz2016 said:
    While possibly not unimportant in the long-term, personally I’d rather Apple delivered on the unspoken promise of their “Project Titan” R&D before a glasses/goggles system which may never be considered an acceptable accessory.
    So you want them to deliver on something they’ve never said was a product they were going to deliver?

    Sounds reasonable 🙄
    Agree. Also, you can’t just focus on one single product these days — especially if you’re the size of Apple. Competitors will soon copy any of your hero products, and make cheaper versions of them (although inferior quality). And patenting is a really weak way of protecting your IP nowadays as well. Only solution is continuous innovation. And I think Apple is doing quite good in this area — actually better than in the Jobs era.
  • Reply 10 of 11
    gatorguy said:
    The possibilities of this tech are massive. It’s good to see apple doing what they do better than anyone - focus on the person using the device, and not just the gee-whiz hardware ‘look what we can do’ approach to devices. Google glass was a true crash-and-burn, not even up to beta standards. If apple solves the issue of how augmented glasses will benefit the user, expect it to be the next iPhone or Watch. 
    Google Glass (consumer version, enterprise is still in use and shipping) crashed and burned primarily due to invented privacy issues revolving around the camera. Google disallowed all face recognition apps, but that didn't prevent contrary FUD, and always displayed a red LED whenever the camera was active yet posters claiming there was no way to know. Again far too much FUD there for Google to overcome. They were unable to control the message, something we all here know Apple is excellent at.

    IMO Robert Scoble and his shower post was the beginning of the "OMG a CAMERA!!" movement and subsequent banning of Google Glass by a high-profile but small handful of restaurants and it went downhill from there. 

    So today we all seem to be more at ease now with the camera's around us, no longer automatically fearing them in our smart devices.  Video doorbells, home security, SnapChat and Instagram...
    The timing was wrong in 2012 when Google Glass was first revealed. With 7 years of improvements in hardware and technology, and 7 years for people to digest the fact that camera's can be helpful, useful and fun and not just intrusive, now might be an OK time to revisit a consumer version. If Apple decides to release one of their own I'd expect Google to watch the winds and if blowing the right way quickly release a revised consumer Glass. 
    I don’t think you’re right this time. Anyone producing a good enough set of goggles, with or without camera, would make a splash at any time. Consumers don’t care if others around them like their cameras or not.
  • Reply 11 of 11
    gatorguygatorguy Posts: 24,211member
    gatorguy said:
    The possibilities of this tech are massive. It’s good to see apple doing what they do better than anyone - focus on the person using the device, and not just the gee-whiz hardware ‘look what we can do’ approach to devices. Google glass was a true crash-and-burn, not even up to beta standards. If apple solves the issue of how augmented glasses will benefit the user, expect it to be the next iPhone or Watch. 
    Google Glass (consumer version, enterprise is still in use and shipping) crashed and burned primarily due to invented privacy issues revolving around the camera. Google disallowed all face recognition apps, but that didn't prevent contrary FUD, and always displayed a red LED whenever the camera was active yet posters claiming there was no way to know. Again far too much FUD there for Google to overcome. They were unable to control the message, something we all here know Apple is excellent at.

    IMO Robert Scoble and his shower post was the beginning of the "OMG a CAMERA!!" movement and subsequent banning of Google Glass by a high-profile but small handful of restaurants and it went downhill from there. 

    So today we all seem to be more at ease now with the camera's around us, no longer automatically fearing them in our smart devices.  Video doorbells, home security, SnapChat and Instagram...
    The timing was wrong in 2012 when Google Glass was first revealed. With 7 years of improvements in hardware and technology, and 7 years for people to digest the fact that camera's can be helpful, useful and fun and not just intrusive, now might be an OK time to revisit a consumer version. If Apple decides to release one of their own I'd expect Google to watch the winds and if blowing the right way quickly release a revised consumer Glass. 
    I don’t think you’re right this time. Anyone producing a good enough set of goggles, with or without camera, would make a splash at any time. Consumers don’t care if others around them like their cameras or not.
    Just 7 years ago there was a lot of excitement and anticipation surrounding Google Glass. People were imagining what a breakthru they could be, how they'd put them to use in their own lives. Gaming, travel, AR and VR, enterprise and office and really useful stuff like automotive repair, plumbing or electrical installations, construction, and doctor/hospital visits. There were potentials and plans and...  then suddenly in morphed into being called a Glasshole for using them. They have a camera. O.M.G.! It's watching me!

    That's probably mostly changed now, and definitely would if Apple also got in the game. Like magic it would all OK. Put a million of them out there and now it's the cool thing. A million buyers of an expensive headset ain't gonna fail to push back against FUD.  So it has a camera. What me worry? Who cares.
    edited September 2019
Sign In or Register to comment.