'Apple Glass' could project AR directly onto a wearer's retina

Posted:
in Future Apple Hardware
Apple is researching how "Apple Glass," or other future Apple AR devices, could skip tiny screens altogether, and instead use micro projectors to beam the images straight onto the wearer's retina.

Future Apple AR or VR systems could project straight Into a user's eyes
Future Apple AR or VR systems could project straight Into a user's eyes


Apple may soon have an entirely different meaning for its term "retina display." Rather than a screen whose resolution is so good our eyes can't distinguish individual pixels, there may not be a screen at all.

"Direct retinal projector," is a newly-granted patent, that claims this projecting right into the eyes could be best for AR. Specifically, it could prevent certain ways that watching AR or VR on a headset can cause headaches and sickness.

"Virtual reality (VR) allows users to experience and/or interact with an immersive artificial environment, such that the user feels as if they were physically in that environment," says the patent. "For example, virtual reality systems may display stereoscopic scenes to users in order to create an illusion of depth, and a computer may adjust the scene content in real-time to provide the illusion of the user moving within the scene."

We know all of this, but Apple wants to set the stage for how typical AR/VR systems work, and why there are problems. Then it wants to solve those problems.

"When the user views images through a virtual reality system, the user may thus feel as if they are moving within the scenes from a first-person point of view," it continues. "However, conventional virtual reality and augmented reality systems may suffer from accommodation-convergence mismatch problems that cause eyestrain, headaches, and/or nausea.:

"Accommodation-convergence mismatch arises when a VR or AR system effectively confuses the brain of a user," says Apple, "by generating scene content that does not match the depth expected by the brain based on the stereo convergence of the two eyes of the user."

You're wearing a headset and no matter how light Apple manages to make it, you're still conscious that you have a screen right in front of your face. Two screens, in fact, and right in front of your eyes.

So what you're watching, the AR or VR experience, might be showing you a panoramic virtual vista, perhaps with someone's avatar in the far distance, walking toward you. The AR/VR experience is telling your eyes to focus in that far distance, but what's being displayed is still actually, physically, exactly as close to your eyes as it ever was.

"For example, in a stereoscopic system the images displayed to the user may trick the eye(s) into focusing at a far distance while an image is physically being displayed at a closer distance," continues the patent. "In other words, the eyes may be attempting to focus on a different image plane or focal depth compared to the focal depth of the projected image, thereby leading to eyestrain and/or increasing mental stress."

Beyond the fact that this is obviously, as the patent says, "undesirable," there is a further issue. Today it's still the case that there is a limit to how long a user can comfortably wear an AR/VR headset.

Part of that is of course down to the size and weight of the headset, but it is also to do with these "accommodation-convergence mismatch problems." Apple says this can detract from a wearer's "endurance levels (i.e. tolerance) of virtual reality or augmented reality environments."

Detail from the patent showing one method of using mirrors to position AR projection
Detail from the patent showing one method of using mirrors to position AR projection


Projecting images into the wearer's eyes, by comparison, is a lot more like the way light usually comes into our vision as we look around. There have to be issues about the strength of projection, the brightness of the light source, but this patent doesn't cover those.

Rather, once it's made a case for using projection at all, the majority of the patent is spent on systems and methods for aiming correctly. The projection has to be precise, not just generally in shining into the eyes, but also what gets projected to which points.

Consequently, much of the patent about projecting into the eyes is also about what can be read back from those eyes. Specifically, Apple is once again investigating gaze tracking technology.

This new patent is credited to Richard J. Topliss; James H. Foster, and Alexander Shpunt. Topliss's previous work for Apple includes a patent for using AR to improve the Find My app.

Read on AppleInsider

Comments

  • Reply 1 of 15
    ...Hypnosis Capitalism...? :open_mouth: 

  • Reply 2 of 15
    GeorgeBMacGeorgeBMac Posts: 11,421member
    That sounds like a cross between science fiction and Halloween Horror.

    So why stop at the retina?  Cut out the middleman:  just send it straight down the optic nerve into the brain?
    Or is Musk already doing that?
    dewmebyronl
  • Reply 3 of 15
    tjwolftjwolf Posts: 424member
    I wonder how this would work for people wearing contacts.  Like, how would the apparatus know how much to correct for the refraction from those lenses?  With AR glasses, I could imagine a calibration mechanism on either side of the glasses, but that's obviously to possible with contacts.
  • Reply 4 of 15
    22july201322july2013 Posts: 3,571member
    This reminds me of a sign a grocery clerk had near his scanner:
    "Do not look into laser beam with remaining eye."


    dewmecornchipGeorgeBMacFileMakerFeller
  • Reply 5 of 15
    dewmedewme Posts: 5,362member
    tjwolf said:
    I wonder how this would work for people wearing contacts.  Like, how would the apparatus know how much to correct for the refraction from those lenses?  With AR glasses, I could imagine a calibration mechanism on either side of the glasses, but that's obviously to possible with contacts.
    I’d imagine there are many other scenarios that must be addressed as well including people who’ve had cataract surgery with various implanted artificial lens types. I suppose they’ll have to have some sort of calibration process to tune the projectors to individual circumstances. 

    As someone who’s had multiple retinal surgeries I’m not too keen on having a device projecting images directly on to my retina until they’ve worked out all of the kinks. I won’t be signing up to be a beta tester. 

    I’m actually not really sold on the whole AR/VR thing yet. I do hope they come up with some very practical uses of the technology including ways that enhance situational awareness for people with vision limitations and impairments and enhanced mobility. If it’s just a gaming thing or form of escapism I really don’t care where or when Apple releases a product using the technology. 
    DAalsethFileMakerFeller
  • Reply 6 of 15
    hexclockhexclock Posts: 1,252member
    dewme said:
    tjwolf said:
    I wonder how this would work for people wearing contacts.  Like, how would the apparatus know how much to correct for the refraction from those lenses?  With AR glasses, I could imagine a calibration mechanism on either side of the glasses, but that's obviously to possible with contacts.
    I’d imagine there are many other scenarios that must be addressed as well including people who’ve had cataract surgery with various implanted artificial lens types. I suppose they’ll have to have some sort of calibration process to tune the projectors to individual circumstances. 

    As someone who’s had multiple retinal surgeries I’m not too keen on having a device projecting images directly on to my retina until they’ve worked out all of the kinks. I won’t be signing up to be a beta tester. 

    I’m actually not really sold on the whole AR/VR thing yet. I do hope they come up with some very practical uses of the technology including ways that enhance situational awareness for people with vision limitations and impairments and enhanced mobility. If it’s just a gaming thing or form of escapism I really don’t care where or when Apple releases a product using the technology. 
    I doubt it will work for everyone. There is great variety in eyes, and brains. For example, not too long ago I rode Mission Space at Disney along with some family. Everyone felt no effects after the ride but me. I was nauseous for about 45 minutes. I couldn’t turn my head without getting dizzy. 
    I don’t wear glasses or contacts, but for some reason that ride messed me up good, and spinning rides and roller coasters don’t bother me at all. 
    Maybe some clever software can adjust for things like that. 
  • Reply 7 of 15
    DAalsethDAalseth Posts: 2,783member
    It’s an interesting technology I heard about the concept a few years ago. I found it vaguely disquieting then, and tbh I still do. Much like earphones have the potential to do damage if they are turned up too loud, I’m thinking this could damage the very delicate retina if the projector was too bright. Not necessarily a rational fear, but I’d not want to be the first person to use this, or even be in the first few years. 

    cornchipGeorgeBMac
  • Reply 8 of 15
    JapheyJaphey Posts: 1,767member
    DAalseth said:
    It’s an interesting technology I heard about the concept a few years ago. I found it vaguely disquieting then, and tbh I still do. Much like earphones have the potential to do damage if they are turned up too loud, I’m thinking this could damage the very delicate retina if the projector was too bright. Not necessarily a rational fear, but I’d not want to be the first person to use this, or even be in the first few years. 

    Exactly. This was my first thought as well. How do they even test this out? Who in their right mind would ever volunteer to be the guinea pig for such mad science? 
  • Reply 9 of 15
    MarvinMarvin Posts: 15,322moderator
    Japhey said:
    DAalseth said:
    It’s an interesting technology I heard about the concept a few years ago. I found it vaguely disquieting then, and tbh I still do. Much like earphones have the potential to do damage if they are turned up too loud, I’m thinking this could damage the very delicate retina if the projector was too bright. Not necessarily a rational fear, but I’d not want to be the first person to use this, or even be in the first few years. 
    Exactly. This was my first thought as well. How do they even test this out? Who in their right mind would ever volunteer to be the guinea pig for such mad science? 
    Their employees test these out:

    https://bgr.com/tech/apple-vr-glasses-rumors-injury/

    "According to a workplace Health and Safety report obtained by Gizmodo, Apple is working on some kind of eye-mounted system that involves using lasers. We only know this because in two instances, Apple employees were left injured after testing the prototype."

    Direct projection into the eye will allow for higher quality images than projection onto lenses in front of the eyes but they will have to make the brightness levels safe.
    byronl
  • Reply 10 of 15
    Japhey said:
    DAalseth said:
    It’s an interesting technology I heard about the concept a few years ago. I found it vaguely disquieting then, and tbh I still do. Much like earphones have the potential to do damage if they are turned up too loud, I’m thinking this could damage the very delicate retina if the projector was too bright. Not necessarily a rational fear, but I’d not want to be the first person to use this, or even be in the first few years. 

    Exactly. This was my first thought as well. How do they even test this out? Who in their right mind would ever volunteer to be the guinea pig for such mad science? 
    Well, they promised that the lasers were set to stun...
    rundhvidam8449
  • Reply 11 of 15
    Fascinating R&D which highlights why AR is orders of magnitude more complex than VR!

    —it seems as  investigates the means to render the augmenting visual cues transparent to the brain—that is—deliver visual cues indistinguishable from the objective reality to the brain!
    In AR, you Augment the “objective” reality with visual cues. However, the brain functions—in essence—as an anti-augmenting system by dismissing the vast majority of “reality”. Most find such statements counter-intuitive and ridiculous, which is a testament to how effective the brain is at generating the illusion we call reality 😳

    If  succeeds, the user experience will be trivial and unimpressive: zero fatigue, zero tolerance issues 🤭
    Thus, the brain sees the augmenting visual cues as part of the objective reality—thereby providing extremely powerful tools to “the good guys” and equally scary scenarios in the hands of “the bad guys”. For example: a HUD you never interpret as a HUD, with data points in your field of view along traffic signs, trees, pedestrians. This will likely reduce risk of traffic accidents, as you no longer must alternate your focus between the HUD and the surroundings.
    For the scary scenarios, have a look at the movie “Anon”, available at Netflix 😱

    The visual system is able to shift focus plane by manipulating the curvature of the lens—independent of gaze/position of the eyes. The above next-generation HUD must dynamically adjust the augmenting visual cues to both gaze (trivial) and focus plane (non-trivial, to say the least 👀). Luckily, shifting focus plane is very slow 😉

    byronl
  • Reply 12 of 15
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Marvin said:
    Japhey said:
    DAalseth said:
    It’s an interesting technology I heard about the concept a few years ago. I found it vaguely disquieting then, and tbh I still do. Much like earphones have the potential to do damage if they are turned up too loud, I’m thinking this could damage the very delicate retina if the projector was too bright. Not necessarily a rational fear, but I’d not want to be the first person to use this, or even be in the first few years. 
    Exactly. This was my first thought as well. How do they even test this out? Who in their right mind would ever volunteer to be the guinea pig for such mad science? 
    Their employees test these out:

    https://bgr.com/tech/apple-vr-glasses-rumors-injury/

    "According to a workplace Health and Safety report obtained by Gizmodo, Apple is working on some kind of eye-mounted system that involves using lasers. We only know this because in two instances, Apple employees were left injured after testing the prototype."

    Direct projection into the eye will allow for higher quality images than projection onto lenses in front of the eyes but they will have to make the brightness levels safe.

    There is a physical advantage to it:   When projected in front of the eye, the lens & eye would have to adjust to focus on it and thus take focus away from the main area of attention (say a driver approaching an intersection).   Aside from being distracting, that could be draining for a driver.

    By putting the image directly on the retina, it avoids that (re)focusing.
    But then, it seems, it would introduce a new wrinkle:   how does the beam get through the lens without being bent -- which is the purpose of the lens (which bends light by variable amounts to suit the obect of attention).

    This seems more practical in a person with post cataract surgery since most of those lenses are static and don't adjust for focus.
    byronl
  • Reply 13 of 15
    applguyapplguy Posts: 235member
    Must be why Apple has worked hard and received patents for eye tracking technology.
    byronl
  • Reply 14 of 15
    am8449am8449 Posts: 392member
    I imagine the projectors needed to make this happen would need pixels on the microscopic level. That would be mind boggling if that were possible!
    byronl
  • Reply 15 of 15
    There is a physical advantage to it:   When projected in front of the eye, the lens & eye would have to adjust to focus on it and thus take focus away from the main area of attention (say a driver approaching an intersection).   Aside from being distracting, that could be draining for a driver.

    By putting the image directly on the retina, it avoids that (re)focusing.
    But then, it seems, it would introduce a new wrinkle:   how does the beam get through the lens without being bent -- which is the purpose of the lens (which bends light by variable amounts to suit the obect of attention).

    This seems more practical in a person with post cataract surgery since most of those lenses are static and don't adjust for focus.
    Having owned a few cars with a HUD since 2004, most of them have some have a bit of adjustment for the perceived distance of the display. The 2 that were fixed distance were at the perceived to be at the front of the car, the adjustable ones have allowed for moving the perceived distance up to about 20’ from the front windshield. This allows you to see the information without refocusing from the traffic or whatever else is in front of you. F1 drivers have had these since roughly 1995 and, of course, combat pilots have had HUD for much longer. 

    If it’s being patented by a public company my guess is that it is already has been in use in the military for 10+ years, that is, if it works.
    rundhvidbyronl
Sign In or Register to comment.