Apple has been working on AR glasses for years, and iOS 13 is only the latest sign

Posted:
in General Discussion edited May 2020
Rumors have circulated about an Apple-produced augmented reality headset or smart glasses for quite a few years, but so far the iPhone producer has yet to officially confirm the hardware is in development. The additions made to ARKit in iOS 13 as well as a plethora of patents and other reports indicate Apple is closer to revealing its work than ever before.




During the WWDC 2019 keynote, Apple spent time revealing some of the newest features it is bringing to ARKit 3, the company's framework that allows developers to add augmented reality functionality to their apps. The additions arriving as part of iOS 13 will bring Apple's AR efforts to a new level, closer to the ideal form of the technology that users anticipate.

For the moment, however, AR in the Apple ecosystem continues to be limited to iPhones and iPads, which are less than ideal when it comes to the user experience. A version of AR using a headset, giving the effect without requiring people to hold up and look through a limited window on a mobile device, could be the perfected form of the technology, and Apple continues to edge closer to creating with every one of its improvements to the platform.

ARKit 3 in iOS 13

The keynote highlighted some elements that have so far been missing from the AR experience, made possible via earlier ARKit versions. Under ARKit 3, there are many changes made to make it more immersive, and enabling developers to offer even more functionality in their apps.

A key component of ARKit 3 is people occlusion, where the iOS device detects there is a person standing in the middle of an AR-enhanced scene. Rather than flatly covering the camera view with the digital objects, ARKit 3 can detect the distance the seen person is from the user, and if they are closer than a digital object theoretically would be, can alter the overlay to make the object seem to appear behind the person.

The technique also enables the possibility of green screen-style effects, where a subject viewed through an app could appear to walk through a fully digital environment, without needing a chroma key background.

A view of an AR scene that occludes digital objects to make them appear behind the subject
A view of an AR scene that occludes digital objects to make them appear behind the subject


Apple also added motion capture capabilities, where the movements of a subject could be analyzed and interpreted in real time in the app. Dividing the subject down to a simple skeleton, joints and bones can be determined and monitored for changes, with those movements able to trigger animations or to be recorded for custom movements of characters.

The tracking changes also impact ARKit Face Tracking, which already takes advantage of the TrueDepth camera on modern iPhone and iPad Pro models to analyze the user's face, such as for Memoji and Snapchat. In ARKit 3, the face tracking has been expanded to allow it to work with up to three faces at the same time.





Another key component is the ability for users to collaborate with each other within the same AR environment. By letting people see the same items and environment in a single session, this can enable multiplayer games and collaborative work to take place in AR between multiple people, enabling a social element to AR experiences.

Other changes also introduced in ARKit 3 is the ability to track the user's face and the world using the front and rear cameras at the same time, detecting up to 100 images at the same time, automatic estimates of physical image sizes, more robust 3D object detection, and improved plane detection via machine learning.

The ongoing work by Apple

Aside from ARKit announcements, patents are a second source of evidence for Apple's AR glasses work.

While not a guarantee Apple intends to use the ideas it files with the US Patent and Trademark Office in future products and services, the filings do at least indicate previous areas of interest for Apple's research and development efforts.

An image from an Apple patent filing showing cameras inside and outside a headset
An image from an Apple patent filing showing cameras inside and outside a headset


A key element of the AR headset is the display, as fast response times are essential for creating a perfect illusion of combined virtual and real scenery. Patent applications from December 2016 and January 2017 titled "Displays with Multiple Scanning Modes" describes how a display's circuitry could update only partial sections of a display for each refresh rather than the entire display, in order to minimize the possibility of artifacts appearing in high-resolution and high refresh rate screens.

Updating only part of a screen also offers less work for a display to perform each refresh, allowing for shorter refresh periods and a faster overall screen update rate. Faster updates may also help cut down on motion sickness in some users, where the display doesn't update fast enough to match physical movements of a user's head.

The ability render the world at high speed can also help prevent nausea and motion sickness, as suggested in the 2017 filing for a "Predictive, Foveated Virtual Reality System." Consisting of a headset with a multiple-camera setup, the hardware would take pictures of the world at low and high resolutions, with the higher sections shown to the user where their eyes are looking at on the display, while the lower fills the periphery, speeding up processing of the scene.

A 2016 "Display System Having World and User Sensors" uses cameras outside and inside the headset to track the environment and the user's eyes, expressions, and hand gestures, enabling a form of augmented reality image to be created and interacted with.

A patent filing image showing the use of a 'hot mirror' for eye tracking
A patent filing image showing the use of a 'hot mirror' for eye tracking


Eye tracking has also featured heavily in patent filings, including one from May 2019 for determining ocular dominance, and others from 2018 suggesting ways to actually perform eye tracking as close to the user's head as possible. By bringing the tracking equipment nearer the face and using items like a "hot mirror" to reflect infrared light, there is less weight positioned further away from the user's head, resulting in a more comfortable user experience.

On a more practical level, Apple has also explored how AR glasses could fit perfectly using various adjustment systems in 2019, including motor-cinched bands and inflatable sections to prevent the headset from moving once worn. Thermal regulation was also determined to be a important element, cooling not only the hardware inside the headset, but also the user's face.

Apple has also considered how to create a form of smart glasses that, instead of using a separate display, simply places the user's iPhone in front of their eyes. A more advanced form of the simple smartphone-based VR headset enclosures, Apple's proposal for a "Head-mounted Display Apparatus for Retaining a Portable Electronic Device with Display from 2017 slides an iPhone in from one side, includes a camera and external buttons, built-in earphones in the arms, and a remote control.

A patent filing image of smart glasses holding an iPhone
A patent filing image of smart glasses holding an iPhone


For interaction, Apple has spent time working out how to use force-sensing gloves with VR and AR systems. By using conductive strands and embedded sensors, gloves and other garments could provide force-sensing capabilities that could trigger commands, read gestures, and even count the number of fingers in contact with a physical object.

On the application side, Apple's patent filings suggest AR could be used for ride-hailing services, with smart glasses allowing customers to find their ride and vice versa. For navigation, one 2019 patent advises how points of interest could be displayed in a real-world view via AR, one which could also be expanded to advise of the position of items in a room, rather than notable geographic features.

How a 'points of interest' navigation app could appear in smart glasses
How a 'points of interest' navigation app could appear in smart glasses

Reports and Speculation

The idea of Apple creating some sort of smart glasses or AR headset has circulated for some time, and could result in the release of hardware in the next few years. Loup Ventures analyst Gene Munster suggested in 2018 that Apple could release "Apple Glasses" in late 2021, possibly selling as many as 10 million units in the first year of release at a cost of around $1,300 per unit.

Ming-Chi Kuo, another long-term Apple commentator, has more recently proposed the headset could make an appearance midway through 2020, with manufacturing plausibly commencing at the end of 2019.

As for how the supposed glasses could appear, Kuo suggests they could act more as a display with the iPhone performing the bulk of the processing. WiGig, a 60-gigahertz wireless networking system that could provide cableless connectivity, has also been tipped for the device, as well as 8K-resolution eyepieces, and the supposed codename "T288."

A headset produced by eye-tracking acquisition SensoMotoric Instruments
A headset produced by eye-tracking acquisition SensoMotoric Instruments


Headset development surfaced in a leaked safety report in 2017, where incidents requiring "medical treatment beyond first aid" were required for a person testing a prototype device at one of Apple's offices. Specifically the injury related to eye pain, suggesting it was in testing something vision-related, potentially a headset of some form.

Apple has also reportedly spoken to technology suppliers about acquiring components for AR headsets at CES in 2019, with incognito Apple employees visiting stands of suppliers of waveguide hardware.

There have also been acquisitions of companies that are closely related to AR and headset production. In 2017 Apple picked up eye tracking firm SensoMotoric Instruments, while in August 2018 it bought Akonia Holographics, a startup focused on the development of specialized lenses for AR headsets.

Comments

  • Reply 1 of 9
    tjwolftjwolf Posts: 424member
    A lot of VR related patents described in an article ostensibly about AR.

    Between Munster's and Ming-Chi Kuo's predictions on what Apple will deliver, I'd put my money on on the latter.  Munster doesn't appear to understand the technology and its limitations if he thinks Apple will release  $1,300 glasses which, presumably will cost that much because they'll be gill full of tech.  But there's no room for a lot of technology in glasses that are to be worn all day.  I think that was one of the traps Google Glass fell into: it tried to fit a complete system into the temples & hinge of the glasses.  The resulting glasses were still too heavy - and its batteries lasted only lasted for a couple hours at most.

    Kuo, like me, seems to think that most of the processing will happen on the phone (where ARKit and ARKit enabled apps already reside).  The glasses themselves will simply replicate the sensors/camera the iPhone already uses for AR and a display for AR images sent to it by the phone, use a minimal CPU, memory, and networking (e.g. Apple's own W2 chip),.  It should then be possible to have small batteries drive these AR glasses last all day.
    racerhomie3fastasleepjamiemcdboxcatcher
  • Reply 2 of 9
    racerhomie3racerhomie3 Posts: 1,264member
    I am super excited about this!!
  • Reply 3 of 9
    FolioFolio Posts: 698member
    By 2021, gamers subscribing to Arcade likely will be among first movers to AR glasses. Cook and crew in either world seen as visionary.
    lolliver
  • Reply 4 of 9
    SpamSandwichSpamSandwich Posts: 33,407member
    Everyone wearing such glasses will be able to live in what appears to be a beautiful, well-kept mansion (even if they actually live in a cockroach infested dump).
    mwhiteJWSC
  • Reply 5 of 9
    hentaiboyhentaiboy Posts: 1,252member
    Everyone wearing such glasses will be able to live in what appears to be a beautiful, well-kept mansion (even if they actually live in a cockroach infested dump).
    Mansions can have cockroaches too you know.
  • Reply 6 of 9
    SpamSandwichSpamSandwich Posts: 33,407member
    hentaiboy said:
    Everyone wearing such glasses will be able to live in what appears to be a beautiful, well-kept mansion (even if they actually live in a cockroach infested dump).
    Mansions can have cockroaches too you know.
    Not the virtual ones. ;)
    mwhiteJWSClolliverlitoloop
  • Reply 7 of 9
    frantisekfrantisek Posts: 756member
    One can see the future. As is mentioned use with maps but also AppleCar, disabled people, in future with AppleWatch only. And of course all places it is used already. Important is how Apple will manage interaction with onscreen envirnoment and user interface items. Whether by eye tracking, voice scanning hands or some smart gloves for more mature solutions or whether Watch become small trackpad as well. Last think is awailability. Apple needs enough iPhones capable of feeding glasses. I do not have idea whether any of current iPhones can do it. Or whether it will be needed for simple information display or every iPhone with ARKit 3 will be able to do but for Real AR interaction some new will be needed. If someone has better knowledge, pleas share.
  • Reply 8 of 9
    tjwolf said:

    Kuo, like me, seems to think that most of the processing will happen on the phone (where ARKit and ARKit enabled apps already reside).  
    Exactly -- Apple Watch strategy.

    1. Maintains the need for Apple's flagship product;

    2. Gets "good" to market instead of waiting for "great" (powerful on-device processors, standalone batteries) to be ready;

    3. Leverages the entirety of Apple's integrated ecosystem (handoff, etc.) as a competitive advantage against which Android will struggle to fight; and

    4. Simplifies the design requirements of software / UI running on the glasses
    edited June 2019
  • Reply 9 of 9
    davendaven Posts: 696member
    hentaiboy said:
    Everyone wearing such glasses will be able to live in what appears to be a beautiful, well-kept mansion (even if they actually live in a cockroach infested dump).
    Mansions can have cockroaches too you know.
    Not the virtual ones. ;)
    I beg to disagree. By definition, virtual mansions are built using computer programs and all computer programs have bugs. Therefore, virtual mansions will have bugs including cockroaches. ;)
    radarthekat
Sign In or Register to comment.