Apple's AR glasses arriving in 2020, iPhone will do most of the work

2

Comments

  • Reply 21 of 47
    crowleycrowley Posts: 10,453member
    tjwolf said:
    crowley said:
    That seems like a hell of a lot of data that you're shunting over a wireless connection, and in a situation where any lag will destroy the experience.  Colour me sceptical, firstly of the report and then of the system.
    You must be thinking of VR, not AR.  Why do you think it's a "hell of a lot of data"?  The only thing that needs to be transmitted is the augmentation.  In other words, the display is already in front of you - it's reality.  The glasses just have to show whatever augmentation the iPhone decides on based on what the glass' camera sees.  What the camera sees can be transmitted to the iPhone easily - Apple has been using AirPlay to transmit video for years.
    If it's really augmenting reality then it's not just video, it'll be real time depth mapping.  And possibly interaction with the augmentation as well, which could be simple, or could be complicated.

    I think I'm winding back, maybe it's not so much data, but I'm still suspicious that processing off-device may end up introducing more lag than is acceptable in this kind of device if Apple are trying to do anything in excess of the most basic AR.
  • Reply 22 of 47
    AppleExposedAppleExposed Posts: 1,805unconfirmed, member
    Goodness, needing an iPhone for processing just screams Samsung. I think these dolt analysts saw the ugly Samsung phone strapped to peoples faces and limited their creativity to that. Does sammy even support Gear VR anyore?

    Is there any reason the A14 chip embedded into the Glasses wouldn't be able to handle the processing alone?
  • Reply 23 of 47
    cgWerkscgWerks Posts: 2,952member
    entropys said:
    Killer apps or it would be a waste of time.
    That's the thing. I certainly can see a few vertical-market applications for such technology, but the tech industry seems to be looking at this like 'the next big thing.' I just don't see that being the case, but I suppose dystopia-fiction and movies like Idiocracy seem to be coming true at an ever-alarming rate, so maybe I'm the one who is out of touch. :(
    mobird
  • Reply 24 of 47
    22july201322july2013 Posts: 3,550member
    iGlasses.
    AppleExposed
  • Reply 25 of 47
    entropys said:
    Killer apps or it would be a waste of time.
    Can’t wait for a AR spreadsheet app!!!
  • Reply 26 of 47
    wigbywigby Posts: 692member
    And if ( a big IF ) the AR Goggles has a sensor to scan in hands to interact with the 3D AR objects, then this is what Apple should have released in the very first place! Apple's idea of using an iOS device to hold up for AR use is asinine and I have never, I mean, NEVER seen anyone locally hold up an iPhone or iPad just for that. The only exception would be the Ingress game in my experience which is AR, a bit older than Pokemon Go and doesn't require holding up a phone in front of an object or location.

    Another concern I have is that the AR Goggles will most likely need to be recharged which makes it the 5th device with a rechargeable battery ( iPhone, iPad, Watch, AirPods and now this one ) running on Bluetooth. 

    I suspect the AR Goggles will probably go for close to $300-400 alone when and IF they release it in 2020, depending on the market situation. 
    Apple chose to use the phone as the platform and display because the alternative was Google Glass and HoloLens. One was a commercial failure and the other is way out everyone's price range except for enterprise. In these past few years, Apple has learned a lot about AR precisely because they went with the iPhone and iPad as platforms. Using hands freely for gesturing and other things was always their mission but the technology just wasn't there a few years ago and it arguably still isn't based on Magic Leap reviews I've seen.

    Recharging any any discreet device is just something we have all come to accept and deal with. It is much better than the alternative of a physical tether and besides, taking off your glasses and putting them on an AirPower charging pad sounds like a simple way to deal with the charging issues and make a lot of extra money for Apple.

    $300-$400 is Apple Watch pricing. This is a new category with much more going on in the way of sensors, battery tech, wireless connectivity, cameras and display. It will not be sold for less than $699. It might not be great 1st or 2nd gen hardware or software but will sell millions and temporarily push Apple into the forefront of AR. The only question is will the developers and general consumers embrace Apple's AR in the same way that they have done so for iPhone, iPad and Apple Watch?
    tjwolffastasleep
  • Reply 27 of 47
    tjwolftjwolf Posts: 424member
    crowley said:
    tjwolf said:
    crowley said:
    That seems like a hell of a lot of data that you're shunting over a wireless connection, and in a situation where any lag will destroy the experience.  Colour me sceptical, firstly of the report and then of the system.
    You must be thinking of VR, not AR.  Why do you think it's a "hell of a lot of data"?  The only thing that needs to be transmitted is the augmentation.  In other words, the display is already in front of you - it's reality.  The glasses just have to show whatever augmentation the iPhone decides on based on what the glass' camera sees.  What the camera sees can be transmitted to the iPhone easily - Apple has been using AirPlay to transmit video for years.
    If it's really augmenting reality then it's not just video, it'll be real time depth mapping.  And possibly interaction with the augmentation as well, which could be simple, or could be complicated.

    I think I'm winding back, maybe it's not so much data, but I'm still suspicious that processing off-device may end up introducing more lag than is acceptable in this kind of device if Apple are trying to do anything in excess of the most basic AR.
    I think "the most basic AR" is exactly what Apple is aiming for - and, if done right, it'll sell like hotcakes IMHO.    I'm not saying interaction with AR objects isn't desirable - or eventually achievable - but I think there's plenty of usefulness for AR glasses that look like real glasses, whose battery lasts all day, and which lets you view notifications, emails, texts, and - very importantly - run ARKit enabled applications.  I don't know about ARKit, but if ARKit also allows for interaction with AR objects, then Apple will probably provide equivalents for that as well.

    Once Apple has established the market, it will go about adding functionality while making sure battery life and looks don't suffer.  Think about it - that's how they've grown all their devices - e.g. the initial watch didn't have cellular, no GPS, wasn't water proof, didn't have an ECG.  But the initial version had enough features to be an unmitigated success (despite what some media sources tried to paint it as).
    fastasleep
  • Reply 28 of 47
    tjwolftjwolf Posts: 424member

    wigby said:
    And if ( a big IF ) the AR Goggles has a sensor to scan in hands to interact with the 3D AR objects, then this is what Apple should have released in the very first place! Apple's idea of using an iOS device to hold up for AR use is asinine and I have never, I mean, NEVER seen anyone locally hold up an iPhone or iPad just for that. The only exception would be the Ingress game in my experience which is AR, a bit older than Pokemon Go and doesn't require holding up a phone in front of an object or location.

    Another concern I have is that the AR Goggles will most likely need to be recharged which makes it the 5th device with a rechargeable battery ( iPhone, iPad, Watch, AirPods and now this one ) running on Bluetooth. 

    I suspect the AR Goggles will probably go for close to $300-400 alone when and IF they release it in 2020, depending on the market situation. 
    Apple chose to use the phone as the platform and display because the alternative was Google Glass and HoloLens. One was a commercial failure and the other is way out everyone's price range except for enterprise. In these past few years, Apple has learned a lot about AR precisely because they went with the iPhone and iPad as platforms. Using hands freely for gesturing and other things was always their mission but the technology just wasn't there a few years ago and it arguably still isn't based on Magic Leap reviews I've seen.

    Recharging any any discreet device is just something we have all come to accept and deal with. It is much better than the alternative of a physical tether and besides, taking off your glasses and putting them on an AirPower charging pad sounds like a simple way to deal with the charging issues and make a lot of extra money for Apple.

    $300-$400 is Apple Watch pricing. This is a new category with much more going on in the way of sensors, battery tech, wireless connectivity, cameras and display. It will not be sold for less than $699. It might not be great 1st or 2nd gen hardware or software but will sell millions and temporarily push Apple into the forefront of AR. The only question is will the developers and general consumers embrace Apple's AR in the same way that they have done so for iPhone, iPad and Apple Watch?

    Agree with everything you've said, except pricing.  I think it'll price in the same range as the Apple Watch.  Besides the argument that people probably won't accept an "accessory" costing nearly as much as their phone, I'd add that I don't necessarily buy the argument that it has a lot more going on - parts wise - than the Apple Watch.  Yes, it needs sensors similar to those on the iPhone themselves (camera, possibly depth sensors).  But the watch has a display, wireless connectivity, battery tech.  In addition, the watch  has GPS, option for cellular, an ECG, heart beat monitor, a processor powerful enough to run apps, a speaker, a microphone, and enough memory for quite a few songs.  So, actually, the glasses will likely have a lot less going on than the watch in terms of pure functionality.
    fastasleep
  • Reply 29 of 47
    DAalsethDAalseth Posts: 2,783member
    iPhone will do most of the work
    Just like the first gen AppleWatch. 
    edited March 2019
  • Reply 30 of 47
    macplusplusmacplusplus Posts: 2,112member
    I imagine a very advanced and accurate depth mapping system on otherwise ordinary glasses, with Neural Engine's advanced object recognition capabilities. When you point to a car, that part of the car may explode to reveal the inside as wireframe drawing, for example. So we may also assume hand gesture recognition to interact with the "augmented" reality. Not a nauseating nK VR to redraw what the eye already sees, but an intelligent system that recognizes it.

    These patents may give an idea:
    https://www.patentlyapple.com/.services/blog/6a0120a5580826970c0120a5580ebf970c/search?filter.q=depth+mapping

    Also don't miss their home page, there are news more interesting than Kuo's divinations:
    https://www.patentlyapple.com/patently-apple/

    edited March 2019
  • Reply 31 of 47
    Eric_WVGGEric_WVGG Posts: 965member
    I wish this site ran gambling pools

    $100 says no fucking way, no AR glasses until 2025 at earliest, likely much later
  • Reply 32 of 47
    boltsfan17boltsfan17 Posts: 2,294member
    Maybe I'm wrong, but I don't see the point of AR glasses for consumer use. Walking around with a personal HUD display giving directions is something I personally wouldn't use. The only fields I see AR having a big impact would be the medical field (surgeries) or military use. 
    tomjunior39
  • Reply 33 of 47
    gatorguy said:
    esummers said:
    entropys said:
    Killer apps or it would be a waste of time.
    If the 8K per eye rumors are true then there will be.  That is considered the holy grail of AR/VR. That resolution is near retina.  At a minimum it could give you a movie theatre size screen anywhere.  Current headsets are toys mainly because resolution prevents them from replacing existing tech.
    20 MP per eye, 90-120 fps, foveated rendering, paired with new Sharp LCD development...
    It's coming. 
    I’m glad you mentioned “foveated rendering.” People need to understand how important this is in being able to use a 20MP+ display with less processing.  A lot less work for the GPU if it only needs to address the native resolution of the display in the area that the eye is focused on (eye-tracking obviously required). 
  • Reply 34 of 47
    abolishabolish Posts: 14member
    What's crazy about this rumor is the camera power requirements. Unlike predecessors like Google Glass, AR glasses will require a camera that's on during use and transmitting a live feed to the phone the whole time. Immediately there's a challenge with latency and connection quality. But moreover running a camera and high-bandwidth transmitter requires a relatively high amount of power. How big of a battery can they fit into glasses which keeping it thin and lightweight?
    cgWerks
  • Reply 35 of 47
    mcdavemcdave Posts: 1,927member
    crowley said:
    That seems like a hell of a lot of data that you're shunting over a wireless connection, and in a situation where any lag will destroy the experience.  Colour me sceptical, firstly of the report and then of the system.
    For full-scene generation at 2x 8K yes.  But many applications could use locally rendered objects at least to the complexity of an Apple Watch.  The other thing to bear in mind is the 8K is for full field of view, the eye-tracking patents are almost certainly to only render a small portion of that canvas at full resolution with the peripheral vision progressively lower. You’re probably only looking at 1080p stream rates.
  • Reply 36 of 47
    gatorguygatorguy Posts: 24,084member
    mcdave said:
    crowley said:
    That seems like a hell of a lot of data that you're shunting over a wireless connection, and in a situation where any lag will destroy the experience.  Colour me sceptical, firstly of the report and then of the system.
    For full-scene generation at 2x 8K yes.  But many applications could use locally rendered objects at least to the complexity of an Apple Watch.  The other thing to bear in mind is the 8K is for full field of view, the eye-tracking patents are almost certainly to only render a small portion of that canvas at full resolution with the peripheral vision progressively lower. You’re probably only looking at 1080p stream rates.
    I've mentioned foveated rendering, the "peripheral vision thing" you brought up,  a few times in the past couple of years. I don't think most folks have paid any attention even if they have indicated they have an interest in learning about VR/AR hardware and software. Here's another chance to be better informed. 
    https://ai.googleblog.com/2017/12/introducing-new-foveation-pipeline-for.html
    edited March 2019
  • Reply 37 of 47
    flaneurflaneur Posts: 4,526member
    abolish said:
    What's crazy about this rumor is the camera power requirements. Unlike predecessors like Google Glass, AR glasses will require a camera that's on during use and transmitting a live feed to the phone the whole time. Immediately there's a challenge with latency and connection quality. But moreover running a camera and high-bandwidth transmitter requires a relatively high amount of power. How big of a battery can they fit into glasses which keeping it thin and lightweight?
    Not one camera — two, for stereo, spaced at least eye-distance apart.

    gatorguy
    said:
    mcdave said:
    crowley said:
    That seems like a hell of a lot of data that you're shunting over a wireless connection, and in a situation where any lag will destroy the experience.  Colour me sceptical, firstly of the report and then of the system.
    For full-scene generation at 2x 8K yes.  But many applications could use locally rendered objects at least to the complexity of an Apple Watch.  The other thing to bear in mind is the 8K is for full field of view, the eye-tracking patents are almost certainly to only render a small portion of that canvas at full resolution with the peripheral vision progressively lower. You’re probably only looking at 1080p stream rates.
    I've mentioned foveated rendering, the "peripheral vision thing" you brought up,  a few times in the past couple of years. I don't think most folks have paid any attention even if they have indicated they have an interest in learning about VR/AR hardware and software. Here's another chance to be better informed. 
    https://ai.googleblog.com/2017/12/introducing-new-foveation-pipeline-for.html
    Thanks for the link. Most of us interested in this technology have digested the concept of foveated rendering, and I for one want to keep up with it.

    In case some haven’t groked the need for one 8K screen per eye, I’d mention that the screens will be small and thus under relatively high magnification, so the pixel density will be needed to prevent a “screen door” effect. You mentioned in a post above that Sharp is working on some kind of LCD tech that might be relevant to these hypothetical 8K screens. Do you have a link for that? Different from microLED?
    edited March 2019
  • Reply 38 of 47
    gatorguygatorguy Posts: 24,084member
    flaneur said:
    abolish said:
    What's crazy about this rumor is the camera power requirements. Unlike predecessors like Google Glass, AR glasses will require a camera that's on during use and transmitting a live feed to the phone the whole time. Immediately there's a challenge with latency and connection quality. But moreover running a camera and high-bandwidth transmitter requires a relatively high amount of power. How big of a battery can they fit into glasses which keeping it thin and lightweight?
    Not one camera — two, for stereo, spaced at least eye-distance apart.

    gatorguy
    said:
    mcdave said:
    crowley said:
    That seems like a hell of a lot of data that you're shunting over a wireless connection, and in a situation where any lag will destroy the experience.  Colour me sceptical, firstly of the report and then of the system.
    For full-scene generation at 2x 8K yes.  But many applications could use locally rendered objects at least to the complexity of an Apple Watch.  The other thing to bear in mind is the 8K is for full field of view, the eye-tracking patents are almost certainly to only render a small portion of that canvas at full resolution with the peripheral vision progressively lower. You’re probably only looking at 1080p stream rates.
    I've mentioned foveated rendering, the "peripheral vision thing" you brought up,  a few times in the past couple of years. I don't think most folks have paid any attention even if they have indicated they have an interest in learning about VR/AR hardware and software. Here's another chance to be better informed. 
    https://ai.googleblog.com/2017/12/introducing-new-foveation-pipeline-for.html
    Thanks for the link. Most of us interested in this technology have digested the concept of foveated rendering, and I for one want to keep up with it.

    In case some haven’t groked the need for one 8K screen per eye, I’d mention that the screens will be small and thus under relatively high magnification, so the pixel density will be needed to prevent a “screen door” effect. You mentioned in a post above that Sharp is working on some kind of LCD tech that might be relevant to these hypothetical 8K screens. Do you have a link for that? Different from microLED?
    https://www.vrfocus.com/2017/06/google-and-sharp-partner-to-develop-vr-displays/
    For now and in the near term they're still planning for OLED.
  • Reply 39 of 47
    mobirdmobird Posts: 749member
    If an individual wears corrective lenses to begin with, whether it is a pair of glasses or contacts, how would that affect the experience?

    This whole AR/VR adventure/direction on Apple's part is unsettling along with News subscriptions and original content production. IMHO.
  • Reply 40 of 47
    mcdavemcdave Posts: 1,927member
    I'd wear AR glasses if they could somehow solve the tinting issue - they need to go from clear to tinted without looking completely goofy. The other issue is that now that so many people have Apple Watches, no matter how customised the bands are, they all look so generic. That's not ideal but also not the end of the world on a watch but glasses are even more obvious. I would hope that Apple realise this and can create multiple styles with the same underlying tech. I assume it's easier than building a round watch vs a square one.
    Monochrome LCD backplane/alpha channel (or everything will look ghostly).  Sadly, that ‘killer app’ will probably be instant sunglasses.
Sign In or Register to comment.