Apple AR headset codenamed 'T288' said to run new 'rOS' operating system, launch as soon a...

124»

Comments

  • Reply 61 of 66
    Seems pretty clear how it's going to work to me: dot projector, IR sensor, other stuff, no camera, no ability to record with the IR sensor (as sendmcjak described), microdot led over the lens that's calibrated at first time use to determine distance from the cornea/fovea.

    I think this will eventually be incredibly popular. Most people will never want to take them off. Immediate contextual info on the fly, fun AR applications, and possibly the most important.. the ability to browse facebook* while.. at a meeting, presentation, in a boring conversation, etc.

    *Only being slightly facetious.
  • Reply 62 of 66
    calicali Posts: 3,494member
    Soli said:
    cali said:
    Unlike other headsets that require a smartphone or other devices to power them, Bloomberg claims that Apple's device will have its own display and processor.“

    I told people this long ago on here and they couldn’t wrap their head around it. They kept saying “it will need an iPhone to work”.

    WHY?

    Apple isn’t Samsung or google. They don’t make mechanical, clunky, awkward technology. There’s absolutely no reason to strap an iPhone to your face when Apple owns technology like the A11/S3/W2 chips. 

    I agree AppleInsider needs to stop showing Goog Glass. Makes it seem like Apple is trying to copy that nerdwear.

    I also gave the same possible solution months ago for the “creepiness”.... just don’t allow the glasses to record. There’s little reason to use glasses to record when you have an awesome camera in your pocket. I mean really, why do you need to record from your glasses? I can see 3rd parties offering an obvious attachment to make it possible. Obvious as in, others would know you have a camera on your glasses.

    The other possibility like another commenter said is the glasses become popular and no one cares about the recording feature but I stick by my solution and believe Apple will leave out recording at least for the first gen or until they become more popular. 
    Um… the Apple Watch has "its own display and processor" yet "needs an iPhone to work." These are obviously not mutually exclusive and I'd bet any money that the next wearable from Apple will also connect need an auxiliary device the way the Apple Watch does because it's unreasonable to do everything you need to do on that small display.
    But in 2020? We're already seeing the Watch become more and more independent.

    I believe Apple can shrink the A11 processor which is made for ARKit and the true depth camera into the bridge of glasses, no external devices needed. If you have to strap an iPhone to your face or connect it to your Watch or iPhone, they've failed.
  • Reply 63 of 66
    SoliSoli Posts: 10,035member
    cali said:
    Soli said:
    cali said:
    Unlike other headsets that require a smartphone or other devices to power them, Bloomberg claims that Apple's device will have its own display and processor.“

    I told people this long ago on here and they couldn’t wrap their head around it. They kept saying “it will need an iPhone to work”.

    WHY?

    Apple isn’t Samsung or google. They don’t make mechanical, clunky, awkward technology. There’s absolutely no reason to strap an iPhone to your face when Apple owns technology like the A11/S3/W2 chips. 

    I agree AppleInsider needs to stop showing Goog Glass. Makes it seem like Apple is trying to copy that nerdwear.

    I also gave the same possible solution months ago for the “creepiness”.... just don’t allow the glasses to record. There’s little reason to use glasses to record when you have an awesome camera in your pocket. I mean really, why do you need to record from your glasses? I can see 3rd parties offering an obvious attachment to make it possible. Obvious as in, others would know you have a camera on your glasses.

    The other possibility like another commenter said is the glasses become popular and no one cares about the recording feature but I stick by my solution and believe Apple will leave out recording at least for the first gen or until they become more popular. 
    Um… the Apple Watch has "its own display and processor" yet "needs an iPhone to work." These are obviously not mutually exclusive and I'd bet any money that the next wearable from Apple will also connect need an auxiliary device the way the Apple Watch does because it's unreasonable to do everything you need to do on that small display.
    But in 2020? We're already seeing the Watch become more and more independent.

    I believe Apple can shrink the A11 processor which is made for ARKit and the true depth camera into the bridge of glasses, no external devices needed. If you have to strap an iPhone to your face or connect it to your Watch or iPhone, they've failed.
    I don't see that happening by then, if ever. There's simply too many settings that simply wouldn't work with the UI unless you can think of very revolutionary ways in which that can happen. Processor speed isn't a factor here—it's user interface.
  • Reply 64 of 66
    calicali Posts: 3,494member
    wizard69 said:
    I think you guys are missing some obvious things here.   

    I dont ever see the continous wear of AR glasses being socially acceptable.   Even if no camera is involved people will be put off by your preoccupation with the AR data.   I kinda liken this to laptop useage where if someone stops by to chat while you are working it is considered polite to take your focus off the screen, closing the screen can be seen as giving the person 100% of your attention.  

    Rather for AR glasses i see marketing success if they are sold as a tool just like a laptop.  Thus you put on the glasses when the activity warrants the use of those glasses.  There are literaly thousands, probably millions if casees where this would be advatageous.  Imagine a technician working on a very complex machine, a jet engine for example, an AR/VR solution could handle things like parts look up, recall notices, calibration or tolerance values and asssembly or disassembly procedures in a very fluid manner.  

    I dont see AR glasses being successful for general daily wear.   At least at this point i dont see a useful app on the horizon.   What i see them being useful for is the multitude of uses as a tool.  Everything from an interior designer demoing solutions for a client to AR glasses replacing HUDs on aircraft.  

    I liken the acceptance of AR glasses to the the acceptance of an eletrician walking around with his tool belt strapped to his waiste.   The tool belt is completely accepted at the job site effectively being a professional requirment.  That same tool belt might not be accepted outside of the work zone.   Some of that is social, some practicle (they get in the way) but the point is people separate their tools from their lives outside of work.  For AR/VR glasses to be accepted they will need to be seen as tools and handled as such.    That means daily wear by Joe Q. Public will be shunned.    That still means millions sold.  


    Well eventually it will mean millions sold.  The problem is the tech to pull this off.    Frankly i see computational and storage issues as big factor here.  The storage issue becomes a problem becuase network latenancy will be extremely frustrating users especially when massive data sets have to be digested.   Even if coupled to an iPhone you will still have issues with current tech. Likewise very good processing power is needed to process all of that data and present it to the user.  Basically you need better than A11 performance.  

    Topping everything off here is the delivery of viable apps.  I know they will come eventually but you need to manage consummers expectations.   In any event Apple will need a couple of compelling apps to spark imaginations.  
    There's a solution to everything. I don't believe it will be always-on AR as that's annoying as f*** () and drain battery. I believe there will be a gesture like tapping the side of the glasses to activate AR. This way we'll live as normal people. Lost? Tap and an overlay of the world appears until you tap again. What song is playing? Tap to reveal info. What bug is that?! Tap to open info and wikipedia etc. This saves battery and makes the glasses practical and addicting.

    A year ago I suggested Apples ARKit could eventually identify objects such as natural plants. This will help with exploring and I see no reason why it shouldn't be utilized in glasses.

    Seems pretty clear how it's going to work to me: dot projector, IR sensor, other stuff, no camera, no ability to record with the IR sensor (as sendmcjak described), microdot led over the lens that's calibrated at first time use to determine distance from the cornea/fovea.

    Bet your a** Apple is working to double the dots and lengths of their dot projector. Twingate and all that crap is going away. Also expect more precise Animoji, faster and more accurate FaceID, further distance, object recognition etc. This is the groundwork for something like AR gaming for TV and AR/VR glasses etc.

    There's actually a video on youtube where you can see the dots working and the improvement potential is so obvious. When you see the dots it's blatant this is a first-gen product.

    Soli said:
    cali said:
    Soli said:
    cali said:
    Unlike other headsets that require a smartphone or other devices to power them, Bloomberg claims that Apple's device will have its own display and processor.“

    I told people this long ago on here and they couldn’t wrap their head around it. They kept saying “it will need an iPhone to work”.

    WHY?

    Apple isn’t Samsung or google. They don’t make mechanical, clunky, awkward technology. There’s absolutely no reason to strap an iPhone to your face when Apple owns technology like the A11/S3/W2 chips. 

    I agree AppleInsider needs to stop showing Goog Glass. Makes it seem like Apple is trying to copy that nerdwear.

    I also gave the same possible solution months ago for the “creepiness”.... just don’t allow the glasses to record. There’s little reason to use glasses to record when you have an awesome camera in your pocket. I mean really, why do you need to record from your glasses? I can see 3rd parties offering an obvious attachment to make it possible. Obvious as in, others would know you have a camera on your glasses.

    The other possibility like another commenter said is the glasses become popular and no one cares about the recording feature but I stick by my solution and believe Apple will leave out recording at least for the first gen or until they become more popular. 
    Um… the Apple Watch has "its own display and processor" yet "needs an iPhone to work." These are obviously not mutually exclusive and I'd bet any money that the next wearable from Apple will also connect need an auxiliary device the way the Apple Watch does because it's unreasonable to do everything you need to do on that small display.
    But in 2020? We're already seeing the Watch become more and more independent.

    I believe Apple can shrink the A11 processor which is made for ARKit and the true depth camera into the bridge of glasses, no external devices needed. If you have to strap an iPhone to your face or connect it to your Watch or iPhone, they've failed.
    I don't see that happening by then, if ever. There's simply too many settings that simply wouldn't work with the UI unless you can think of very revolutionary ways in which that can happen. Processor speed isn't a factor here—it's user interface.

    I see what you're saying but if Apple is putting as much work (or more) as they did with iPhone and Watch then I believe a standalone product is possible. If they can invent something like 3D Touch where a 2nd layer lies behind the glasses(for AR) then the UI is fine. Remember Apple may have been working on this for years already.

    Question do you think these are nice?



    Look at all that room for the true depth notch, mini A11, 64GB w/ rOS, LTE, dual batteries (inside Hinges), even speakers at the hinge tips for Siri/music/Podcasts etc.
    Now underlay an AR display on the lenses and you have magic. Sprinkle some Jony Ive and you have beautiful glasses with adaptive lighting for sunlight etc.
  • Reply 65 of 66
    brucemcbrucemc Posts: 1,541member
    Late to the thread but to add my perspective:
    => 2020 is the earliest such a product could come to market (3 years away, leaving 2 years to determine if they can get the tech to work in the right form factor).  I would not be surprised if it is a bit later.  But think of Apple Pencil and how they have stuffed computing into that small cylinder, and their silicon experience, and I think 2020 (stretch) is doable for a gen 1.

    => It would require an iPhone to work, similar to how AW requires it (data at first, settings, s/w mgmt, many use cases).  You might be able to get some simple use cases with glasses only.  Perhaps it would work with Apple Watch for an additional set of use cases.

    => Features at first will be limited, but will "just work".  Critics will call it lack lustre, but it will be the only AR glasses to sell at a high price and in volume.

    => Perhaps, like what the AW was in its first couple generations (not necessarily how Apple initially promoted it, but what it was), it will be "glasses that do more".  Glasses with smarts.  Apple Watch is a watch that does more.  It is not a computer for the wrist.

    => Apple will target the "general public", though of course the use cases will be specific, and it will be early adopter types that buy a gen 1.  I don't see Apple targeting industrial as a lead industry.

    => The cameras would not be able to record photo/video (not in gen 1).  They would want people to use the iPhone for that, as it would be many years ahead of what could be put into glasses.  The focus will be for AR purposes only (identify objects / scenes, read text, etc)

    => My thinking on some use cases that would appeal to the broader public:
    - Viewer for AR apps initiated from the iPhone.  Rather than holding up your phone screen, you see through the glasses and positioning is based on glasses.  Pretty compelling right there
    - Translation.  Look at text on a sign, page, etc, and have it automatically translated into your language)
    - Object / location identification and overlay of information (restaurants review, business information, object info)
    patchythepirate
  • Reply 66 of 66
    icoco3icoco3 Posts: 1,474member
    I have my glasses on now...Oakley frames and some progressive lenses.  

    I want to wear my regular looking frames.
    I want regular looking lenses.
    When I look at something the lenses automatically adjust the focus in a split second.
    Headsup display when desired.
    Identification of what I am looking at, when this is desired.
    Notification of incoming calls and messages.
    Heads up display for directions.
    Gaming and all that comes with it.

    All in regular looking glasses.  I just send them my prescription or order right at the eye care place.

    Since it is prescription, can buy with my HSA account!!
    edited November 2017
Sign In or Register to comment.