Apple AR headset codenamed 'T288' said to run new 'rOS' operating system, launch as soon a...

24

Comments

  • Reply 21 of 66
    tshapitshapi Posts: 370member
    Soli said:

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
    Is this actually true?  Couldn’t many of the things that are shown using AR be done without a camera?  Granted, on an iPhone/iPad the camera is required to see put, say, my kitchen table, on the screen.  But if I was wearing glasses couldn’t the AR information just be projected onto the lens that I’m already looking through?

    How much of what we currently view as AR relies on the camera for reasons other than showing what we can already see with our own eyes?  It seems to me that the other sensors are doing most of the AR work.  The camera is just there to supply an image.  Right?  So, couldn’t the camera potentially be eliminated while the other sensors still do their jobs in the same fashion?
    If will still require a phone. The watch required a nearby phone for cellular. This most likely be the same until they conquer battery issues. 
    tjwolfwatto_cobra
  • Reply 22 of 66
    peteopeteo Posts: 402member
    macxpress said:
    Yeah I'm not sure how Apple is going to make glasses successful, but if anyone can, its Apple. Snap Chat just announced a massive loss on their stupid glasses with tons of unsold glasses. We'll see. Maybe by 2020 different technology will be here and consumers will be more willing to wear them.
    SnapChat's glasses were not AR glasses. AR (which can turn into VR) glasses will replace all your screens and a lot of other real life things.
    Its an amazing world changing tech that has some time to go befor the tech is mass market ready but its coming no doubt about it and its great to hear that Cook knows this.

    I have the HTC Vive and its amazing. Is it mass market ready. Nope, but using it you know that this is where everything is heading. Its so compelling and you can see how trans-formative this will be, just like the internet and smart phones but this will be building on top of those and will be bigger.
    edited November 2017
  • Reply 23 of 66
    nhughesnhughes Posts: 770editor
    Soli said:

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
    I've been thinking about this a lot lately, at least in terms of a wearable camera on the Apple Watch. Using the Series 3 with LTE, without your iPhone nearby, is a great experience except for one glaring omission: the lack of camera. And yet, having a discreet camera on your wrist is indeed creepy.

    There is also the problem of which way does the camera face? Toward you for FaceTime/selfies? Or away from you, to take (probably low-quality) pics? Or do they do two cameras, or some sort of clunky physical rotating camera?

    I wonder if the solution both of these problems is the same: One camera that faces the user (embedded in the screen, if they could ever do that). This allows for FaceTime and selfies and the like, and then taking pictures of the world around you would require some sort of broad gesture (ie, telling the watch to take a picture, but the image isn't actually snapped until you rotate your wrist outward in a rather obvious motion, then turn your wrist back to view the image). I wonder if having a broad gesture, combined with some sort of visual cue (a light flash or whatever) would turn down the "creep" factor, letting people around you know that you are snapping pictures.

    Perhaps the solution for this with augmented reality glasses would be something similar — a process that requires the user to press a button on the device itself, so it is clear to those around you that you are interacting with your glasses, accompanied perhaps by a flash of light or something that says "hey, picture time." It's not perfect, but it's arguably more obvious than someone on the train acting like they are using their phone for other purposes while they are actually snapping pics.

    Cameras in everything, including our glasses, are seemingly inevitable. And younger generations won't care as much. But we're still in some sort of uncomfortable limbo — which is precisely why the Apple TV Siri remote requires you to hold a button to invoke Siri. I suspect Apple will continue to go slow and tread lightly. Perhaps the first Apple AR glasses won't even allow the user to capture pictures. The first iPad didn't have a camera.
    watto_cobra
  • Reply 24 of 66
    robjnrobjn Posts: 283member
    Soli said:
    Soli said:

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
    Is this actually true?  Couldn’t many of the things that are shown using AR be done without a camera?  Granted, on an iPhone/iPad the camera is required to see put, say, my kitchen table, on the screen.  But if I was wearing glasses couldn’t the AR information just be projected onto the lens that I’m already looking through?

    How much of what we currently view as AR relies on the camera for reasons other than showing what we can already see with our own eyes?  It seems to me that the other sensors are doing most of the AR work.  The camera is just there to supply an image.  Right?  So, couldn’t the camera potentially be eliminated while the other sensors still do their jobs in the same fashion?
    I'd still call an IR camera a camera, but I like where you're going with this. It wouldn't be able to take video like your iPhone can but it could use GPS and accelerometer data (for orientation) along with IR data to project images properly.
    I’m sure it would need more than IR image data to understand the environment and project images usefully.  It surely needs a visible light spectrum camera, probably multiple cameras.

    Also it is worth noting that the technology in the front facing TrueDepth Camera only works over short distances. It cannot be used to scan a room without much more power - if something the size and weight of a phone does not have the battery power to do this - do you think you are likely to see a more powerful TrueDepth Camera on a pair of lightweight glasses?
    edited November 2017 patchythepiratewatto_cobra
  • Reply 25 of 66
    foljsfoljs Posts: 390member
    I also think it is a major generational thing. Younger kids won't have much of an issue IMO with cameras imbedded in a device like this. 
    Younger kids wont be doing stuff they don't want their friends/parents to know? They wont be having extra-marital affairs? They wont have bad hair days?
  • Reply 26 of 66
    robjnrobjn Posts: 283member
    Tim Cook already said publicly that AR glasses are not happening any time soon - the the technology required to do it well does not yet exist. I expect Apple would want to implement it with strong privacy, in a way that users are not able to record and developers are not able to take image data to their own servers.

    Tim’s comments confirm that Apple are definitely experimenting with AR glasses - but what they have now is not anywhere near good enough for them to ship as a product. I imagine it works but it is big and heavy and ugly and the technology to miniaturize is not yet available. Implementing it in a way that ensures privacy is probably another technological hurdle to overcome.

    If AR glasses were coming “as soon as 2020” Apple would at this point almost finished them. Tim’s comment asserts that this is definitely not the case - far from it!

    The media needs to push back against these reports and present them with deep skepticism that gives more weight to what Apple has publicly stated. Overwise, these reports stir up unrealistic expectations that Apple has already publicly tried to dampen.

    It is likely that given Apple’s stated position on AR glasses competitors are inclined to seed these rumors in order to put Apple under pressure with a new tide of cyclic waves of expectation and claims of failure designed to disappoint consumers.

    so Apple Insider - DON’T BE PLAYED.
    watto_cobra
  • Reply 27 of 66
    macxpressmacxpress Posts: 5,808member
    peteo said:
    macxpress said:
    Yeah I'm not sure how Apple is going to make glasses successful, but if anyone can, its Apple. Snap Chat just announced a massive loss on their stupid glasses with tons of unsold glasses. We'll see. Maybe by 2020 different technology will be here and consumers will be more willing to wear them.
    SnapChat's glasses were not AR glasses. AR (which can turn into VR) glasses will replace all your screens and a lot of other real life things.
    Its an amazing world changing tech that has some time to go befor the tech is mass market ready but its coming no doubt about it and its great to hear that Cook knows this.

    I have the HTC Vive and its amazing. Is it mass market ready. Nope, but using it you know that this is where everything is heading. Its so compelling and you can see how trans-formative this will be, just like the internet and smart phones but this will be building on top of those and will be bigger.
    I know SnapChat glass are not AR glasses...I was just saying in general that as of right now, the general public doesn't seem to want to wear specialized glasses, especially ones with cameras on them. Not only does this freak others out, its also not accepted in certain places.

    As @ihatescreennames said, I don't really see the point of those. Now, if Apple came out with some type of AR glasses, they should be able to bring them to the masses and be successful. If there's one thing Apple is pretty good at, its bringing technology to the market that sure, may have already been out there, but they're able to make it a success. Were seeing it with fingerprint readers (TouchID), smartphones, smartwatches, FaceID, etc etc...all of which were out there before Apple released their version, but they're version was much more of a success than anyone else's attempts. 
  • Reply 28 of 66
    nhughesnhughes Posts: 770editor
    robjn said:
    Tim Cook already said publicly that AR glasses are not happening any time soon - the the technology required to do it well does not yet exist. I expect Apple would want to implement it with strong privacy, in a way that users are not able to record and developers are not able to take image data to their own servers.

    Tim’s comments confirm that Apple are definitely experimenting with AR glasses - but what they have now is not anywhere near good enough for them to ship as a product. I imagine it works but it is big and heavy and ugly and the technology to miniaturize is not yet available. Implementing it in a way that ensures privacy is probably another technological hurdle to overcome.

    If AR glasses were coming “as soon as 2020” Apple would at this point almost finished them. Tim’s comment asserts that this is definitely not the case - far from it!

    The media needs to push back against these reports and present them with deep skepticism that gives more weight to what Apple has publicly stated. Overwise, these reports stir up unrealistic expectations that Apple has already publicly tried to dampen.

    It is likely that given Apple’s stated position on AR glasses competitors are inclined to seed these rumors in order to put Apple under pressure with a new tide of cyclic waves of expectation and claims of failure designed to disappoint consumers.

    so Apple Insider - DON’T BE PLAYED.
    Fall of 2020 is three years away — a lifetime in tech development. If you think the product would be "almost done" by now for a 2020 (presumably later in the year) release, you're playing yourself.
    calibrucemc
  • Reply 29 of 66
    nhughesnhughes Posts: 770editor

    foljs said:
    I also think it is a major generational thing. Younger kids won't have much of an issue IMO with cameras imbedded in a device like this. 
    Younger kids wont be doing stuff they don't want their friends/parents to know? They wont be having extra-marital affairs? They wont have bad hair days?
    Won't someone think of the adulterers?
    ihatescreennameswatto_cobra
  • Reply 30 of 66
    SoliSoli Posts: 10,035member
    nhughes said:
    Soli said:

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
    I've been thinking about this a lot lately, at least in terms of a wearable camera on the Apple Watch. Using the Series 3 with LTE, without your iPhone nearby, is a great experience except for one glaring omission: the lack of camera. And yet, having a discreet camera on your wrist is indeed creepy.

    There is also the problem of which way does the camera face? Toward you for FaceTime/selfies? Or away from you, to take (probably low-quality) pics? Or do they do two cameras, or some sort of clunky physical rotating camera?

    I wonder if the solution both of these problems is the same: One camera that faces the user (embedded in the screen, if they could ever do that). This allows for FaceTime and selfies and the like, and then taking pictures of the world around you would require some sort of broad gesture (ie, telling the watch to take a picture, but the image isn't actually snapped until you rotate your wrist outward in a rather obvious motion, then turn your wrist back to view the image). I wonder if having a broad gesture, combined with some sort of visual cue (a light flash or whatever) would turn down the "creep" factor, letting people around you know that you are snapping pictures.

    Perhaps the solution for this with augmented reality glasses would be something similar — a process that requires the user to press a button on the device itself, so it is clear to those around you that you are interacting with your glasses, accompanied perhaps by a flash of light or something that says "hey, picture time." It's not perfect, but it's arguably more obvious than someone on the train acting like they are using their phone for other purposes while they are actually snapping pics.

    Cameras in everything, including our glasses, are seemingly inevitable. And younger generations won't care as much. But we're still in some sort of uncomfortable limbo — which is precisely why the Apple TV Siri remote requires you to hold a button to invoke Siri. I suspect Apple will continue to go slow and tread lightly. Perhaps the first Apple AR glasses won't even allow the user to capture pictures. The first iPad didn't have a camera.
    If we're talking about a FaceTime camera in the glasses, I'm not sure that's possible due to distance. If we're talking about in the Watch, then perhaps it could be placed within the display once we move to micro-LED or its successor where there's a lot of extra space between the bright LEDs for other types of "pixels" to reside. Building it into the display may also remove much of the stigma over having a glossy, curved lens pointing at someone.
  • Reply 31 of 66
    BluntBlunt Posts: 224member
    Breaking: A new report from a source very close to Apple claims that most of these reports are nothing more then clickbait.
    watto_cobra
  • Reply 32 of 66
    robjn said:
    Soli said:
    Soli said:

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
    Is this actually true?  Couldn’t many of the things that are shown using AR be done without a camera?  Granted, on an iPhone/iPad the camera is required to see put, say, my kitchen table, on the screen.  But if I was wearing glasses couldn’t the AR information just be projected onto the lens that I’m already looking through?

    How much of what we currently view as AR relies on the camera for reasons other than showing what we can already see with our own eyes?  It seems to me that the other sensors are doing most of the AR work.  The camera is just there to supply an image.  Right?  So, couldn’t the camera potentially be eliminated while the other sensors still do their jobs in the same fashion?
    I'd still call an IR camera a camera, but I like where you're going with this. It wouldn't be able to take video like your iPhone can but it could use GPS and accelerometer data (for orientation) along with IR data to project images properly.
    I’m sure it would need more than IR image data to understand the environment and project images usefully.  It surely needs a visible light spectrum camera, probably multiple cameras.
    Why are you sure?  As I mentioned earlier, the SkyView app places constellations (and other celestial bodies, satellites, etc.) over a live view coming from the camera.  But it still works just the same if the camera is covered, is inside or even pointing at the floor so I can see where things are on the other side of the planet.  The camera is helpful but not required, in this case. If the phone (or glasses) can get GPS location data, accelerometer info, orientation via compass, etc it can project information accurately without using a camera.
    cali
  • Reply 33 of 66
    SoliSoli Posts: 10,035member
    Since Face ID can already read 50 muscles of the face and since Apple is big on Accessibility features in their devices, smart glasses from Apple could potentially help say, people autism understand the emotions of people in front of them, and definitely help husbands understand the level of uspet their wives have for them at any given moment.
    ihatescreennamesStrangeDayspatchythepirateicoco3
  • Reply 34 of 66
    nhughes said:
    Soli said:

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
    I've been thinking about this a lot lately, at least in terms of a wearable camera on the Apple Watch. Using the Series 3 with LTE, without your iPhone nearby, is a great experience except for one glaring omission: the lack of camera. And yet, having a discreet camera on your wrist is indeed creepy.

    There is also the problem of which way does the camera face? Toward you for FaceTime/selfies? Or away from you, to take (probably low-quality) pics? Or do they do two cameras, or some sort of clunky physical rotating camera?

    I wonder if the solution both of these problems is the same: One camera that faces the user (embedded in the screen, if they could ever do that). This allows for FaceTime and selfies and the like, and then taking pictures of the world around you would require some sort of broad gesture (ie, telling the watch to take a picture, but the image isn't actually snapped until you rotate your wrist outward in a rather obvious motion, then turn your wrist back to view the image). I wonder if having a broad gesture, combined with some sort of visual cue (a light flash or whatever) would turn down the "creep" factor, letting people around you know that you are snapping pictures.

    Perhaps the solution for this with augmented reality glasses would be something similar — a process that requires the user to press a button on the device itself, so it is clear to those around you that you are interacting with your glasses, accompanied perhaps by a flash of light or something that says "hey, picture time." It's not perfect, but it's arguably more obvious than someone on the train acting like they are using their phone for other purposes while they are actually snapping pics.

    Cameras in everything, including our glasses, are seemingly inevitable. And younger generations won't care as much. But we're still in some sort of uncomfortable limbo — which is precisely why the Apple TV Siri remote requires you to hold a button to invoke Siri. I suspect Apple will continue to go slow and tread lightly. Perhaps the first Apple AR glasses won't even allow the user to capture pictures. The first iPad didn't have a camera.
    I'm still not sold on a camera in Apple Watch. I'm not saying it won't happen, but when I try to envision it I can't help but see something that is awkward to use.

    As you mentioned, the first iPad didn't have a camera.  And despite past attempts by Apple to show that many people use their iPad as cameras I still don't see it happen very often, especially when compared to people using cell phone to take pictures.

    In my view it would probably make more sense for Apple (or any company, really) to release glasses without cameras at first. I think Google would have been more successful if Glass didn't come with a camera.  Maybe that's the way to go at the start, to get people to accept the new technology without getting the backlash from the "you're trying to take my photo in a public restroom" crowd and all the noise that creates.  And as I mentioned earlier, a camera may not be a requirement for AR.
  • Reply 35 of 66
    tjwolftjwolf Posts: 424member
    "Unlike other headsets that require a smartphone or other devices to power them" - I didn't have Google Glass, but I vaguely remember that it could operate standalone.  That was actually part of it's downfall (aside from the camera fiasco) - the short battery life due to all the onboard tech!   So I completely disagree with the Bloomberg article - I think Apple's AR glasses will be a companion product - just like Apple Watch was/is.  I coincidentally just posted my thoughts on this this morning: http://landofwolf.blogspot.com

    Soli
  • Reply 36 of 66
    nhughesnhughes Posts: 770editor
    Soli said:
    nhughes said:
    Soli said:

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
    I've been thinking about this a lot lately, at least in terms of a wearable camera on the Apple Watch. Using the Series 3 with LTE, without your iPhone nearby, is a great experience except for one glaring omission: the lack of camera. And yet, having a discreet camera on your wrist is indeed creepy.

    There is also the problem of which way does the camera face? Toward you for FaceTime/selfies? Or away from you, to take (probably low-quality) pics? Or do they do two cameras, or some sort of clunky physical rotating camera?

    I wonder if the solution both of these problems is the same: One camera that faces the user (embedded in the screen, if they could ever do that). This allows for FaceTime and selfies and the like, and then taking pictures of the world around you would require some sort of broad gesture (ie, telling the watch to take a picture, but the image isn't actually snapped until you rotate your wrist outward in a rather obvious motion, then turn your wrist back to view the image). I wonder if having a broad gesture, combined with some sort of visual cue (a light flash or whatever) would turn down the "creep" factor, letting people around you know that you are snapping pictures.

    Perhaps the solution for this with augmented reality glasses would be something similar — a process that requires the user to press a button on the device itself, so it is clear to those around you that you are interacting with your glasses, accompanied perhaps by a flash of light or something that says "hey, picture time." It's not perfect, but it's arguably more obvious than someone on the train acting like they are using their phone for other purposes while they are actually snapping pics.

    Cameras in everything, including our glasses, are seemingly inevitable. And younger generations won't care as much. But we're still in some sort of uncomfortable limbo — which is precisely why the Apple TV Siri remote requires you to hold a button to invoke Siri. I suspect Apple will continue to go slow and tread lightly. Perhaps the first Apple AR glasses won't even allow the user to capture pictures. The first iPad didn't have a camera.
    If we're talking about a FaceTime camera in the glasses, I'm not sure that's possible due to distance. If we're talking about in the Watch, then perhaps it could be placed within the display once we move to micro-LED or its successor where there's a lot of extra space between the bright LEDs for other types of "pixels" to reside. Building it into the display may also remove much of the stigma over having a glossy, curved lens pointing at someone.
    I don't think there would be a FaceTime camera in the glasses, no. But it would make sense in the watch (even if the battery life would be abysmal — just a simple phone call has only an hour of battery life). And one camera in the watch for both FaceTime and "outward" facing pics would be aesthetically pleasing, simple to use, and actually help cut down on the "creep" factor. I guess society would just have to get used to the idea of someone turning their wrist at you to snap a pic. Seems like a rather obvious motion to me, though. Less discreet than using your phone.
  • Reply 37 of 66
    Soli said:
    Since Face ID can already read 50 muscles of the face and since Apple is big on Accessibility features in their devices, smart glasses from Apple could potentially help say, people autism understand the emotions of people in front of them, and definitely help husbands understand the level of uspet their wives have for them at any given moment.
    I get them just for that. Really, it doesn't need to be limited to 'level of upset'.  "Mood" in general, and then at different levels, would be fantastic.

    Maybe Apple can implement some sort of ML now so my wife's phone could alert me before I get home if I should delay my arrival or hurry up because "good things" are waiting for me.  Maybe, glean some information from the type of texts she's been sending, her tone when making Siri requests, what websites she's been visiting, etc, to gain some understanding of her mindset at the moment.  Sounds great!
    edited November 2017
  • Reply 38 of 66
    jbdragonjbdragon Posts: 2,311member
    Soli said:
    1) Considering what they've done with the BT, WiFi, and now cellular in the Apple Watch I think it's plausible that Apple can come up with smart glasses that can do that industry what the Apple Watch is doing to the to the watch industry. I'd love to see this directly affect Luxottica, but in terms of wearability this is much more complex in terms of both fashion and sizes.

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?


    Google Glass didn't use the Camera for AR. It didn't do AR. It was just a typical HUD. (Heads Up Display) running Android. Been around for a long time. Hell, you play games that have a type of HUD.

    The Camera was there to take pictures and record video. What people had a problem with was not knowing if they were being recorded or not. Maybe a small light that comes on when it's in record mode? Really though, there's ZERO expectation of privacy when out in public. That means as long as I'm at someplace Public, say a sidewalk or some type of federal building, I'm protected by the constitution and can record anything I can see. As long as it's in a public accessible area. That including walking into a post office and record even though they'll generally tell you that you can't, which is a lie or they're clueless and think you can't record a federal/government building.

    A private business can set policy's where there's no recording inside. But they can't stop you from recording on the sidewalk at their building. The simple fact is you're being recording on cameras everywhere you go. Maybe as soon as you walk out your front door from a neighbor security camera. Someone can be on the sidewalk, recording the front of your house and it's LEGAL. They can't up to your window and record inside. For one thing that would be trespassing. If you don't want to be recorded, don't leave the inside of where ever you live.
  • Reply 39 of 66
    nhughesnhughes Posts: 770editor
    tjwolf said:
    "Unlike other headsets that require a smartphone or other devices to power them" - I didn't have Google Glass, but I vaguely remember that it could operate standalone.  That was actually part of it's downfall (aside from the camera fiasco) - the short battery life due to all the onboard tech!   So I completely disagree with the Bloomberg article - I think Apple's AR glasses will be a companion product - just like Apple Watch was/is.  I coincidentally just posted my thoughts on this this morning: http://landofwolf.blogspot.com

    Glass could operate on its own in the sense that it had its own display and processor, but it was still dependent on a connected phone for data and such (and I would suspect Apple's first-gen model might operate the same way). However, Glass, as a consumer product, is dead, and almost all current consumer head-mounted displays (both AR and VR) require either a smartphone for display and processing, or a full-fledged computer for processing.
  • Reply 40 of 66
    nhughesnhughes Posts: 770editor
    jbdragon said:
    Soli said:
    1) Considering what they've done with the BT, WiFi, and now cellular in the Apple Watch I think it's plausible that Apple can come up with smart glasses that can do that industry what the Apple Watch is doing to the to the watch industry. I'd love to see this directly affect Luxottica, but in terms of wearability this is much more complex in terms of both fashion and sizes.

    2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?


    Google Glass didn't use the Camera for AR. It didn't do AR. It was just a typical HUD. (Heads Up Display) running Android. Been around for a long time. Hell, you play games that have a type of HUD.
    This is a good point. It would be generous to call Google Glass even "AR lite." It was basically a HUD.

    Which raises a lot of questions on how well Apple (or anyone) will be able to make a lightweight, fashionable, "true" AR headset with acceptable battery life. The processing power required for AR is not exactly small, and that requires battery consumption. And nobody wants to wear a giant battery on their head.
Sign In or Register to comment.