2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
I've been thinking about this a lot lately, at least in terms of a wearable camera on the Apple Watch. Using the Series 3 with LTE, without your iPhone nearby, is a great experience except for one glaring omission: the lack of camera. And yet, having a discreet camera on your wrist is indeed creepy.
There is also the problem of which way does the camera face? Toward you for FaceTime/selfies? Or away from you, to take (probably low-quality) pics? Or do they do two cameras, or some sort of clunky physical rotating camera?
I wonder if the solution both of these problems is the same: One camera that faces the user (embedded in the screen, if they could ever do that). This allows for FaceTime and selfies and the like, and then taking pictures of the world around you would require some sort of broad gesture (ie, telling the watch to take a picture, but the image isn't actually snapped until you rotate your wrist outward in a rather obvious motion, then turn your wrist back to view the image). I wonder if having a broad gesture, combined with some sort of visual cue (a light flash or whatever) would turn down the "creep" factor, letting people around you know that you are snapping pictures.
Perhaps the solution for this with augmented reality glasses would be something similar — a process that requires the user to press a button on the device itself, so it is clear to those around you that you are interacting with your glasses, accompanied perhaps by a flash of light or something that says "hey, picture time." It's not perfect, but it's arguably more obvious than someone on the train acting like they are using their phone for other purposes while they are actually snapping pics.
Cameras in everything, including our glasses, are seemingly inevitable. And younger generations won't care as much. But we're still in some sort of uncomfortable limbo — which is precisely why the Apple TV Siri remote requires you to hold a button to invoke Siri. I suspect Apple will continue to go slow and tread lightly. Perhaps the first Apple AR glasses won't even allow the user to capture pictures. The first iPad didn't have a camera.
I'm still not sold on a camera in Apple Watch. I'm not saying it won't happen, but when I try to envision it I can't help but see something that is awkward to use.
As you mentioned, the first iPad didn't have a camera. And despite past attempts by Apple to show that many people use their iPad as cameras I still don't see it happen very often, especially when compared to people using cell phone to take pictures.
In my view it would probably make more sense for Apple (or any company, really) to release glasses without cameras at first. I think Google would have been more successful if Glass didn't come with a camera. Maybe that's the way to go at the start, to get people to accept the new technology without getting the backlash from the "you're trying to take my photo in a public restroom" crowd and all the noise that creates. And as I mentioned earlier, a camera may not be a requirement for AR.
I'm not sold on it either, but it is the one glaring thing that I miss when I leave my phone behind. While I think its inclusion in a future watch is inevitable, how to integrate it without ruining the aesthetic of the device or making it "creepy" is a major challenge for Apple.
I would disagree with you on people using their iPad as a camera. My parents travel a lot and own a compact digital camera and an iPhone 7, but they recently told me they take many (most?) of their travel pics on an iPad (2017 9.7" model I bought them). While I find that to be rather silly and unnecessary, don't underestimate the convenience of a big screen. Anecdotally, in my own travels and just seeing tourists around NYC, I have also seen many people using iPads for photography. In fact, I once saw a helicopter tour company that banned use of iPads for pictures in flight — because the giant tablet blocked the view for others on the heli.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
I've been thinking about this a lot lately, at least in terms of a wearable camera on the Apple Watch. Using the Series 3 with LTE, without your iPhone nearby, is a great experience except for one glaring omission: the lack of camera. And yet, having a discreet camera on your wrist is indeed creepy.
There is also the problem of which way does the camera face? Toward you for FaceTime/selfies? Or away from you, to take (probably low-quality) pics? Or do they do two cameras, or some sort of clunky physical rotating camera?
I wonder if the solution both of these problems is the same: One camera that faces the user (embedded in the screen, if they could ever do that). This allows for FaceTime and selfies and the like, and then taking pictures of the world around you would require some sort of broad gesture (ie, telling the watch to take a picture, but the image isn't actually snapped until you rotate your wrist outward in a rather obvious motion, then turn your wrist back to view the image). I wonder if having a broad gesture, combined with some sort of visual cue (a light flash or whatever) would turn down the "creep" factor, letting people around you know that you are snapping pictures.
Perhaps the solution for this with augmented reality glasses would be something similar — a process that requires the user to press a button on the device itself, so it is clear to those around you that you are interacting with your glasses, accompanied perhaps by a flash of light or something that says "hey, picture time." It's not perfect, but it's arguably more obvious than someone on the train acting like they are using their phone for other purposes while they are actually snapping pics.
Cameras in everything, including our glasses, are seemingly inevitable. And younger generations won't care as much. But we're still in some sort of uncomfortable limbo — which is precisely why the Apple TV Siri remote requires you to hold a button to invoke Siri. I suspect Apple will continue to go slow and tread lightly. Perhaps the first Apple AR glasses won't even allow the user to capture pictures. The first iPad didn't have a camera.
If we're talking about a FaceTime camera in the glasses, I'm not sure that's possible due to distance. If we're talking about in the Watch, then perhaps it could be placed within the display once we move to micro-LED or its successor where there's a lot of extra space between the bright LEDs for other types of "pixels" to reside. Building it into the display may also remove much of the stigma over having a glossy, curved lens pointing at someone.
I don't think there would be a FaceTime camera in the glasses, no. But it would make sense in the watch (even if the battery life would be abysmal — just a simple phone call has only an hour of battery life). And one camera in the watch for both FaceTime and "outward" facing pics would be aesthetically pleasing, simple to use, and actually help cut down on the "creep" factor. I guess society would just have to get used to the idea of someone turning their wrist at you to snap a pic. Seems like a rather obvious motion to me, though. Less discreet than using your phone.
Societies are collectively made up of idiots (and I obviously include myself as part of many societies). There are people on this forums that are "concerned" about the Amazon Echo and HomePod listening for a keyword without understanding that recording is a different task or that they've been using devices in their homes and on their person that have microphones that could be recording and sending data to servers without their knowledge, not to mention camera data, keystrokes, general app use, and screenshots.
It wasn't until too many Mac OS versions ago that Apple even made it so apps would have to ask for access to the data in Contacts nee Address Book, of which I have plenty of data that could be used to steal my identity. For example, to make access to a CC easier I've set up that company's phone number in Contacts with a semicolon to Wait, and then my entire 16 digit card number so I can press a button to have it automatically sent as tones for authentication without having to pull out my wallet and type the number in manually. Before they did that I'm not sure I really considered that my Address Book and pretty much everything else under my User Account '~/' and Library '~/Library/' was readable by random apps I may install.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
I've been thinking about this a lot lately, at least in terms of a wearable camera on the Apple Watch. Using the Series 3 with LTE, without your iPhone nearby, is a great experience except for one glaring omission: the lack of camera. And yet, having a discreet camera on your wrist is indeed creepy.
There is also the problem of which way does the camera face? Toward you for FaceTime/selfies? Or away from you, to take (probably low-quality) pics? Or do they do two cameras, or some sort of clunky physical rotating camera?
I wonder if the solution both of these problems is the same: One camera that faces the user (embedded in the screen, if they could ever do that). This allows for FaceTime and selfies and the like, and then taking pictures of the world around you would require some sort of broad gesture (ie, telling the watch to take a picture, but the image isn't actually snapped until you rotate your wrist outward in a rather obvious motion, then turn your wrist back to view the image). I wonder if having a broad gesture, combined with some sort of visual cue (a light flash or whatever) would turn down the "creep" factor, letting people around you know that you are snapping pictures.
Perhaps the solution for this with augmented reality glasses would be something similar — a process that requires the user to press a button on the device itself, so it is clear to those around you that you are interacting with your glasses, accompanied perhaps by a flash of light or something that says "hey, picture time." It's not perfect, but it's arguably more obvious than someone on the train acting like they are using their phone for other purposes while they are actually snapping pics.
Cameras in everything, including our glasses, are seemingly inevitable. And younger generations won't care as much. But we're still in some sort of uncomfortable limbo — which is precisely why the Apple TV Siri remote requires you to hold a button to invoke Siri. I suspect Apple will continue to go slow and tread lightly. Perhaps the first Apple AR glasses won't even allow the user to capture pictures. The first iPad didn't have a camera.
If we're talking about a FaceTime camera in the glasses, I'm not sure that's possible due to distance. If we're talking about in the Watch, then perhaps it could be placed within the display once we move to micro-LED or its successor where there's a lot of extra space between the bright LEDs for other types of "pixels" to reside. Building it into the display may also remove much of the stigma over having a glossy, curved lens pointing at someone.
I don't think there would be a FaceTime camera in the glasses, no. But it would make sense in the watch (even if the battery life would be abysmal — just a simple phone call has only an hour of battery life). And one camera in the watch for both FaceTime and "outward" facing pics would be aesthetically pleasing, simple to use, and actually help cut down on the "creep" factor. I guess society would just have to get used to the idea of someone turning their wrist at you to snap a pic. Seems like a rather obvious motion to me, though. Less discreet than using your phone.
Societies are collectively made up of idiots (and I obviously include myself as part of many societies). There are people on this forums that are "concerned" about the Amazon Echo and AirPod listening for a keyword without understanding that recording is a different task or that they've been using devices in their homes and on their person that have microphones that could be recording and sending data to servers without their knowledge, not to mention camera data, keystrokes, general app use, and screenshots.
It wasn't until too many Mac OS versions ago that Apple even made it so apps would have to ask for access to the data in Contacts nee Address Book, of which I have plenty of data that could be used to steal my identity. For example, to make access to a CC easier I've set up that company's phone number in Contacts with a semicolon to Wait, and then my entire 16 digit card number so I can press a button to have it automatically sent as tones for authentication without having to pull out my wallet and type the number in manually. Before they did that I'm not sure I really considered that my Address Book and pretty much everything else under my User Account '~/' and Library '~/Library/' was readable by random apps I may install.
Not to get all tinfoil hat up in here, but some of that paranoia is justified. Who watches the Watchmen?
1) Considering what they've done with the BT, WiFi, and now cellular in the Apple Watch I think it's plausible that Apple can come up with smart glasses that can do that industry what the Apple Watch is doing to the to the watch industry. I'd love to see this directly affect Luxottica, but in terms of wearability this is much more complex in terms of both fashion and sizes.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
Google Glass didn't use the Camera for AR. It didn't do AR. It was just a typical HUD. (Heads Up Display) running Android. Been around for a long time. Hell, you play games that have a type of HUD.
That misses the point. If Google Glass was the same it was but with AR as an additional feature it wouldn't have been acceptable. It still has the creep factor by having a camera pointed at the subject. Now a camera that is outside the visible spectrum* would likely not have the same stigma, but it still might have too much. We've even seem some crazy theoretical stuff with ambient light sensors that could be brought into the conversation that could also be used as an argument as to why even an IR camera could be an issue.
* Yes, I'm aware that some people can see some frequencies of IR, but it's still not considered the visual spectrum.
“Unlike other headsets that require a smartphone or other devices to power them, Bloomberg claims that Apple's device will have its own display and processor.“
I told people this long ago on here and they couldn’t wrap their head around it. They kept saying “it will need an iPhone to work”. WHY?
Apple isn’t Samsung or google. They don’t make mechanical, clunky, awkward technology. There’s absolutely no reason to strap an iPhone to your face when Apple owns technology like the A11/S3/W2 chips.
I agree AppleInsider needs to stop showing Goog Glass. Makes it seem like Apple is trying to copy that nerdwear.
I also gave the same possible solution months ago for the “creepiness”.... just don’t allow the glasses to record. There’s little reason to use glasses to record when you have an awesome camera in your pocket. I mean really, why do you need to record from your glasses? I can see 3rd parties offering an obvious attachment to make it possible. Obvious as in, others would know you have a camera on your glasses.
The other possibility like another commenter said is the glasses become popular and no one cares about the recording feature but I stick by my solution and believe Apple will leave out recording at least for the first gen or until they become more popular.
“Unlike other headsets that require a smartphone or other devices to power them, Bloomberg claims that Apple's device will have its own display and processor.“
I told people this long ago on here and they couldn’t wrap their head around it. They kept saying “it will need an iPhone to work”. WHY?
Apple isn’t Samsung or google. They don’t make mechanical, clunky, awkward technology. There’s absolutely no reason to strap an iPhone to your face when Apple owns technology like the A11/S3/W2 chips.
I agree AppleInsider needs to stop showing Goog Glass. Makes it seem like Apple is trying to copy that nerdwear.
I also gave the same possible solution months ago for the “creepiness”.... just don’t allow the glasses to record. There’s little reason to use glasses to record when you have an awesome camera in your pocket. I mean really, why do you need to record from your glasses? I can see 3rd parties offering an obvious attachment to make it possible. Obvious as in, others would know you have a camera on your glasses.
The other possibility like another commenter said is the glasses become popular and no one cares about the recording feature but I stick by my solution and believe Apple will leave out recording at least for the first gen or until they become more popular.
Um… the Apple Watch has "its own display and processor" yet "needs an iPhone to work." These are obviously not mutually exclusive and I'd bet any money that the next wearable from Apple will also connect need an auxiliary device the way the Apple Watch does because it's unreasonable to do everything you need to do on that small display.
1) Considering what they've done with the BT, WiFi, and now cellular in the Apple Watch I think it's plausible that Apple can come up with smart glasses that can do that industry what the Apple Watch is doing to the to the watch industry. I'd love to see this directly affect Luxottica, but in terms of wearability this is much more complex in terms of both fashion and sizes.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
Google Glass didn't use the Camera for AR. It didn't do AR. It was just a typical HUD. (Heads Up Display) running Android. Been around for a long time. Hell, you play games that have a type of HUD.
This is a good point. It would be generous to call Google Glass even "AR lite." It was basically a HUD.
Which raises a lot of questions on how well Apple (or anyone) will be able to make a lightweight, fashionable, "true" AR headset with acceptable battery life. The processing power required for AR is not exactly small, and that requires battery consumption. And nobody wants to wear a giant battery on their head.
A battery breakthrough may be one of the technologies that Tim Cook is referring to that isn't ready yet. Another is probably microLED.
I can imagine that another would be miniaturization of the cameras. There would have to be two of them facing forward and roughly at interpupillary distance, ~65mm, for stereo "scene acquisition," which would be the method by which the glasses would know (by tracking your eyes) where exactly in the 3D point-field matrix you are looking, so the AR engine could apply its overlay of data (which might well be auditory instead of visual.). @Soli's idea that the out-facing cameras could be embedded in the microLED screens is interesting, and Apple has patents for this, I think. It would mean that the parallax could be adjustable across the span of the screen/lens, so accurate stereo could be acquired at closer and farther distances than we can normally see depth. That would be a trip.
The whole technology will be a trip. It's way too early to worry, I think, about creep factors. Apple is far more aware than other companies of what sort of experience their devices will provide their customers, since that is their major focus and they test with the psychology of privacy and good manners in mind — to the extent that they're said to be notoriously behind in context-aware AI, vis à vis Google, Amazon or Facebook. Maybe they'll even make it impossible to record video, or maybe the entire glasses frame will light up when recording, or somesuch.
edit: I see @Cali also be thinks that video recording will maybe not be enabled, at least at first. Thinking about this, it does seems a shame to waste all the 3D capability that stereo AR glasses will provide, and then still remain stuck with our flat screens for recorded media. Maybe we'll let the benefits outweigh the privacy concerns. This kind of change in outlook is hard to predict, as we've seen through the entire evolution of the pocket computer.
Google Glass didn't use the Camera for AR. It didn't do AR. It was just a typical HUD. (Heads Up Display) running Android. Been around for a long time. Hell, you play games that have a type of HUD.
This is a good point. It would be generous to call Google Glass even "AR lite." It was basically a HUD.
Which raises a lot of questions on how well Apple (or anyone) will be able to make a lightweight, fashionable, "true" AR headset with acceptable battery life. The processing power required for AR is not exactly small, and that requires battery consumption. And nobody wants to wear a giant battery on their head.
I think Apple's glasses will contain just enough hardware/software to act as a display driver for the iPhone (perhaps using AirPlay-like tech?) and as a collector of AR sensor/camera data to be sent back to the iPhone. Not sure if Bluetooth LE has enough bandwidth for both, but if so, that could be the data channel. With no onboard CPU (at least not significant CPU), memory, wifi, etc. to drive, I would think that Apple could stuff enough battery into the glasses to make it last all day.
ihatescreennames said: So, couldn’t the camera potentially be eliminated while the other sensors still do their jobs in the same fashion?
For many applications the answer would be yes. Admittedly, it would limit what the glasses could do and the techno-nerds would criticize Apple for once again ‘crippling’ a product by leaving out a feature that ‘everyone one else already has.’ But it won’t matter.
You’re still going to need some kind of sensor that can detect objects and read signs, even if it’s not a camera per se. Let’s not forget the blind. Apple has made the iPhone quite accessible for a blind person to use. Imagine a cool pair of shades that can provide audio feedback on where a person is and what objects (people, cars, buildings) are nearby.
1) Considering what they've done with the BT, WiFi, and now cellular in the Apple Watch I think it's plausible that Apple can come up with smart glasses that can do that industry what the Apple Watch is doing to the to the watch industry. I'd love to see this directly affect Luxottica, but in terms of wearability this is much more complex in terms of both fashion and sizes.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
Google Glass didn't use the Camera for AR. It didn't do AR. It was just a typical HUD. (Heads Up Display) running Android. Been around for a long time. Hell, you play games that have a type of HUD.
This is a good point. It would be generous to call Google Glass even "AR lite." It was basically a HUD.
Which raises a lot of questions on how well Apple (or anyone) will be able to make a lightweight, fashionable, "true" AR headset with acceptable battery life. The processing power required for AR is not exactly small, and that requires battery consumption. And nobody wants to wear a giant battery on their head.
A battery breakthrough may be one of the technologies that Tim Cook is referring to that isn't ready yet. Another is probably microLED.
Unlike with the Watch, where the bands can be very versatile in design and (superficial) function, and are generally very malleable, there may be more space for a larger overall battery compared to the other electronics because of the temple pieces (arms on glasses), which Apple could make as a standard design or designs for the battery to reside. They may need a new hinge system for transferring power (and data) but I think that's an easier technological hurdle than creating flexible Watch bands that contain CE.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
I think Google got around that with their latest version of Glass, which has a red LED that lights whenever the camera is active, like a security camera. I think even better might be a two-color LED, say amber and the bright blue nowadays that's hard to miss, which pulses between the two colors to attract your attention, with maybe a constant amber glow around the sides so people know the camera is on even when it's not aimed at them at the moment.
1) Considering what they've done with the BT, WiFi, and now cellular in the Apple Watch I think it's plausible that Apple can come up with smart glasses that can do that industry what the Apple Watch is doing to the to the watch industry. I'd love to see this directly affect Luxottica, but in terms of wearability this is much more complex in terms of both fashion and sizes.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
It’s easy to solve. The camera won’t be able to record. At all. It will be marketed as a reality device as in “only for right now” and when people consume it the embargo will be lifted.
1) Considering what they've done with the BT, WiFi, and now cellular in the Apple Watch I think it's plausible that Apple can come up with smart glasses that can do that industry what the Apple Watch is doing to the to the watch industry. I'd love to see this directly affect Luxottica, but in terms of wearability this is much more complex in terms of both fashion and sizes.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
It’s easy to solve. The camera won’t be able to record. At all. It will be marketed as a reality device as in “only for right now” and when people consume it the embargo will be lifted.
I think that's sellable, but if that includes streaming video then there's always the possibility someone could record, so making it isolated so it's simply for the iPhone to push layover data to your lens(es) could work.
2) You can't have AR without a camera, but as we saw with Google Glass a camera just makes it creepy. Can Apple overcome that stigma?
One possibility could be that they use a device that doesn't take actual photos but instead object data points. Something similar to Apple's fingerprint sensor where it reads data points instead of an image of your fingerprint.
I don't disagree that an AR visor could work without a forward-looking camera, but the lack thereof would impact the AR Visor's capabilities. I suspect Apple's will have a camera.
I assert that the "creepiness factor" of a forward-looking camera is linked to the assumption that said camera can / will be used to record. So far, this has been true -- in fact, Google touted Glass' ability to record and share photos / videos as essentially the best feature & made the camera super overt ... both decisions were probably mistaken. Google might be forgiven, however, because Glass wasn't an AR Visor.
I suspect that, IF ...
The AR Visor's camera(s) is (are) seamlessly blended into the design (and, therefore, "hidden" or covert), and ...
There is 100% trust that said camera(s) cannot record ...
... THEN the creepiness factor could be effectively reduced to zero.
If Apple can make an AR Visor that looks like a normal pair of glasses then 95% of people won't even begin to think of the "creepy recording issue". If Apple can guarantee that the AR Visor's camera(s) cannot record, then it'll likely address 4% of the remaining 5%. There will always be the 1% who are skeptical and fearful ... see TouchID / FaceID launches.
I think you guys are missing some obvious things here.
I dont ever see the continous wear of AR glasses being socially acceptable. Even if no camera is involved people will be put off by your preoccupation with the AR data. I kinda liken this to laptop useage where if someone stops by to chat while you are working it is considered polite to take your focus off the screen, closing the screen can be seen as giving the person 100% of your attention.
Rather for AR glasses i see marketing success if they are sold as a tool just like a laptop. Thus you put on the glasses when the activity warrants the use of those glasses. There are literaly thousands, probably millions if casees where this would be advatageous. Imagine a technician working on a very complex machine, a jet engine for example, an AR/VR solution could handle things like parts look up, recall notices, calibration or tolerance values and asssembly or disassembly procedures in a very fluid manner.
I dont see AR glasses being successful for general daily wear. At least at this point i dont see a useful app on the horizon. What i see them being useful for is the multitude of uses as a tool. Everything from an interior designer demoing solutions for a client to AR glasses replacing HUDs on aircraft.
I liken the acceptance of AR glasses to the the acceptance of an eletrician walking around with his tool belt strapped to his waiste. The tool belt is completely accepted at the job site effectively being a professional requirment. That same tool belt might not be accepted outside of the work zone. Some of that is social, some practicle (they get in the way) but the point is people separate their tools from their lives outside of work. For AR/VR glasses to be accepted they will need to be seen as tools and handled as such. That means daily wear by Joe Q. Public will be shunned. That still means millions sold.
Well eventually it will mean millions sold. The problem is the tech to pull this off. Frankly i see computational and storage issues as big factor here. The storage issue becomes a problem becuase network latenancy will be extremely frustrating users especially when massive data sets have to be digested. Even if coupled to an iPhone you will still have issues with current tech. Likewise very good processing power is needed to process all of that data and present it to the user. Basically you need better than A11 performance.
Topping everything off here is the delivery of viable apps. I know they will come eventually but you need to manage consummers expectations. In any event Apple will need a couple of compelling apps to spark imaginations.
Comments
I would disagree with you on people using their iPad as a camera. My parents travel a lot and own a compact digital camera and an iPhone 7, but they recently told me they take many (most?) of their travel pics on an iPad (2017 9.7" model I bought them). While I find that to be rather silly and unnecessary, don't underestimate the convenience of a big screen. Anecdotally, in my own travels and just seeing tourists around NYC, I have also seen many people using iPads for photography. In fact, I once saw a helicopter tour company that banned use of iPads for pictures in flight — because the giant tablet blocked the view for others on the heli.
It wasn't until too many Mac OS versions ago that Apple even made it so apps would have to ask for access to the data in Contacts nee Address Book, of which I have plenty of data that could be used to steal my identity. For example, to make access to a CC easier I've set up that company's phone number in Contacts with a semicolon to Wait, and then my entire 16 digit card number so I can press a button to have it automatically sent as tones for authentication without having to pull out my wallet and type the number in manually. Before they did that I'm not sure I really considered that my Address Book and pretty much everything else under my User Account '~/' and Library '~/Library/' was readable by random apps I may install.
https://gizmodo.com/how-facebook-figures-out-everyone-youve-ever-met-1819822691
* Yes, I'm aware that some people can see some frequencies of IR, but it's still not considered the visual spectrum.
I told people this long ago on here and they couldn’t wrap their head around it. They kept saying “it will need an iPhone to work”.
WHY?
Apple isn’t Samsung or google. They don’t make mechanical, clunky, awkward technology. There’s absolutely no reason to strap an iPhone to your face when Apple owns technology like the A11/S3/W2 chips.
I agree AppleInsider needs to stop showing Goog Glass. Makes it seem like Apple is trying to copy that nerdwear.
I also gave the same possible solution months ago for the “creepiness”.... just don’t allow the glasses to record. There’s little reason to use glasses to record when you have an awesome camera in your pocket. I mean really, why do you need to record from your glasses? I can see 3rd parties offering an obvious attachment to make it possible. Obvious as in, others would know you have a camera on your glasses.
The other possibility like another commenter said is the glasses become popular and no one cares about the recording feature but I stick by my solution and believe Apple will leave out recording at least for the first gen or until they become more popular.
I can imagine that another would be miniaturization of the cameras. There would have to be two of them facing forward and roughly at interpupillary distance, ~65mm, for stereo "scene acquisition," which would be the method by which the glasses would know (by tracking your eyes) where exactly in the 3D point-field matrix you are looking, so the AR engine could apply its overlay of data (which might well be auditory instead of visual.). @Soli's idea that the out-facing cameras could be embedded in the microLED screens is interesting, and Apple has patents for this, I think. It would mean that the parallax could be adjustable across the span of the screen/lens, so accurate stereo could be acquired at closer and farther distances than we can normally see depth. That would be a trip.
The whole technology will be a trip. It's way too early to worry, I think, about creep factors. Apple is far more aware than other companies of what sort of experience their devices will provide their customers, since that is their major focus and they test with the psychology of privacy and good manners in mind — to the extent that they're said to be notoriously behind in context-aware AI, vis à vis Google, Amazon or Facebook. Maybe they'll even make it impossible to record video, or maybe the entire glasses frame will light up when recording, or somesuch.
edit: I see @Cali also be thinks that video recording will maybe not be enabled, at least at first. Thinking about this, it does seems a shame to waste all the 3D capability that stereo AR glasses will provide, and then still remain stuck with our flat screens for recorded media. Maybe we'll let the benefits outweigh the privacy concerns. This kind of change in outlook is hard to predict, as we've seen through the entire evolution of the pocket computer.
I think Apple's glasses will contain just enough hardware/software to act as a display driver for the iPhone (perhaps using AirPlay-like tech?) and as a collector of AR sensor/camera data to be sent back to the iPhone. Not sure if Bluetooth LE has enough bandwidth for both, but if so, that could be the data channel. With no onboard CPU (at least not significant CPU), memory, wifi, etc. to drive, I would think that Apple could stuff enough battery into the glasses to make it last all day.
You’re still going to need some kind of sensor that can detect objects and read signs, even if it’s not a camera per se. Let’s not forget the blind. Apple has made the iPhone quite accessible for a blind person to use. Imagine a cool pair of shades that can provide audio feedback on where a person is and what objects (people, cars, buildings) are nearby.
I assert that the "creepiness factor" of a forward-looking camera is linked to the assumption that said camera can / will be used to record. So far, this has been true -- in fact, Google touted Glass' ability to record and share photos / videos as essentially the best feature & made the camera super overt ... both decisions were probably mistaken. Google might be forgiven, however, because Glass wasn't an AR Visor.
I suspect that, IF ...
... THEN the creepiness factor could be effectively reduced to zero.
If Apple can make an AR Visor that looks like a normal pair of glasses then 95% of people won't even begin to think of the "creepy recording issue". If Apple can guarantee that the AR Visor's camera(s) cannot record, then it'll likely address 4% of the remaining 5%. There will always be the 1% who are skeptical and fearful ... see TouchID / FaceID launches.
https://www.bloomberg.com/news/articles/2017-11-08/apple-is-said-to-ramp-up-work-on-augmented-reality-headset
With a special chip, operating system and display, it is no more “glasses”. This is a true VR solution.
I dont ever see the continous wear of AR glasses being socially acceptable. Even if no camera is involved people will be put off by your preoccupation with the AR data. I kinda liken this to laptop useage where if someone stops by to chat while you are working it is considered polite to take your focus off the screen, closing the screen can be seen as giving the person 100% of your attention.
Rather for AR glasses i see marketing success if they are sold as a tool just like a laptop. Thus you put on the glasses when the activity warrants the use of those glasses. There are literaly thousands, probably millions if casees where this would be advatageous. Imagine a technician working on a very complex machine, a jet engine for example, an AR/VR solution could handle things like parts look up, recall notices, calibration or tolerance values and asssembly or disassembly procedures in a very fluid manner.
I dont see AR glasses being successful for general daily wear. At least at this point i dont see a useful app on the horizon. What i see them being useful for is the multitude of uses as a tool. Everything from an interior designer demoing solutions for a client to AR glasses replacing HUDs on aircraft.
I liken the acceptance of AR glasses to the the acceptance of an eletrician walking around with his tool belt strapped to his waiste. The tool belt is completely accepted at the job site effectively being a professional requirment. That same tool belt might not be accepted outside of the work zone. Some of that is social, some practicle (they get in the way) but the point is people separate their tools from their lives outside of work. For AR/VR glasses to be accepted they will need to be seen as tools and handled as such. That means daily wear by Joe Q. Public will be shunned. That still means millions sold.
Well eventually it will mean millions sold. The problem is the tech to pull this off. Frankly i see computational and storage issues as big factor here. The storage issue becomes a problem becuase network latenancy will be extremely frustrating users especially when massive data sets have to be digested. Even if coupled to an iPhone you will still have issues with current tech. Likewise very good processing power is needed to process all of that data and present it to the user. Basically you need better than A11 performance.
Topping everything off here is the delivery of viable apps. I know they will come eventually but you need to manage consummers expectations. In any event Apple will need a couple of compelling apps to spark imaginations.