mattinoz

About

Username
mattinoz
Joined
Visits
377
Last Active
Roles
member
Points
3,445
Badges
1
Posts
2,691
  • Apple still has a lot of secret apps for Vision Pro in the works

    What no augmented calculator?
    Can not only detect a calculator on your desk and read the result from the screen but can be used with other calculating items. Scale Ruler, Slide Rule, abacus.... 
    williamlondonwatto_cobra
  • Why Apple Vision Pro has a chance at being the future of work

    avon b7 said:
    mattinoz said:
    avon b7 said:
    mattinoz said:
    avon b7 said:
    chutzpah said:
    avon b7 said:
    tmay said:
    avon b7 said:
    danox said:
    avon b7 said:
    zomp said:
    Apple is always holding secrets so that us and, more importantly, the competition can only guess at apple's roadmap for vision pro. Apple is a hardware company and it's up to the software creators to imagine new ways to use the device. So rather than apple promote something so that the folks can say "How are you going to do that?", they allow us to imagine the possibilities so that we can create new things. Hence "Think Different". Apple will always continue to modify the hardware as the needs arrive - at the moment vision pro is fantastic, but they have no idea what we will dream up and what more needs to be added to the hardware and software updates. That's what makes apple so amazing! They leave it up to us to create the future of their devices.
    The roadmap is the same for everyone. 

    Everyone knows where everyone wants to go. It's how they get there and at what cost that is more important. 

    Moving the screen towards you for a more immersive experience is the most basic goal. Interacting with a 3D like environment is part of that. Then the audio/visual experience itself (resolution/quality etc). The computer experience. Interaction with the external environment. 

    Size, battery life, 'speed' etc. 

    It's not exactly a new field. 


    But it is a new Apple ecosystem and there will be daily announcements from Apple developers in time highlighting the fact that they have ported their software over to Apple VisionOS, and that drumbeat Tsunami, will get louder and louder as we get closer to Apple’s Vision Pro release date. The R1 SOC is also new what are it’s full spec’s capabilities?.

    The competition if there is any won’t have any of that over the next six months, a long slow steady drum beat of a rising army of VisionOS developers.

    https://developer.apple.com/news/?id=h3qjwosp

    However, none of that changes the facts.

    Everyone is moving on the same roadmap and with the same end goals.

    That industry roadmap was there years before Apple even announced the Vision Pro. Now Apple is officially on it. 

    None of us may be big Zuckerberg fans but what he said the other day wasn't really off the mark. 

    If you re-watch the presentation how much was truly 'new'? Or not already planned? 

    If anything, the true upshot was that it tempered people's expectations, which is no bad thing. Perhaps some people were simply getting ahead of themselves. 

    The R1 is a dedicated chip for specific tasks. Those kinds of specific chips are all over the place.

    Here’s one for RF processing which could very well end up in an XR device at some point:

    https://www.gizchina.com/2023/03/06/honor-c1-rf-chipset-launched-sets-a-new-benchmark/

    If there are no general purpose chipsets up to a particular task then companies tend to bake their own. They might be for in-house use like those from Honor, Huawei, Google, Apple etc, or made available on the open market like those from Qualcomm, Broadcom, Mediatek, Sony etc. 

    You mention the full specs of the R1 but what are the full specs of the Vision Pro? 

    With most tech announcements, what is not said (or done) is just as important as what is said and done. 

    We know there is no cellular option but it was surprising, to me at least, that they kept most of the presentation in the AR realm and not so much the VR realm. No one knows how many of the announced features actually work because those who had hands on access were not allowed to use them. 

    Now, for a non-production unit that is the order of the day. Especially as those units would be running early software implementations but the reality is that no one got to try out some of the tentpole features. 

    Ecosystems are just ecosystems.

    They serve a purpose and there is a lot to say on that subject but not really relevant here. 

    The Vision Pro will just slip into the Apple ecosystem. But then again, why wouldn't it?

    How well? No one knows yet. 

    It is true that developers are an important component of many ecosystems and of course the Vision Pro was announced now precisely to be able to widen and hone the developer support. Marketing was another factor. That was absolutely necessary. 

    My take is that the package as a whole looks great. The finesse. It all comes at a price but that has to be understood. Let those with the disposable income and the will to be early adopters iron out the wrinkles. 

    The roadmap, though. There's not much new there. 

    Oh boy; Honor makes a new RF chipset that is slightly faster that Apple's iPhone RF chipset, and no one cares except avon b7 who is beyond excited.

    Meanwhile, Apple, again, "stuns" competition with custom R1 sensor processor, on top of the M2 processor.

    Competitors: "Everything is nominal", "Look at our user base and marketshare", "Look at our affordable prices", "Look at our roadmap", "Ecosystems are just ecosystems"; state Apple is doing nothing that they haven't already explored, all while they quietly watch any future profits shrink to nothing.
    You miss the point as only you can. 

    Why did Honor even develop the chip if slight gains were the order of the day? 

    You have deliberately ignored the point which is actually worse than just missing it!

    The point was that if you can't get the results you want with off-the-shelf solutions (even slightly modified ones) you bake your own. 

    It's what Honor did. 
    It's what Apple did. 
    It's what Huawei did. 

    It's what lots of companies do!

    Absolutely nothing out of the ordinary.

    Right? 
    I don't believe any other major VR headsets* have used eye tracking as a primary mode of interaction.  They've all used hand operated controllers as the primary.  Likewise focussing on being a consumer-level productivity device**.  And the approach to pass through is so far ahead it makes others look like they haven't been trying at all.

    It is very much out of the ordinary.



    * Google Cardboard a slight exception, though it was so much less ambitious that it barely counts.
    ** HoloLens was always targetted for specialist productivity, not for mass appeal.
    The reason active eye tracking is not commonplace in VR headsets is simply because controllers are cheap and extremely functional.

    Eyetrackers are very 'old' technology and have been up for inclusion on headsets for a while. They are simply more expensive to implement actively (as opposed to passively which is available on some consumer VR headsets) because, once you remove the controllers, you need to add another way (gestures for example) to resolve the same problems. 

    That isn't technology related per se. It's more of a cost consideration. I've been working in UX for a few years now and eye trackers are essential.

    As mentioned above, gesture recognition is another area brought on through the absence of controllers but again, gesture interaction brings cost considerations. Gesture recognition is also a well worn technology. 

    If you reduce the importance of cost in your consumer focused product, the door opens to other options. That is what Apple has chosen to do with the initial release although I'm sure an 'SE' version is being planned, and IMO sans the bells and whistles. 

    Zuckerberg and many other companies have spoken about these aspects for some time. There have been lots of concept devices, prototypes and whatnot. The problem is price and mass consumer appeal for a device that will not get anywhere the usage time of a phone. 

    We have known where everyone wants to go for a very long while and it's the same place Apple wants to go. 

    I believe Xiaomi announced something at MWC and even used the word 'spatial' in computing terms. Quite logical when you consider that VR is spatial by definition. 
    Controllers are not "extremely functional", they make me work how they want to work and limit me to what they want to do. There are limiting in function. Still they are cheap and easy and basically just lazy. 


    Controllers do exactly the same thing as gestures and from exactly the same place.

    Placement and action. 

    Your comment doesn't make any sense. 

    Gestures make you work how they want to work and limit you in the same way as controllers do. 


    The pain points of controllers are basically the batteries and sometimes the breakdown in communication with the host device. 

    The pain points of gestures are you need line of sight with the sensors doing the gesture interpretation and the accuracy of the interpretation itself. 

    I imagine (I haven't really thought about it) that to avoid false positives with accidental hand fidgeting, a mechanism to 'wake' the interpretation system might be needed. 



    Controller may have uses but if you need to do anything then the device is limited. 
    Like the iPhone or more so the iPad the pencil is useful but if it was required to use the iPhone or the iPad both devices would have been duds. 

    Same here if the gestures are well thought out then you will learn by discovery for the most part, and developers aren't bound by the buttons built in to the device they can extend gestures at will even add complimentary controllers if that would suit. 

    Image X-Plane with cardboard cutouts of the cockpit controls beyond the basic ones like throttle, Yoke now you leveraging muscle memory without needing to build $1,000 of replicas. There are systems that work now with controllers but it just isn't the same.

    Rinse and repeat across 100's of uses and the price difference swings to the more capable device.
    I think you're mixing things up a bit here and getting ahead of yourself. 

    The gestures that Apple demoed were limited and replicated the current controllers on the market today. The eye tracker handles placement. 

    That means in terms of functionality the different proposals are identical and each have their pain points. 

    So, my reading is that the eye tracker places the focus and then you use gestures to click, drag etc

    Is there any gesture on the announced product that cannot be achieved using interface elements and a controller? 

    Swipe, pinch etc? 

    If I am reading you correctly, you are suggesting app developers will be able to invent their own app gestures and have the system interpret them. 

    That would mean users learning different gestures for different apps. Not a system wide gesture collection. 

    AFAIK that isn't on the table with what Apple has announced.

    Apologies in advance if I have misread what you are saying. 
    The advice they have given developers is don’t got crazy and keep it real. Don’t use hands up gestures unless they build on real world muscle memory.

    and the down facing camera are there to avoid fatigue. 

    they showed using a pen gesture for notes, free form and pdf for writing and markup. 

    Not I’ve seen any others. 


    tmaywilliamlondon
  • Why Apple Vision Pro has a chance at being the future of work

    avon b7 said:
    chutzpah said:
    avon b7 said:
    tmay said:
    avon b7 said:
    danox said:
    avon b7 said:
    zomp said:
    Apple is always holding secrets so that us and, more importantly, the competition can only guess at apple's roadmap for vision pro. Apple is a hardware company and it's up to the software creators to imagine new ways to use the device. So rather than apple promote something so that the folks can say "How are you going to do that?", they allow us to imagine the possibilities so that we can create new things. Hence "Think Different". Apple will always continue to modify the hardware as the needs arrive - at the moment vision pro is fantastic, but they have no idea what we will dream up and what more needs to be added to the hardware and software updates. That's what makes apple so amazing! They leave it up to us to create the future of their devices.
    The roadmap is the same for everyone. 

    Everyone knows where everyone wants to go. It's how they get there and at what cost that is more important. 

    Moving the screen towards you for a more immersive experience is the most basic goal. Interacting with a 3D like environment is part of that. Then the audio/visual experience itself (resolution/quality etc). The computer experience. Interaction with the external environment. 

    Size, battery life, 'speed' etc. 

    It's not exactly a new field. 


    But it is a new Apple ecosystem and there will be daily announcements from Apple developers in time highlighting the fact that they have ported their software over to Apple VisionOS, and that drumbeat Tsunami, will get louder and louder as we get closer to Apple’s Vision Pro release date. The R1 SOC is also new what are it’s full spec’s capabilities?.

    The competition if there is any won’t have any of that over the next six months, a long slow steady drum beat of a rising army of VisionOS developers.

    https://developer.apple.com/news/?id=h3qjwosp

    However, none of that changes the facts.

    Everyone is moving on the same roadmap and with the same end goals.

    That industry roadmap was there years before Apple even announced the Vision Pro. Now Apple is officially on it. 

    None of us may be big Zuckerberg fans but what he said the other day wasn't really off the mark. 

    If you re-watch the presentation how much was truly 'new'? Or not already planned? 

    If anything, the true upshot was that it tempered people's expectations, which is no bad thing. Perhaps some people were simply getting ahead of themselves. 

    The R1 is a dedicated chip for specific tasks. Those kinds of specific chips are all over the place.

    Here’s one for RF processing which could very well end up in an XR device at some point:

    https://www.gizchina.com/2023/03/06/honor-c1-rf-chipset-launched-sets-a-new-benchmark/

    If there are no general purpose chipsets up to a particular task then companies tend to bake their own. They might be for in-house use like those from Honor, Huawei, Google, Apple etc, or made available on the open market like those from Qualcomm, Broadcom, Mediatek, Sony etc. 

    You mention the full specs of the R1 but what are the full specs of the Vision Pro? 

    With most tech announcements, what is not said (or done) is just as important as what is said and done. 

    We know there is no cellular option but it was surprising, to me at least, that they kept most of the presentation in the AR realm and not so much the VR realm. No one knows how many of the announced features actually work because those who had hands on access were not allowed to use them. 

    Now, for a non-production unit that is the order of the day. Especially as those units would be running early software implementations but the reality is that no one got to try out some of the tentpole features. 

    Ecosystems are just ecosystems.

    They serve a purpose and there is a lot to say on that subject but not really relevant here. 

    The Vision Pro will just slip into the Apple ecosystem. But then again, why wouldn't it?

    How well? No one knows yet. 

    It is true that developers are an important component of many ecosystems and of course the Vision Pro was announced now precisely to be able to widen and hone the developer support. Marketing was another factor. That was absolutely necessary. 

    My take is that the package as a whole looks great. The finesse. It all comes at a price but that has to be understood. Let those with the disposable income and the will to be early adopters iron out the wrinkles. 

    The roadmap, though. There's not much new there. 

    Oh boy; Honor makes a new RF chipset that is slightly faster that Apple's iPhone RF chipset, and no one cares except avon b7 who is beyond excited.

    Meanwhile, Apple, again, "stuns" competition with custom R1 sensor processor, on top of the M2 processor.

    Competitors: "Everything is nominal", "Look at our user base and marketshare", "Look at our affordable prices", "Look at our roadmap", "Ecosystems are just ecosystems"; state Apple is doing nothing that they haven't already explored, all while they quietly watch any future profits shrink to nothing.
    You miss the point as only you can. 

    Why did Honor even develop the chip if slight gains were the order of the day? 

    You have deliberately ignored the point which is actually worse than just missing it!

    The point was that if you can't get the results you want with off-the-shelf solutions (even slightly modified ones) you bake your own. 

    It's what Honor did. 
    It's what Apple did. 
    It's what Huawei did. 

    It's what lots of companies do!

    Absolutely nothing out of the ordinary.

    Right? 
    I don't believe any other major VR headsets* have used eye tracking as a primary mode of interaction.  They've all used hand operated controllers as the primary.  Likewise focussing on being a consumer-level productivity device**.  And the approach to pass through is so far ahead it makes others look like they haven't been trying at all.

    It is very much out of the ordinary.



    * Google Cardboard a slight exception, though it was so much less ambitious that it barely counts.
    ** HoloLens was always targetted for specialist productivity, not for mass appeal.
    The reason active eye tracking is not commonplace in VR headsets is simply because controllers are cheap and extremely functional.

    Eyetrackers are very 'old' technology and have been up for inclusion on headsets for a while. They are simply more expensive to implement actively (as opposed to passively which is available on some consumer VR headsets) because, once you remove the controllers, you need to add another way (gestures for example) to resolve the same problems. 

    That isn't technology related per se. It's more of a cost consideration. I've been working in UX for a few years now and eye trackers are essential.

    As mentioned above, gesture recognition is another area brought on through the absence of controllers but again, gesture interaction brings cost considerations. Gesture recognition is also a well worn technology. 

    If you reduce the importance of cost in your consumer focused product, the door opens to other options. That is what Apple has chosen to do with the initial release although I'm sure an 'SE' version is being planned, and IMO sans the bells and whistles. 

    Zuckerberg and many other companies have spoken about these aspects for some time. There have been lots of concept devices, prototypes and whatnot. The problem is price and mass consumer appeal for a device that will not get anywhere the usage time of a phone. 

    We have known where everyone wants to go for a very long while and it's the same place Apple wants to go. 

    I believe Xiaomi announced something at MWC and even used the word 'spatial' in computing terms. Quite logical when you consider that VR is spatial by definition. 
    Controllers are not "extremely functional", they make me work how they want to work and limit me to what they want to do. There are limiting in function. Still they are cheap and easy and basically just lazy. 


    tmaywilliamlondonHirsuteJim
  • Apple Vision Pro developer kits will be made available eventually, and to a select crowd

    entropys said:
    Maybe this will convince major CAD design software makers other than Autodesk to support macs , and the vision pro by extension for AR of imagery.

    Another fantastic opportunity is remote healthcare. A community nurse could wear this and directly consult with a speech pathologist back in the office, for example. The Royal Flying Doctor Service integrated with Starlink would have an immediate demand for this.
    problem is the same as games they rely too much on internal custom scripting and even more so for Pro apps like CAD user generated custom scripting that is basically unshipable under appstore rules. So no access to anything but Mac itself. 

    Apple needs to work out how to sandbox or be comfortable with those environments in the security model for CAD/BIM and other workflows and fun flows to be on these devices. 

    Really surprised no word this year on a move. 
    Anilu_777entropysBiCwilliamlondonAlex1Nwatto_cobra
  • Apple's iPad is propping up a collapsing tablet market

    danox said:
    elijahg said:
    iPads have gotten a bit too expensive. 

    Apple likes to charge premium pricing and the now iOS developers think everyone wants to pay them annually for their work.  I have less and less reason to purchase an iPad anymore.  I'm struggling to find reasons to continue to invest in iPad OS.  A  Mx based 12" Macbook would potentially obliterate any need I have for an iPad. 

    It's been a nice run but Tablets "feel" like a market that has passed its zenith. 
    I think Apple missed the boat with the iPad. Its just hasn't evolved in the way the iPhone has, and the OS restrictions mean it can't really be used by people in environments that don't fit in the narrow scope of use cases Apple imagines. In other words outside of media consumption, it's pretty useless.

    Web development? Nope. App development? Not really. Network engineering? Nope. Control of industrial machinery? Not really. Building non-iPad software? Nope. Essentially anything outside of Apple's iPad ecosystem is a no-no. Yes great you can use FCP on it now. No one in their right mind actually would however, when a Mac can do it with 10% of the effort required to navigate the iPad UI. When using an iPad I feel like I have to put in the same effort to get the UI to do what I want as I do in Windows, whereas everything is effortless on the Mac.

    Aside from that, the multitasking UI is still unbelievably clunky. It needs real, overlapping windows rather than full-screen everything and constant clumsy gestures to navigate around.
    Sounds like you need a Microsoft Surface computer to solve your problems, Apple devices are not for everyone.
    Sorry, but why shouldn't the iPad especially the Pro models live up to the potential of the device. Why even make pro models if this attitude is a reasonable position. 

    That said I think a big overhaul of iPadOS is due that will change the abilities of the Pro if not all iPads going forward. I mean they will need it for xrOS anyway and the iPad is the prefect mass platform for the sandboxed but general purpose computing platform that replaces MacOS eventually.  Hopefully 2023 will be year of the "App specific compiler extension" then the gloves really come off. 
    elijahgthtmuthuk_vanalingamwilliamlondonwatto_cobra