Vision Pro will turn any surface into a display with touch control

Posted:
in Apple Vision Pro

Developers using Apple Vision Pro have learned they will be able to create controls and displays, and have them appear to be on any surface in the user's room.

Vision Pro can make any surface appear to be a control, or an app display
Vision Pro can make any surface appear to be a control, or an app display



Apple has been working on Vision Pro for a long time, and along the way it has applied for countless patents that were to do with it -- even if it wasn't always obvious what the company's plans were. Now one intriguing patent application from 2020 has been revealed to be a part of Vision Pro that will help developers.

Developer Steve Troughton Smith has discovered that it's possible to pick a surface within the headset's field of view, and then place any app so that it appears to be actually on that surface.



"A natural way for humans to interact with (real) objects is to touch them with their hands," said Apple's 2020 patent application. "[Screens] that allow detecting and localizing touches on their surface are commonly known as touch screens and are nowadays common part of, e.g., smartphones and tablet computers."

Troughton Smith used an Apple Music app for his experimentation, but it can be any app and seemingly any controls. So where Apple's virtual keyboard for the Vision Pro is not practical for long typing sessions, it could be that a user's desk is turned into a keyboard.

That would still make for a stunted typing experience, as there would be no travel on the keys. But there are already keyboard projectors that use lasers to display a QWERTY layout directly onto a flat surface.

Read on AppleInsider

jahblade

Comments

  • Reply 1 of 8
    danoxdanox Posts: 3,242member
    A long and steady drum beat from developers over the next 6 to 9 months priceless, taking advantage of a whole new Apple ecosystem………
    edited June 2023 jahbladeFileMakerFellerwatto_cobrawilliamlondon
  • Reply 2 of 8
    thttht Posts: 5,605member
    Waiting to see if typing with 10 fingers will be possible, both with the "in-air" keyboard seen in the screenshots and videos and on a virtual keyboard on flat desktop surface. Ten finger typing is low on the priority list though as I assume Apple thinks people will just use a real keyboard.

    Wondering how the dexterous you can be with the hand tracking. 10 finger typing is one example. Another is say, can you work a virtual Rubik's cube with your hands?
    lolliverFileMakerFellerwatto_cobra
  • Reply 3 of 8
    tht said:
    Waiting to see if typing with 10 fingers will be possible, both with the "in-air" keyboard seen in the screenshots and videos and on a virtual keyboard on flat desktop surface. Ten finger typing is low on the priority list though as I assume Apple thinks people will just use a real keyboard.

    Wondering how the dexterous you can be with the hand tracking. 10 finger typing is one example. Another is say, can you work a virtual Rubik's cube with your hands?
    I suspect that the hand tracking won't be incredibly precise, but that won't be an issue if you are looking at the object because the eye tracking is reportedly phenomenal. We won't know for sure until devices get shipped.
    watto_cobra
  • Reply 4 of 8
    williamhwilliamh Posts: 1,041member

    So where Apple's virtual keyboard for the Vision Pro is not practical for long typing sessions, it could be that a user's desk is turned into a keyboard.  That would still make for a stunted typing experience, as there would be no travel on the keys. But there are already keyboard projectors that use lasers to display a QWERTY layout directly onto a flat surface. 

    I think you COULD have key travel by overlaying a virtual keyboard on a real keyboard.  It wouldn't matter if the keyboard was plugged in or not.  You could type on the real keyboard and the Vision Pro would read the virtual one occupying the same space.
    watto_cobramike egglestonwilliamlondon
  • Reply 5 of 8
    williamh said:

    So where Apple's virtual keyboard for the Vision Pro is not practical for long typing sessions, it could be that a user's desk is turned into a keyboard.  That would still make for a stunted typing experience, as there would be no travel on the keys. But there are already keyboard projectors that use lasers to display a QWERTY layout directly onto a flat surface. 

    I think you COULD have key travel by overlaying a virtual keyboard on a real keyboard.  It wouldn't matter if the keyboard was plugged in or not.  You could type on the real keyboard and the Vision Pro would read the virtual one occupying the same space.
    I won't lie, that seems like it could be an AWESOME alternative.
    williamh
  • Reply 6 of 8
    williamlondonwilliamlondon Posts: 1,398member
    williamh said:

    So where Apple's virtual keyboard for the Vision Pro is not practical for long typing sessions, it could be that a user's desk is turned into a keyboard.  That would still make for a stunted typing experience, as there would be no travel on the keys. But there are already keyboard projectors that use lasers to display a QWERTY layout directly onto a flat surface. 

    I think you COULD have key travel by overlaying a virtual keyboard on a real keyboard.  It wouldn't matter if the keyboard was plugged in or not.  You could type on the real keyboard and the Vision Pro would read the virtual one occupying the same space.
    That's a really good idea, very clever of you!
    williamh
  • Reply 7 of 8
    chutzpahchutzpah Posts: 392member
    williamh said:

    So where Apple's virtual keyboard for the Vision Pro is not practical for long typing sessions, it could be that a user's desk is turned into a keyboard.  That would still make for a stunted typing experience, as there would be no travel on the keys. But there are already keyboard projectors that use lasers to display a QWERTY layout directly onto a flat surface. 

    I think you COULD have key travel by overlaying a virtual keyboard on a real keyboard.  It wouldn't matter if the keyboard was plugged in or not.  You could type on the real keyboard and the Vision Pro would read the virtual one occupying the same space.
    I won't lie, that seems like it could be an AWESOME alternative.
    But the VisionPro supports physical keyboards. What’s the point of overlaying a VisionPro virtual keyboard onto a physical keyboard over and above just connecting the VisionPro to the keyboard (presumably with Bluetooth)?
    williamlondon
  • Reply 8 of 8
    thttht Posts: 5,605member
    tht said:
    Waiting to see if typing with 10 fingers will be possible, both with the "in-air" keyboard seen in the screenshots and videos and on a virtual keyboard on flat desktop surface. Ten finger typing is low on the priority list though as I assume Apple thinks people will just use a real keyboard.

    Wondering how the dexterous you can be with the hand tracking. 10 finger typing is one example. Another is say, can you work a virtual Rubik's cube with your hands?
    I suspect that the hand tracking won't be incredibly precise, but that won't be an issue if you are looking at the object because the eye tracking is reportedly phenomenal. We won't know for sure until devices get shipped.
    The hand tracking looks phenomenal too. They didn't illustrate it in the ad copy or the videos, but all the UI elements are directly touchable. You can use your fingers to directly touch the UI to the swipe-scroll, tap-select, etc. For the virtual keyboard, they even animate a "mechanical" actuation while pressing down on a key. So, it looks low single digit millimeters good for when the user is touching UI elements.

    The trick with a virtual Rubik's cube is doing it with 10 fingers simultaneously. Well, maybe a max of 6? The current SDK says a max of 2 fingers simultaneously, so work to do on this front, though I don't if that is what visionOS provides as a default, and developers can add more on their own.

    The trick for the software with a Rubik's cube is to know when the back fingers - farthest from the cameras and often behind UI elements and your other fingers - are millimeters away from the UI element. If the tracking is good enough, it will know that your fingers a touching virtual surfaces, know that the user is grabbing the UI element in 3 dimensions, and the software can perform a continuous response to the user's action.

    What Apple has shown with the hand tracking has been rather Spartan. They only have shown the most fundamental way to navigate the environment. Just a pinch and swipe. Your hands are a gigantic control devices with a lot of degrees of freedom. Eg, there are people wondering how you can play games without controllers. The answer is with hand gestures. One hand could be forward-back-right-left, including velocity and acceleration if the developer desires, the other hand could be firing, inventory selection etc, and your head and eyes are for aiming.

    Heck, I've heard VR journos ask how people can play Beat Saber without controllers. These are effing VR journalists?! Just go grab a couple of sticks and start swinging. The object tracking will do the rest. With the right colored sticks, the cameras and software will know where the sticks are in 3D space. For this, you probably really need 120 Hz though. On the double-plus side, if the sticks are properly waited, players will actually know what it is like to swing swords. Probably a bad idea as it can get both dangerous and will not make an approachable game for the masses.
Sign In or Register to comment.