Apple Vision Pro imminent, with launch rumored at end of January
The latest rumors sourced from Apple's supply chain point to January 26 for the Apple Vision Pro launch, which lines up with other accurate sources.
Apple Vision Pro
When Apple revealed Apple Vision Pro, it promised an "early 2024" release window, which could realistically mean anytime before June. However, evidence has been mounting for a January to February launch window, with the latest suggesting the former.
Bloomberg's Mark Gurman shared that January 26 is an Apple Vision Pro launch date "floating around" people familiar with the matter in China. This came as a response to a MacRumors story covering a sketchy source suggesting the date.
The original iPad was released on a Saturday. Jan 26 is indeed the date floating around the last few days among people in China who claim to be connected to Apple. Like I said, units will be ready by end of January with a retail launch by February. It's imminent.
-- Mark Gurman (@markgurman)
Gurman's response lines up with what AppleInsider has heard from independent sources. Our understanding is that Apple Retail is preparing for the initial launch through January with training and in-store units.
Apple has been clear about how it intends to roll out Apple Vision Pro in retail. Customers will need to physically come to an Apple Store to be fitted for a unit.
We expect the in-store experience will serve as a way to educate the user for initial setup. Many will likely be fitted in store and sent home with a receipt to await delivery.
The limiting factor for in-store pickup will likely be the prescription lenses required for anyone that wears glasses. Apple Stores are expected to have a small stock of standard prescriptions available.
Apple Vision Pro costs $3,499, but Apple hasn't shared how that price accounts for accessories like a custom visor or prescription lenses. The launch date will likely be revealed by Apple soon to give customers time to plan finances and a visit to the Apple Store.
Rumor Score: Likely
Read on AppleInsider
Comments
Call it what it is a paid beta program the Vision Pro will be from finished product, but Apple needs to see how people will actually use it, shake out the bugs, and start the Fanboys running their mouths about it. Also Apple need to workout how they will sell it in the future to the masses especially in places that aren't close to Apple stores. Apple needs developer to get onboard and start writing apps for the Vision Pro and they need them to test with. This will also generate some funding because the R&D expense on this thing must be outrageous. If Apple called this an Early Adopters program or similar name they would sell as many but putting it out as a 1.0 will get people with deep pockets to hop on board.
My guess is the actual Vision Pro version ready for the masses will come out in about three years.
Big thing for me is being able to use it for keyboard input dominated workflows. Hopefully Terminal.app is part of the software load. Wonder how they are going to do the cursor. Every spatial view has a cursor? One cursor that traverses all the spatial views?
Sounds like copy and paste diatribes from previous Apple releases but here we are... a $3 trillion dollar company.
The idea of strapping a Mac/ipad to my face isn’t appealing personally - even though it can block out the environment and provide a faux 3D look with parallax. I like being able to look away from the screen often, unless I’m heavily involved in. Game, movie, or design project. Don’t care to wear the battery either.
2024 is much closer to 1984 now.
https://www.uploadvr.com/apple-vision-pro-gesture-controls/
If it's possible to connect a mouse, it might show like on iPad with a floating circle ( https://www.youtube.com/watch?v=j10KF1rQx3Q ).
When a Mac is connected, I expect it will show a cursor on the Mac display panel as normal because it shows the Mac framebuffer but the Mac can probably be controlled by eye tracking too, the Mac system would just need a new input handler for it.
For text based workflows, there needs to be a way to select text. This is a fine-grained operation where a user could select a letter, a set of letters, a word, a sentence fragment, paragraphs, and whole tomes that require scrolling.
If it is hand and eye tracking, then it would basically be the slow touch UI method that Apple has. Look at the word, pinch, selection handles appear, look at one selection handle, pinch, move, and repeat with the other.
You could use two hands. Have a cursor button that activates a text insertion point, place the insertion point with your eyes, pinch and slide for selection. 2nd hand could be used to scroll for text outside the view.
The simplest thing is just to use a trackpad or mouse. Input focus could be follow eyes (same as focus follows mouse) or could be focus follows selection of view. macOS uses focus follows selection or focus follows click, whatever you call it.
Apple will have to define input gestures beyond just a pinch as well. Like, what if you need to scroll a lot? Is that a pinch with circular motion? Once something is selected, what's the gesture for popup menu options for that selected object?