Apple Intelligence may be coming to Vision Pro -- but not soon

Posted:
in Apple Vision Pro edited June 30

The roll-out of Apple Intelligence in the fall is passing over the Apple Vision Pro -- but a report on Sunday claims that it could arrive on the existing hardware as soon as 2025.

A bright yellow hard hat, on top of an Apple Vision Pro, resting on a white surface
Apple Vision Pro: hardhat optional



Apple Intelligence is arriving in test in the next few weeks. It will ship to for everybody with a compatible device in the fall, with iPadOS 18, macOS Sequoia, and iOS 18.

Not on that list at present is the Apple Vision Pro. But, a new report on Sunday from Bloomberg discusses the possibility of it arriving on Apple's headset.

The specs of the headset as it stands can support Apple Intelligence. After all, the headset has a M2 processor, and 16GB of RAM, which is more than what's required on a Mac for the feature.

Apple Vision Pro is also a variant of iPadOS, so that's another point in the favor of the technology coming to the headset.

A potential roadblock cited in the report is the user experience. The report claims that the mixed reality environment may be a challenge.

But, the report makes sense. Apple is positioning the Apple Vision Pro as a productivity device, and the Apple Intelligence features make sense for the platform.

What's more questionable is the concept that it might tax Apple's cloud computing infrastructure, which is alluded to and dismissed in the report. As compared to the number of Macs that can use the tech, and the volume of iPhone 15 and iPhone 16 models that will be available when it may arrive, Apple Vision Pro quantities are a drop in the bucket.

Apple Vision Pro sales numbers are several orders of magnitude less than the other hardware that will run Apple Intelligence.

And even so, many Apple Intelligence features aren't going to arrive until 2025 anyway. And, they'll be limited to US English at first.

Rumor Score: Likely

Read on AppleInsider

williamlondon

Comments

  • Reply 1 of 5
    I’m more curious about HomePods. So now they’re the dumbest device, and the ones that tend to answer first when asking Siri a question. 
    williamlondon
  • Reply 2 of 5
    Looking forward to it. I can see why there is an extended timeline for AVP. The requirements for spatial computing will be quite different in many ways. 
  • Reply 3 of 5
    9secondkox29secondkox2 Posts: 2,889member
    Looking forward to it. I can see why there is an extended timeline for AVP. The requirements for spatial computing will be quite different in many ways. 
    I don’t know about that. The hardware and OS takes care of that. It’s handled already. 

    I think it’s more of Apple figuring out how to sell a V2 device if they put all the eggs in V1. 

    The one difficulty I can see is drawing an image in the air. Obviously there is no surface to stop the user from pushing too far out or too close in for the hardware to recognize where the pencil - or your finger - is supposed to register. I think it make take some kind of virtual paper in a fixed x axis and haptic feedback with in a pencil device or the headset itself to register the feedback that you’ve connected with “paper.”
    edited June 30
  • Reply 4 of 5
    chasmchasm Posts: 3,403member
    I’m more curious about HomePods. So now they’re the dumbest device, and the ones that tend to answer first when asking Siri a question. 
    As there's no way to retrofit a new processor into the existing HomePods, the only way they're going to get Apple Intelligence is either them to be reprogrammed to "borrow a cup" of iPhone processing power to do the actual work. Conceivably you could do that, but then there's the question of the advanced Siri. So in short your OG HomePods become dumb speakers for your iPhone and Apple TV, or Apple brings out new HomePods and you replace your existing ones.

    The only alternative to this would be for Apple to offer a "brain transplant" option where, for let's say 2/3rds the price of all-new HomePods, your existing HomePods are sent in and then sent back with an A17/A18 chip and more RAM and improved motherboard with improved Siri.

    My HomePods have been marvellous in the audio department, which is their most important duty, so I've been a happy camper for the most part (I've noticed that HomePods need more precisely-worded requests than, for example my later iPhones). If I have to buy new HomePods to get all the benefits of Apple Intelligence, Siri, and another six-10 years of great sound, I'm willing to do that.
  • Reply 5 of 5
    danoxdanox Posts: 3,079member
    The next generation of the Apple Vision with a M4,M5, R2 SOC will have Apple Intelligence at the start and the first generation will get it as a software upgrade......
Sign In or Register to comment.