Apple's AI rollout leaves Siri behind & long-time fans are asking questions
Apple software chief Craig Federighi confirmed Monday that promised artificial intelligence upgrades to Siri have been delayed, saying the company needs more time to meet quality standards before launching the features in iOS 26.

Apple software chief Craig Federighi
In a WWDC 2025 interview with YouTuber iJustine, Federighi said the new Siri features were working internally but didn't perform well enough to ship.
Apple originally previewed major Siri upgrades at WWDC 2024, including more natural conversations, richer contextual understanding, and the ability to perform multi-step actions. At the time, many of those features were expected to ship in 2025.
Apple now says they're targeting a rollout sometime in 2026, though no firm release date has been shared. Federighi reiterated Apple's plan to eventually deliver everything it previewed and added that additional improvements are on the way.
Siri delays reflect Apple's measured approach to AI
While Siri's new capabilities won't arrive until 2026, the company says it's taking its time to ensure reliability and privacy. Most of Apple's artificial intelligence tools are designed to run on-device by default, mostly avoiding cloud-based processing common in competing models.
Developers can now access Apple's foundation models through a new API, giving them the ability to integrate natural language features into their apps without relying on external services.
Apple's deliberate pace stands in contrast to rapid AI rollouts from Google, Microsoft, and OpenAI. With Gemini and Copilot already deeply integrated into competing platforms, expectations for Siri have only grown.
Some industry analysts have called Apple's 2025 keynote cautious, noting the absence of high-profile breakthroughs. Still, the company appears to be betting on long-term trust.
By delaying Siri until the features meet its quality threshold, Apple avoids shipping half-baked capabilities that could erode confidence in its assistant. The company's message is clear -- better to deliver late than deliver something unreliable.
Apple skips The Talk Show for first time in a decade
It's not unusual for Apple to talk to iJustine. What is, is that for the first time since 2015, Apple executives didn't appear on The Talk Show with John Gruber during WWDC week.
The absence marked the end of a ten-year tradition that gave developers and Apple enthusiasts a rare look at candid conversations with senior leadership.
Gruber, who runs Daring Fireball and has hosted Apple figures like Federighi and Greg Joswiak annually, noted the change. He didn't speculate on the reason.
The shift may signal a broader change in Apple's media strategy, moving away from its long-standing independent outlets toward more controlled messaging or mass-market platforms. Or, if Occam's Razor holds and the simplest explanation is the right one, Gruber hurt Apple's feelings.
We're no stranger to getting shut out by Apple because we've been critical, nor are our friends at 9to5Mac or MacRumors.
Read on AppleInsider
Comments
Apple recently published the paper "The Illusion of Thinking" that neither LRMs nor LLMs reason with accuracy when tasks get very complicated.
Apple has a decent team of researchers. I still believe that Apple has user data as Apple works with major hospitals to analyze user health data from AppleWatch.
WWDC 2025 was just an underwhelming event. The most exciting event was just the Apple Developer event for Foundation Models framework.
Meet the Foundation Models framework - WWDC25 - Videos - Apple Developer
If I'm right, Apple is brilliant and they're on the completely correct course with this. Basically, you use the local model for 90% of queries (most of which will not be user queries, they will be dead-simple tool queries!), and then you have a per-user private VM running a big LLM in the user's iCloud account which the local LLM can reach out to whenever it needs. This keeps the user's data nice and secure. If OpenAI gets breached, Apple will not be affected. And even if a particular user's iCloud is hacked, all other iCloud accounts will still be secure. So this is a way stronger security model and now you can actually train the iCloud LLM on the user's data directly, including photos, notes, meeting invites, etc. etc. The resulting data-blob will be a honeypot for hackers and hackers are going to do everything in the universe to break in and get it. So you really do need a very high level of security. Once the iCloud LLM is trained, it will be far more powerful than anything OpenAI can offer because OpenAI cannot give you per-user customization with strong security guarantees. Apple will have both.
Instead of promoting that, Apple just announces some funny jokes with UX/UI designs..
Get a decent Product Guy.. Tim is a wrong one. Greg is just a Marketing guy who has no clue anyway. Eddy Cue and Phil Schiller are just silent.
Bring a decent Product Guy.
https://stratechery.com/2025/apple-retreats/
https://machinelearning.apple.com/research/introducing-apple-foundation-models
GPT 4 is 1.7 trillion parameters. GPT 4o mini is 8 billion:
https://deepnewz.com/ai-modeling/microsoft-paper-reveals-gpt-4o-mini-size-8b-parameters-gpt-4-1-76t-claude-3-5-a05ca5f5
If they can get something like GPT 4o mini to run locally, that would be an improvement. iPhones would likely need 16GB RAM.
They want agent-like capability with Siri to be able to do tasks (1:44) where someone can ask the device to do everything:
It needs a few things: more RAM, better reliability from smaller models, good training data and process, good QA testing.
It feels like a good default setup would be normal Siri for local tasks but instead of searching the web when it has difficulty, it uses GPT 4o on the server and has local Siri interpret the reply.
I want to read the story that says Tim Cook dragged the Siri team into a room and tore into them for a decade of weak leadership and failure to push Siri forward or for failing to keep abreast of the fast changes happening with LLMs. The normal approach of Apple taking years to perfect a concept will leave Apple in the dust if it continues the same old development strategy in this current environment. And don't get me started on the Apple Car team. The one that burned through billions and delivered nothing. No product. No plan. Just hype.
If those two disasters don’t spell executive failure, what does?
I want to see executives who are a lot less rockstar and hair (I'm looking at you Craig) and a lot more "look at this amazing new product with crazy new capabilities. I'm bored with the updated set of tinker toys now with a new glass effect.
I am glad to see WWDC contained next to no promises of features "coming soon". That lazy approach to engineering and marketing deserves to stay in the vaporware land that is Windows and Honeywell's residential building services group.
Ditching the Siri brand whenever Apple finishes work on its next-gen voice assistant is actually a pretty decent idea. Siri is a valueless brand name now, other then being a "known" name, but what it's known for is not anything you'd want to be associated with a product.
Why wasn't it ready long ago?
Why the management changes?
They've had years to do it and were even confident enough that they 'promised' it at last year's event.
One possible answer is that they were late to the stalls and were left wanting when the other horses left. They were forced to chance things by promoting something they 'hoped' would be ready at a later date, just to buy some more breathing space, but it all fell through.