Hands on: Siri starts to get better thanks to Apple Intelligence

Posted:
in iOS

Apple has released our first look at Apple Intelligence, and with it, a wholly refreshed Siri. Even though it's early, let's take a look at how Siri is getting even better - and what features are still coming.

Two hands gesturing to an iPhone with the getting started screen for Apple Intelligence
Apple Intelligence is now here



Based on the folks we've talked to about it, Apple's long-running voice assistant Siri is the most anticipated upgrade that's set to arrive with Apple Intelligence. It's long past time for Apple to upgrade Siri to make it a viable alternative to other virtual assistants.

Charitably, before the Apple Intelligence upgrade coming in some version of iOS 18, it's pretty bad.

We've updated our iPhone 15 Pro Max -- one of two supported iPhone models -- to the iOS 18.1 beta, the delivery vehicle for Apple's big AI push. It's likely to arrive by this October.

It's important to note that while Siri has a new look, as of August 2, 2024, many of the features aren't here yet. That makes it really hard to judge, this early, at how good the new AI-powered Siri will be when at full power.



Regardless, let's take a look at how Siri has gotten better and what we're still waiting for.

Apple Intelligence: Siri's new look



Siri has a whole new look. Gone is the small glassy orb and in its stead is a colorful animation emanating from the edges of your phone.

An iPhone with the new Siri animation glowing from the edges of the screen
Siri is looking good with Apple Intelligence



It does this on iPhone, iPad, and even when using CarPlay. This light reacts to your voice, moving like an on-screen visualizer.

If you invoke Siri using the side button, the animation sprawls out from that top-right corner while if you use your voice or the keyboard, it comes in from the edges.

Speaking of the keyboard, this too is new. With a double-tap at the bottom of your phone screen, a keyboard will appear to type Siri your questions.

You can now type to Siri which opens a keyboard from the bottom of the screen
Double tap the bottom of your phone to type to Siri



It's easy to move between typed and verbal queries. Just tap the microphone button on the bottom-right of the keyboard to move back to spoken if that's easier in the moment.

Apple Intelligence: A better understanding



Aside from a fresh coat of paint, Siri has improved awareness. It's able to retain context between queries and even when you stumble.

"Can you check the weather tomorrow, actually, no, check it for Tuesday in Cupertino" will return the correct answer while asking on the legacy version of Siri gives a range of weather info.

Similarly, if you say "set a timer for three hours, wait I meant set an alarm for three," the old Siri somehow set a timer for about 5 hours while the new Siri correctly set an alarm for 3PM.

Aside from stutters, context is vastly improved. You can ask follow-up questions that require understand of the prior question and its answer.

For example, you can't ask "old" Siri "check when the next Cleveland Guardians game is?," and then follow it up with "that looks good, add that to my calendar."

Two iPhones getting the season record for the Cleveland Guardians
The new Siri interface looks better for many searches



The soon-to-be outdated Siri give you the Guardian's schedule and then creates a blank calendar appointment without any information attached. The new Siri knew exactly what to do.

Siri trying to add a Guardians game to the calendar even though there are conflicting events
We can ask follow up questions with the updated Siri, like adding a game to our schedule after asking when the game was



It works for anything that has follow up context. In the home, you can ask Siri to turn off the bedroom lights and follow it up by saying "and the living room ones too."

Compounded questions still elude Apple's voice assistant though. If you try to combine two questions into one, it only is able to answer one.

Sending a text via Siri
We wish this worked better with the upgraded Siri



"Text Faith and ask what she wants me to make for dinner, then set a reminder to start cooking at 5," gave us a single text message with the contents "what do you want me to make for dinner, then set a reminder to start cooking at 5."

Maybe this will get better as the beta progresses.

Apple Intelligence: Product knowledge

AppleInsider

readers should appreciate the next new feature coming to Siri by way of Apple Intelligence. It will be able to answer questions about Apple products, helping curb those late night support calls we're all familiar with.

Siri giving results with Apple Intelligence on how to schedule a message to send later
Siri can answer many questions about your phone



You can ask things like "how do I send a message later?," or "how to I fix a photo when someone is blinking." We tested a variety of these and got hit or miss answers so far.

Two phones on a white table with the left showing web results for blinking in photos with the right shows apple intelligence and detailed instructions on how to fix
Old Siri just gave us search results for blinking in photos



When it worked, it was great. It gave step-by-step instructions on how to do whatever we asked. Including with the two outlined above.

Asking something like "How do I log my omeprazole for the day" threw it for a loop. This isn't far off from "how do I log medicine on my iPhone," which gave good results.

Medicine databases are huge, and available. Apple Intelligence theoretically should understand context enough to realize omeprazole is a medicine and what we were asking for. Average users may not know how to phrase a question to elicit the correct response, so its ability to parse here needs to improve.

Again, this is an initial beta so we're hopeful that these things do improve by the full launch. Apple does have a new feedback option when using Siri to help provide that necessary guidance.

Apple Intelligence: More of the same -- for now



The new keyboard and new design has caused us to use Siri more over the past few days. But the more we do, the more it's clear that much of Siri is still the same.

Two searches for the prime minister of Malaysia, comparing the old versus new interfaces on two different iPhones on a white table
Many results, while they look better, still just show web views or open a browser



A lot of answers still take you to the web or require inevitably opening a browser window to answer. Even though all this makes Siri easier to use, it still doesn't really know any more than it did before yet.

It's clear right now that Siri is still Siri. And, right now, it is missing a lot of functionality that other virtual assistants bring to bear.

The old Siri interface on an iPhone, a glowing orb at the bottom of the screen
The old Siri orb is gone



The good news is most of those features are coming. Other than answering questions about your phone, few of the new features for Siri have been released.

In future updates, Siri will include onscreen awareness to be able to take actions and respond based on what you're looking at. It'll also be able to understand personal context, pulling form your calendar, messages, emails, and more to become more personal.

Plus, it'll be able use a dozen app intents to perform actions across third-party apps. These actions apply to browsers, eReaders, photo and camera apps, and more.

An iPhone with a popup asking to use chatGPT for a question
ChatGPT is coming to Siri later this year



Not to mention the hotly anticipated integration with ChatGPT. That, and other LLMs, will add a wealth of knowledge that's currently absent.

Together all of these are poised to deliver a vastly improved Siri. Only time will tell if it will ultimately be enough.

Apple Intelligence and the upgraded Siri are set to arrive with iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 this fall.



Read on AppleInsider

«1

Comments

  • Reply 1 of 30
    jas99jas99 Posts: 161member
    One step at a time. Get it right.
    mac_doggregoriusmdavenwilliamlondon
  • Reply 2 of 30
    I wonder since siri will run on device if you kept asking for omeprazole will it be able to recognize it and save that request for future reference 
  • Reply 3 of 30
    Maybe it is too late… but we must recognize that the pre-AI Siri was more of a Voice User Interface than an ‘assistant.’ With your keyboard and mouse you search and receive web info! (Web browsers only remember you thru tracking ads!)

    Now… or in a near futue… AI-Siri will become a real ‘assistant.’
    The best part is that it will preserve your privacy… like a real personal assistant. When your assistant does not know the answer… he/she/it will go to the library or the encyclopedia… well… to ChatGPT by now…
    gregoriusmwilliamlondonjas99
  • Reply 4 of 30
    I've noticed that Siri has been getting better in recognizing my requests.  However, with the latest security update I'm now being asked to unlock my iPhone before I can use Siri.  Siri is now getting turned off in settings as it is unusable to me, especially if I'm driving and hope to use it for handsfree functions.  Nice one Apple...
    williamlondon
  • Reply 5 of 30
    hypoluxahypoluxa Posts: 698member
    As a previous commenter said, get it right, out of the gate. I can wait.
    davenwilliamlondonjas99grandact73
  • Reply 6 of 30
    maltzmaltz Posts: 486member
    I've been deep in the rabbit hole of self-hosted AI the last couple of weeks, and was late to the AI party in general, and I've been *blown away* by how powerful AI can be with so (relatively) little hardware.  A phone might be pushing it, but with Apple Silicon and SoCs with bespoke AI instructions... who  knows?  Besides, Siri doesn't need a ton of general knowledge, she just needs to know everything about how to work with the device she's on.  Though I'm kind of surprised and disappointed that it tripped over the "Text Faith and ask what she wants me to make for dinner, then set a reminder to start cooking at 5,"  From my experience, even small AIs should be able to handle that.  I know that Apple Intelligence is still very pre-release - I just really don't want a repeat of what happened with "let's roll our own" Apple Map, which is still kinda garbage in the one way that most matters: map accuracy.

    Still, I'm incredibly excited about an on-device, privacy-centric AI assistant.  That's how it should be.
    edited August 2
  • Reply 7 of 30
    elijahgelijahg Posts: 2,814member
    Since all this apparently needs an iPhone 15, what is going to happen to the Watches and Homepods out there? They all stick with dumb Siri? Apple's never going to put an A16 in a Watch nor Homepod, so are they going to stick with dumb Siri forever?
  • Reply 8 of 30
    elijahg said:
    Since all this apparently needs an iPhone 15, what is going to happen to the Watches and Homepods out there? They all stick with dumb Siri? Apple's never going to put an A16 in a Watch nor Homepod, so are they going to stick with dumb Siri forever?
    I would say so.
  • Reply 9 of 30
    XedXed Posts: 2,777member
    Maybe it is too late… but we must recognize that the pre-AI Siri was more of a Voice User Interface than an ‘assistant.’ With your keyboard and mouse you search and receive web info! (Web browsers only remember you thru tracking ads!)

    Now… or in a near futue… AI-Siri will become a real ‘assistant.’
    The best part is that it will preserve your privacy… like a real personal assistant. When your assistant does not know the answer… he/she/it will go to the library or the encyclopedia… well… to ChatGPT by now…
    Siri was using AI even before Apple bought it. It is a project developed by the SRI International Artificial Intelligence Center. Simply because the needle of what we expect from AI in the Real has moved considerably doesn't mean that AI wasn't being utilized.
    williamlondondewmejas99
  • Reply 10 of 30
    XedXed Posts: 2,777member
    amystic1 said:
    I've noticed that Siri has been getting better in recognizing my requests.  However, with the latest security update I'm now being asked to unlock my iPhone before I can use Siri.  Siri is now getting turned off in settings as it is unusable to me, especially if I'm driving and hope to use it for handsfree functions.  Nice one Apple...

    Have you enabled "Hey, Siri" in Settings and is it enabled when Locked? Also , there is an Accessibility setting for using "Hey, Siri" when the iPhone is face down.

    PS: I personally don’t like to have features accessible from the Lock Screen. In fact, I hate that phone numbers and emails show up on the  Lock Screen after a restart. It doesn’t connect with Contacts, but I feel that a full phone number reveals more info than simply a name like Bob being displayed. To rectify that I lock the SIM.
    edited August 2 williamlondon
  • Reply 11 of 30
    elijahgelijahg Posts: 2,814member
    Xed said:
    Maybe it is too late… but we must recognize that the pre-AI Siri was more of a Voice User Interface than an ‘assistant.’ With your keyboard and mouse you search and receive web info! (Web browsers only remember you thru tracking ads!)

    Now… or in a near futue… AI-Siri will become a real ‘assistant.’
    The best part is that it will preserve your privacy… like a real personal assistant. When your assistant does not know the answer… he/she/it will go to the library or the encyclopedia… well… to ChatGPT by now…
    Siri was using AI even before Apple bought it. It is a project developed by the SRI International Artificial Intelligence Center. Simply because the needle of what we expect from AI in the Real has moved considerably doesn't mean that AI wasn't being utilized.
    Recent versions of Siri don't use AI in any sense. It has got worse over time because it has had extra actions added with zero upgrades to its language processing. It simply picks out keywords from what you say, likely weights them for importance, and works out which of its predefined actions has the most hits for the words it deemed important. And seems to add in some entropy to guarantee asking the same exact same thing every day will occasionally result in it interpreting wrong. And if it's on a Homepod, it often tries and fails to contact the attached phone to do the actual work, so fails anyway. Its "AI" is way too basic to be labelled that. Even the most novice coder could improve on its interpretation engine,.

    Siri doesn't understand what you say at all - that is really one of the key definitions of something being AI.. You can miss out words and it works still even if those words are critical to getting the action right. In fact works better if you say the absolute minimum as there's less for it to judge wrongly. "Siri, music" for example. 

    It used to use Wofram Alpha if it didn't understand, but Apple apparently abandoned that way back in about 2015. You could ask complex things like "how many busses does a whale weigh" and it'd give you a correct answer. Now it's just a glorified web search for anything it needs "intelligence" for - which is even worse for small or no screen devices. "I can show you that on your iPhone"... great.
  • Reply 12 of 30
    Amusing that the lede parrots Apple with its “even better” followed by all the ways Siri sucks or/and blows. 
    williamlondon
  • Reply 13 of 30
    Really, who cares how Siri looks now? The most prevalent use case for a voice assistant is no screens at all, think HomePod or car.

    And that's where the biggest downside is. The sheer amount of "I found some results and can show them to you on your iPhone" or "I cannot answer this while you are driving" is mind boggling for something that should help users do stuff without physical interaction with their devices. I'm as big an Apple fanboy as the next guy, but I hate how Apple has been so slow on anything Siri related—especially the consistency and seamlessness across devices, which has been Apple's hallmark.

    We are clearly moving in the direction of voice interaction (here's looking at you, Picard) and if Apple would spend half the time on developing Siri properly than it does on super-important stuff like phones that are 0.00001 inch thinner than last year's or teaching their employees ridiculous hand gestures for presentations, they'd be scorching the competition and making all of us much happier.
    elijahgwilliamlondonjas99
  • Reply 14 of 30
    XedXed Posts: 2,777member
    elijahg said:
    Xed said:
    Maybe it is too late… but we must recognize that the pre-AI Siri was more of a Voice User Interface than an ‘assistant.’ With your keyboard and mouse you search and receive web info! (Web browsers only remember you thru tracking ads!)

    Now… or in a near futue… AI-Siri will become a real ‘assistant.’
    The best part is that it will preserve your privacy… like a real personal assistant. When your assistant does not know the answer… he/she/it will go to the library or the encyclopedia… well… to ChatGPT by now…
    Siri was using AI even before Apple bought it. It is a project developed by the SRI International Artificial Intelligence Center. Simply because the needle of what we expect from AI in the Real has moved considerably doesn't mean that AI wasn't being utilized.
    Recent versions of Siri don't use AI in any sense. It has got worse over time because it has had extra actions added with zero upgrades to its language processing. It simply picks out keywords from what you say, likely weights them for importance, and works out which of its predefined actions has the most hits for the words it deemed important. And seems to add in some entropy to guarantee asking the same exact same thing every day will occasionally result in it interpreting wrong. And if it's on a Homepod, it often tries and fails to contact the attached phone to do the actual work, so fails anyway. Its "AI" is way too basic to be labelled that. Even the most novice coder could improve on its interpretation engine,.

    Siri doesn't understand what you say at all - that is really one of the key definitions of something being AI.. You can miss out words and it works still even if those words are critical to getting the action right. In fact works better if you say the absolute minimum as there's less for it to judge wrongly. "Siri, music" for example. 

    It used to use Wofram Alpha if it didn't understand, but Apple apparently abandoned that way back in about 2015. You could ask complex things like "how many busses does a whale weigh" and it'd give you a correct answer. Now it's just a glorified web search for anything it needs "intelligence" for - which is even worse for small or no screen devices. "I can show you that on your iPhone"... great.
    It's still AI. It clearly uses machine learning to cater the results to the user, which you seem to admit. 
    williamlondon
  • Reply 15 of 30
    dewmedewme Posts: 5,634member
    Xed said:

    PS: I personally don’t like to have features accessible from the Lock Screen. In fact, I hate that phone numbers and emails show up on the  Lock Screen after a restart. It doesn’t connect with Contacts, but I feel that a full phone number reveals more info than simply a name like Bob being displayed. To rectify that I lock the SIM.
    These behaviors can be changed in Settings/Notifications Phone and Mail under their respective Lock Screen options.

  • Reply 16 of 30
    XedXed Posts: 2,777member
    dewme said:
    Xed said:

    PS: I personally don’t like to have features accessible from the Lock Screen. In fact, I hate that phone numbers and emails show up on the  Lock Screen after a restart. It doesn’t connect with Contacts, but I feel that a full phone number reveals more info than simply a name like Bob being displayed. To rectify that I lock the SIM.
    These behaviors can be changed in Settings/Notifications Phone and Mail under their respective Lock Screen options.

    I only see them as an all-or-nothing option for the Lock Screen, not as an option to keep all identifiable notification data off the Lock Screen of a device that has been booted but not authenticated with the passcode.
  • Reply 17 of 30
    elijahgelijahg Posts: 2,814member
    Xed said:
    elijahg said:
    Xed said:
    Maybe it is too late… but we must recognize that the pre-AI Siri was more of a Voice User Interface than an ‘assistant.’ With your keyboard and mouse you search and receive web info! (Web browsers only remember you thru tracking ads!)

    Now… or in a near futue… AI-Siri will become a real ‘assistant.’
    The best part is that it will preserve your privacy… like a real personal assistant. When your assistant does not know the answer… he/she/it will go to the library or the encyclopedia… well… to ChatGPT by now…
    Siri was using AI even before Apple bought it. It is a project developed by the SRI International Artificial Intelligence Center. Simply because the needle of what we expect from AI in the Real has moved considerably doesn't mean that AI wasn't being utilized.
    Recent versions of Siri don't use AI in any sense. It has got worse over time because it has had extra actions added with zero upgrades to its language processing. It simply picks out keywords from what you say, likely weights them for importance, and works out which of its predefined actions has the most hits for the words it deemed important. And seems to add in some entropy to guarantee asking the same exact same thing every day will occasionally result in it interpreting wrong. And if it's on a Homepod, it often tries and fails to contact the attached phone to do the actual work, so fails anyway. Its "AI" is way too basic to be labelled that. Even the most novice coder could improve on its interpretation engine,.

    Siri doesn't understand what you say at all - that is really one of the key definitions of something being AI.. You can miss out words and it works still even if those words are critical to getting the action right. In fact works better if you say the absolute minimum as there's less for it to judge wrongly. "Siri, music" for example. 

    It used to use Wofram Alpha if it didn't understand, but Apple apparently abandoned that way back in about 2015. You could ask complex things like "how many busses does a whale weigh" and it'd give you a correct answer. Now it's just a glorified web search for anything it needs "intelligence" for - which is even worse for small or no screen devices. "I can show you that on your iPhone"... great.
    It's still AI. It clearly uses machine learning to cater the results to the user, which you seem to admit. 
    What? It clearly doesn’t. If it did, asking it to “add tomato sauce to my shopping list” wouldn’t keep randomly choosing between adding “tomato” and “sauce” separately vs adding “tomato sauce” as one item since I’ve told it to correct that 500 times. 

    A list of commands isn’t AI. AI by definition is supposed to learn, AI abilities aren't preprogrammed as they are with Siri. Otherwise you could call displaying a letter onscreen as someone types AI, because it is a list of preprogrammed actions.  
  • Reply 18 of 30
    XedXed Posts: 2,777member
    elijahg said:
    Xed said:
    elijahg said:
    Xed said:
    Maybe it is too late… but we must recognize that the pre-AI Siri was more of a Voice User Interface than an ‘assistant.’ With your keyboard and mouse you search and receive web info! (Web browsers only remember you thru tracking ads!)

    Now… or in a near futue… AI-Siri will become a real ‘assistant.’
    The best part is that it will preserve your privacy… like a real personal assistant. When your assistant does not know the answer… he/she/it will go to the library or the encyclopedia… well… to ChatGPT by now…
    Siri was using AI even before Apple bought it. It is a project developed by the SRI International Artificial Intelligence Center. Simply because the needle of what we expect from AI in the Real has moved considerably doesn't mean that AI wasn't being utilized.
    Recent versions of Siri don't use AI in any sense. It has got worse over time because it has had extra actions added with zero upgrades to its language processing. It simply picks out keywords from what you say, likely weights them for importance, and works out which of its predefined actions has the most hits for the words it deemed important. And seems to add in some entropy to guarantee asking the same exact same thing every day will occasionally result in it interpreting wrong. And if it's on a Homepod, it often tries and fails to contact the attached phone to do the actual work, so fails anyway. Its "AI" is way too basic to be labelled that. Even the most novice coder could improve on its interpretation engine,.

    Siri doesn't understand what you say at all - that is really one of the key definitions of something being AI.. You can miss out words and it works still even if those words are critical to getting the action right. In fact works better if you say the absolute minimum as there's less for it to judge wrongly. "Siri, music" for example. 

    It used to use Wofram Alpha if it didn't understand, but Apple apparently abandoned that way back in about 2015. You could ask complex things like "how many busses does a whale weigh" and it'd give you a correct answer. Now it's just a glorified web search for anything it needs "intelligence" for - which is even worse for small or no screen devices. "I can show you that on your iPhone"... great.
    It's still AI. It clearly uses machine learning to cater the results to the user, which you seem to admit. 
    What? It clearly doesn’t. If it did, asking it to “add tomato sauce to my shopping list” wouldn’t keep randomly choosing between adding “tomato” and “sauce” separately vs adding “tomato sauce” as one item since I’ve told it to correct that 500 times. 

    A list of commands isn’t AI. AI by definition is supposed to learn, AI abilities aren't preprogrammed as they are with Siri. Otherwise you could call displaying a letter onscreen as someone types AI, because it is a list of preprogrammed actions.  
    The definition of AI isn't "everything works like in some near future sci-fi movie I watched last night."
    williamlondonjas99
  • Reply 19 of 30
    rezwitsrezwits Posts: 895member
    All I know is my Home Kit request, to turn things on and off, are like lightning!
    jas99
  • Reply 20 of 30
    elijahgelijahg Posts: 2,814member
    Xed said:
    elijahg said:
    Xed said:
    elijahg said:
    Xed said:
    Maybe it is too late… but we must recognize that the pre-AI Siri was more of a Voice User Interface than an ‘assistant.’ With your keyboard and mouse you search and receive web info! (Web browsers only remember you thru tracking ads!)

    Now… or in a near futue… AI-Siri will become a real ‘assistant.’
    The best part is that it will preserve your privacy… like a real personal assistant. When your assistant does not know the answer… he/she/it will go to the library or the encyclopedia… well… to ChatGPT by now…
    Siri was using AI even before Apple bought it. It is a project developed by the SRI International Artificial Intelligence Center. Simply because the needle of what we expect from AI in the Real has moved considerably doesn't mean that AI wasn't being utilized.
    Recent versions of Siri don't use AI in any sense. It has got worse over time because it has had extra actions added with zero upgrades to its language processing. It simply picks out keywords from what you say, likely weights them for importance, and works out which of its predefined actions has the most hits for the words it deemed important. And seems to add in some entropy to guarantee asking the same exact same thing every day will occasionally result in it interpreting wrong. And if it's on a Homepod, it often tries and fails to contact the attached phone to do the actual work, so fails anyway. Its "AI" is way too basic to be labelled that. Even the most novice coder could improve on its interpretation engine,.

    Siri doesn't understand what you say at all - that is really one of the key definitions of something being AI.. You can miss out words and it works still even if those words are critical to getting the action right. In fact works better if you say the absolute minimum as there's less for it to judge wrongly. "Siri, music" for example. 

    It used to use Wofram Alpha if it didn't understand, but Apple apparently abandoned that way back in about 2015. You could ask complex things like "how many busses does a whale weigh" and it'd give you a correct answer. Now it's just a glorified web search for anything it needs "intelligence" for - which is even worse for small or no screen devices. "I can show you that on your iPhone"... great.
    It's still AI. It clearly uses machine learning to cater the results to the user, which you seem to admit. 
    What? It clearly doesn’t. If it did, asking it to “add tomato sauce to my shopping list” wouldn’t keep randomly choosing between adding “tomato” and “sauce” separately vs adding “tomato sauce” as one item since I’ve told it to correct that 500 times. 

    A list of commands isn’t AI. AI by definition is supposed to learn, AI abilities aren't preprogrammed as they are with Siri. Otherwise you could call displaying a letter onscreen as someone types AI, because it is a list of preprogrammed actions.  
    The definition of AI isn't "everything works like in some near future sci-fi movie I watched last night."
    Nor is "Siri add 5 minutes to my 20 minute timer" "Ok, your timer is set to 5 minutes"
Sign In or Register to comment.