Rumor: Siri to get iMessage integration, adapt to user habits in 'iOS 11'
Apple is planning a slate of Siri updates for "iOS 11" which should include things like more adaptive AI and integration with iMessage and iCloud, according to a rumor stemming from an Israeli site.
A mockup of Siri integration with the iOS Messages app.
The voice assistant should be able to learn a person's usage habits and offer different actions based on context, The Verifier claimed. The lack of contextual awareness has been one of the chief criticisms of Siri, especially in comparison with Google Assistant and Samsung's Bixby.
Google's AI, for instance, can not only pull data from the Web and a variety of services but accept follow-up questions that avoid the need to spell out a full command. Asking "who is the President of the U.S.," for instance, can be followed by "how old is he."
The iMessage integration will reportedly let Siri act based on conversations. Talking about eating sushi might prompt it to suggest restaurants, offer to book a reservation, or even arrange ridesharing. These claims may be at least partially based on recent patent filings.
iCloud integration will allegedly be used to "identify the connections" between devices, and offer relevant actions on each. Apple is in fact said to be planning deeper use of Siri in tvOS and watchOS, for example expanding the number of commands an Apple TV understands.
Apple should show off its new technology at the Worldwide Developers Conference in June when "iOS 11" will be announced, The Verifier said, cautioning that the company could postpone Siri features until later releases. Changes will also reportedly make their way into the next version of macOS.
The Verifier is a relatively unknown site without an established track record. Recently it claimed that "iOS 11" will also support group video calls in FaceTime, finally catching up with rivals like Google and Microsoft.
A mockup of Siri integration with the iOS Messages app.
The voice assistant should be able to learn a person's usage habits and offer different actions based on context, The Verifier claimed. The lack of contextual awareness has been one of the chief criticisms of Siri, especially in comparison with Google Assistant and Samsung's Bixby.
Google's AI, for instance, can not only pull data from the Web and a variety of services but accept follow-up questions that avoid the need to spell out a full command. Asking "who is the President of the U.S.," for instance, can be followed by "how old is he."
The iMessage integration will reportedly let Siri act based on conversations. Talking about eating sushi might prompt it to suggest restaurants, offer to book a reservation, or even arrange ridesharing. These claims may be at least partially based on recent patent filings.
iCloud integration will allegedly be used to "identify the connections" between devices, and offer relevant actions on each. Apple is in fact said to be planning deeper use of Siri in tvOS and watchOS, for example expanding the number of commands an Apple TV understands.
Apple should show off its new technology at the Worldwide Developers Conference in June when "iOS 11" will be announced, The Verifier said, cautioning that the company could postpone Siri features until later releases. Changes will also reportedly make their way into the next version of macOS.
The Verifier is a relatively unknown site without an established track record. Recently it claimed that "iOS 11" will also support group video calls in FaceTime, finally catching up with rivals like Google and Microsoft.
Comments
How is that any different than the example of Google's AI that is in the article? (Has anyone used Bixby?)
I recently saw a video (thought it was posts here but may have been DF) of someone using an Android phone and an iPhone and asking the same questions and follow-ups (occasionally, when appropriate) and basically finding that the two are on par with each other.
I see a lot of "Siri sucks" articles and comments but I think the reality is much different.
Then I tried, "find a bar near me" and was presented a list of 15. I said, "call one", the same list was presented with "which one would you like to call?"
Speaking at 100 miles per hour and with english with a french accent
I say one of the variants here:
give me an italian restaurant near her
give me a thai restaurant near here
give me a greek restaurant near here
give me a french restaurant near here
Found in less than one second the nearest ones from each case, tells me the name and location on the map, rating and how much it cost (moderate, expensive, etc.)
asks me if I want directions or to call them
I say: give me the directions
and starts doing it.
Tried same thing with give me nearest X restaurant, same results.
Not sure what your doing but it is literally impossible for me for this request to fail (and I've tried all variants).
From my own perspective, voice recognition is much better now than 4 years ago, much much better.
This was done with no noise around and raising phone at 6 inch from my mouth.
Will try it in a noisy place to see if it decreases efficiency (it probably should).
1) Who is the president of the United States, received Wolfram Alpha info, same as above.
2) How old is he? Siri replies Donald Trump is 70 years old and shows info from Wikipedia
3) Does he have kids? Siri replies "The children of Donald Trump are Eric Trump, Ivanka Trump and 3 others" and again lists info from Wolfram Alpha.
Did you even try this before posting? It takes less than a minute to verify.
What sometimes may be forgotten is that competitors mai lu for with English. Siri works with many more languages than them. Which means it has to "understand" syntax, semantics, grammar, context for all of them.
I think Siri is not so bad considering this.
Really curious to how they expand but keep privacy.
Also really hope Siri learns say named timers better still a series of timers.