Developers can begin work on an app intent system that will make Siri smarter in 2025
Apple's latest betas for iOS 18.2 and the rest support developer testing of the new app intent system that will ultimately make Siri more contextually aware in a later release.
App Intents will let developers pass onscreen data to Apple Intelligence
Users that excitedly updated to the public release of iOS 18.1 for access to Apple Intelligence were disappointed to discover that Siri hadn't changed much beyond a new glow animation. Several of the features shown off during WWDC still aren't available, but developers can get started on support today.
According to developer documentation released by Apple, developers can test the app intent system for making onscreen content available to Siri and Apple Intelligence. Simply, this means users will be able to send anything active on screen to AI for parsing.
Even this beta testing implementation is a limited version of what was promised for Siri. So no, you won't be asking for information about your mother arriving at the airport using data from iMessage and Mail, but this is a step in that direction.
Here is a short list of examples provided by Apple.
- Browser: A person might ask Siri questions about the web page.
- Document reader: A person might ask Siri to give a conclusion of a document.
- File management: A person might ask Siri to summarize file content.
- Mail: A person might ask Siri to provide a summary.
- Photos: A person might ask Siri about things to do with an object in a photo.
- Presentations: A person might ask Siri to suggest a creative title for a presentation.
- Spreadsheets: A person might ask Siri to give an overview of the spreadsheet's data.
- Word processor: A person might ask Siri to suggest additional content for a text document.
Users can already send content to ChatGPT through Siri and ask for actions on that content. For example, the above unordered list was generated by ChatGPT via a Siri request made on a screenshot of Apple's documentation.
However, the new app intent system will let users take a more direct approach. For example, it is already possible to share an entire webpage's contents to Apple Intelligence in iOS 18.2 beta 2.
It is possible users will begin seeing these actions in third-party apps soon, but the full app intent system won't launch until at least iOS 18.4 according to rumors. That release is expected in the spring, which should breathe new life into Siri by combining Apple's on-device information with generative technologies.
Read on AppleInsider
Comments
Before and after AI, Siri is just as dumb as it's always been:
[IEntry condition: my library has 7 albums by the group "James", one of which is "James: The Best Of". Talking to Siri via CarPlay]
Me: Siri, play all James shuffled.
Siri: Playing "Best of James shuffled"
Me: Siri, play all James Albums, shuffled.
Siri: Playing "Best of James shuffled"
Me: Play all the albums of the group James
Siri: Playing "Best of James shuffled.
Me...ARRRRRGGGG MF*&##@
The only improvement I have seen since the introduction of Apple "Intelligence" is that Siri is now a bit more context aware - i.e. you can ask some follow-up questions and Siri will [sometimes] know what you are talking about. E.g.: "What is the population of France?" - Siri answers "What is its landmass?" It gives the landmass of France.