Craig Federighi says Siri won't become sentient, but it'll get better
Apple head of software Craig Federighi was asked about Siri and Apple Intelligence, and while he couldn't discuss the future, he was sure Siri was never going to become some kind of sentient pal.
Apple Intelligence launches with iOS 18.1. Image credit: Apple
Apple Intelligence will launch alongside iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1 before the end of October. Instead of falling into the hype cycle of its competitors, Apple seemingly took a more conservative approach by offering private and secure tools users can use every day.
Apple's senior vice president of software engineering Craig Federighi sat down with Wall Street Journal Joanna Stern to discuss Apple's entry into AI and how it differs from the competition. One example he cited was that Siri is already helping users open their garage door or send a text, while ChatGPT isn't meant for those use cases.
One moment, Federighi pointed out that Siri would continue to improve over time, but Apple isn't targeting a "sentient pal." This differs greatly from other companies like OpenAI, Microsoft, and Google which seem convinced they may create sentience at any time -- referred to as Artificial General Intelligence (AGI).
Instead, Apple wants users to get useful text summaries or use Clean Up to remove objects without fundamentally changing an image. Other companies are happy to generate entire essays or turn a memory into a fabrication, both things Apple seems disinterested in doing.
Privacy isn't a concern with Apple Intelligence, but when Stern asks why other companies aren't approaching AI this way, Federighi's answer is simple. Not only is Apple's on-device system and Cloud Compute difficult to build and execute, the concept runs counter to competitors that want to suck up as much data as possible for training.
When asked about the delay releasing features for Apple Intelligence, Federighi called AI a "big lift" and that "we want to get it right." As opposed to putting "something out there and have it be sort of a mess."
His answer reflects Apple CEO Tim Cook's, who said Apple wasn't the first to AI, but it will be the best.
Apple sees Apple Intelligence as a decades-long arc that needs to be done responsibly. That arc begins with iOS 18.1 and the other operating systems on or around October 28.
Read on AppleInsider
Comments
but Apple is farming that out to ChatGPT. It’s out of their control. I’d really rather Apple handled that as well.
As I mentioned elsewhere all consumer-facing AI features right now really shouldn't be considered more than an alpha or early beta state. Some AI tasks that people do in 2024 and 2025 won't be done in 2026. There will new tasks that arrive. Some will come and go. But for sure there is no reason for anyone to consider today's consumer-facing AI features as mature technology.
We see too much of that online in various discussions -- not just here at AppleInsider -- where people are judging AI as though it were fully featured and not expected to change.
That is shortsighted at best, maybe more naïve. It would be like judging the viability of streaming music based on early 64kbps streams (long before people had smartphones).
I can actually get it to play the correct Led Zeppelin (I-IV) or Yes (Yes, The Yes Album) album now. Before, it was impossible.
Consumers will use what they have at their disposal just like always.
Honesty isn't exclusive to Apple and Craig offered up a ridiculous line here:
"One example he cited was that Siri is already helping users open their garage door or send a text, while ChatGPT isn't meant for those use cases."
This kind of utterly useless statement should be whacked out of existence by the interviewer.
Why didn't Stern just take him to task with that one?
Or this:
"Not only is Apple's on-device system and Cloud Compute difficult to build and execute"
The question 'and....?' should have been asked here. Apple is not the only company doing this. On device compute has been around for years and for very clear reasons. That will increase over time as hardware options increase. The cloud compute part is just a soundbyte too. Stern should have been all over him on that.
This summer I realised my sister-in-law was having 'conversations' with AI and the context and follow up questions/answers were spot on most of the time.
While it is never going to be perfect (your personal mileage may vary) and user reflection is a must, the actual usefulness was off the scale in terms of context and accuracy. The same obviously cannot be said for Siri.
And it will only get better. Being sentient or not is irrelevant as long as the overall experience is worthwhile.I often pose the question to people whether they'd be ok if the government put cameras and microphones all over their homes. The answer is universally no. Yet they're quite happy to let tech companies do it via their phones/smart speakers/etc because they really don't understand that it's EXACTLY the same thing. And that's because it was done underhandedly.
Apple is the only company with the integrity to explain, in detail, how their system works and how it protects privacy. And Apple wouldn't even have built it in the first place if all the tech investors and marketing hype hadn't convinced consumers that they "need" it. Yes, we all need tech companies monitoring everything we do just so that we can win bar room bets faster. Biggest eye roll ever.
I absolutely understand that there are some great things about PRACTICAL applications of AI to advance areas like health care. But I'm very cautious about its use in people's personal lives. For example, the way it's being used to manipulate people on social media (e.g. large numbers of bots convincing people of a certain opinion which benefits private interests). And even the kind of information it gives back on topics for which there are different viewpoints, all of which may have merit, but some of which may go against the personal views of the investors behind the technology providing that information.
Have you asked why Apple, the company with a "privacy first" policy" is doing business with Google, a company with "privacy issues"? I think Apple doing business with a privacy focused search engine will be better and easier than customers changing to another engine, don't you think?