Apple plans more AI in iOS, improved Siri in $1B technology push
Apple has embarked on a billion-dollar push to develop generative AI, as the iPhone maker attempts to bring more AI to the next versions of its operating systems.
Following a wave of technological progress pushed forward by ChatGPT and its rivals, Apple needs to be seen to catch up with the rest of the market. Apple CEO Tim Cook confirmed in September that the company is "of course" working on generative AI, but not what was on the way.
In Sunday's "Power On" newsletter for Bloomberg, Apple executives were "caught off guard" by the influx of AI, and has been working to make up for lost time since late 2022.
"There's a lot of anxiety about this and it's considered a pretty big miss internally," a source told the report.
Along with creating an internal chatbot called "Apple GPT," the company is keen to work out how to add the technology to its products.
The AI push is headed up by SVPs John Giannandrea and Craig Federighi, in charge of AI and software respectively. Services chief Eddy Cue is also apparently on board with the project, which is set to cost Apple about $1 billion per year.
Giannandrea is managing the development of a new AI system's technology, as well as revamping Siri to use the system. It is thought that a new and improved Siri using the technology could arrive as soon as 2024.
The software engineering group controlled by Federighi will be working to add AI to the next edition of iOS. This will apparently involve features that use Apple's large language model, and could improve how Siri fields questions and how Messages auto-completes sentences.
There is also exploration in adding generative AI to development tools such as Xcode. Doing so could help developers create apps for Apple's platforms more quickly.
Services will also work to add AI wherever it can in its various apps, such as auto-generated Apple Music playlists. This could also include helping users write in Pages or create slides in Keynote.
Apple is also trialling using generative AI for internal customer service apps under AppleCare.
As development continues, there is apparently some debate over whether to continue pushing for on-device processing or if using cloud-based LLMs could help. The former is more privacy-focused, but the latter could allow for more advanced features to be developed.
Read on AppleInsider
Comments
2) Intelligence is defined as "the ability to acquire and apply knowledge and skills." Now explain to us how you define so-called "actual intelligence".
https://www.youtube.com/watch?v=x7qPAY9JqE4
https://www.youtube.com/watch?v=5dhuxRF2c_w Take a look at the coolant cooling system required for using the 4090 card I don’t think Apple is looking to go down that
path, a path that they briefly were on with IBM which didn’t work out too well?
They can have all the innovation in the world behind closed doors but silence is increasingly equaling falling behind in a market that is extremely open source and open discourse about their innovations.
In my view it is a huge miss at Apple to be so behind in shipping strong AI features in their products and OS. Some executives should really be replaced as they have become complacent with a string of years of high profits and margins. Craig should have ridden the AI wave but dropped the ball. They are getting the old bank disease.
As an Apple fanboi, I find it hugely disappointing that Apple is not ground zero for some of the seismic shifts in the industry on this topic. Acquiring Siri a decade so ago was a great move and then they completely let it die on the vine.
Right now it seems like the elite at Apple can be found in the chip department and the rest are muddling through bailed out by industry leading silicon.
Multimodal models are now busy blowing the barn doors off what is possible and Apple is busy polishing the edges of the as-is.
Regarding being caught „off-guard“ sounds really strange to me, since I would assume that the state of the art in this field is being presented and discussed between scientists/experts as in any other field of research. Hm.
OF course, one could argue that „actual intelligence“ is nothing but „really really advanced statistics embedded in a bio-chemical system“. I am not sure to what extent one can draw a sharp line between both of them. I think that the term „knowledge“ is more than „data“ and the magic word here might be „context“. As long as ChatGPT and friends model people with six fingers (as I have seen) there is apparently way to go…
Also, looking at the examples of things that Apple intends to use AI for makes me think of faster horses. Apple will have to do far better, and think far different(ly), than this if they intend to stay ahead of OpenAI + Jony Ive.
…just saying
…and the clock’s ticking
Since Apple won't ship something that they don't believe fully integrates with all their other software and services as it should, I don't see them shipping anything super soon related to Siri unless they've been pulling in other software teams to help.
And when they do, they won't take the the approach of throwing everything up on a wall to see what sticks. They'll be very calculated to ensure it has a positive benefit for the end user. This is my biggest issue currently. I'm in software, and feel like I follow AI developments pretty closely. I still have yet to see something put together in a way that is actually going to transform the way I work. The closest being Github CoPilot or other coding services.
But Siri's slow to absent progress ultimately taught them that features have to be useful and make a difference; and, it has to be something that Apple can make a difference with. After the initial coolness of having a voice interface wore off, I think they learned that there just wasn't a lot of utility to it. It was just a feature, not a revolution. So, they really became hesitant to enter markets or do things that didn't make a difference to users afterward. The features needed to be good and useful to users in the long run. They ultimately have had a few not so useful features, but they are not chasing everything either.
It's a long game and this is not the first time, the second time, or the fifth time that Apple has been criticized for struggling with some feature or being behind with some new service. Amazon's Alexa has now gone the entire cycle of people saying that they have the next great thing with Echo devices, getting tens of millions of units into places, to now laying off large parts of the division in 2023. Google chased, and now has a moribund voice interface box set of hardware. No one actually made money, hence the layoffs.
There was a cycle in the 2015 to 2018 time frame of pundits saying Apple was struggling with artificial intelligence or machine learning. Google was showing them up with their ML photo processing. Not much talk about it now, as this is basically at a detente between Google and Apple. Apple basically skipped the entire 2014 to 2022 phase of VR headset hardware. Google, Facebook and MS all entered the market, but this market basically contracted in 2022, nobody made any money, and it's hoping Vision Pro would resuscitate it. An entire cycle of nobody making money in the headset market. Heck, we were supposed to have self driving cars now, based on the hype in 2017, and it may not happen until 2027, in limited forms at that.
So, with these LLMs, how does it affect Apple, and why does Apple need to offer said service? If MS, Google, whoever, has their LLM service available on Apple platforms, how does it impact Apple? How does it impact Apple if they have their own LLM? Then the first question is what do LLMs do for users? Lots of hype over how LLMs reply in natural language, but its usefulness really depends on what it does for people. So, what does it ultimately do for people?