Apple isn't behind on AI, it's looking ahead to the future of smartphones

Posted:
in iPhone

A new research paper shows that Apple has practical solutions to technical AI issues that other firms appear to be ignoring, specifically how to use massive large language modules on lower-memory devices like iPhone.




Despite claims that Apple is behind the industry on generative AI, the company has twice now revealed that it is continuing to do its longer term planning rather than racing to release a ChatGPT clone. The first sign was a research paper which proposed an AI system called HUGS, which generates digital avatars of humans.

Now as spotted by VentureBeat, a second research paper, proposes solutions for deploying enormous large language modules (LLMs) on devices with limited RAM, such as iPhones.

The new paper is called "LLM in a flash: Efficient Large Language Model Inference with Limited Memory." Apple says that it "tackles the challenge of efficiently running LLMs that exceed the available DRAM capacity by storing the model parameters on flash memory but bringing them on demand to DRAM."

So the whole LLM still needs to be stored on-device, but working with it in RAM can be done through working with flash memory as a kind of virtual memory, not dissimilar to how it's done on macOS for memory intensive tasking.

"Within this flash memory-informed framework, we introduce two principal techniques," says the research paper. "First, 'windowing' strategically reduces data transfer by reusing previously activated neurons... and second, 'row-column bundling,' tailored to the sequential data access strengths of flash memory, increases the size of data chunks read from flash memory."

What this ultimately means is that LLMs of practically any size can still be deployed on devices with limited memory or storage. It means that Apple can leverage AI features across more devices, and therefore in more ways.

Detail from the research paper showing faster reading of LLMs from flash memory
Detail from the research paper showing faster reading of LLMs from flash memory



"The practical outcomes of our research are noteworthy," claims the research paper. "We have demonstrated the ability to run LLMs up to twice the size of available DRAM, achieving an acceleration in inference speed by 4-5x compared to traditional loading methods in CPU, and 20-25x in GPU."

"This breakthrough is particularly crucial for deploying advanced LLMs in resource-limited environments," it continues, "thereby expanding their applicability and accessibility."

Apple has made this research public, as it did with the HUGS paper. So instead of being behind, it is actually working to improve AI capabilities for the whole industry.

This fits in with analysts who, given the user base that Apple does, believe the firm will benefit the most as AI goes further mainstream.



Read on AppleInsider

«1

Comments

  • Reply 1 of 33
    elijahgelijahg Posts: 2,760member
    This article carefully steps around the fact that this improvement will make zero difference to Apple's most visible and most deficient use of AI: Siri.

    Siri is the primary and by far most visible use of "AI" at Apple. People quite rightly have a problem with Siri being thick as a brick. Siri has numerous problems which are not going to be solved by improving the creation of useless avatars for a niche device no one yet owns outside of the developer space. This research specifically does nothing to help Siri for several reasons: 
    • Siri doesn't use contemporary "AI" like ChatGPT, Siri's "intelligence" comes from predefined queries a in basic lookup table - which is why if you don't ask it something in the right way it breaks.
    • Very few Siri queries are processed on-device. 
    • LLMs need huge datasets unavailable on-device so efficiency of on-device processing is mostly irrelevant. 
    This article does nothing to convince anyone that Apple is not far far behind the AI curve. Making a clockwork timepiece 10% more efficient is irrelevant when everyone else has long since moved to battery. Apple needs exactly what this article claims they don't: a clone of ChatGPT. Siri was supposed to be conversational 10 years ago when Steve Jobs introduced it on the iPhone 4. It has arguably got worse since then.

    However, Apple's AI in other areas seems pretty good, photos is pretty good at recognising objects/people/things/scenes for example; and that's on-device. But claiming that Apple isn't behind the AI curve when their most visible use of AI is a disaster is pure fanboyism.
    edited December 2023 muthuk_vanalingamdesignrzeus423
  • Reply 2 of 33
    avon b7avon b7 Posts: 7,703member
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    It is also recruiting for specific roles in AI. So far, most of the talk has been only that, talk. 

    Talking about ML as they made a point of doing, is stating the obvious here. Who isn't using ML? 

    In this case of LLMs on resource strapped devices, again, some manufacturers are already using them. 

    A Pangu LLM underpins Huawei's Celia voice assistant on its latest phones. 

    I believe Xiaomi is also using LLMs on some of its phones too (although I don't know in which areas). 

    The notion of trying to do more with less is an industry constant. Research never stops in that area and in particular routers have been a persistent research target, being ridiculously low on spare memory and CPU power. I remember, many years ago, doing some external work for the Early Bird project and the entire goal was how to do efficient, real time detection of worm signatures on data streams without impacting the performance of the router. 

    Now, AI is key to real-time detection of threats in network traffic and storage (ransomware in the case of storage, which is another resource strapped area). 

    LLMs have to be run according to needs. In some cases there will be zero issues with carrying out tasks in the Cloud or at the Edge. In other cases/scenarios you might want them running more locally. Maybe even in your Earbuds (for voice recognition or Bone Voice ID purposes etc). 

    Or in your TV or even better across multiple devices at the same time. Resource pooling. 

    edited December 2023 muthuk_vanalingamdesignrelijahgblastdoorgrandact73
  • Reply 3 of 33
    I use Google quite a bit to lookup facts and figures. But as much as I'd love to have a conversational AI with which I could throw random questions and get answers, I simply don't trust most of today's generative LLMs to give me accurate answers to the questions posed.

    The more conversational the AI, the more like it is to hallucinate and give me answers that seem perfectly accurate... when in fact they're anything but.

    If I were Apple, that's a problem I'd want to solve.
    ihatescreennameswatto_cobraForumPost
  • Reply 4 of 33
    At best you can say that Apple is behind but it is attempting to catch up by looking to the future. But if it fails to do that, then it will remain behind. Way behind. 
    nubuswilliamlondon
  • Reply 5 of 33
    avon b7 said:
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    It is also recruiting for specific roles in AI. So far, most of the talk has been only that, talk. 

    Talking about ML as they made a point of doing, is stating the obvious here. Who isn't using ML? 

    In this case of LLMs on resource strapped devices, again, some manufacturers are already using them. 

    A Pangu LLM underpins Huawei's Celia voice assistant on its latest phones. 

    I believe Xiaomi is also using LLMs on some of its phones too (although I don't know in which areas). 

    The notion of trying to do more with less is an industry constant. Research never stops in that area and in particular routers have been a persistent research target, being ridiculously low on spare memory and CPU power. I remember, many years ago, doing some external work for the Early Bird project and the entire goal was how to do efficient, real time detection of worm signatures on data streams without impacting the performance of the router. 

    Now, AI is key to real-time detection of threats in network traffic and storage (ransomware in the case of storage, which is another resource strapped area). 

    LLMs have to be run according to needs. In some cases there will be zero issues with carrying out tasks in the Cloud or at the Edge. In other cases/scenarios you might want them running more locally. Maybe even in your Earbuds (for voice recognition or Bone Voice ID purposes etc). 

    Or in your TV or even better across multiple devices at the same time. Resource pooling. 

    "Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. "

    This does not prove that Apple is behind.  Was Apple behind when it introduced the iPod years after other digital music players? or the iPhone? or the iPad? or the Apple Watch.  All this proves is that Apple is not first but being behind implies that other players are better than them in this technology which hasn't been proven yet since we're still in the early stages with respect to AI / ML.
    williamlondondanoxwatto_cobraroundaboutnowForumPostStrangeDaysjony0ssfe11
  • Reply 6 of 33
    I don't think Apple is interested in, or should be interested in, competing with ChatGPT or Google Bard or any other LLMs.  It's a race nobody can win, and it is not aligned with Apple's core mission.  It seems likely to me that Apple will apply Machine Learning, aka Artificial Intelligence, in places where it can add direct value to Apple products.  Search is not one of those places and, as we have seen, is a magnet for antitrust legislation.  Analysis of heart rhythms, blood pressure or blood glucose detection, improved crash detection, speech recognition, and facial identifications are a few examples of product features that might improve by application of new machine learning technology.  No doubt there are more applications that align with Apple's product mission.

    As a user, I am not particularly invested with Siri, and I don't need it write Python code, or translate Plato's Republic to another language.  Honestly, if it can tell me what the temperature in Minneapolis is, I'm happy enough.  Siri is not meant to be a key competitive element, as far as I can tell.
    roundaboutnowForumPost
  • Reply 7 of 33
    avon b7avon b7 Posts: 7,703member
    avon b7 said:
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    It is also recruiting for specific roles in AI. So far, most of the talk has been only that, talk. 

    Talking about ML as they made a point of doing, is stating the obvious here. Who isn't using ML? 

    In this case of LLMs on resource strapped devices, again, some manufacturers are already using them. 

    A Pangu LLM underpins Huawei's Celia voice assistant on its latest phones. 

    I believe Xiaomi is also using LLMs on some of its phones too (although I don't know in which areas). 

    The notion of trying to do more with less is an industry constant. Research never stops in that area and in particular routers have been a persistent research target, being ridiculously low on spare memory and CPU power. I remember, many years ago, doing some external work for the Early Bird project and the entire goal was how to do efficient, real time detection of worm signatures on data streams without impacting the performance of the router. 

    Now, AI is key to real-time detection of threats in network traffic and storage (ransomware in the case of storage, which is another resource strapped area). 

    LLMs have to be run according to needs. In some cases there will be zero issues with carrying out tasks in the Cloud or at the Edge. In other cases/scenarios you might want them running more locally. Maybe even in your Earbuds (for voice recognition or Bone Voice ID purposes etc). 

    Or in your TV or even better across multiple devices at the same time. Resource pooling. 

    "Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. "

    This does not prove that Apple is behind.  Was Apple behind when it introduced the iPod years after other digital music players? or the iPhone? or the iPad? or the Apple Watch.  All this proves is that Apple is not first but being behind implies that other players are better than them in this technology which hasn't been proven yet since we're still in the early stages with respect to AI / ML.
    We are definitely in the early stages but a tremendous amount has already been done - and not only in research, which of course will never stop. It's been surprisingly productive too. 

    End users of literally all kinds will be using the practical implementations of everything that reaches the market and that covers almost everything you can think of. 

    In generative AI (the hot topic of this year) Apple has yet to deliver. It's not doing anything in ML that anyone else isn't. Over the last six years I could even argue that it hasn't pulled its weight in that field. 

    The road maps are clear and a lot of projects are open source.

    Apple recently making a couple of things open source was nothing new. That's been going on for years now. 

    LLMs, as mentioned in this article, are another point. They are already being used on some phones.

    With everybody pushing practical solutions out the door already, the real problem isn't really the AI itself, but how to 'control' it (ethically, morally, legally...).

    That's been known for years too but it is a little like electric scooters. The issues they bring arrive far sooner than we can legislate for them. 

    'Objectively', means looking at the state of play in userland and seeing what's on offer.

    It's difficult not to see Apple as behind here and it's very difficult to argue otherwise.

    No one doubts that things are cooking in Cupertino but they are cooking everywhere. 
    elijahg
  • Reply 8 of 33
    nubusnubus Posts: 388member
    avon b7 said:
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    So very true. Apple is behind on AI. Research papers are nice but as Steve Jobs put it: Real artists ship. Tim Cook recently said Apple is investing in generative AI - so clearly Apple is behind on that as other companies are shipping. If Tim Cook can say it... then why not Appleinsider?

    Siri is a mess, Apple can't match Copilot for code development, the iOS keyboard suggestions are nowhere near ChatGPT, and organizing files locally and on servers has been a manual chore for soon 40 years on the Mac with no AI to support us. Apple is doing very well on photos and a lot of space is reserved for processing Neural Engine = we pay for ML.
    elijahg9secondkox2williamlondondesignr
  • Reply 9 of 33
    elijahgelijahg Posts: 2,760member
    avon b7 said:
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    It is also recruiting for specific roles in AI. So far, most of the talk has been only that, talk. 

    Talking about ML as they made a point of doing, is stating the obvious here. Who isn't using ML? 

    In this case of LLMs on resource strapped devices, again, some manufacturers are already using them. 

    A Pangu LLM underpins Huawei's Celia voice assistant on its latest phones. 

    I believe Xiaomi is also using LLMs on some of its phones too (although I don't know in which areas). 

    The notion of trying to do more with less is an industry constant. Research never stops in that area and in particular routers have been a persistent research target, being ridiculously low on spare memory and CPU power. I remember, many years ago, doing some external work for the Early Bird project and the entire goal was how to do efficient, real time detection of worm signatures on data streams without impacting the performance of the router. 

    Now, AI is key to real-time detection of threats in network traffic and storage (ransomware in the case of storage, which is another resource strapped area). 

    LLMs have to be run according to needs. In some cases there will be zero issues with carrying out tasks in the Cloud or at the Edge. In other cases/scenarios you might want them running more locally. Maybe even in your Earbuds (for voice recognition or Bone Voice ID purposes etc). 

    Or in your TV or even better across multiple devices at the same time. Resource pooling. 

    "Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. "

    This does not prove that Apple is behind.  Was Apple behind when it introduced the iPod years after other digital music players? or the iPhone? or the iPad? or the Apple Watch.  All this proves is that Apple is not first but being behind implies that other players are better than them in this technology which hasn't been proven yet since we're still in the early stages with respect to AI / ML.
    The iPod was a massive advance on other MP3 players of the time. So was the iPhone, and so was the iPad. Siri was pretty good at the time of introduction, but everyone else has surpassed it since development seems to have essentially stopped in 2014. ChatGPT is as far ahead of the competition as the iPhone was in 2007. Siri is to ChatGPT as Blackberry was to iPhone in 2007.
    nubuszeus423designr
  • Reply 10 of 33
    Hope we see Apple leverage this tech to help users much more efficiently create automations.
    williamlondonwatto_cobraForumPost
  • Reply 11 of 33
    People are funny. apple has been implementing AI for a good while now. People forget all about Apple's Machine Learning advancements - not just in software, but HARDWARE. Apple has been baking it in for years and years now. Machine Learning = AI.

    What they haven't been is irresponsible and unscrupulous.

    To take the words from Jobs years ago,  Current AI trends are "a big bag of hurt." It's a steaming mess. Everything from false information to biased political influence, to stolen artwork, to criminally tainted datasets, the legal issues, plagiarism, and untrustworthiness are eerily similar to the early days of mass available internet. 

    Apple has been the one company using AI for a long time already but in immediately helpful ways to enhance life. They won't be the ones rushing things. Of course with new. use cases, such as the Chat GPT style, Apple will add this in such a way as to be nearly invisible.

    USER: "Hey Siri, draw me a picture of Mickey Mouse going to church in Studio Ghibli style."
    SIRI: "OK, I have created an image of Mickey Mouse at church."

    Simple.

    It will just integrate into your normal routine, such as definition suggestions and related. info when you type, a more informed Siri, etc.  Not just a thing unto itself, in order to say "look at how cool I am." Instead, it. will "just work" and Apple will be the trusted, reliable, alsways advancing source for all your tech  needs they have been. And in a few years, some new thing will be the dominator of healdines and the. cycle will have. repeated iteself. 

    edited December 2023 ForumPost13485jony0
  • Reply 12 of 33
    elijahg said:
    avon b7 said:
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    It is also recruiting for specific roles in AI. So far, most of the talk has been only that, talk. 

    Talking about ML as they made a point of doing, is stating the obvious here. Who isn't using ML? 

    In this case of LLMs on resource strapped devices, again, some manufacturers are already using them. 

    A Pangu LLM underpins Huawei's Celia voice assistant on its latest phones. 

    I believe Xiaomi is also using LLMs on some of its phones too (although I don't know in which areas). 

    The notion of trying to do more with less is an industry constant. Research never stops in that area and in particular routers have been a persistent research target, being ridiculously low on spare memory and CPU power. I remember, many years ago, doing some external work for the Early Bird project and the entire goal was how to do efficient, real time detection of worm signatures on data streams without impacting the performance of the router. 

    Now, AI is key to real-time detection of threats in network traffic and storage (ransomware in the case of storage, which is another resource strapped area). 

    LLMs have to be run according to needs. In some cases there will be zero issues with carrying out tasks in the Cloud or at the Edge. In other cases/scenarios you might want them running more locally. Maybe even in your Earbuds (for voice recognition or Bone Voice ID purposes etc). 

    Or in your TV or even better across multiple devices at the same time. Resource pooling. 

    "Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. "

    This does not prove that Apple is behind.  Was Apple behind when it introduced the iPod years after other digital music players? or the iPhone? or the iPad? or the Apple Watch.  All this proves is that Apple is not first but being behind implies that other players are better than them in this technology which hasn't been proven yet since we're still in the early stages with respect to AI / ML.
    The iPod was a massive advance on other MP3 players of the time. So was the iPhone, and so was the iPad. Siri was pretty good at the time of introduction, but everyone else has surpassed it since development seems to have essentially stopped in 2014. ChatGPT is as far ahead of the competition as the iPhone was in 2007. Siri is to ChatGPT as Blackberry was to iPhone in 2007.
    "Siri is to ChatGPT as Blackberry was to iPhone in 2007."

    Not the same thing even though they both utilize machine learning technology.  Wait till Apple introduces their take on an LLM and then we'll see if ChatGPT is far ahead.
  • Reply 13 of 33
    elijahgelijahg Posts: 2,760member
    elijahg said:
    avon b7 said:
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    It is also recruiting for specific roles in AI. So far, most of the talk has been only that, talk. 

    Talking about ML as they made a point of doing, is stating the obvious here. Who isn't using ML? 

    In this case of LLMs on resource strapped devices, again, some manufacturers are already using them. 

    A Pangu LLM underpins Huawei's Celia voice assistant on its latest phones. 

    I believe Xiaomi is also using LLMs on some of its phones too (although I don't know in which areas). 

    The notion of trying to do more with less is an industry constant. Research never stops in that area and in particular routers have been a persistent research target, being ridiculously low on spare memory and CPU power. I remember, many years ago, doing some external work for the Early Bird project and the entire goal was how to do efficient, real time detection of worm signatures on data streams without impacting the performance of the router. 

    Now, AI is key to real-time detection of threats in network traffic and storage (ransomware in the case of storage, which is another resource strapped area). 

    LLMs have to be run according to needs. In some cases there will be zero issues with carrying out tasks in the Cloud or at the Edge. In other cases/scenarios you might want them running more locally. Maybe even in your Earbuds (for voice recognition or Bone Voice ID purposes etc). 

    Or in your TV or even better across multiple devices at the same time. Resource pooling. 

    "Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. "

    This does not prove that Apple is behind.  Was Apple behind when it introduced the iPod years after other digital music players? or the iPhone? or the iPad? or the Apple Watch.  All this proves is that Apple is not first but being behind implies that other players are better than them in this technology which hasn't been proven yet since we're still in the early stages with respect to AI / ML.
    The iPod was a massive advance on other MP3 players of the time. So was the iPhone, and so was the iPad. Siri was pretty good at the time of introduction, but everyone else has surpassed it since development seems to have essentially stopped in 2014. ChatGPT is as far ahead of the competition as the iPhone was in 2007. Siri is to ChatGPT as Blackberry was to iPhone in 2007.
    "Siri is to ChatGPT as Blackberry was to iPhone in 2007."

    Not the same thing even though they both utilize machine learning technology.  Wait till Apple introduces their take on an LLM and then we'll see if ChatGPT is far ahead.
    Well, it is. Steve said he wanted Siri to be conversational and intelligent. ChatGPT is conversational and appears intelligent, Siri is thick and easily bamboozled by the most basic things. it doesn't "understand" what the intention is behind your words, only takes them at a very basic level by their first meaning in the dictionary. Like a toddler. But I was more referring to the gulf between Siri and ChatGPT is similar to that between Blackberry and iPhone back in 2007.
    designr
  • Reply 14 of 33
    danvmdanvm Posts: 1,409member
    People are funny. apple has been implementing AI for a good while now. People forget all about Apple's Machine Learning advancements - not just in software, but HARDWARE. Apple has been baking it in for years and years now. Machine Learning = AI.
    Have you considered why people forget about Apple ML advancements?  If you ask me, it's because they are behind Open AI, Microsoft and Google.  
    Apple has been the one company using AI for a long time already but in immediately helpful ways to enhance life. They won't be the ones rushing things. Of course with new. use cases, such as the Chat GPT style, Apple will add this in such a way as to be nearly invisible.

    USER: "Hey Siri, draw me a picture of Mickey Mouse going to church in Studio Ghibli style."
    SIRI: "OK, I have created an image of Mickey Mouse at church."

    Simple.

    It will just integrate into your normal routine, such as definition suggestions and related. info when you type, a more informed Siri, etc.  Not just a thing unto itself, in order to say "look at how cool I am." Instead, it. will "just work" and Apple will be the trusted, reliable, alsways advancing source for all your tech  needs they have been. And in a few years, some new thing will be the dominator of healdines and the. cycle will have. repeated iteself. 
    The example you gave can be done today with Open AI Dall-E and Microsoft Designer. Users don't have to wait for Apple for this.  Also, MS Office with CoPilot and Google Workplace with Duet have AI in two of the most popular apps and services used in school and business, while Apple has done nothing with their apps. And considering the state of Apple services, I see the competition ahead of Apple for the next few years.  
  • Reply 15 of 33
    avon b7avon b7 Posts: 7,703member
    People are funny. apple has been implementing AI for a good while now. People forget all about Apple's Machine Learning advancements - not just in software, but HARDWARE. Apple has been baking it in for years and years now. Machine Learning = AI.

    What they haven't been is irresponsible and unscrupulous.

    To take the words from Jobs years ago,  Current AI trends are "a big bag of hurt." It's a steaming mess. Everything from false information to biased political influence, to stolen artwork, to criminally tainted datasets, the legal issues, plagiarism, and untrustworthiness are eerily similar to the early days of mass available internet. 

    Apple has been the one company using AI for a long time already but in immediately helpful ways to enhance life. They won't be the ones rushing things. Of course with new. use cases, such as the Chat GPT style, Apple will add this in such a way as to be nearly invisible.

    USER: "Hey Siri, draw me a picture of Mickey Mouse going to church in Studio Ghibli style."
    SIRI: "OK, I have created an image of Mickey Mouse at church."

    Simple.

    It will just integrate into your normal routine, such as definition suggestions and related. info when you type, a more informed Siri, etc.  Not just a thing unto itself, in order to say "look at how cool I am." Instead, it. will "just work" and Apple will be the trusted, reliable, alsways advancing source for all your tech  needs they have been. And in a few years, some new thing will be the dominator of healdines and the. cycle will have. repeated iteself. 

    "They won't be the ones rushing things. Of course with new. use cases, such as the Chat GPT style, Apple will add this in such a way as to be nearly invisible."

    So you are agree they are behind?

    You just think they aren't rushing things. 

    Apple's ML uses aren't any different to anybody else's and like I said, I'd venture they were pretty far behind competitors over the last six years. 

    Apple hasn't been 'the one' company to use AI to immediately enhance our lives. 

    Just about everybody using AI has used it productively. 

    AI today is definitely not a 'steaming mess'.

    There are plenty of issues but also plenty of amazing success stories, too:

    https://www.huaweicloud.com/intl/en-us/news/20230707180809498.html

    https://fintechmagazine.com/tech-ai/huawei-cloud-and-pangu-ai-model-reshaping-finance-industry

    If you feed your data in the correct way, all the pain points you mention (which are very valid criticisms BTW) vanish. At least with the exception of the other issues I mentioned. 

    And even with those pain points there is still a lot of benefit. Huge benefit. So much so that Apple has gone out of its way to make it clear that they consider it to be of vital importance going forward and are working on it. Not much to show for it though and that's why it's not unreasonable to see them as a bit behind here while others are shipping products across industry and consumer space.

    Microsoft, Google, Amazon, the Chinese players. There's a huge amount going on and it's not behind closed doors. 
    edited December 2023 elijahgdesignr
  • Reply 16 of 33
    Me to Siri: “Play My Way by Elvis Presley”

    Siri: “Okay, here’s Elvis Presley” as she plays the same wrong song every single time. 

    I’d be thrilled if Siri could just do the simple tasks it used to do just fine. 
    williamlondonelijahgdesignr
  • Reply 17 of 33
    zeus423 said:
    Me to Siri: “Play My Way by Elvis Presley”

    Siri: “Okay, here’s Elvis Presley” as she plays the same wrong song every single time. 

    I’d be thrilled if Siri could just do the simple tasks it used to do just fine. 
    Something about Siri that has baffled me for years is how people can ask it the same thing and get different results. Every time is see an issue like yours on the forum I try it myself. So far I’ve had the expected response every time I ask, trying to use the same wording. I find it very strange that this happens, year after year. What could it be that makes Siri reply one thing to one user and another thing to a differ t user asking the same thing?


    williamlondonzeus423watto_cobraroundaboutnow
  • Reply 18 of 33
    avon b7 said:
    I think it's fair to say that Apple is behind in this area. 

    Objectively, this year has been about ChatGPT style usage and Apple hasn't brought anything to market while others have. 

    It is also recruiting for specific roles in AI. So far, most of the talk has been only that, talk. 

    Talking about ML as they made a point of doing, is stating the obvious here. Who isn't using ML? 

    In this case of LLMs on resource strapped devices, again, some manufacturers are already using them. 

    A Pangu LLM underpins Huawei's Celia voice assistant on its latest phones. 

    I believe Xiaomi is also using LLMs on some of its phones too (although I don't know in which areas). 

    The notion of trying to do more with less is an industry constant. Research never stops in that area and in particular routers have been a persistent research target, being ridiculously low on spare memory and CPU power. I remember, many years ago, doing some external work for the Early Bird project and the entire goal was how to do efficient, real time detection of worm signatures on data streams without impacting the performance of the router. 

    Now, AI is key to real-time detection of threats in network traffic and storage (ransomware in the case of storage, which is another resource strapped area). 

    LLMs have to be run according to needs. In some cases there will be zero issues with carrying out tasks in the Cloud or at the Edge. In other cases/scenarios you might want them running more locally. Maybe even in your Earbuds (for voice recognition or Bone Voice ID purposes etc). 

    Or in your TV or even better across multiple devices at the same time. Resource pooling. 

    Yea this author really wants to prove Apple isn’t behind on AI, for some reason. That’s like his fourth article on this exact topic
    williamlondonelijahg
  • Reply 19 of 33
    mattinozmattinoz Posts: 2,323member
    elijahg said:
    This article carefully steps around the fact that this improvement will make zero difference to Apple's most visible and most deficient use of AI: Siri.

    Siri is the primary and by far most visible use of "AI" at Apple. People quite rightly have a problem with Siri being thick as a brick. Siri has numerous problems which are not going to be solved by improving the creation of useless avatars for a niche device no one yet owns outside of the developer space. This research specifically does nothing to help Siri for several reasons: 
    • Siri doesn't use contemporary "AI" like ChatGPT, Siri's "intelligence" comes from predefined queries a in basic lookup table - which is why if you don't ask it something in the right way it breaks.
    • Very few Siri queries are processed on-device. 
    • LLMs need huge datasets unavailable on-device so efficiency of on-device processing is mostly irrelevant. 
    This article does nothing to convince anyone that Apple is not far far behind the AI curve. Making a clockwork timepiece 10% more efficient is irrelevant when everyone else has long since moved to battery. Apple needs exactly what this article claims they don't: a clone of ChatGPT. Siri was supposed to be conversational 10 years ago when Steve Jobs introduced it on the iPhone 4. It has arguably got worse since then.

    However, Apple's AI in other areas seems pretty good, photos is pretty good at recognising objects/people/things/scenes for example; and that's on-device. But claiming that Apple isn't behind the AI curve when their most visible use of AI is a disaster is pure fanboyism.
    Does the article step around or not engage the obvious speculation or inference?

    Siri could reduce the look-up table by having an LLM ( or a tight LM) on the device translated from the user language to "SiriCode", Developers wouldn't need to localise at all. define intents for what can be done by their Apps in  "SiiriCode" (much like what happens for shortcuts).

    Could actually be advantageous for them as they can have a localised LM to suit a much more limited range of speech to start, but have a lot border range of downloadable models per language or regional dialect, even common language combos by allowing users' to nominate all the languages they speak.  then share a retrained or annotated version between the users' devices that picks up on each user's quirks in speech. 

    Smaller, on-device ML lets Apple be more personable. I'm not surprised they would be looking at these sorts of developments the same way as the article clearly shows they are. 
    watto_cobra
  • Reply 20 of 33
    elijahgelijahg Posts: 2,760member
    zeus423 said:
    Me to Siri: “Play My Way by Elvis Presley”

    Siri: “Okay, here’s Elvis Presley” as she plays the same wrong song every single time. 

    I’d be thrilled if Siri could just do the simple tasks it used to do just fine. 
    Something about Siri that has baffled me for years is how people can ask it the same thing and get different results. Every time is see an issue like yours on the forum I try it myself. So far I’ve had the expected response every time I ask, trying to use the same wording. I find it very strange that this happens, year after year. What could it be that makes Siri reply one thing to one user and another thing to a differ t user asking the same thing?


    I've had Siri respond differently even when asking the same thing. For example one that often trips it up is saying "Add tomato sauce to my shopping list". Sometimes I get "tomato" and "sauce" added, sometimes I get "tomato sauce". Maybe the inflection is different but I doubt it.
    byronl
Sign In or Register to comment.