danvm

About

Username
danvm
Joined
Visits
212
Last Active
Roles
member
Points
1,862
Badges
0
Posts
1,507
  • Apple keeps pushing AI industry forward with more open-source models

    tmay said:
    avon b7 said:
    chasm said:
    Is Apple's model designed to be small enough to run natively on an iPhone? Are all the other models you mentioned small enough to run on an iPhone? (It's a sincere question; I don't know the answer. It sounds like you are comparing Apples to Oranges.)
    Max_Troll isn't a good source of accurate information, so let me answer that one for you: Apple's models are designed to be as much on-device as possible, but of course the whole of human experience and knowledge isn't going to fit, so some tasks will be quickly handed to Apple's Private Cloud Compute online if needed.

    For queries that involve specialized knowledge (like medical or legal advice, as examples, or other specialty areas), Apple will offer to send your query privately to OpenAI for an answer, but that requires your permission each time its needed. OpenAI doesn't get to collect your data, or train itself on your queries.

    For more about how this all works, I'd suggest watching the "Apple Intelligence" video on Apple's YouTube channel rather than the entire keynote.

    All that said, this article is about LLM and dataset models Apple is offering the wider AI community, not the specific models it will use on any of its own products. Apple has figured out how to create models that are modest in size AND use less computing power than the offerings from other companies, so it is offering those compression techniques and computing algorithms to the wider community.
    That's a fair stab at an answer but, to be fair, claims in the article like this are off the mark:

    "Apple's ability to create incredibly compact yet powerful AI models is unequaled in the industry."

    There is a huge amount of research going into tiny LLM's and it seems new advances come out every week. Open source or not, and many are specifically tailored to specific areas or languages. Apple is unlikely to challenge native Chinese models for the Chinese language. And what about Arabic? 

    When I say 'huge' I mean it's very difficult to know exactly what might appear and keep track of it all. 

    Earbuds can make great use of NLP and everything related to voice biometrics, bone, conduction, audio processing etc. Little more is needed there. 

    Then this:

    "In debuting both Apple Intelligence and Private Cloud Compute at its WWDC conference in June, the company silenced critics who had claimed that Apple was behind the industry on artificial intelligence applications in its devices"

    The critics were simply pointing out reality. No equivalent shipping product from Apple was available. That remains the case today and will remain the case until something actually ships. Sometime late this year on a small range of models and well into next year for the rest of what was announced.

    To all intents and purposes 'Apple Intelligence' was more akin to a placeholder at WWDC to generate buzz and let people know what's coming.

    That's fine but we still have to wait to see what eventually comes out of the pipe and how it performs. 

    The more the better IMO but the 'industry' isn't slacking and is actually shipping. 
    To date, AAPL has been rewarded nicely vs MSFT, so the market looks very favorably on Apple's AI implementation plans.

    Perhaps those "critics" were entirely wrong?

    MS CoPilot, as an example, has subsequently received quite a bit of "well earned" bad press by early users.

    Perhaps the rush to deliver has consequences?

    More to the point, critics totally ignored the fact that Apple has something on the order of 1.5 billion iPhone users, most of whom will upgrade in the future to more powerful AI hardware the does in essence allow increasingly larger models.

    Calling this AI "race" at the starting gate, as you are oft to do, isn't a determining factor in Apple ultimate AI success. 
    I could be wrong, but the only bad press MS had was with Windows 11 Recall, and not with CoPilot.  On the contrary, I have seen CoPilot working.  And while it's not perfect, it's very good, especially with the MS Office integration. 
    muthuk_vanalingamctt_zh
  • Apple may want to monetize advanced Apple Intelligence features in the future

    blastdoor said:
    danvm said:
    blastdoor said:
    I think it is highly likely that Apple will charge for premium apple intelligence services. If nothing else, they could charge developers or other pro users for training models on ACDC. 

    Many people think Apple can’t possibly compete with Nvidia but it’s actually Nvidia that can’t possibly compete with Apple. Nvidia just looks impressive because their main competitors have been AMD and Intel. Apple has the ability to build and optimize a full stack solution like nobody else. When Apple deploys those capabilities in the model training space they will blow away the competition. People will be delighted to pay Apple to train models on ACDC. 

    Another reason I think Apple will charge for AI services is that it’s part of the strategy to become less dependent on China. The more of apples revenue that comes from services, the smaller the disruption if Xi does something stupid. Making their products last as long as possible also helps with that strategy. 
    After reading the article, I believe Apple will be competing with Microsoft Copilot and Google Gemini, rather than Nvidia. Nvidia's AI business is more centered around data centers rather than end-users. I assume Apple will concentrate on their customer base, primarily iOS/iPadOS users, while Microsoft will be the first option in the business and enterprise sector, and Google will target Android users.
    The competition I’m referring to is Apple silicon + software vs Nvidia silicon plus CUDA. 

    If Apple relies on their own hardware and software rather than Nvidia’s, that’s competition. If Apple takes business away from Nvidia’s customers, that’s also competition. 

    It’s similar to how the M series of chips competes with Intel. The competition isn’t direct, but it’s there and it’s significant. 
    I believe that Nvidia's competition will come from its current major customers: Microsoft, Amazon, and Google. All three are developing their own AI chips—Maia/Cobalt, Graviton, and Axion.  These chips could replace Nvidia in these companies datacenters.

    Maybe Apple could compete indirectly with Nvidia, but I think I don't think it has the same impact as other companies.  
    blastdoor
  • Craig Federighi ignited Apple's AI efforts after using Microsoft's Copilot

    danox said:
    danvm said:
    tmay said:
    danvm said:
    tmay said:
    nubus said:
    Competent management would have also invested whatever it takes to have their own in-house LLM to power Siri and have their own AI datacenter infrastructure ready to go by now. 
    And this “AI data center infrastructure” you speak of — what would this power, and what income would this generate? 
    With Apple paying OpenAI - 49% owned by MS and hosted on Azure we see Apple making Microsoft stronger. Actively funding your competitors, having no control over core tech, and fumbling around copying products from Zuckerberg and Musk - that is Team Cook.
    You mean the MS that has no mobile presence vs Apple with 1.46 B iPhone users?

    Who do you think gains the most fainancially from OpenAI on iOS; Apple or MS?
    Yes, the Apple with no generative AI / LLM infrastructure vs MS with one of the largest infrastructures in the world.
    Right now, no one knows how will gain more, since there are no details of Apple agreement with OpenAI. I suppose Apple will be paying far more than the $30M per month to Amazon and the $300M they pay Google for GCP cloud services. 
    https://www.windowscentral.com/microsoft/the-enormity-of-microsofts-windows-phone-shut-down-mistake-is-becoming-increasingly-clear-in-the-ai-era
    It’s evident that not having a smartphone impact MS negatively, but the same can be said from Apple by not having their own generative AI / LLM infrastructure.

    Microsoft not having their own quality long term mobile hardware is by far more important than a dubious AI model, however the Google Tensor being five years behind the leader (Apple) may mitigate that problem. 
    Do you refer to the "dubious AI model" that Apple want to use for their AI services? And I'm aware that MS have been negatively impacted by not having a smartphone.  Still, they managed to work with their strong points, had the vision to see where the market was moving, and now they have one of the largest AI infrastructures in the world. And it looks like Apple will use that infrastructure for their AI services. I suppose MS is doing something right, considering that Copliot was the inspiration for Apple AI effort, don't you think?
    gatorguymuthuk_vanalingam
  • Craig Federighi ignited Apple's AI efforts after using Microsoft's Copilot

    nubus said:
    Competent management would have also invested whatever it takes to have their own in-house LLM to power Siri and have their own AI datacenter infrastructure ready to go by now. 
    And this “AI data center infrastructure” you speak of — what would this power, and what income would this generate? 
    With Apple paying OpenAI - 49% owned by MS and hosted on Azure we see Apple making Microsoft stronger. Actively funding your competitors, having no control over core tech, and fumbling around copying products from Zuckerberg and Musk - that is Team Cook.
    Why do you say that Apple is paying OpenAI anything?  Are you privy to executive level negotiations at Apple?  I don’t believe Apple will be paying them or anyone anything.  The partnership is a rumor, as nothing has been announced, but the ability to be integrated onto 2B+ devices would be attractive to any AI company.

    And what products do you think Apple copied from, of all people, Zuckerberg and Musk?  Apple was supposedly working on an EV technology that no other company (including Tesla) has released yet, and Zuckerberg doesn’t produce anything that resembles Vision Pro in form or function.  Vision Pro does seem to align with Varjo (and maybe Magic Leap/HoloLens) in terms of function and their particular market, but Apple obviously sees the opportunity to expand beyond enterprise.  They made a key AR acquisition in 2017 that led them to developing the technologies that are now in Vision Pro.
    A few years back Apple was paying Amazon $30M per month and $300M per year to Google for cloud services. I assume the cost of the OpenAI agreement will be higher, considering the cost of AI infrastructure. Why you think Apple will not be paying? 
    watto_cobra
  • Apple will allow users to opt in to ChatGPT services in iOS 18 after deal with OpenAI

    danox said:
    twolf2919 said:
    Opt-in would make sense since agreeing to it will likely mean giving up some privacy to ChatGPT servers in the cloud.

    If I had guts, I'd short AAPL stock - it's been rising for the past few months simply on analysts' expectation that Apple will make big announcements regarding AI.  I.e. letting the world know that it isn't behind in AI, after all.  But if, instead, we find out that they are indeed hopelessly behind - by essentially offering us another company's AI on their devices - the stock will go down, for sure.

    And this sounds about right - Apple hasn't managed to improve Siri in 10+ years.  There's no reason to believe Apple could inject a massive improvement into it in the 1-2 years that AI has become a hot topic.
    The only thing that counts are useful AI solutions that the public can use AI Hype and parlor tricks won't last, go ahead and short Apple, Apple is the only tech company with the total package desktop and mobile OS with best in class edge hardware that actually run on the edge now, how long do you think the public will tolerate phoning home?
    Apple is missing in their package the infrastructure and datacenters companies like Microsoft, Google and Amazon have. That's the reason they will depend on OpenAI LLM. Also, people have been "phoning home" for years. For example, ChatGPT still very popular, and don't see people having issues "phoning" OpenAI. Why do you think this will change?
    muthuk_vanalingamctt_zhgatorguy