Apple Intelligence & Private Cloud Compute are Apple's answer to generative AI

24

Comments

  • Reply 21 of 65
    9secondkox29secondkox2 Posts: 2,893member
    ralphabet said:
    So where is the source of its AI images? It has to glean images from somewhere to train its AI tool. Does anyone know?
    There were rumors from a while ago stating that Apple bought into agreements with a few companies to train its ai capabilities. 

    I don’t remember the list right off the bat, but there are some popular image repositories as well as some politically liberal data and news outlets. 
    watto_cobra
  • Reply 22 of 65
    mpantonempantone Posts: 2,101member
    scapal said:
    No mention of the HomePods lineup at all.
    That’s the first place where a Siri update is needed.

    I guess it reads as: too embarrassed to even acknowledge their existence in this context, be ready to dump them all to buy a next gen next year.
    Apple's WWDC general keynotes are only about 90 minutes long. They don't have time to cover every single feature for every single product in a high-level marketing presentation. You'll have to refer to the Platform Keynote or the various HomePod technical sessions through this week's WWDC.

    That said, the general keynote indicated that Siri's new AI capabilities required an A17 SoC (or later) or any of the M-series SoCs. Devices with less capable SoCs (Apple TV, Apple Watch, HomePod) don't appear to be candidates for Siri's new functionality. Those devices will continue to work as they have.

    Remember that this includes all iPhones prior to the iPhone 15 generation.

    The Siri & Apple Intelligence engineers probably have a good idea of what sort of processing power is required for a satisfactory user experience. It's not like they don't have access to these devices for extensive testing in their labs.

    But Apple does have a long standing practice of limiting functionality to new(er) hardware. But for sure, the next generation HomePod isn't going to have an M4 Pro inside. If Apple wants to bring improved Siri functionality to these lower-specced devices, they will need to find a way to bring the processing requirements down in future LLMs or find some way to offload processing to a nearby paired Mac or iPhone.

    But for sure, HomePod is far down the hierarchy of Apple product families. They will focus on getting new Siri functionality on the Apple Watch before they look at Apple TV settop boxes or the HomePod.
    edited June 10 watto_cobraBart Y
  • Reply 23 of 65
    scapalscapal Posts: 18member
    mpantone said:
    scapal said:
    No mention of the HomePods lineup at all.
    That’s the first place where a Siri update is needed.

    I guess it reads as: too embarrassed to even acknowledge their existence in this context, be ready to dump them all to buy a next gen next year.
    Apple's WWDC general keynotes are only about 90 minutes long. They don't have time to cover every single feature for every single product in a high-level marketing presentation. You'll have to refer to the Platform Keynote or the various HomePod technical sessions through this week's WWDC.

    That said, the general keynote indicated that Siri's new AI capabilities required an A17 SoC (or later) or any of the M-series SoCs. Devices with less capable SoCs (Apple TV, Apple Watch, HomePod) don't appear to be candidates for Siri's new functionality. Those devices will continue to work as they have.

    Remember that this includes all iPhones prior to the iPhone 15 generation.

    The Siri & Apple Intelligence engineers probably have a good idea of what sort of processing power is required for a satisfactory user experience. It's not like they don't have access to these devices for extensive testing in their labs.

    But Apple does have a long standing practice of limiting functionality to new(er) hardware. But for sure, the next generation HomePod isn't going to have an M4 Pro inside. If Apple wants to bring improved Siri functionality to these lower-specced devices, they will need to find a way to bring the processing requirements down in future LLMs or find some way to offload processing to a nearby paired Mac or iPhone.

    But for sure, HomePod is far down the hierarchy of Apple product families. They will focus on getting new Siri functionality on the Apple Watch before they look at Apple TV settop boxes or the HomePod.
    Of course it may also be because there could be an event dedicated to AirPod and HomePod later this year.
    williamlondonwatto_cobraBart Y
  • Reply 24 of 65
    AfarstarAfarstar Posts: 59member
    DAalseth said:
    There are a few things that would be useful, an improved Siri for example. But there has never in my recollection been a WWDC keynote where I said “Oh F*** no” quite as many times. Many of their headline abilities I will just want to disable as soon as I can and as completely as I can. I AM an artist. I AM a writer. I have no use for AI generating my images and text. 
    Dinosaur 
    welshdogrezwits9secondkox2williamlondon
  • Reply 25 of 65
    RigiDigiRigiDigi Posts: 14member
    1) I don’r care and only want to to that AI OFF, and 2) zero updates to iPadOS usability, pathetic. All that money for an M4 iPad & keyboard & we get nothing.
    williamlondon
  • Reply 26 of 65
    omasou said:
    Will be really interesting to see how these web based platforms, where you are the product, address questions related to privacy.

    For sure they cannot offer what Apple is doing.

    Also interesting how they played down the Apple data center chips.
    I wonder this stuff will be running on M4 Ultra, and if that SoC will include a stronger neural engine.

    In the past I believe all the M class SoCs of a given generation shared the same NPU, but the new strategy may scale up the number of NPU cores with higher level chips.
    watto_cobra
  • Reply 27 of 65
    welshdogwelshdog Posts: 1,903member
    For me today's event was a huge information and feature overload. I am not a developer and a pretty low level user of OS features as it is, so all this new stuff was exciting, but also a bit off putting from a "it's all too much" standpoint. However, it really is very impressive how much effort Apple puts into their privacy efforts along with expanding features. As someone said above, this puts Apple in a unique position in the AI space compared to literally everyone else. Pretty sure the financial analysts won't understand the implications of today's event and the stock price will drop initially.
    Alex1NKierkegaardenwilliamlondonbaconstangwatto_cobraBart Y
  • Reply 28 of 65

    DAalseth said:
    There are a few things that would be useful, an improved Siri for example. But there has never in my recollection been a WWDC keynote where I said “Oh F*** no” quite as many times. Many of their headline abilities I will just want to disable as soon as I can and as completely as I can. I AM an artist. I AM a writer. I have no use for AI generating my images and text. 
    AI isn't just for generating images or text - it's also used to more correctly understand commands/requests, and is probably used for things like categorizing your emails or searching or identifying objects in your photo library - and interpreting what's happening and with whom in your videos.

    There's a lot that can be done with AI including summarizing data.

    You don't have to use the rewrite feature - though I suspect you will - and will probably use image manipulation features as well.

    Not everything done by generative models has to do with stealing content from the internet.
    edited June 10 dewmeAlex1Npascal007watto_cobra
  • Reply 29 of 65

    ralphabet said:
    So where is the source of it's AI images? It has to glean images from somewhere to train it's AI tool. Does anyone know?
    Betcha they paid publishers, image repositories, and maybe some social media platforms with compliant terms of service to train their models.

    Last thing a company like Apple wants is someone claiming their models were trained on stolen IP.
    Alex1Nwilliamlondonwatto_cobraBart Y
  • Reply 30 of 65
    A rather lame, half-baked, underwhelming attempt at AI. 

    A $3000B company with thousands and thousands of developers and a history of spectacular innovations can’t come up with it’s own AI?
    williamlondon
  • Reply 31 of 65
    mpantonempantone Posts: 2,101member
    scapal said:
    mpantone said:
    scapal said:
    No mention of the HomePods lineup at all.
    That’s the first place where a Siri update is needed.

    I guess it reads as: too embarrassed to even acknowledge their existence in this context, be ready to dump them all to buy a next gen next year.
    Apple's WWDC general keynotes are only about 90 minutes long. They don't have time to cover every single feature for every single product in a high-level marketing presentation. You'll have to refer to the Platform Keynote or the various HomePod technical sessions through this week's WWDC.

    That said, the general keynote indicated that Siri's new AI capabilities required an A17 SoC (or later) or any of the M-series SoCs. Devices with less capable SoCs (Apple TV, Apple Watch, HomePod) don't appear to be candidates for Siri's new functionality. Those devices will continue to work as they have.

    Remember that this includes all iPhones prior to the iPhone 15 generation.

    The Siri & Apple Intelligence engineers probably have a good idea of what sort of processing power is required for a satisfactory user experience. It's not like they don't have access to these devices for extensive testing in their labs.

    But Apple does have a long standing practice of limiting functionality to new(er) hardware. But for sure, the next generation HomePod isn't going to have an M4 Pro inside. If Apple wants to bring improved Siri functionality to these lower-specced devices, they will need to find a way to bring the processing requirements down in future LLMs or find some way to offload processing to a nearby paired Mac or iPhone.

    But for sure, HomePod is far down the hierarchy of Apple product families. They will focus on getting new Siri functionality on the Apple Watch before they look at Apple TV settop boxes or the HomePod.
    Of course it may also be because there could be an event dedicated to AirPod and HomePod later this year.
    Sure, that's possible although a dedicated event for AirPod and HomePod seem highly unlikely. AirPod announcements are typically coupled with iPhone announcements because that's how they are most often used.

    I'm not sure if HomePod has ever been center stage at a launch presentation. Compared to other Apple product families, HomePod is very small peanuts. HomePod is usually rides on another major product launch's coattails. Hell, Apple TV (the set-top box) never gets top billing either.

    But feel free to dream.
    edited June 10 dewmewatto_cobra
  • Reply 32 of 65
    thttht Posts: 5,536member
    Huh, Apple private cloud compute is running a stripped down, modified iOS running on Apple silicon clusters. 

    No rumors on the cluster setup. Hopefully they have a custom PCIe based card that slots into a Mac Pro server box. Have M2 Ultra, M3 Max or M4 based systems on a PCIe card. 4 to 6 of them in a Mac Pro box. 

    If it is stacks of Mac minis or Mac Studios, seems quite hacky. 
    dewmewatto_cobra
  • Reply 33 of 65
    mpantonempantone Posts: 2,101member
    tht said:
    Huh, Apple private cloud compute is running a stripped down, modified iOS running on Apple silicon clusters. 

    No rumors on the cluster setup. Hopefully they have a custom PCIe based card that slots into a Mac Pro server box. Have M2 Ultra, M3 Max or M4 based systems on a PCIe card. 4 to 6 of them in a Mac Pro box. 

    If it is stacks of Mac minis or Mac Studios, seems quite hacky. 
    Apple will likely never divulge any substantial details of their Private Cloud Compute systems. For sure they are intimately aware of the capabilities of third-party solutions (NVIDIA) and how Apple's custom silicon stacks up against those, in terms of raw performance, performance per watt, performance per dollar, TCO, and probably other metrics.

    Knowing Apple, they are probably using some sort of proprietary interface completely unusable by anyone else. It's not PCIe nor are they stacks of Mac minis or Mac Studios. There's too much general computing hardware inside of those products that are completely irrelevant to AI cloud workloads. 

    During the Platforms State of the Union presentation, they reiterated that there was no persistent storage. That likely means there are no SSDs in standard interfaces (m.2, etc.). They also mentioned that each Private Cloud Compute session was confined to one cluster. This is important because one big issue with more conventional AI computing is that AI accelerators spend something like 30% of their energy on communicating with other accelerators.

    Without knowing anything about it, my hunch is they are running some sort of blade configuration with each blade being a discrete cluster. The actual SoCs are probably heavily modified M-series chips that have all the irrelevant parts yanked out: the display pipeline, the media engines, etc. If the AI workload doesn't require much in the way of standard CPU cores, they will likely jettison most of those. Maybe they are using a variant of the UltraFusion interconnect. It certainly doesn't need to be an exact copy, they can yank out unnecessary connections.


    edited June 10 watto_cobra
  • Reply 34 of 65
    chasmchasm Posts: 3,404member
    MantlesD said:
    US English only. It will probably be six years until available in my native language. Still waiting for numerous other Apple features that are only available for major languages. Bummer.
    Unless you speak a quite obscure language as your native tongue, I’m very very doubtful it will be six years, but of course you would know better than I.

    All the major and most of the sizable languages are usually supported with the official release, which in this case should be sometime before the end of the year.
    williamlondonwatto_cobraBart Y
  • Reply 35 of 65
    They sure spent a lot of time talking about generative emojis and stickers. The future is now!
    williamlondon
  • Reply 36 of 65
    iOS_Guy80iOS_Guy80 Posts: 850member
    nubus said:
    Impressive level of integration making the Apple ecosystem even more sticky. There is a before and after today.  A lot of hardware currently and recently sold by Apple are now in the "pre-AI" dumb class. Amazing upgrades to hardware and software ahead.
     Can’t wait to see what third-party app developers come up with to take advantage of Apple intelligence.
    danoxwatto_cobraBart Y
  • Reply 37 of 65
    omasouomasou Posts: 606member
    mpantone said:
    omasou said:
    Will be really interesting to see how these web based platforms, where you are the product, addresses questions about privacy.

    For sure they cannot offer what Apple is doing.

    Also interesting how they played down the Apple data center chips.
    These general WWDC keynotes are about 90 minutes long, they simply don't have the time to delve deeply in any one specific topic. There's a platform keynote that offers more meat for developers plus the individual WWDC sessions and documentation.

    I'm not really sure what you would expect Apple to talk about concerning their datacenter silicon. They aren't the company that will list specs on their server configurations. And they won't be selling these chips, they are for their own exclusive use. For many years I have repeatedly speculated that Apple was testing Apple Silicon in their own data centers.

    In the same way, they don't describe what espresso machines are being used in the cafes at the Infinite Loop campus or what sort of HVAC units or networking cable brand is at the spaceship Apple Campus. And when you go to a pizzeria, you don't know what brand and model number the industrial grade mixer used to make the pizza dough. Or the sewing machine used to make your jeans.

    The most important thing is that the data is private and the system architecture can be inspected by third-party observers. The model number on the chip isn't really relevant to anyone but a fraction of Apple staff members. 
    You completely didn’t understand my comment.

    AWS, GCP and MS Azure announced and are working on their own proprietary data center chips aligned to their specific needs (similar to why Apple dropped Intel) and to reduce their alliance on Nvidia.

    MS has announced ARM based Copilot PCs.

    And all the news and financial analysts have been saying Apple is behind and need to announce how they were not.

    Whereas, Apple has been developing ARM based A & M series chips for years and have years of real world experience.

    Apple isn’t behind they are miles ahead and should be bragging about it for those that can’t see it.

    Just like Tim’s had to start using the word AI in earnings calls b/c people are too stupid to know that AI is more than Chat, LLMs and GenAI.
    libertyandfreedanoxdewmewatto_cobraBart Y
  • Reply 38 of 65
    StrangeDaysStrangeDays Posts: 12,945member
    These images are created in three styles: Sketch, Animation, and Realism.
    this is incorrect. Sketch, Illustration, Animation
    gatorguywatto_cobra
  • Reply 39 of 65
    thttht Posts: 5,536member
    mpantone said:
    tht said:
    Huh, Apple private cloud compute is running a stripped down, modified iOS running on Apple silicon clusters. 

    No rumors on the cluster setup. Hopefully they have a custom PCIe based card that slots into a Mac Pro server box. Have M2 Ultra, M3 Max or M4 based systems on a PCIe card. 4 to 6 of them in a Mac Pro box. 

    If it is stacks of Mac minis or Mac Studios, seems quite hacky. 
    Apple will likely never divulge any substantial details of their Private Cloud Compute systems. For sure they are intimately aware of the capabilities of third-party solutions (NVIDIA) and how Apple's custom silicon stacks up against those, in terms of raw performance, performance per watt, performance per dollar, TCO, and probably other metrics.

    Knowing Apple, they are probably using some sort of proprietary interface completely unusable by anyone else. It's not PCIe nor are they stacks of Mac minis or Mac Studios. There's too much general computing hardware inside of those products that are completely irrelevant to AI cloud workloads. 

    During the Platforms State of the Union presentation, they reiterated that there was no persistent storage. That likely means there are no SSDs in standard interfaces (m.2, etc.). They also mentioned that each Private Cloud Compute session was confined to one cluster. This is important because one big issue with more conventional AI computing is that AI accelerators spend something like 30% of their energy on communicating with other accelerators.

    Without knowing anything about it, my hunch is they are running some sort of blade configuration with each blade being a discrete cluster. The actual SoCs are probably heavily modified M-series chips that have all the irrelevant parts yanked out: the display pipeline, the media engines, etc. If the AI workload doesn't require much in the way of standard CPU cores, they will likely jettison most of those. Maybe they are using a variant of the UltraFusion interconnect. It certainly doesn't need to be an exact copy, they can yank out unnecessary connections.
    Inference processing can be done the CPU, GPU or NPU. I don't think they are ripping anything out of the SoC as they will need as much inference compute that they can get. The could use all of the failed SoCs too, with nonfunctioning cores in the CPU, GPU or NPU clusters. They would want the cheapest possible hardware and the cheapest way to run it. So, Apple Silicon on a PCIe card slotted into Mac Pros is my initial thought as that's already rack-mountable. What Apple Silicon? Whatever they have now. A bunch of A16s? A17 Pros? M silicon? What be interesting to find out.

    Just an interesting thought on having as much inference compute in as small a volume as possible. Stacks of Mac mini/studio devices seem space inefficient. Hmm, they could put like 5 to 10 A17 Pros on a PCIe board if they wanted to.

    Apple doesn't use slotted NVMe controllers. The storage controller is in the SoC on the switched fabric. They then solder dumb NAND off-package. So, there wouldn't even be NAND pads or NAND daughterboard slots let alone NAND.

    The no persistent storage thing is interesting as how would they update the OS on the thing? Could just have a boot ROM and it downloads the remainder of the OS across the network every time the cluster is booted? Also, page file size is set to zero. Power redundancy? What OS is controller everything?
    watto_cobra
  • Reply 40 of 65
    entropysentropys Posts: 4,217member
    MantlesD said:
    US English only. It will probably be six years until available in my native language. Still waiting for numerous other Apple features that are only available for major languages. Bummer.
    No doubt even longer for Australian.
    williamlondondewmebaconstangwatto_cobra
Sign In or Register to comment.