Apple's bad blood with Nvidia continues, after decades of fighting

Jump to First Reply
Posted:
in Future Apple Hardware

Apple is ramping up research and development of its own AI chip to reduce its reliance on third-party developers, potentially finally completely ending its decades-long unhappy relationship with Nvidia.

Siri icon in a datacenter
Siri icon in a datacenter



In November 2020, Apple announced the M1 chip, its first foray into in-house designed processors for its Mac lineup. The move effectively severed ties between Apple and Intel, the latter responsible for previous processors in Apple's computers.

Now, it seems like Apple is gearing up to reduce its reliance on another third-party developer -- Nvidia. Currently, Apple works with Nvidia to power many of the features behind Apple Intelligence.

Nvidia currently controls somewhere between 70% and 95% of the market share for AI Chips, Technology Magazine points out. Its position in the market rocketed Nvidia to the top of the most valuable companies. It even eclipsed Apple's top spot, albeit for a brief time, as noted by CNBC.

Interestingly enough, Apple doesn't buy Nvidia chips; rather, it rents access to them from cloud services run by Amazon and Microsoft. But Apple is likely gearing up to sever ties even further by allegedly partnering with Broadcom to design its own AI server chip.

A long, unhappy relationship



Apple's relationship with Nvidia took off in the early 2000s when the company began using Nvidia's chips in its Macs to improve graphics performance. But even then, the relationship between the companies was strained.

Allegedly, during a meeting with a senior Nvidia executive, then-CEO Steve Jobs said that Nvidia products had contained technology copied from Pixar sources told The Information. At the time, Jobs had a controlling stake in the animation studio. The executive pushed back on the idea, but Jobs simply ignored him for the remainder of the meeting.

For its part, Nvidia doesn't seem to enjoy working with Apple, either. It has viewed Apple as overly demanding, especially for a company that consistently fails to make Nvidia's top 10 customers.

According to former employees, Apple saw Nvidia as exceptionally difficult to work with. Nvidia's stock chips were far from energy-efficient and produced a lot of heat, both undesirable qualities for laptops. When Apple approached Nvidia about the prospect of designing custom chips for MacBooks, Nvidia balked at the idea.

Tensions ramped up in 2008 when a faulty graphics chip designed by Nvidia made its way into Apple computers, as well as those created by Dell and HP. The event, dubbed "Bumpgate," became a driving force for Apple to switch to AMD, eventually playing a role in Apple developing Apple Silicon.

In the 2010s, Nvidia began to suspect that Apple, Samsung, and Qualcomm were using its patented techniques for rendering graphics on their smartphones. Nvidia would go on to demand licensing fees from the suspected offenders.

In 2019, Apple stopped cooperating with Nvidia on drivers for macOS Mojave. Not only did this cut off most future support, but a lack of driver work also prevented more modern cards from working on PCI-E Macs, or on Macs with eGPUs.

It wasn't as though either company was especially unwilling to work with the other -- at least at the development level. Apple developers had told AppleInsider that support for Nvidia's higher-level cards would have been welcome, and even went as far as praising Nvidia's engineers.

Allegedly, the change was because someone higher up in the company did not want Nvidia support. By this point, many acknowledged the bad blood between the companies, but no one was quite sure who was responsible for pulling driver support.

Apple's unwillingness to end the feud



Currently, Nvidia executives maintain that the fight is mostly one-sided. Nvidia told The Information that the company remains open to collaborating with Apple.

As it stands, Apple appears to be aiming to release its own AI processor, codenamed Baltra, in 2026. Baltra is expected to be manufactured by TSMC using its N3P process. Announced in April 2024, the technology is expected to be first seen in the processors for the iPhone 17 Pro.



Read on AppleInsider

Comments

  • Reply 1 of 13
    JinTechjintech Posts: 1,085member
    Wasn't there just a report about how Apple and Nvidia were collaborating again?
    muthuk_vanalingamAlex1Nwatto_cobradanox
     4Likes 0Dislikes 0Informatives
  • Reply 2 of 13
    JinTech said:
    Wasn't there just a report about how Apple and Nvidia were collaborating again?
    You mean renting?
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 3 of 13
    As AppleInsider noted  collaboration on the developer side exists as this recent announcement by NVIDIA states

    Nvidia said in a blog post:

    “This collaboration between NVIDIA and Apple, has made TensorRT-LLM more powerful and more flexible, enabling the LLM community to innovate more sophisticated models and easily deploy them with TensorRT-LLM to achieve unparalleled performance on NVIDIA GPUs. These new features open exciting possibilities, and we eagerly anticipate the next generation of advanced models from the community that leverage TensorRT-LLM capabilities, driving further improvements in LLM workloads.”


    This collaboration is based on Apple's AI work "Recurrent Drafter (ReDrafter)".   I think that both companies are eager to share and grow AI opportunities.  Rather it is on the business side (providing hardware chips, quality control, IP) that the relationship is strained.  This is not a surprise.  

    Of all of the tech companies only Tesla provides a truly altruistic and open environment sharing patents, IP, etc to further their stated goals:   energy efficiency, decarbonization, human settlement on Mars.  Apple didn't start off with its "walled garden" it was developed over decades to maximize profitability.  

    Apple would be wise to refocus more on its stated goals by encouraging/highlighting their collaborations in support of those goals.

    applesnorangeswatto_cobra
     2Likes 0Dislikes 0Informatives
  • Reply 4 of 13
    So... NVidia is using Apple's work in their products and they're acting like they had a hand in it - calling it "collaboration?" Weird.
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 5 of 13
    blastdoorblastdoor Posts: 3,669member
    I wonder if it’s not really “bad blood” but an inability to find common ground in negotiation. 

    Nvidia wants to charge a premium for their GPUs, Apple doesn’t want to pay a premium. Nvidia has other customers willing to pay their price, apple finds other suppliers willing to charge Apple’s price. 

    In the long run, though, I predict nvidia gets squeezed. I don’t mean that they will go out of business only tha their profit rate and stock price are going to come down a fair bit. 
    applesnorangeswatto_cobra
     1Like 0Dislikes 1Informative
  • Reply 6 of 13
    JinTechjintech Posts: 1,085member
    ForumPost said:
    JinTech said:
    Wasn't there just a report about how Apple and Nvidia were collaborating again?
    You mean renting?
    Whatever this is:
    https://machinelearning.apple.com/research/redrafter-nvidia-tensorrt-llm

    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 7 of 13
    danoxdanox Posts: 3,516member
    Everything Nvidia does uses the energy to burn down a barn, nothing they do is energy efficient. How long do you think the era will last where a data center/AI center uses the power of a small city? Not too long…. The politicians/government, are slow witted, but they ain’t that slow.

    https://www.digitaltrends.com/computing/why-nvidia-rtx-4090s-are-melting/
    edited December 2024
     0Likes 0Dislikes 0Informatives
  • Reply 8 of 13
    JinTech said:
    Wasn't there just a report about how Apple and Nvidia were collaborating again?
    Yes, but you have to bait engagement, and that works better when you make stuff up.
     0Likes 0Dislikes 0Informatives
  • Reply 9 of 13
    It makes me feel that Apple Server might make a resurgence. I don’t mean in a specific Xserve type arrangement but that features in the server OS might come back to macOS. If you could Xgrid a server farm of say 100 M4 Ultra Mac Pros or Mac Studios then you’d have a great start to pumping AI capabilities.

    But what about a grid array of sufficiently connected Mac Minis or MacBook Airs from the average punter like you and me, all feeding data and processing chunks of data, a kind of Apple AI run blockchain? A world of connected Macs and iPhones and iPads working together to form that largest AI data centre the planet has ever seen. Be easy to do with a rework of Xgrid for macOS.

    Apple has most of the tech available, it just needs consumer buy in. I’m sure they could spin it easy enough to make us realise we wanted it anyway.
     0Likes 0Dislikes 0Informatives
  • Reply 10 of 13
    …But what about a grid array of sufficiently connected Mac Minis or MacBook Airs from the average punter like you and me, all feeding data and processing chunks of data, a kind of Apple AI run blockchain? A world of connected Macs and iPhones and iPads working together to form that largest AI data centre the planet has ever seen. Be easy to do with a rework of Xgrid for macOS.
    Won’t work for AI. AI workloads are often memory bandwidth constrained. That’s why NVIDIA bought a networking gear company to make proprietary connections between servers in a pod. The upcoming 5090 is still a consumer device and has 1,792 GB/s of memory bandwidth. An M4 in an iPad or base MacBook Pro has 120 GB/s. And much much worse is the slow gigabit Ethernet of (1Gb/s is 0.125 GB/s). Even if you use 10Gb Ethernet you are still getting about 1.25 GB/s. Thunderbolt 5 networking would be better. So would custom networking like 400 Gb/s Ethernet on servers Tesla is using with a stripped down and incompatible TCP/IP to improve bandwidth and latency. It’s really hard to get the bandwidth you need, even inside a server rack, with specialized kit. Consumer stuff is just too darn slow.
     0Likes 0Dislikes 0Informatives
  • Reply 11 of 13
    sroussey2 said:
    …But what about a grid array of sufficiently connected Mac Minis or MacBook Airs from the average punter like you and me, all feeding data and processing chunks of data, a kind of Apple AI run blockchain? A world of connected Macs and iPhones and iPads working together to form that largest AI data centre the planet has ever seen. Be easy to do with a rework of Xgrid for macOS.
    Won’t work for AI. AI workloads are often memory bandwidth constrained. That’s why NVIDIA bought a networking gear company to make proprietary connections between servers in a pod. The upcoming 5090 is still a consumer device and has 1,792 GB/s of memory bandwidth. An M4 in an iPad or base MacBook Pro has 120 GB/s. And much much worse is the slow gigabit Ethernet of (1Gb/s is 0.125 GB/s). Even if you use 10Gb Ethernet you are still getting about 1.25 GB/s. Thunderbolt 5 networking would be better. So would custom networking like 400 Gb/s Ethernet on servers Tesla is using with a stripped down and incompatible TCP/IP to improve bandwidth and latency. It’s really hard to get the bandwidth you need, even inside a server rack, with specialized kit. Consumer stuff is just too darn slow.
    I don't agree that it won't work. Remember when at one stage the most powerful super computer on the planet was a grid array of over 1000 Mac Minis a few years ago? Now imagine that spread over the entire planet on every Mac, iPhone, iPad that can run Apple Intelligence. It has potential.
     0Likes 0Dislikes 0Informatives
  • Reply 12 of 13
    There was always NVIDIA wanting a platform-brand presence everywhere it was, including Apple.  And that was philosophically incompatible with Apple.  NVIDIA knew that, and as it was Apple’s literal OWN platform, they were the ones souring any deal by refusing to budge on that. 
    danox
     1Like 0Dislikes 0Informatives
  • Reply 13 of 13
    danoxdanox Posts: 3,516member
    Apple has everything it needs In house to make their own server hardware along with the software the only question in the last five years is what’s taking so long, And the same can be said in the router department where there are many Apple users who ask that question everyday when faced with choosing between Google or Amazon for a router (the foxes guarding the hen house).
    edited December 2024
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.