kju3

About

Username
kju3
Joined
Visits
0
Last Active
Roles
member
Points
7
Badges
0
Posts
1
  • Apple is reportedly investing heavily into Nvidia servers for AI development

    blastdoor said:
    avon b7 said:
    No doubt CUDA is vital here as I haven't heard anything about a complete Apple AI training stack for use with the heavy lifting.

    Nvidia has CUDA. Huawei has CANN. 

    Has Apple released an equivalent solution? 
    Apple has Metal Performance Shaders and MLX. 

    I'm not qualified to say whether they are 'equivalent' to CUDA. But I believe they are focused on doing the same general job. 
    Metal Performance Shaders? No. MLX? Yes. The issue with CUDA is that it has been around since 2007, no one else had an incentive to put out a competitor for most of that time so the best software toolkits have been built on top of what was the only game in town, plus it is what the most experienced programmers, engineers etc use. So what Apple needs to create competitive AI applications doesn't work with MLX and not enough people know the stuff that does work with MLX. Now if this was 3 years ago before Microsoft triggered this genAI boom and had Apple made this a huge priority it wouldn't have been a problem: Apple could have created whatever they need for MLX/Apple Silicon and trained enough developers to do the work. As neither was the case and Apple finds itself needing to make as much progress as quickly as possible, they need to go with a solution that allows them to just plug it in, hire proven programmers and engineers who have done this job in the past and get going. 

    Everyone is just going nuts over this because it is Nvidia, who is on the rather long list of companies that Apple fans are supposed to despise (along with Microsoft, Google, Intel, Samsung, Qualcomm, Masimo, Amazon and I am certain that I am leaving out a lot more) despite Apple's own, er, history of doing stuff. Such as Steve Jobs accusing Nvidia of IP theft and Apple getting upset at Nvidia's refusal to make custom MacBook GPUs under terms that likely would have bankrupted Nvidia. But honestly, it is only 250 servers for less than $1 billion. Lots of companies far smaller than Apple are paying far more to buy way more.

    They are just going to be used to cover gaps that Apple can't immediately fill with their own tech: short term stuff. Other companies have already spent far more time and money being among the first to do what Apple needs to get done now. Apple will be able to trace their steps at a fraction of the time and cost while avoiding their mistakes. Once they are finished using the servers and CUDA to play catch-up they'll be done with them and will probably donate them to some university or nonprofit for a tax writeoff, and the engineers that they hire to work on this will make top dollar for a relatively brief gig and will leave with the Apple experience on their resumes that will allow them to work wherever Apple's noncompete clause allows. And yes, this means next time they will actually go with Nvidia when they want to instead of when they have to, which is the way that it should be anyway. As Apple is working with companies that they have literally sued (or threatened to) like Microsoft, Samsung, Google and Amazon then there was never any reason to try to freeze Nvidia out in the first place. That MacBook GPU thing? Well Apple wound up using AMD GPUs that weren't nearly as good, which forced a ton of people who needed the best graphics to buy Windows machines with Nvidia cards instead. So Apple really showed them, didn't they?
    muthuk_vanalingamavon b7bestkeptsecretblastdoorwilliamlondonbadmonkelijahgwatto_cobraJanNL