thadec

About

Username
thadec
Joined
Visits
18
Last Active
Roles
member
Points
469
Badges
0
Posts
97
  • New Apple Silicon has arrived with M3, M3 Pro, and M3 Max chips

    I have seen in forums that the M3 still only supports 2 displays. Meaning that people are going to pay $1600 for an M3 MacBook Pro that can only support a single external monitor. Granted, the Lenovo IdeaPad Duet 3 Chromebook can only support a single external monitor, but that is because it is a $280 device that runs on a 32 bit SOC that was designed in 2017. (Moreover it is technically a tablet and not a laptop.) So can someone please explain this limitation with Apple Silicon's base chips? Whatever it is, you can bet that the Qualcomm chips in 2024 as well as the Nvidia and AMD ARM chips in 2025 aren't going to have them.
    muthuk_vanalingamwilliamlondon
  • Apple could spend $5B on servers to catch up in AI race

    danox said:
    Yep, Apple’s gonna buy Nvidia AI servers, when they have everything in house to build their own and learn more from the experience, Nvidia, Intel, and AMD servers are barn burners. I don’t think Apple will go down that path not at that wattage when then can build their own without the need for additional coolant fluid.

    https://www.youtube.com/watch?v=5dhuxRF2c_w&pp=ygUMb3B0aW11bSB0ZWNo 13:12 mark high wattage and MHz at all cost Apple isn’t on that path…… Long term Apple appears to building up from ground level with a different solution.

    https://www.notebookcheck.net/Disappointing-Core-i9-14900K-performance-watt-vs-Ryzen-7-7800X3D-showcased-as-Intel-CPU-appears-to-guzzle-energy.761067.0.html

    https://www.reddit.com/r/hardware/comments/17esez1/exclusive_nvidia_to_make_armbased_pc_chips_in/ Opps, Nvidia wants to make Arm chips I think that cancels them out.
    Good grief. Let me tell you this: Apple is a consumer products company. Apple doesn't even have a workstation chip good enough to compete with the latest Intel Xeon W, which itself (by Intel's admission) can't compete with the latest 96 core 192 thread AMD Threadripper workstation chip. So where on earth are they going to get a server class chip design? Before you answer, Apple Silicon doesn't have anywhere close to the single core performance of an x86 server or even a generic ARM Neoverse V2 or V1 server (maybe a Neoverse N1/N2). As for multicore, Nvidia Grace chips have 144 cores. In 1Q2024 Intel Sierra Forest will have 196 cores. And that is CPU. Where on earth is Apple going to get the parallel processing power from? (AI/ML uses GPUs as parallel processors.) The M2 Ultra only provides raw performance - no we aren't talking about the hardware accelerators that are just good for video, audio and photo processing - equivalent to a midrange Nvidia card that you can pick up off the shelf at a retail store. Comparing it to an Nvidia Hopper data center GPU or the competing products that AMD has and Intel is trying to make is hilarious. And that is the hardware. Where are the software tools going to come from? You know, like Nvidia CUDA, AMD ROCm or even Intel's DPC++?

    You are aware that Apple doesn't use their own chips for their data centers, right? They use Intel Xeons - with some AMD Epyc - just like everybody else, and those servers run Linux, not macOS or some Apple home-grown server OS, like everybody else. Apple loves to talk about how their data centers run on green energy, but the tech that they use in them? Not so much. Just as they don't like to talk about how much they rely on Amazon AWS (plus some Google Cloud Platform for redundancy and data analytics) to deliver iCloud and their other cloud products. It would demystify everything I guess. 

    So just like Apple was unable to design a 5G modem as good as Qualcomm's - and they aren't even starting from scratch because they bought Intel's 5G division way back in 2019, they just reupped with Qualcomm through 2026 so we are talking about at least 7 years here meaning that they will finally have 5G done for 2027 right about the time that the industry is going to start shifting to 6G (testing begins in 2028 and commercial release in 2030) - they won't be able to snap their fingers and create their own genAI server stack at the drop of a hat either.
    williamlondongatorguyctt_zhavon b7watto_cobra
  • Goldman Sachs regrets Apple Card, and is trying to escape the deal

    @hmlongco ;

    More like "they didn't have to do business with Apple unless they had the knowledge necessary to know what were favorable terms for them - and not Apple - beforehand as well as the clout to insist on them and the lawyers good enough to hold Apple accountable should Apple try to change the terms of the deal beforehand."

    In other words, be Samsung, who refuses to sign contracts with Apple unless they guarantee a large profit for Samsung.
    Be Qualcomm, who had the legal resources to take on Apple and win when Apple tried to force Qualcomm to accepting significantly less per modem than both had mutually agreed to AND less than Qualcomm charges every other customer using some "you'll still be getting billions from us anyway so quit whining" nonsense logic.
    At the very least, insist on a deal that gives you the same ability to walk away if it is not working out for you as Apple does.

    Do not be like tons of companies who signed deals with Apple only to have Apple walk away with all the cash while their "partners" were left holding the bag. No other bank, credit card or lending company in the world would have agreed to Apple's terms. Goldman Sachs did because they were inexperienced in direct-to-consumer banking, wanted to get into it and figured that hopping onto the Apple infrastructure and brand name would lower their startup costs and risks. Goldman Sachs tried to use Apple to cut corners and got burned. So they don't deserve a lot of sympathy, if any. They should have hired Bain to review their deal with Apple, and if Apple didn't allow a reputable third party to review the deal they should have seen it as a red flag and walked away. Goldman Sachs should have put in the work to build up their own consumer division from scratch, or they should have just bought a smaller bank or lending company and built their own consumer division on top of it (similar to how Apple Music was built on top of Beats Radio). Now, thanks to all the money that they are going to lose on this deal, they can't do either.

    But yes, stuff like this is exactly why there is no "Apple Car." Every single manufacturer they talked to knew that they would be locked into an extremely long deal that they couldn't get out of that would result in their investing tens of billions of dollars (or more) while Apple would walk away with the profits AND the IP. The partnering car company wouldn't even be able to say "buy a Nissan because we help Apple make the Apple car!" because ... that isn't a very effective advertising campaign. 
    gatorguybart123ricmuthuk_vanalingam
  • LabView design & test app abandons the Mac after four decades

    mknelson said:
    I can understand Valve abandoning Counterstrike on the Mac. I don't believe any of them were ported to 64bit so were abandonware at best. None would run on a current Mac/MacOS.

    LabView 2021 was 64 bit.
    Lots of the work that was being done on Macs 25 years ago is being done on Linux now.  I guess you can say that Windows lost client market share to Linux (as ChromeOS runs on top of Gentoo Linux) and macOS lost workstation market share to Linux. 
    watto_cobra
  • Valve kills CS:GO on macOS, won't launch Mac Counter-Strike 2 either

    Honkers said:
    Macs have proven that a 100% native version of a game outperforms a better specced PC. version and Metal 3 makes that even more the case.
    Really?  Where have they proven that?
    Back in the day when Blizzard made OpenGL versions of their games like World of Warcraft etc, they did performance tests and proved that the Mac played consistently better than the PC with DirectX. Those tests were done by Blizzard themselves and independently verified.

    More recently they did tests of Metal vs DirectX and found that Metal could out perform DX in many tasks. Not all but a hell of a lot.

    A good proof that it comes down to the developers is Eve Online. Eve has always sucked on the Mac because it was always a Cider wrapped DirectX game (think CrossOver but more dedicated to one game). There was always a massive performance hit and the game performed slower than a snail in molasses. Now however it is a native Metal app, largely down to Apple Silicon, and it performs like a hot knife through butter even on a 2014 MacBook Air.

    The lazy Cider approach means they can port to other platforms quicker but then those platforms get massive performance hits and people claim the Mac is crap for games. It’s NOT the platform that is crap, it’s lazy developers not making native versions of games.
    Yeah ... back in the day doesn't mean Apple Silicon, does it? Fascinating that you would leave that out. We aren't talking about ARM PCs that are limited to integrated graphics. This very website has acknowledged multiple times that - again M2 Ultra aside - the integrated graphics only perform at about the level of a midrange discrete GPU. Apple Silicon's graphics are world class among integrated graphics. They are even better for photo and video editing because of the built-in codecs. But they can't compete with dedicated graphics cards. That is why I made the comparisons that I did.

    The 3 year old MacBook Air costs $1000. That will buy you a gaming laptop with a lower midrange graphics card like the AMD Radeon RX 7600S that will crush that Air's 7 core GPU. The 16" MacBook Pro costs $3300. That will get you a gaming laptop with an RTX 4080 whose graphics outperform the 76 core M2 Ultra. And the cheapest M2 Ultra device is $5000 and totally isn't a laptop! We aren't even going to talk about comparing the M2 Ultra to a $5000 gaming rig that has the desktop version of the AMD Ryzen 9 (16 cores and 32 threads) and the desktop NVIDIA GeForce RTX 4090.

    Video game studios aren't lazy. They aren't going to build games for a platform that gets 7% market share in a good year with the vast majority of that 7% being multimedia pros and blue bubble life poseurs. And bottom line, you flat out weren't telling the truth when you claimed that modern Macs beat modern PCs with discrete GPUs in gaming performance. They don't and they never will. At least with an Intel-based Mac you could get an eGPU. That set up would cost twice as much as a PC with the same GPU included, but at least you would get the same performance. Now? No dice even if the game runs natively.


    watto_cobra