thadec
About
- Username
- thadec
- Joined
- Visits
- 18
- Last Active
- Roles
- member
- Points
- 469
- Badges
- 0
- Posts
- 97
Reactions
-
New Apple Silicon has arrived with M3, M3 Pro, and M3 Max chips
I have seen in forums that the M3 still only supports 2 displays. Meaning that people are going to pay $1600 for an M3 MacBook Pro that can only support a single external monitor. Granted, the Lenovo IdeaPad Duet 3 Chromebook can only support a single external monitor, but that is because it is a $280 device that runs on a 32 bit SOC that was designed in 2017. (Moreover it is technically a tablet and not a laptop.) So can someone please explain this limitation with Apple Silicon's base chips? Whatever it is, you can bet that the Qualcomm chips in 2024 as well as the Nvidia and AMD ARM chips in 2025 aren't going to have them. -
Apple could spend $5B on servers to catch up in AI race
danox said:Yep, Apple’s gonna buy Nvidia AI servers, when they have everything in house to build their own and learn more from the experience, Nvidia, Intel, and AMD servers are barn burners. I don’t think Apple will go down that path not at that wattage when then can build their own without the need for additional coolant fluid.
https://www.youtube.com/watch?v=5dhuxRF2c_w&pp=ygUMb3B0aW11bSB0ZWNo 13:12 mark high wattage and MHz at all cost Apple isn’t on that path…… Long term Apple appears to building up from ground level with a different solution.
https://www.notebookcheck.net/Disappointing-Core-i9-14900K-performance-watt-vs-Ryzen-7-7800X3D-showcased-as-Intel-CPU-appears-to-guzzle-energy.761067.0.html
https://www.reddit.com/r/hardware/comments/17esez1/exclusive_nvidia_to_make_armbased_pc_chips_in/ Opps, Nvidia wants to make Arm chips I think that cancels them out.
You are aware that Apple doesn't use their own chips for their data centers, right? They use Intel Xeons - with some AMD Epyc - just like everybody else, and those servers run Linux, not macOS or some Apple home-grown server OS, like everybody else. Apple loves to talk about how their data centers run on green energy, but the tech that they use in them? Not so much. Just as they don't like to talk about how much they rely on Amazon AWS (plus some Google Cloud Platform for redundancy and data analytics) to deliver iCloud and their other cloud products. It would demystify everything I guess.
So just like Apple was unable to design a 5G modem as good as Qualcomm's - and they aren't even starting from scratch because they bought Intel's 5G division way back in 2019, they just reupped with Qualcomm through 2026 so we are talking about at least 7 years here meaning that they will finally have 5G done for 2027 right about the time that the industry is going to start shifting to 6G (testing begins in 2028 and commercial release in 2030) - they won't be able to snap their fingers and create their own genAI server stack at the drop of a hat either. -
Goldman Sachs regrets Apple Card, and is trying to escape the deal
@hmlongco
More like "they didn't have to do business with Apple unless they had the knowledge necessary to know what were favorable terms for them - and not Apple - beforehand as well as the clout to insist on them and the lawyers good enough to hold Apple accountable should Apple try to change the terms of the deal beforehand."
In other words, be Samsung, who refuses to sign contracts with Apple unless they guarantee a large profit for Samsung.
Be Qualcomm, who had the legal resources to take on Apple and win when Apple tried to force Qualcomm to accepting significantly less per modem than both had mutually agreed to AND less than Qualcomm charges every other customer using some "you'll still be getting billions from us anyway so quit whining" nonsense logic.
At the very least, insist on a deal that gives you the same ability to walk away if it is not working out for you as Apple does.
Do not be like tons of companies who signed deals with Apple only to have Apple walk away with all the cash while their "partners" were left holding the bag. No other bank, credit card or lending company in the world would have agreed to Apple's terms. Goldman Sachs did because they were inexperienced in direct-to-consumer banking, wanted to get into it and figured that hopping onto the Apple infrastructure and brand name would lower their startup costs and risks. Goldman Sachs tried to use Apple to cut corners and got burned. So they don't deserve a lot of sympathy, if any. They should have hired Bain to review their deal with Apple, and if Apple didn't allow a reputable third party to review the deal they should have seen it as a red flag and walked away. Goldman Sachs should have put in the work to build up their own consumer division from scratch, or they should have just bought a smaller bank or lending company and built their own consumer division on top of it (similar to how Apple Music was built on top of Beats Radio). Now, thanks to all the money that they are going to lose on this deal, they can't do either.
But yes, stuff like this is exactly why there is no "Apple Car." Every single manufacturer they talked to knew that they would be locked into an extremely long deal that they couldn't get out of that would result in their investing tens of billions of dollars (or more) while Apple would walk away with the profits AND the IP. The partnering car company wouldn't even be able to say "buy a Nissan because we help Apple make the Apple car!" because ... that isn't a very effective advertising campaign. -
LabView design & test app abandons the Mac after four decades
mknelson said:I can understand Valve abandoning Counterstrike on the Mac. I don't believe any of them were ported to 64bit so were abandonware at best. None would run on a current Mac/MacOS.
LabView 2021 was 64 bit. -
Valve kills CS:GO on macOS, won't launch Mac Counter-Strike 2 either
lowededwookie said:Honkers said:Macs have proven that a 100% native version of a game outperforms a better specced PC. version and Metal 3 makes that even more the case.
More recently they did tests of Metal vs DirectX and found that Metal could out perform DX in many tasks. Not all but a hell of a lot.
A good proof that it comes down to the developers is Eve Online. Eve has always sucked on the Mac because it was always a Cider wrapped DirectX game (think CrossOver but more dedicated to one game). There was always a massive performance hit and the game performed slower than a snail in molasses. Now however it is a native Metal app, largely down to Apple Silicon, and it performs like a hot knife through butter even on a 2014 MacBook Air.
The lazy Cider approach means they can port to other platforms quicker but then those platforms get massive performance hits and people claim the Mac is crap for games. It’s NOT the platform that is crap, it’s lazy developers not making native versions of games.
The 3 year old MacBook Air costs $1000. That will buy you a gaming laptop with a lower midrange graphics card like the AMD Radeon RX 7600S that will crush that Air's 7 core GPU. The 16" MacBook Pro costs $3300. That will get you a gaming laptop with an RTX 4080 whose graphics outperform the 76 core M2 Ultra. And the cheapest M2 Ultra device is $5000 and totally isn't a laptop! We aren't even going to talk about comparing the M2 Ultra to a $5000 gaming rig that has the desktop version of the AMD Ryzen 9 (16 cores and 32 threads) and the desktop NVIDIA GeForce RTX 4090.
Video game studios aren't lazy. They aren't going to build games for a platform that gets 7% market share in a good year with the vast majority of that 7% being multimedia pros and blue bubble life poseurs. And bottom line, you flat out weren't telling the truth when you claimed that modern Macs beat modern PCs with discrete GPUs in gaming performance. They don't and they never will. At least with an Intel-based Mac you could get an eGPU. That set up would cost twice as much as a PC with the same GPU included, but at least you would get the same performance. Now? No dice even if the game runs natively.