thadec
About
- Username
- thadec
- Joined
- Visits
- 18
- Last Active
- Roles
- member
- Points
- 469
- Badges
- 0
- Posts
- 97
Reactions
-
iPhone 15 USB-C will fix some problems, but create issues for most
Disagree with the writer. USB-C has been the standard for gadgets for years. Even if you only use Apple products whenever possible, the author himself states that Macs adopted USB-C in 2015. Everyone else? Headphones and pretty much every other gadget you can think of uses USB-C, including $10 USB-C earbuds for (presumably Android) phones and tablets. USB-C docks, dongles etc. are among the most popular items on shopping sites for a reason. If this wasn't the case the EU would have never had a basis to mandate USB-C to Apple to begin with. So this is nothing like the eleven years ago switch from the 30-pin connector to the lightning pin at all. Technology, the market etc. were entirely different back then. And it is precisely because USB-C has been so widely used for so long that finding good cables that can be used for both charging and data is easy. Even some bargain brands like Amazon Basics work quite well for everyone who isn't a digital media pro or someone else who has the most demanding requirements, and those people are going to buy their cables from Apple anyway. Also, you are pretending as if third party lightning cables don't exist.
Whatever reasons Apple had for avoiding USB-C on their best selling products, unless you work for able and are contractually obligated to toe the company line, you can't pretend that the reason for it was the consumer. If anything the opposite is true: there will be another option out there for high quality USB-C cables if you need them to transfer data from your Android phone to your Windows computer (for example), and Mac and iPad Pro users will be able to use their Apple-provided cables for those products on iPhones etc. too. -
Apple will keep using Qualcomm 5G modems until 2026
tmay said:avon b7 said:
Keep digging!
My first post!
My first line!
It was a joke.
"money for nothing and your chicks for free" - Dire Straits
A reference to how Qualcomm ended up with a triple windfall and all without doing anything.
1. Lawsuit settled.
2. Millions in totally unexpected revenue from Apple.
3. Millions in totally unexpected revenue from Huawei.
Technically two potential competitors were also caged as a result. Could things get any better?
Well, yes actually they could. Samsung ran into Exynos problems. Unbelievable.
None of it had formed any part of its future projections. It was laughing all the way to the bank. And then some. That is my point.
Losing that revenue didn't change anything on its core roadmap because none of it was there from the the start. It was simply the icing on the cake.
Any delay to Apple doing what Huawei just did to Qualcomm (Qualcomm formerly announced a few weeks ago that there would be no further material revenue from Huawei) is simply even more Apple revenues to line Qualcomm's pockets.
And whenever Apple does finally stop ordering Qualcomm modems it will still have pay the patent fees to both Qualcomm and Huawei.
Everything else you threw in here about market share, process nodes, unit sales, who gets to wait etc was simply chaff. Irrelevant to what I said.
I have no idea what drove you to post your 'you don't understand capitalism' rant because it had nothing at all to do with my point!
From there on you simply buckled down with chaff after chaff in an effort to avoid replying to my very open, direct and simple question.
Where was what I said wrong?
Not answering that didn't leave you in the best of positions so I said you should let it go.
You chose to drag things on.
I'll bow out now because it's pointless carrying on with someone who refuses to answer the question.
Qualcomm does $44B a year revenue, so a loss of an order from Apple is 20% of its revenues, but that same order to Qualcomm is less that 2% of Apple's revenue. Apple's modem operation in an R&D expense, a fraction of the $24B that they will spend this year on R&D.
From Apple's reference, it is a rounding error.
Meanwhile, Qualcomm has its hands in many other sectors, but who knows if any of those will compensate for the loss of Apple's modem business.
I don't know what the value of Huawei's business to Qualcomm, but there's no question that Huawei's margins are much less than Apple's, so even with silicon from SMIC, Huawei isn't going to see much movement in margins.
Qualcomm is the same. Yes, Nuvia purchase to compete with Apple Silicon is officially a failure as the 4nm Snapdragon 8cx Gen 4 chips will only beat Intel's 5nm Arrow Lake to market by mere weeks - https://wccftech.com/qualcomm-oryon-problems-potential-snapdragon-8cx-gen-4-delay/ - and Microsoft won't publicly commit to use them in Surface devices anyway. But so long as Android remains vital - and despite 10 years of "all the cool kids love iPhones it will especially overseas - the best devices on that platform will have Qualcomm chips. Yes they do need another revenue stream but to get it they can ditch their agreement with Microsoft - who hasn't been upholding their end of the bargain anyway - and push ARM Linux devices, which unlike Windows on ARM actually has a market. Which was Nuvia's original goal anyway: to make chips for ARM servers.
As with Intel, for Qualcomm Apple was merely a customer. A big customer, sure, but customers come and go. And like the person who started this whole exchange said, Qualcomm has known for some time that Apple was leaving. So for them getting Apple's business until 2026 is basically free money - a bonus - that they weren't planning on getting. Who knows, maybe they can use it to buy foundry capacity from whoever has it, or to break their bad exclusivity deal with Microsoft. -
Mac Pro in danger after fumbled Apple Silicon launch
dewme said:Just checked Apple’s leadership team profiles on https://www.apple.com/leadership/. Did I miss something? I don’t see Mark Gurman’s profile anywhere on that site. Where does he sit in Apple’s leadership team that decides the execution of Apple’s product strategy?
1. Apple has released four Mac Pros in the entire line's existence: 2006, 2013, 2019 and 2023. No idea why anyone ever expected the shift to Apple Silicon was going to mean it adopting some 2 or even 3 year refresh schedule especially when you consider that the Mac Studio and according to some people - the 16" MacBook Pro can replace the (cheaper with default specs) Mac Pro for a ton of users.
2. Intel and AMD aren't done. The 5nm Threadripper PRO 7985WX that AMD will announce in September will max out at 64 cores. And September 2025 Intel is going to announce a Core i9 - a regular desktop chip, not a workstation or server chip - with 40 cores. Apple knows what Intel and AMD are doing. (They also know that a ton of the Mac Pro customer base has since switched to Linux - before you snark "Linux has about 100 users" take a good look at the Mac Pro sales figures - and this is what they will be using.) As this group doesn't care nearly as much about power efficiency as Apple fans think that they do (should) Apple knows that keeping up with Intel and AMD in this space will be a challenge.
3. Why? I have been saying this on this board for some time. And this is why the "Apple bungled the Silicon switch" argument is wrong: Apple's main problem is they have only one basic CPU core design. Give them credit: they get a lot out of it. They reuse old performance cores as efficiency cores, and for the performance cores they vary the number and power draw from 4 in a low power mode in a thin, inexpensive fanless device like the M1 iPad Air to 24 cores that they pump massive power to in the M2 Ultra Mac Pro. But still: they use an outdated version of this core as efficiency cores and the modern version as performance ones. Intel, for example? Does not have this problem.
Intel has an actual efficiency core that is fundamentally different from their performance cores (1 thread instead of 2 and for now doesn't support AVX-512 ISA). They also have Core i3, Core i5, Core i7, Core i9 and Xeon cores for 6 total (they also had Pentium and Celeron but starting with 13th gen they ditched them in favor of the faster e-core). AMD? Same. Athlon/A-Series (which seems to have met the same fate as Intel's Pentium and Celeron), Ryzen 3, Ryzen 5, Ryzen 7, Ryzen 9, Threadripper, Epyc. The Core i9/Ryzen 9/Threadripper/Xeon/Epyc batch uses a ton of power - though notably not so much in the Core i9/Ryzen 9 laptop versions - but Apple simply cannot match the single core performance of Core i9/Ryzen 9 and above. And no, this isn't x86 Windows fanboy nonsense. Apple themselves admitted the same. Yes, the 3nm M3 Ultra will beat anything that Intel and AMD have ... because Intel will be on 7nm (10nm for desktop/workstation/server) and because AMD, though on 5nm, hasn't adopted big.LITTLE on desktop yet (their big.LITTLE chips for laptops and handheld gaming consoles are just now coming out). But when Intel and Apple get to 3nm in 2025?
Yes, Apple could spend billions in R&D on developing cores to compete with Ryzen 9 and above. Or to save time and money they could license the Neoverse ARM server cores and customize them. But why? Remember: Apple only sells 25 million Macs in a good (actually great) year with the vast majority being M2 Pro and below. Were Apple to develop a Mac Pro capable of beating Linux workstations running the AMD Threadripper Pro in single core and multicore performance, how many more people are actually going to buy it? Apple knows the answer to this question already, even if the many Apple fan sites out there aren't going to admit it. Consider that even were Apple to start selling 10 million workstations a year, you will see Nvidia, Qualcomm and the rest react with their own ARM workstation chips. Nvidia would release a scaled down version of their Grace server/cloud SOCs - which use Neoverse V2 cores and their excellent GPUs - for workstations the very day Apple provides evidence of a market for it. There is even an ARM Ubuntu workstation and server stack already in place to take advantage of it.
Sorry guys, while I do buy and use Apple products, I am not an "Apple knows best" sort by any stretch of the imagination. But this is one instance where you are flat out wrong. Making a competitive Mac Pro requires making a competitive workstation CPU, and even if Apple could make one it wouldn't benefit them financially or in terms of market position. So were Apple to abandon the Mac Pro down the line, big deal: Google and Intel both exited the PC market because they were faced with the same set of facts. -
Apple on the hook for $1.1 billion in Caltech Wi-Fi patent case
godofbiscuits said:Given that the patent system was created to protect entities like Cal Tech AGAINST enormous corporations... -
M2 Ultra benchmarks show performance bump over M1 Ultra
coolfactor said:
Double the Intel performance — ouch, indeed!
Way to go, Apple. Keep up the good work.22july2013 said:Someone should sell stickers that say "Apple Inside."
https://wccftech.com/apple-m2-ultra-soc-isnt-faster-than-amd-intel-last-year-desktop-cpus-50-slower-than-nvidia-rtx-4080/
And even the Intel 13900K and AMD 7950X are 2022 chips. Also, neither are workstation chips. A comparison with the AMD Threadripper 7000 and 7000 Pro that gets released in September, which will be made on the same node as the M2 Ultra, won't be particularly favorable. The M3 Extreme - which I am betting that Apple is going to launch on TSMC's 2nd gen 3nm process in 2025 - will be needed to compete, except that in 2025 the AMD Threadrippers made on TSMC's 1st gen 3nm process will be out, as will - and this is a worst case scenario for them - Intel's 5nm desktop and workstation chips.
Going forward Apple is likely going to de-emphasize direct comparisons between Apple Silicon and Intel - note that they have avoided mentioning AMD altogether - in favor of comparisons with previous generations of Apple Silicon. In a way that will be appropriate. The software that most people are going to run on the Mac Pro and Mac Studio are going to be so different from the software that most people are going to run on Threadripper and Xeon-W workstations and servers that direct comparisons will be impossible anyway.