Why Apple should buy Advanced Micro Devices
I?ll admit it; I?m an AMD fanboi. Let?s just get that on the table so we can move on. I won?t describe how, if it were not for AMD, we would still be paying $600 for 800MHz cpus. I won?t talk about how AMD was first with SIMD instructions for 3D processing, or how they were first to 1 GHz, or how they were first to on-die memory controller, or how they were first to true quad-core processing. I won?t even bring up the bad old days when Intel added an extra step to the production of their 486sx processors to disable the math co-processor in order to make a chip that would be obsolete within a year. I promise not to mention any of that in my argument for having Apple buy AMD.
Here it is ? AMD is in trouble. Intel (in response to AMD?s Athlon 64) actually got off their collective seats and designed a good (perhaps great) CPU in the form of the Core Duo. Then they made it better with the Core 2 Duo, and now they?re about to release an even better version with SSE4 extensions that I predict will kick some serious math. AMD is struggling to release their ?Phenom? processors, which will probably be pretty good once AMD has time to iron out the kinks, but are currently getting chewed up in hardware reviews. Consumers are disappointed, vendors are unenthused, and shareholders are pissed. From most angles, things look pretty bleak for AMD.
However, AMD is actually positioned extremely well for two large (and growing) markets - Apples and laptops. Here?s why:
While most recent benchmarks won?t show it, AMD?s Phenom chip is actually very fast ? clock-for-clock as fast as Intel?s offerings on some software. The problem is that software is frequently much better optimized to run on Intel-based systems. But, here?s the thing - after their purchase of ATI (the number 2 graphics card maker), AMD has been exploring ways to make the CPU (the chip in the computer) and the GPU (the chip in the graphics card) work together to crank through extremely processor intensive tasks like video encoding. This is a bigger deal than it may sound, as GPUs have become insanely powerful over the past three years, yet they don?t contribute to anything other than video games and some 3D design apps. The problem is actually getting the two chips to work efficiently together. To do that, you almost need complete control over the hardware and very tight integration between that hardware and the software that runs on it. And, well, where can that be found? You guessed it ? Apple Inc.
While Microsoft?s many versions of Windows run on a gazillion different types of hardware and have ten times as many applications, Apple has one OS that runs on a relative handful of hardware configurations, with a (relatively) small number of third-party apps. And what does Apple do best? That?s right ? video and graphics! So, this provides the perfect platform to run code that is highly optimized to use all available resources as efficiently as possible for the applications that are both the most demanding and the most appropriate. So, the benefit to AMD is obvious, but what?s in it for Apple? You?re right again! Control and graphics performance!
If you know anything about Steve Jobs, chairman and CEO of Apple, you know that he likes his power, which he does have. In fact, on November 27, 2007, Jobs was named the most powerful person in business by Fortune Magazine. Sooner or later, Intel is going to make Steve angry, and trust me, Intel (and the consumer) would not like Steve Jobs when he is angry. Owning AMD would give Jobs carte blanche over CPU speeds and availability, and since AMD owns ATI and designs their own motherboards, it?s all wrapped up in one nice package with a pretty red bow. And let?s face it, Apple fans, the graphics cards in most Macs need some help. But let?s focus on the Macbook, which, just like all other newer laptops with an Intel CPU, uses an Intel X3100 integrated solution. Let?s focus on the GPU because this is where Intel simply and irrefutably blows chunks. All of Intel?s integrated chipsets suck. It might be because of their weak hardware or it might be because the drivers are crap, or it might just be because they?re integrated graphics. The truth is, we may never know exactly why they blow chunks so hard, but they do. AMD can fix this using their ?hybrid crossfire? technology. In true AMD form, they are working to make use of every bit of computing power available by allowing programs to combine the power of onboard graphics with an add-in board. HardOCP has an article describing how they were able to run one of the most graphically demanding new games quite well running on a computer with an ATI integrated chipset plus a cheap $49 ATI graphics card. I don?t know about you, but Apple?s Macbook would suddenly be a lot more appealing if I could buy a version for $50 more that would actually (gasp!) run games or Second Life well.
And now that all of ATI?s cards are available to Apple at cost, let?s get something a little more, shall we say ?competitive?, into the iMac. And while we?re at it, let?s turn the Mac Pro and Macbook Pro into the graphics powerhouses they should be. I would bet $100 that the Apple programmers could get iMovie, Final Cut and possibly even Photoshop to double in speed if they coupled the power from the new Phenom processor and a mid-range ATI video card. Meanwhile, Microsoft would try to accomplish the same thing, but would ultimately fail, not just because their programmers lack the finesse, but also because there are just too many applications and hardware components to accommodate. Apple wins, AMD wins, and the consumer wins. Yes, Microsoft loses, but they should be getting used to that by now.
So, in short, please send Steve Jobs a certified letter asking him to buy AMD with part of his current 15 billion dollar surplus. You have my full permission to include the full text of this rant in your letter. I think we will all be happy with the results. Except maybe Microsoft, but who really cares what they think anyway?
-Switchy McSwitcher
P.S. In the spirit of full disclosure, this is an idea stolen (but expanded upon) from a comment posted at Tom's Hardware.
Here it is ? AMD is in trouble. Intel (in response to AMD?s Athlon 64) actually got off their collective seats and designed a good (perhaps great) CPU in the form of the Core Duo. Then they made it better with the Core 2 Duo, and now they?re about to release an even better version with SSE4 extensions that I predict will kick some serious math. AMD is struggling to release their ?Phenom? processors, which will probably be pretty good once AMD has time to iron out the kinks, but are currently getting chewed up in hardware reviews. Consumers are disappointed, vendors are unenthused, and shareholders are pissed. From most angles, things look pretty bleak for AMD.
However, AMD is actually positioned extremely well for two large (and growing) markets - Apples and laptops. Here?s why:
While most recent benchmarks won?t show it, AMD?s Phenom chip is actually very fast ? clock-for-clock as fast as Intel?s offerings on some software. The problem is that software is frequently much better optimized to run on Intel-based systems. But, here?s the thing - after their purchase of ATI (the number 2 graphics card maker), AMD has been exploring ways to make the CPU (the chip in the computer) and the GPU (the chip in the graphics card) work together to crank through extremely processor intensive tasks like video encoding. This is a bigger deal than it may sound, as GPUs have become insanely powerful over the past three years, yet they don?t contribute to anything other than video games and some 3D design apps. The problem is actually getting the two chips to work efficiently together. To do that, you almost need complete control over the hardware and very tight integration between that hardware and the software that runs on it. And, well, where can that be found? You guessed it ? Apple Inc.
While Microsoft?s many versions of Windows run on a gazillion different types of hardware and have ten times as many applications, Apple has one OS that runs on a relative handful of hardware configurations, with a (relatively) small number of third-party apps. And what does Apple do best? That?s right ? video and graphics! So, this provides the perfect platform to run code that is highly optimized to use all available resources as efficiently as possible for the applications that are both the most demanding and the most appropriate. So, the benefit to AMD is obvious, but what?s in it for Apple? You?re right again! Control and graphics performance!
If you know anything about Steve Jobs, chairman and CEO of Apple, you know that he likes his power, which he does have. In fact, on November 27, 2007, Jobs was named the most powerful person in business by Fortune Magazine. Sooner or later, Intel is going to make Steve angry, and trust me, Intel (and the consumer) would not like Steve Jobs when he is angry. Owning AMD would give Jobs carte blanche over CPU speeds and availability, and since AMD owns ATI and designs their own motherboards, it?s all wrapped up in one nice package with a pretty red bow. And let?s face it, Apple fans, the graphics cards in most Macs need some help. But let?s focus on the Macbook, which, just like all other newer laptops with an Intel CPU, uses an Intel X3100 integrated solution. Let?s focus on the GPU because this is where Intel simply and irrefutably blows chunks. All of Intel?s integrated chipsets suck. It might be because of their weak hardware or it might be because the drivers are crap, or it might just be because they?re integrated graphics. The truth is, we may never know exactly why they blow chunks so hard, but they do. AMD can fix this using their ?hybrid crossfire? technology. In true AMD form, they are working to make use of every bit of computing power available by allowing programs to combine the power of onboard graphics with an add-in board. HardOCP has an article describing how they were able to run one of the most graphically demanding new games quite well running on a computer with an ATI integrated chipset plus a cheap $49 ATI graphics card. I don?t know about you, but Apple?s Macbook would suddenly be a lot more appealing if I could buy a version for $50 more that would actually (gasp!) run games or Second Life well.
And now that all of ATI?s cards are available to Apple at cost, let?s get something a little more, shall we say ?competitive?, into the iMac. And while we?re at it, let?s turn the Mac Pro and Macbook Pro into the graphics powerhouses they should be. I would bet $100 that the Apple programmers could get iMovie, Final Cut and possibly even Photoshop to double in speed if they coupled the power from the new Phenom processor and a mid-range ATI video card. Meanwhile, Microsoft would try to accomplish the same thing, but would ultimately fail, not just because their programmers lack the finesse, but also because there are just too many applications and hardware components to accommodate. Apple wins, AMD wins, and the consumer wins. Yes, Microsoft loses, but they should be getting used to that by now.
So, in short, please send Steve Jobs a certified letter asking him to buy AMD with part of his current 15 billion dollar surplus. You have my full permission to include the full text of this rant in your letter. I think we will all be happy with the results. Except maybe Microsoft, but who really cares what they think anyway?
-Switchy McSwitcher
P.S. In the spirit of full disclosure, this is an idea stolen (but expanded upon) from a comment posted at Tom's Hardware.
Comments
We know that Apple like control but would AMD be willing to give up power to Apple? I doubt it.
I don't see it happening. Intel are racing away from AMD and the Phenom thing just shows AMD are scared and desperate. It under-delivers and is more expensive than Intel's offering.
Apple currently have the best chips on the market and their sales are strong and Intel has provided them with a development plan that probably stretches out over the next few years (if it didn't, Apple wouldn't have switched in the first place).
I can see what benefits there would be but I just don't see it happening.
It sounds nice to think about Apple having access to the fruits of AMD labor but as the owner the funding for this R&D now comes out of their own pockets. In the end you still have the 800lb Gorilla in Intel that must be beaten and thats no short order.
If AMD is having issues getting Phenom up to snuff then they must be really dreading the possibility of Nehalem shipping on time and living up to expectatations. Everywhere AMD has led Intel in CPU design is gone. Integrated Memory Controller ...check. True Quad Core...check. GPU integrated into the die...check. Hypertransport like bus...check. It's all in there and will be running on a mature 45nm process.
As magical as Apple is they surely don't want to expend all of their engergy fighting that battle. I think the best thing for AMD to do is find a complimentary company to merge with and that company does not appear to be Apple at this juncture IMO.
Ok, let's get into it.
I’ll admit it; I’m an AMD fanboi. Let’s just get that on the table so we can move on. I won’t describe how, if it were not for AMD, we would still be paying $600 for 800MHz cpus. I won’t talk about how AMD was first with SIMD instructions for 3D processing, or how they were first to 1 GHz, or how they were first to on-die memory controller, or how they were first to true quad-core processing.
The Pentium era was pretty bad for Intel. The original Pentium got raped by PowerPC G3, G4. Pentium 2 was not too bad but needed a pretty chunky heatsink/fan thingy. Pentium 3 was by far the best Intel achievement in the Pentium era, and I believe is the strongest contributor to the Core stuff. Pentium 4 of course was totally rubbish and let Athlon and Turion kick some serious Intel butt and established AMD as truly legendary particularly in the modern PC gaming era.
I won’t even bring up the bad old days when Intel added an extra step to the production of their 486sx processors to disable the math co-processor in order to make a chip that would be obsolete within a year...
Heh. Old skool bro -- nice. 386SX,DX and 486SX,DX were dark, dark days. The "would you need a Math co-processor?" question back then seems incredibly stupid today. Then again today there is the "would you need SSE3 and SSE4???".
Here it is – AMD is in trouble. Intel (in response to AMD’s Athlon 64) actually got off their collective seats and designed a good (perhaps great) CPU in the form of the Core Duo. Then they made it better with the Core 2 Duo, and now they’re about to release an even better version with SSE4 extensions that I predict will kick some serious math. AMD is struggling to release their “Phenom” processors, which will probably be pretty good once AMD has time to iron out the kinks, but are currently getting chewed up in hardware reviews. Consumers are disappointed, vendors are unenthused, and shareholders are pissed. From most angles, things look pretty bleak for AMD.
Core and Core2 are truly, truly amazing. Penryn will probably pretty much demolish all expectations, the "Core3" of 2008. IMO there has been a lot of whining about how PC systems are "limited" by GPU, RAM, hard disk speed, etc. The emergence of Core2 Duo and Quad pretty much showed that even with limited GPU, RAM and hard disk speed bottlenecks, a very good Core2Duo ~2ghz even can improve things even with 512mb and 5400rpm drives, eg. MacBook etc.
Then there is the overclocking overhead. 2ghz Core2Duo Conroes that go up to 3ghz on *air*. Penryn 2+ghz chips have been pushed to 4ghz on *air*. My PC running a Core2Duo Conroe 2ghz sits comfortably at 2.5ghz (on air, Zalman big ass CPU HSF).
...AMD is struggling to release their “Phenom” processors, which will probably be pretty good once AMD has time to iron out the kinks, but are currently getting chewed up in hardware reviews. Consumers are disappointed, vendors are unenthused, and shareholders are pissed. From most angles, things look pretty bleak for AMD.
AMD is in deep, deep, deep trouble. The German government gave them a few hundred mil. There is some highly speculative investment from Middle East interests. The ATI merger is still causing a lot more chaos than benefits at this stage.
...While most recent benchmarks won’t show it, AMD’s Phenom chip is actually very fast – clock-for-clock as fast as Intel’s offerings on some software. The problem is that software is frequently much better optimized to run on Intel-based systems. But, here’s the thing - after their purchase of ATI (the number 2 graphics card maker), AMD has been exploring ways to make the CPU (the chip in the computer) and the GPU (the chip in the graphics card) work together to crank through extremely processor intensive tasks like video encoding. This is a bigger deal than it may sound, as GPUs have become insanely powerful over the past three years, yet they don’t contribute to anything other than video games and some 3D design apps. The problem is actually getting the two chips to work efficiently together to do that. You almost need complete control over the hardware and very tight integration between that hardware and the software that runs on it. And, well, where can that be found? You guessed it – Apple Inc.
That's the pickle right there. Phenom has a lot of potential. The ATI stuff going up against nVidia is promising. The ATI-AMD integrated chipset is interesting.
However, the *R&D pipeline* for ATI-AMD is totally fracked.
Phenom, the ATI 3-thousand etc stuff is really good tech pushed out from AMD and ATI. This is because these are very good products and technologies and have been developed thoroughly for a long while.
However, what comes after this? If they have been bleeding cash badly to get the good stuff out, how are they going to continue this? What if their releases in 2008 and 2009 are not so great? That's what I fear. AMD-ATI will hobble along, but for me Intel-nVidia is extremely threatening going into the last part of this decade.
........And now that all of ATI’s cards are available to Apple at cost, let’s get something a little more, shall we say “competitive”, into the iMac. And while we’re at it, let’s turn the Mac Pro and Macbook Pro into the graphics powerhouses they should be. I would bet $100 that the Apple programmers could get iMovie, Final Cut and possibly even Photoshop to double in speed if they coupled the power from the new Phenom processor and a mid-range ATI video card. Meanwhile, Microsoft would try to accomplish the same thing, but would ultimately fail, not just because their programmers lack the finesse, but also because there are just too many applications and hardware components to accommodate. Apple wins, AMD wins, and the consumer wins. Yes, Microsoft loses, but they should be getting used to that by now...
The thing is that you have to consider how Apple is positioning its products, and how Intel-nVidia-ATI fits into this. The way I see it.
Let's look at the MacBook. Apple specifically wants a crappy GPU in it. It is not *that* bad, given Core Image stuff runs decently on a X3100 integrated, at resolutions around 1280x1024 etc. For any more demanding graphical stuff, even "just" World Of Warcraft, Apple *wants you strongly* to be pushed upsell to a MacBookPro. Where, let's face it, the nVidia 8600 GTS, GT, MGT, etc. etc. is finally living up to the nVidia 7600GT and 6600GT pedigree. Upsell and profit margin. What makes Apple successful.
iMac: ATI cards have been horrendous in the 20" and 24" Alu designs. I'm curious as to your take on this.
A lot of people have wondered about the Mac Pro and why the GPU options are so outdated. Anyone want to explain?
iMac: ATI cards have been horrendous in the 20" and 24" Alu designs. I'm curious as to your take on this.
Apple chose the 2400 and 2600, not because they are good performers, but because they support Vista's DirectX 10 APIs. I can actually understand Apple's logic, but I wish they had chosen a faster card.
The 2400 is way slow, but that's its only real problem. Unfortunately, a good number of the infamous 2600s had defective memory, causing all those pesky lock-ups that alumimac owners kept bringing up in the forums. That ATI allowed these defective cards be sold is inexcusable and embarrassing. It was probably due to one of their memory supplier's poor quality control. And ATI's poor quality control as well. In any case, my take is that ATI really screwed up on that one and it took them too long to identify and "fix" the problem. What really annoys me is that the fix seems to have been just to make that part of the card's memory unavailable to the system and/or lower the memory clock speed. Benchmarks show a very slight (within margin of error slight), but consistent, drop in performance. Most owners of the "defective" cards just seem to be happy that their machines are running smoothly, but I think I'd want a new system.
Not AMD/ATIs most shining moment, for sure.
-Switchy McSwitcher
Would help them spread out the costs of fabbing chips and the cost of getting down to 32nm is something like 4 billion.