or Connect
AppleInsider › Forums › General › General Discussion › AMD to acquire graphics chip giant ATI
New Posts  All Forums:Forum Nav:

AMD to acquire graphics chip giant ATI

post #1 of 147
Thread Starter 
AMD on Monday upped the ante in its fierce battle with rival Intel Corp., announcing that it plans to acquire graphics chip maker ATI Technologies in a part stock, part cash transaction valued at $5.4B.

The combined company "will create a processing powerhouse by bringing AMD's technology leadership in microprocessors together with ATI's strengths in graphics, chipsets and consumer electronics," the two companies said in a statement.

AMD, the world's second largest chip makrer, is hoping the acquisition will leverage its position within the microchip industry, allowing it to expand beyond computational processors for PC systems as it strives to tear additional market share away from world leader Intel Corp.

By 2008, AMD said it plans to move beyond current technological configurations to transform processing technologies, with silicon-specific platforms that integrate microprocessors and graphics processors to address the growing need for general-purpose, media-centric, data-centric and graphic-centric performance.

"ATI shares our passion and complements our strengths: technology leadership and customer centric innovation," said AMD Chairman and CEO Hector Ruiz. "Bringing these two great companies together will allow us to transcend what we have accomplished as individual businesses and reinvent our industry as the technology leader and partner of choice."

Under the terms of the transaction, unanimously approved by both companies' board of directors, AMD will acquire all of the outstanding common shares of ATI for a combination of $4.2 billion in cash and 57 million shares of AMD common stock, based on the number of shares of ATI common stock outstanding on July 21, 2006.

Based upon the closing price of AMD common stock on July 21, 2006 of $18.26 a share, the consideration for each outstanding share of ATI common stock would be $20.47, comprised of $16.40 of cash and 0.2229 shares of AMD common stock.

The Sunnyvale, Calif.-based AMD said it plans to finance the
cash portion of the transaction with the help of a $2.5B loan from Morgan Stanley.

Graphics processors from ATI have been used in personal computer systems from Apple Computer throughout the years. Although Apple turned away from ATI several years ago in favor using graphics chips from Nvidia Corp., the company's recent transition to Intel microprocessors has signaled the return of ATI as the primary supplier of graphics chips for the company's Mac line.
post #2 of 147
Well, I guess my MacBook Pro is now obsolete. Can't wait to get one with an AMD CPU next year!
post #3 of 147
Quote:
Originally posted by krankerz
Well, I guess my MacBook Pro is now obsolete. Can't wait to get one with an AMD CPU next year!

I can't agree with that.
post #4 of 147
Quote:
Well, I guess my MacBook Pro is now obsolete. Can't wait to get one with an AMD CPU next year!

Quote:
I can't agree with that.


Both of you two sure know what good dialog is
about. Drama, conflict and revealing the neccessary info
at the latest point possible. That keeps the tension up.
" I will not commit anything to memory that I can get from another source . . . "
ALBERT EINSTEIN
Reply
" I will not commit anything to memory that I can get from another source . . . "
ALBERT EINSTEIN
Reply
post #5 of 147
Quote:
Originally posted by Vox Barbara

Both of you two sure know what good dialog is
about. Drama, conflict and revealing the neccessary info
at the latest point possible. That keeps the tension up.

Yes, and your post added to it.
post #6 of 147
Quote:
Originally posted by krankerz
Well, I guess my MacBook Pro is now obsolete. Can't wait to get one with an AMD CPU next year!

Can't wait to see a real laptop CPU from AMD first. The Turion 64 is a nice try, but it's gonna need lots and lots of refinement.
post #7 of 147
Why Doesnt everybody just log on each day.... "GO ON DO IT" and just type "dumb ass!!!" Randomly, seemingly directed at know one in particular,so things seem to be functioning normally around here,btw I love off topic posts.
post #8 of 147
I prefer ATI video cards to NVidia, I hope this doesn't mean Apple will stop using ATI out of loyalty to Intel.
post #9 of 147
Quote:
Originally posted by AppleInsider
Although Apple turned away from ATI several years ago in favor using graphics chips from Nvidia Corp., the company's recent transition to Intel microprocessors has signaled the return of ATI as the primary supplier of graphics chips for the company's Mac line.

Not any more... heh heh.


Quote:
I prefer ATI video cards to NVidia, I hope this doesn't mean Apple will stop using ATI out of loyalty to Intel.

I'm thinking it will mean just that. Check this news out for example: Intel pulls ATI (AMD)'s Chip license.
post #10 of 147
If you were a PC builder you wouldn't say that IMO. ATI drivers are the worst drivers I have ever had to work with. They are 2 binary digits short of spyware / bloatware. ATI's bleeding edge technology is sub-par to NVidia. Every time apple does multiple offerings for graphics cards, NVidia is the top card at the time.

I'm actually happy about this announcement because if this means anything, is apple will use more nvidia chips. I'll take more performance and easier use any day!

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply

 

 

Quote:
The reason why they are analysts is because they failed at running businesses.

 

Reply
post #11 of 147
Not to stray away from the conversation, but...

Worst-case possibilities:

* Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.

* Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.

* Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.

All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.

Best-case possibilities:

* AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)

* Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.

* Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.

* Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.

I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.
post #12 of 147
Quote:
ot to stray away from the conversation, but...

Worst-case possibilities:

* Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.

* Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.

* Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.

All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.

Best-case possibilities:

* AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)

* Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.

* Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.

* Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.

I'm not too dismayed at this point. ATI and Intel are compatible

ramblings of a mad man...
"i find that if you keep talkin', your mouth comes up with stuff..." Karl Pilkington
Reply
"i find that if you keep talkin', your mouth comes up with stuff..." Karl Pilkington
Reply
post #13 of 147
How about "nothing whatsoever changes for Apple".

*bows*

End of topic, thanks for listening.
post #14 of 147
Quote:
How about "nothing whatsoever changes for Apple".

*bows*

End of topic, thanks for listening.

i concur...
"i find that if you keep talkin', your mouth comes up with stuff..." Karl Pilkington
Reply
"i find that if you keep talkin', your mouth comes up with stuff..." Karl Pilkington
Reply
post #15 of 147
Quote:
Originally posted by nagromme
Not to stray away from the conversation, but...

Worst-case possibilities:

* Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.

* Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.

* Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.

All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.

Best-case possibilities:

* AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)

* Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.

* Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.

* Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.

I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.

Great analysis. Except the worst-case scenarios are completely unrealistic, and the best-cases just bypass the fact that there's almost definitely an exclusive, long-term contract with Intel. I seriously doubt Apple would just be able to jump to AMD. Add to that the fact that you imply that OS X doesn't need any updates or optimizations to work with AMD rather than intel.
post #16 of 147
Not of that much importance I think...

If Intel forces Apple to use Nvidia again, big deal, it's not like Apple is using top of the line ATI-stuff atm to which no Nvidia can compare.
post #17 of 147
Quote:
Originally posted by Louzer
Add to that the fact that you imply that OS X doesn't need any updates or optimizations to work with AMD rather than intel.

Does it?



i think Apple will continue to use ATI/AMD graphics. AMD doesn't want to lose ATI's current customers, and Intel isn't gonna turn their back on Apple just for buying from AMD. That'd be dumb.
post #18 of 147
Quote:
Originally posted by Louzer
*snip* I seriously doubt Apple would just be able to jump to AMD. Add to that the fact that you imply that OS X doesn't need any updates or optimizations to work with AMD rather than intel.

OS X wouldn't need any updates or optimizations to run on AMD rather than Intel. Processor-wise, their new models both support the same feature set (SSE3, etc.). If this weren't true, people would not be getting OS X to run on their generic AMD PCs. It's not like Windows XP required special optimizations to run on AMD chips - AMD chips have been designed from day one to be a 100% compatible competitor to Intel chips.

The only problem would be getting drivers for AMD motherboard chipsets. If Apple were to sell both AMD and Intel systems, they could likely use some kind of chipset like the new nForce models that are used in both types of motherboards so they wouldn't need to write as many drivers.
post #19 of 147
Sweet!

There have been mutterings that ATI have developed an API for their GPU's so that you can use them as a DSP for music, Video processing like AVIVO, or massive FPU performance (for 3d rendering like Lightwave, Cinema4d), or physics calculations.

Meanwhile, there have been mutterings that AMD wants a DSP to connect to their Opterons/X2's using hypertransport in the second AM2 slot, - either a second CPU, or a custom DSP - like a GPU with special API.

Put 2 and 2 together and this screams...
post #20 of 147
Quote:
Originally posted by Louzer
the fact that there's almost definitely an exclusive, long-term contract with Intel. I seriously doubt Apple would just be able to jump to AMD.

There may well be an exclusive contract, but I doubt there would be one that's REALLY long term. Because a) Apple wouldn't want to sign one, especially after being burned by CPU vendors in the past. And b) Intel wouldn't play hardball because they've really wanted Apple as a customer for some time. And c) There's no need for Intel to worry in the short-term anyway: Core 2 is that good!

Beyond the short term, looking ahead a few years--which is the soonest I see there being any big threat to Intel caused by ATI joining AMD--I think Apple will have the freedom to shop around.

Imagine in 2008, if AMD comes out with a chip twice as fast as Intel's best, and Intel has nothing in the pipeline... I bet Apple would go to great lengths to get out of any exclusive relationship with Intel. And we'd reap the benefits

Now, I don't think such a dramatic scenario is likely, of course.
post #21 of 147
Quote:
Originally posted by krankerz
Well, I guess my MacBook Pro is now obsolete. Can't wait to get one with an AMD CPU next year!

AMD has minimal focus on low power computing, an area that Intel has sunk billions into. This is why they're so far ahead of AMD in the laptop/Small-PC market. I don't see AMD posing a threat anytime soon. Intel is also pulling well ahead of TSMC in fab technology, which is only going to further separate them from the rest of the low-power market.

It will be interesting to see if AMD is even permitted to purchase ATI. Big mergers and buyouts are always investigated by the regulatory agencies. Nonetheless, AMD and ATI are both pretty good shops, and if I were a betting man, I'd bet that one of AMD's primary goals with buying ATI is to develop a cell-like line of processors as a high-end extension of their Alchemy line. Doing so would make them a major presence in markets they've had some difficulty with in the past.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #22 of 147
Quote:
Originally posted by MarcUK
Sweet!

There have been mutterings that ATI have developed an API for their GPU's so that you can use them as a DSP for music, Video processing like AVIVO, or massive FPU performance (for 3d rendering like Lightwave, Cinema4d), or physics calculations.

Meanwhile, there have been mutterings that AMD wants a DSP to connect to their Opterons/X2's using hypertransport in the second AM2 slot, - either a second CPU, or a custom DSP - like a GPU with special API.

Put 2 and 2 together and this screams...

HTX slots as well
post #23 of 147
Quote:
Originally posted by emig647
If you were a PC builder you wouldn't say that IMO. ATI drivers are the worst drivers I have ever had to work with. They are 2 binary digits short of spyware / bloatware.

I have never had any stability problems or bloat with ATI drivers in Windows, whether I assembled the computer or someone else did. In my case, it wasn't necessary to install Cadalyst, for what I have, I can just get the driver by itself on ATI's site. The ATI software so far has not had any incoming or outgoing traffic either, so the claim of being nearly spyware is quite dubious.
post #24 of 147
Quote:
Originally posted by Splinemodel
AMD has minimal focus on low power computing, an area that Intel has sunk billions into. This is why they're so far ahead of AMD in the laptop/Small-PC market. I don't see AMD posing a threat anytime soon. Intel is also pulling well ahead of TSMC in fab technology, which is only going to further separate them from the rest of the low-power market.

But Intels low end video may become the one big thing that windows vista will blow ups into a mass dumping of intel chip sets.
post #25 of 147
Quote:
Originally posted by tirefire
OS X wouldn't need any updates or optimizations to run on AMD rather than Intel. Processor-wise, their new models both support the same feature set (SSE3, etc.). If this weren't true, people would not be getting OS X to run on their generic AMD PCs. It's not like Windows XP required special optimizations to run on AMD chips - AMD chips have been designed from day one to be a 100% compatible competitor to Intel chips.

The only problem would be getting drivers for AMD motherboard chipsets. If Apple were to sell both AMD and Intel systems, they could likely use some kind of chipset like the new nForce models that are used in both types of motherboards so they wouldn't need to write as many drivers.

apple wanted to put osx on the $100 laptop with is running a amd chip
post #26 of 147
Quote:
Originally posted by nagromme
Not to stray away from the conversation, but...

Worst-case possibilities:

* Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.

* Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.

* Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.

All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.

Best-case possibilities:

* AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)

* Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.

* Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.

* Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.

I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.

If AMD is stupid enough to not sell to Intel customers, their sales will tank, and their stock will continue to tank. I can only hope that they would not be that stupid.

the main reason they are buying ATI is to have a decent set of chipsets for their own cpu's, in-house. Their own chipsets suck, as is well known, and they have to rely on others for them.

The other reason is to get a chipset with graphics capabilities, a la Intel, so that they can compete in that area as well.

Insofar as the Mainline gpu's are concerned, AMD can't stop selling them to all who want them, or their investment will up and burn.

Business 101.

Your second part is much more likely.
post #27 of 147
Quote:
Originally posted by tirefire
OS X wouldn't need any updates or optimizations to run on AMD rather than Intel. Processor-wise, their new models both support the same feature set (SSE3, etc.). If this weren't true, people would not be getting OS X to run on their generic AMD PCs. It's not like Windows XP required special optimizations to run on AMD chips - AMD chips have been designed from day one to be a 100% compatible competitor to Intel chips.

The only problem would be getting drivers for AMD motherboard chipsets. If Apple were to sell both AMD and Intel systems, they could likely use some kind of chipset like the new nForce models that are used in both types of motherboards so they wouldn't need to write as many drivers.

Don't be so sure about that. It's well known in the PC industry that programs, and possibly even the OS, can be optimised for either AMD or Intel. There is just enough difference in the way the chips work that it can matter.

With the Core 2 chips, there is more of a difference, and, so far, a performance difference. The fact that Intel's new chips are 4 issue rather than three, and that SSE 4 is run in a 128 bit line rather than 2 64 bit ones, also makes a big difference. The way they handle memory, and differences in cache size also makes a difference.

All of these, and other differences, can result in performance issues if not optimised for.
post #28 of 147
Quote:
Originally posted by Joe_the_dragon
But Intels low end video may become the one big thing that windows vista will blow ups into a mass dumping of intel chip sets.

I wish people would stop saying this. The GMA950 works fine for all of Tiger's eye candy (despite many fanboy predictions and claims otherwise), there's no reason to expect them to not work well for Vista.
post #29 of 147
Quote:
Originally posted by nagromme
There may well be an exclusive contract, but I doubt there would be one that's REALLY long term. Because a) Apple wouldn't want to sign one, especially after being burned by CPU vendors in the past. And b) Intel wouldn't play hardball because they've really wanted Apple as a customer for some time. And c) There's no need for Intel to worry in the short-term anyway: Core 2 is that good!

Beyond the short term, looking ahead a few years--which is the soonest I see there being any big threat to Intel caused by ATI joining AMD--I think Apple will have the freedom to shop around.

Imagine in 2008, if AMD comes out with a chip twice as fast as Intel's best, and Intel has nothing in the pipeline... I bet Apple would go to great lengths to get out of any exclusive relationship with Intel. And we'd reap the benefits

Now, I don't think such a dramatic scenario is likely, of course.

I can't see any reason why Apple would be interested in AMD. The likelihood that AMD will come out with anything in 2007 through the forseeable future that would trounce Intel anymore is highly unlikely.

Intel has learned its lesson. It won't go down one road anymore. This has been a wake-up call, and they have responded well.
post #30 of 147
Quote:
Originally posted by Joe_the_dragon
But Intels low end video may become the one big thing that windows vista will blow ups into a mass dumping of intel chip sets.

Nonsense!

Intel's latest chipsets are designed to run Vista just fine. What they are not designed to do is to be high-end gaming chips. Though Intel is showing some interest in that again.
post #31 of 147
So Intel decides to buy NVidia..who knows but I'm pretty sure it won't effect Apple in any way.

G5 2GHZ Power Mac, iPod Shuffle (1st Gen),iPod Nano (2nd Gen),iPod (5th Gen), Apple TV, Apple TV 2G x2, iPad 2,iPhone 4S, rMBP 15" 2.6

Reply

G5 2GHZ Power Mac, iPod Shuffle (1st Gen),iPod Nano (2nd Gen),iPod (5th Gen), Apple TV, Apple TV 2G x2, iPad 2,iPhone 4S, rMBP 15" 2.6

Reply
post #32 of 147
Quote:
Originally posted by Joe_the_dragon
apple wanted to put osx on the $100 laptop with is running a amd chip

That was an AMD Geode. It has its own brand of onboard video, which will not be impacted in the least by the ATI buyout.

The Geode is more comparable to a VIA, Crusoe, or even a PXA270 than any PC chip. The Geode targets the low-end of the market that buys ULV pentium M or Core solo chips (i.e. the 1.0 to 1.2GHz variants)
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #33 of 147
Quote:
Originally posted by Splinemodel
That was an AMD Geode. It has its own brand of onboard video, which will not be impacted in the least by the ATI buyout.

The Geode is more comparable to a VIA, Crusoe, or even a PXA270 than any PC chip. The Geode targets the low-end of the market that buys ULV pentium M or Core solo chips (i.e. the 1.0 to 1.2GHz variants)

Of course, it's no longer an AMD Geode.
post #34 of 147
Yes, Here it is... More obscure wisdom from ReCompile.
I personally believe that as far as technology goes, AMD would have been Apple's First choice when deciding what chip maker to go with. But all things considered, They had such a bad time with Motorola and IBM not being able to keep up the pace, that they would loose all the ground they would gain in a technological advantage. This was most evident when the G3 chip came out, and followed by the G4 chips. Intel confessed to me that Apple had gained a 3 YEAR technological advantage in the industry. This is huge. But without the ability to move for more than 3 years, the playing field once again became even. I believe that Apple needed to have none less than the largest most able, chipmaker to be the current supplier in this delicate transition. I think it was also a choice of perception. Most people see windows and Intel as one. The last thing Apple needed was people seeing the windoze machines as having a better or different chip than available on macs. Once again. But... Apple knows that once the transition is complete, they will have the ability to use AMD as well as intel, or just AMD. I think of AMD as the Mac of the chip-world. They are the innovators. They are also the underdogs. But they are out performing Intel. Intel has responded with hurried releases and price wars to try and stop the bleeding that AMD is causing them by gaining Intel's market share daily.
Intel just released a statement that they will no longer use ATI chip technology after the end of this year, as not to feed their competitor AMD. This could hurt Intel in the long run. Also it may make Nvidia's stock climb, as it will open up that account to Nvidia.
-ReCompile-
"No matter where you go, There you are"
- Buckaroo Bonzai
Reply
-ReCompile-
"No matter where you go, There you are"
- Buckaroo Bonzai
Reply
post #35 of 147
Quote:
Originally posted by ReCompile
Yes, Here it is... More obscure wisdom from ReCompile.
I personally believe that as far as technology goes, AMD would have been Apple's First choice when deciding what chip maker to go with. But all things considered, They had such a bad time with Motorola and IBM not being able to keep up the pace, that they would loose all the ground they would gain in a technological advantage. This was most evident when the G3 chip came out, and followed by the G4 chips. Intel confessed to me that Apple had gained a 3 YEAR technological advantage in the industry. This is huge. But without the ability to move for more than 3 years, the playing field once again became even. I believe that Apple needed to have none less than the largest most able, chipmaker to be the current supplier in this delicate transition. I think it was also a choice of perception. Most people see windows and Intel as one. The last thing Apple needed was people seeing the windoze machines as having a better or different chip than available on macs. Once again. But... Apple knows that once the transition is complete, they will have the ability to use AMD as well as intel, or just AMD. I think of AMD as the Mac of the chip-world. They are the innovators. They are also the underdogs. But they are out performing Intel. Intel has responded with hurried releases and price wars to try and stop the bleeding that AMD is causing them by gaining Intel's market share daily.
Intel just released a statement that they will no longer use ATI chip technology after the end of this year, as not to feed their competitor AMD. This could hurt Intel in the long run. Also it may make Nvidia's stock climb, as it will open up that account to Nvidia.

While I can agree with most of what you said, you left a couple of things out.

The most important is that Apple had a GOOD look at Intel's roadmap well before the deal was consummated. You can be sure of that.

Remember when Jobs was up on stage and talked about the performance/power situation? Many people were thinking, "What is he smoking, the Prescott, and the Xeons use so much power, and they are being killed by AMD, and IBM's G5 is pretty close, and uses less power?"

Going by that, even though Intel is the gorilla, the performance still sucked.

Now, we see otherwise. Apple knew what we didn't.

Apple isn't going to AMD. At this time, they would be fools to do so. And AMD is having many pricing problems which is going to destroy their profits. Intel can afford it, but the still much smaller AMD may not be able to.
post #36 of 147
Quote:
Originally posted by ReCompile
. . . I think of AMD as the Mac of the chip-world. They are the innovators. . .

IBM has been the innovator of the chip world for some time now. Trouble is, we've seen that software authors have been too lazy or too stupid to keep up with innovation. We're right on the cusp of seeing electronics dictate the industry again, but the personal computer market is still wrapped up in a lot of legacy. In twenty years when the PC is dead, lightweight stuff will all be Java -- the extension of the modern software development -- and the rest will be architecture specific.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #37 of 147
Quote:
Originally posted by Splinemodel
IBM has been the innovator of the chip world for some time now. Trouble is, we've seen that software authors have been too lazy or too stupid to keep up with innovation. We're right on the cusp of seeing electronics dictate the industry again, but the personal computer market is still wrapped up in a lot of legacy. In twenty years when the PC is dead, lightweight stuff will all be Java -- the extension of the modern software development -- and the rest will be architecture specific.

I hope it will be Java. Java has suffered the past few years.
post #38 of 147
Quote:
Originally posted by melgross
I hope it will be Java. Java has suffered the past few years.

It has actually done quite well in embedded devices as a AOT-compiled package. Most cable boxes run Java apps and Java display layers.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #39 of 147
Quote:
Originally posted by Splinemodel
Trouble is, we've seen that software authors have been too lazy or too stupid to keep up with innovation. [..] In twenty years when the PC is dead, lightweight stuff will all be Java -- the extension of the modern software development -- and the rest will be architecture specific.

Wha-what now? You speak of innovation and then mention "when the PC is dead", Java, and architecture-agnostic (read: low-quality) software?

Peh.
post #40 of 147
Quote:
Originally posted by melgross
Java has suffered the past few years.

Possibly because it offers nothing worth considering. Its syntax is a weak rip-off of C++ (which is in itself horrible), and its alleged "feature" (being platform-agnostic) is, when you really think about it, completely useless. For a console application (including server daemons), you won't want that for performance reasons. For a front-end, you won't want that because UIs are platform-specific for a reason. If every OS had the same UI, what would be the deciding factor of one OS over another? Exactly.

There is nothing Java can offer you that a clean separation between a platform-agnostic (but compiled!) framework/back-end, written in an efficient language (e.g. plain C) and a set of platform-specific (and not necessarily compiled; interpreted can be good enough) front-ends, written in high-level APIs (e.g. Cocoa, .NET) can't give you.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: General Discussion
AppleInsider › Forums › General › General Discussion › AMD to acquire graphics chip giant ATI