AMD to acquire graphics chip giant ATI

245678

Comments

  • Reply 21 of 146
    Quote:

    Originally posted by MarcUK

    Sweet!



    There have been mutterings that ATI have developed an API for their GPU's so that you can use them as a DSP for music, Video processing like AVIVO, or massive FPU performance (for 3d rendering like Lightwave, Cinema4d), or physics calculations.



    Meanwhile, there have been mutterings that AMD wants a DSP to connect to their Opterons/X2's using hypertransport in the second AM2 slot, - either a second CPU, or a custom DSP - like a GPU with special API.



    Put 2 and 2 together and this screams...




    HTX slots as well
  • Reply 22 of 146
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by emig647

    If you were a PC builder you wouldn't say that IMO. ATI drivers are the worst drivers I have ever had to work with. They are 2 binary digits short of spyware / bloatware.



    I have never had any stability problems or bloat with ATI drivers in Windows, whether I assembled the computer or someone else did. In my case, it wasn't necessary to install Cadalyst, for what I have, I can just get the driver by itself on ATI's site. The ATI software so far has not had any incoming or outgoing traffic either, so the claim of being nearly spyware is quite dubious.
  • Reply 23 of 146
    Quote:

    Originally posted by Splinemodel

    AMD has minimal focus on low power computing, an area that Intel has sunk billions into. This is why they're so far ahead of AMD in the laptop/Small-PC market. I don't see AMD posing a threat anytime soon. Intel is also pulling well ahead of TSMC in fab technology, which is only going to further separate them from the rest of the low-power market.





    But Intels low end video may become the one big thing that windows vista will blow ups into a mass dumping of intel chip sets.
  • Reply 24 of 146
    Quote:

    Originally posted by tirefire

    OS X wouldn't need any updates or optimizations to run on AMD rather than Intel. Processor-wise, their new models both support the same feature set (SSE3, etc.). If this weren't true, people would not be getting OS X to run on their generic AMD PCs. It's not like Windows XP required special optimizations to run on AMD chips - AMD chips have been designed from day one to be a 100% compatible competitor to Intel chips.



    The only problem would be getting drivers for AMD motherboard chipsets. If Apple were to sell both AMD and Intel systems, they could likely use some kind of chipset like the new nForce models that are used in both types of motherboards so they wouldn't need to write as many drivers.




    apple wanted to put osx on the $100 laptop with is running a amd chip
  • Reply 25 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by nagromme

    Not to stray away from the conversation, but...



    Worst-case possibilities:



    * Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.



    * Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.



    * Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.



    All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.



    Best-case possibilities:



    * AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)



    * Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.



    * Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.



    * Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.



    I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.




    If AMD is stupid enough to not sell to Intel customers, their sales will tank, and their stock will continue to tank. I can only hope that they would not be that stupid.



    the main reason they are buying ATI is to have a decent set of chipsets for their own cpu's, in-house. Their own chipsets suck, as is well known, and they have to rely on others for them.



    The other reason is to get a chipset with graphics capabilities, a la Intel, so that they can compete in that area as well.



    Insofar as the Mainline gpu's are concerned, AMD can't stop selling them to all who want them, or their investment will up and burn.



    Business 101.



    Your second part is much more likely.
  • Reply 26 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by tirefire

    OS X wouldn't need any updates or optimizations to run on AMD rather than Intel. Processor-wise, their new models both support the same feature set (SSE3, etc.). If this weren't true, people would not be getting OS X to run on their generic AMD PCs. It's not like Windows XP required special optimizations to run on AMD chips - AMD chips have been designed from day one to be a 100% compatible competitor to Intel chips.



    The only problem would be getting drivers for AMD motherboard chipsets. If Apple were to sell both AMD and Intel systems, they could likely use some kind of chipset like the new nForce models that are used in both types of motherboards so they wouldn't need to write as many drivers.




    Don't be so sure about that. It's well known in the PC industry that programs, and possibly even the OS, can be optimised for either AMD or Intel. There is just enough difference in the way the chips work that it can matter.



    With the Core 2 chips, there is more of a difference, and, so far, a performance difference. The fact that Intel's new chips are 4 issue rather than three, and that SSE 4 is run in a 128 bit line rather than 2 64 bit ones, also makes a big difference. The way they handle memory, and differences in cache size also makes a difference.



    All of these, and other differences, can result in performance issues if not optimised for.
  • Reply 27 of 146
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by Joe_the_dragon

    But Intels low end video may become the one big thing that windows vista will blow ups into a mass dumping of intel chip sets.



    I wish people would stop saying this. The GMA950 works fine for all of Tiger's eye candy (despite many fanboy predictions and claims otherwise), there's no reason to expect them to not work well for Vista.
  • Reply 28 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by nagromme

    There may well be an exclusive contract, but I doubt there would be one that's REALLY long term. Because a) Apple wouldn't want to sign one, especially after being burned by CPU vendors in the past. And b) Intel wouldn't play hardball because they've really wanted Apple as a customer for some time. And c) There's no need for Intel to worry in the short-term anyway: Core 2 is that good!



    Beyond the short term, looking ahead a few years--which is the soonest I see there being any big threat to Intel caused by ATI joining AMD--I think Apple will have the freedom to shop around.



    Imagine in 2008, if AMD comes out with a chip twice as fast as Intel's best, and Intel has nothing in the pipeline... I bet Apple would go to great lengths to get out of any exclusive relationship with Intel. And we'd reap the benefits



    Now, I don't think such a dramatic scenario is likely, of course.




    I can't see any reason why Apple would be interested in AMD. The likelihood that AMD will come out with anything in 2007 through the forseeable future that would trounce Intel anymore is highly unlikely.



    Intel has learned its lesson. It won't go down one road anymore. This has been a wake-up call, and they have responded well.
  • Reply 29 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by Joe_the_dragon

    But Intels low end video may become the one big thing that windows vista will blow ups into a mass dumping of intel chip sets.



    Nonsense!



    Intel's latest chipsets are designed to run Vista just fine. What they are not designed to do is to be high-end gaming chips. Though Intel is showing some interest in that again.
  • Reply 30 of 146
    jimbo123jimbo123 Posts: 153member
    So Intel decides to buy NVidia..who knows but I'm pretty sure it won't effect Apple in any way.
  • Reply 31 of 146
    splinemodelsplinemodel Posts: 7,311member
    Quote:

    Originally posted by Joe_the_dragon

    apple wanted to put osx on the $100 laptop with is running a amd chip



    That was an AMD Geode. It has its own brand of onboard video, which will not be impacted in the least by the ATI buyout.



    The Geode is more comparable to a VIA, Crusoe, or even a PXA270 than any PC chip. The Geode targets the low-end of the market that buys ULV pentium M or Core solo chips (i.e. the 1.0 to 1.2GHz variants)
  • Reply 32 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by Splinemodel

    That was an AMD Geode. It has its own brand of onboard video, which will not be impacted in the least by the ATI buyout.



    The Geode is more comparable to a VIA, Crusoe, or even a PXA270 than any PC chip. The Geode targets the low-end of the market that buys ULV pentium M or Core solo chips (i.e. the 1.0 to 1.2GHz variants)




    Of course, it's no longer an AMD Geode.
  • Reply 33 of 146
    recompilerecompile Posts: 100member
    Yes, Here it is... More obscure wisdom from ReCompile.

    I personally believe that as far as technology goes, AMD would have been Apple's First choice when deciding what chip maker to go with. But all things considered, They had such a bad time with Motorola and IBM not being able to keep up the pace, that they would loose all the ground they would gain in a technological advantage. This was most evident when the G3 chip came out, and followed by the G4 chips. Intel confessed to me that Apple had gained a 3 YEAR technological advantage in the industry. This is huge. But without the ability to move for more than 3 years, the playing field once again became even. I believe that Apple needed to have none less than the largest most able, chipmaker to be the current supplier in this delicate transition. I think it was also a choice of perception. Most people see windows and Intel as one. The last thing Apple needed was people seeing the windoze machines as having a better or different chip than available on macs. Once again. But... Apple knows that once the transition is complete, they will have the ability to use AMD as well as intel, or just AMD. I think of AMD as the Mac of the chip-world. They are the innovators. They are also the underdogs. But they are out performing Intel. Intel has responded with hurried releases and price wars to try and stop the bleeding that AMD is causing them by gaining Intel's market share daily.

    Intel just released a statement that they will no longer use ATI chip technology after the end of this year, as not to feed their competitor AMD. This could hurt Intel in the long run. Also it may make Nvidia's stock climb, as it will open up that account to Nvidia.
  • Reply 34 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by ReCompile

    Yes, Here it is... More obscure wisdom from ReCompile.

    I personally believe that as far as technology goes, AMD would have been Apple's First choice when deciding what chip maker to go with. But all things considered, They had such a bad time with Motorola and IBM not being able to keep up the pace, that they would loose all the ground they would gain in a technological advantage. This was most evident when the G3 chip came out, and followed by the G4 chips. Intel confessed to me that Apple had gained a 3 YEAR technological advantage in the industry. This is huge. But without the ability to move for more than 3 years, the playing field once again became even. I believe that Apple needed to have none less than the largest most able, chipmaker to be the current supplier in this delicate transition. I think it was also a choice of perception. Most people see windows and Intel as one. The last thing Apple needed was people seeing the windoze machines as having a better or different chip than available on macs. Once again. But... Apple knows that once the transition is complete, they will have the ability to use AMD as well as intel, or just AMD. I think of AMD as the Mac of the chip-world. They are the innovators. They are also the underdogs. But they are out performing Intel. Intel has responded with hurried releases and price wars to try and stop the bleeding that AMD is causing them by gaining Intel's market share daily.

    Intel just released a statement that they will no longer use ATI chip technology after the end of this year, as not to feed their competitor AMD. This could hurt Intel in the long run. Also it may make Nvidia's stock climb, as it will open up that account to Nvidia.




    While I can agree with most of what you said, you left a couple of things out.



    The most important is that Apple had a GOOD look at Intel's roadmap well before the deal was consummated. You can be sure of that.



    Remember when Jobs was up on stage and talked about the performance/power situation? Many people were thinking, "What is he smoking, the Prescott, and the Xeons use so much power, and they are being killed by AMD, and IBM's G5 is pretty close, and uses less power?"



    Going by that, even though Intel is the gorilla, the performance still sucked.



    Now, we see otherwise. Apple knew what we didn't.



    Apple isn't going to AMD. At this time, they would be fools to do so. And AMD is having many pricing problems which is going to destroy their profits. Intel can afford it, but the still much smaller AMD may not be able to.
  • Reply 35 of 146
    splinemodelsplinemodel Posts: 7,311member
    Quote:

    Originally posted by ReCompile

    . . . I think of AMD as the Mac of the chip-world. They are the innovators. . .



    IBM has been the innovator of the chip world for some time now. Trouble is, we've seen that software authors have been too lazy or too stupid to keep up with innovation. We're right on the cusp of seeing electronics dictate the industry again, but the personal computer market is still wrapped up in a lot of legacy. In twenty years when the PC is dead, lightweight stuff will all be Java -- the extension of the modern software development -- and the rest will be architecture specific.
  • Reply 36 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by Splinemodel

    IBM has been the innovator of the chip world for some time now. Trouble is, we've seen that software authors have been too lazy or too stupid to keep up with innovation. We're right on the cusp of seeing electronics dictate the industry again, but the personal computer market is still wrapped up in a lot of legacy. In twenty years when the PC is dead, lightweight stuff will all be Java -- the extension of the modern software development -- and the rest will be architecture specific.



    I hope it will be Java. Java has suffered the past few years.
  • Reply 37 of 146
    splinemodelsplinemodel Posts: 7,311member
    Quote:

    Originally posted by melgross

    I hope it will be Java. Java has suffered the past few years.



    It has actually done quite well in embedded devices as a AOT-compiled package. Most cable boxes run Java apps and Java display layers.
  • Reply 38 of 146
    chuckerchucker Posts: 5,089member
    Quote:

    Originally posted by Splinemodel

    Trouble is, we've seen that software authors have been too lazy or too stupid to keep up with innovation. [..] In twenty years when the PC is dead, lightweight stuff will all be Java -- the extension of the modern software development -- and the rest will be architecture specific.



    Wha-what now? You speak of innovation and then mention "when the PC is dead", Java, and architecture-agnostic (read: low-quality) software?



    Peh.
  • Reply 39 of 146
    chuckerchucker Posts: 5,089member
    Quote:

    Originally posted by melgross

    Java has suffered the past few years.



    Possibly because it offers nothing worth considering. Its syntax is a weak rip-off of C++ (which is in itself horrible), and its alleged "feature" (being platform-agnostic) is, when you really think about it, completely useless. For a console application (including server daemons), you won't want that for performance reasons. For a front-end, you won't want that because UIs are platform-specific for a reason. If every OS had the same UI, what would be the deciding factor of one OS over another? Exactly.



    There is nothing Java can offer you that a clean separation between a platform-agnostic (but compiled!) framework/back-end, written in an efficient language (e.g. plain C) and a set of platform-specific (and not necessarily compiled; interpreted can be good enough) front-ends, written in high-level APIs (e.g. Cocoa, .NET) can't give you.
  • Reply 40 of 146
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by Splinemodel

    It has actually done quite well in embedded devices as a AOT-compiled package. Most cable boxes run Java apps and Java display layers.



    Embedded devices, yes. But not in computers that we can use.
Sign In or Register to comment.