AMD to acquire graphics chip giant ATI

124678

Comments

  • Reply 61 of 146
    irelandireland Posts: 17,798member
    Quote:

    Originally posted by mbaynham

    ramblings of a mad man...



    One of the best replies I've heard all month!
  • Reply 62 of 146
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by nagromme

    Not to stray away from the conversation, but...



    Worst-case possibilities:



    * Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.



    * Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.



    * Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.



    All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.



    Best-case possibilities:



    * AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)



    * Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.



    * Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.



    * Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.



    I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.






    .............................

    Let's go chronologically:



    2nd half 2006:

    Intel CPUs are looking rock solid. Apple is using Intel chipset-motherboards. For the rest of this year, Intel won't mind Apple going with ATI or nVidia, whatever works best for Apple. Though I suspect the entire Mac Intel line will be ATI. (I hear onlooker screaming that FireGL sucks compared to nVidia's Quadros). Crossfire or SLI, I think highly unlikely in the Mac Pro. It's a PC-gaming enthusiast thing. And those that want SLI'ed top-of-the-line Quadros, I think you're better off with a nice dual Woodcrest PC-Windows workstation that supports that.



    2007:

    Assuming by now AMD-ATI merger is mostly approved, and starts to gain momentum. Intel pipeline still looks very strong. Intel may start to push Apple to move over to nVidia in their new models. Which would be fine. Or Intel (25% chance IMO) will let Apple continue to ride with ATI or nVidia.



    2008:

    We start to see the first glimmer of what AMD-ATI are really starting to offer. Steve J keeps a close eye on things, but given the momentum of Intel with kickass CPUs in 2006 and 2007 and new developments for 2008, Apple-Intel-nVidia look fine.



    If Steve J starts to see some really interesting stuff in the pipeline (AMD-ATI invites him for a nice magic show) he can start some internal Apple R&D to scope out what ATI-AMD's magic may deliver.



    2009:

    By now it will be clear if AMD-ATI really has anything solid to offer, and if Steve J was really impressed, through the 2nd half of 2008 he'd start loosening up from Intel and 2009 would look good for Apple having wide options for either Intel-nVidia or AMD-ATI, depending on models and stuff, and other new exciting Apple products.



    For Apple, July 2007 onwards, they can easily start to steer the boat in a more platform-agnostic direction if needed. At the end of the day, AMD-Intel-nVidia-ATI all duking it out gives Apple the widest options available. And finally, the OSx86 transition will be heralded as Apple's smartest move in the 2nd half of this decade. The first half of the decade being marred by CPU challenges, but supremely boosted by sexy new, compelling products (in spite of all the CPU problems) and of course, birth of the iPod revolution.



    I just hope that with Apple having to keep pace now with the breakneck PC component world, their Hardware Engineering will continue to focus on quality and long-lasting products that have been a hallmark, for the most part, of their G3, G4, and G5 stuff. Same on the OSx86 side, I hope the quality standards continue to be upheld/ refined.
  • Reply 63 of 146
    melgrossmelgross Posts: 33,580member
    [QUOTE]Originally posted by sunilraman

    Quote:

    Originally posted by nagromme

    Not to stray away from the conversation, but...



    Worst-case possibilities:



    * Intel keeps their lead, but AMD refuses to sell ATI graphics to PC makers (like Apple) who buy from Intel. They are forced to use nVidia or settle for lesser chips from AMD.



    * Or Intel keeps their lead, but refuses to sell CPUs to PC makers (like Apple) who use ATI graphics. They are forced to use nVidia or settle for AMD.



    * Or AMD develops new CPUs to rival anything from Intel, even for laptops, but Intel punishes severely or turns away companies who use both kinds of processors--they want only exclusive customers. Apple is forced choose either one super-fast processor family or the other, they can't pick from both at once.



    All of the above seem unlikely, especially since Intel is so vocal about WANTING to work with Apple. Bullying Apple is not something Intel can do lightly.



    Best-case possibilities:



    * AMD develops new CPUs to rival Intel. And, like other PC makers, Apple can choose to use CPUs from BOTH companies. Two suppliers are better than one. (Exclusive deals might be lost--which can mean higher prices. Competition would increase--which can mean lower prices.)



    * Or AMD develops new ATI graphics technologies that nVidia can't touch, and Apple can use those new GPUs too.



    * Or AMD develops low-cost integrated graphics with better performance than Intel. And Apple can use those too.



    * Or AMD and ATI push Intel and nVidia to do even more, releasing even better products. The resulting competition helps everyone.



    I'm not too dismayed at this point. ATI and Intel are compatible I say leave the Intel-AMD flamewars to PC users. We don't need them any more than we needed IBM vs. Motorola flamewars.






    .............................

    Let's go chronologically:



    2nd half 2006:

    Intel CPUs are looking rock solid. Apple is using Intel chipset-motherboards. For the rest of this year, Intel won't mind Apple going with ATI or nVidia, whatever works best for Apple. Though I suspect the entire Mac Intel line will be ATI. (I hear onlooker screaming that FireGL sucks compared to nVidia's Quadros). Crossfire or SLI, I think highly unlikely in the Mac Pro. It's a PC-gaming enthusiast thing. And those that want SLI'ed top-of-the-line Quadros, I think you're better off with a nice dual Woodcrest PC-Windows workstation that supports that.



    2007:

    Assuming by now AMD-ATI merger is mostly approved, and starts to gain momentum. Intel pipeline still looks very strong. Intel may start to push Apple to move over to nVidia in their new models. Which would be fine. Or Intel (25% chance IMO) will let Apple continue to ride with ATI or nVidia.



    2008:

    We start to see the first glimmer of what AMD-ATI are really starting to offer. Steve J keeps a close eye on things, but given the momentum of Intel with kickass CPUs in 2006 and 2007 and new developments for 2008, Apple-Intel-nVidia look fine.



    If Steve J starts to see some really interesting stuff in the pipeline (AMD-ATI invites him for a nice magic show) he can start some internal Apple R&D to scope out what ATI-AMD's magic may deliver.



    2009:

    By now it will be clear if AMD-ATI really has anything solid to offer, and if Steve J was really impressed, through the 2nd half of 2008 he'd start loosening up from Intel and 2009 would look good for Apple having wide options for either Intel-nVidia or AMD-ATI, depending on models and stuff, and other new exciting Apple products.



    For Apple, July 2007 onwards, they can easily start to steer the boat in a more platform-agnostic direction if needed. At the end of the day, AMD-Intel-nVidia-ATI all duking it out gives Apple the widest options available. And finally, the OSx86 transition will be heralded as Apple's smartest move in the 2nd half of this decade. The first half of the decade being marred by CPU challenges, but supremely boosted by sexy new, compelling products (in spite of all the CPU problems) and of course, birth of the iPod revolution.



    I just hope that with Apple having to keep pace now with the breakneck PC component world, their Hardware Engineering will continue to focus on quality and long-lasting products that have been a hallmark, for the most part, of their G3, G4, and G5 stuff. Same on the OSx86 side, I hope the quality standards continue to be upheld/ refined.



    Sunil, this is all very interesting. But, while I hate to burst your bubble ,

    none of this matters. What Intel is not going to be buying from ATI is their chipsets. Sometimes Intel does that to fill a need. Apple will have no problem continuing to use Intel's chipsets. It doesn't affect video if Apple uses seperate gpu's. Where they won't, such as in the Mini, and the MacBook, they will be using IG anyway.



    As far as Crossfire goes (and SLI), that's something we've been debating here for awhile. I think the consensus has been that we won't see either, so that won't matter if Intel's chipsets support it or not.



    Apple is likely just where they were before it was announced.



    I'm not happy about the whole thing just on principle. If there were four or five major GPU companies out there, it wouldn't matter much, but there aren't.



    So, now what happens to Nvidia? And does this mean that the toe that Intel has been putting into the water of the high end gpu market again as of late, get plunged all of the way in? Or, do they look to buy Nvidia?



    If the AMD ATI deal is approved, an Intel Nvidia deal may have to be as well.



    The analyst who saw this coming almost two weeks ago has to get some credit. Now, in retrospect, we can see AMD shedding assets for the purpose of getting enough cash to make this happen.
  • Reply 64 of 146
    splinemodelsplinemodel Posts: 7,311member
    Quote:

    Originally posted by vinea

    In 20 years the next language will have more levels of abstraction and not less. Eh, I just don't see moving to Verilog or VHDL...if you're going to go to that much trouble for "high performance" then burn the ASIC or load the FPGA.



    "Low performance" software is replacing what traditionally was done by RT software because system performance is "fast enough".



    Vinea




    Have you seen the code for Cell? Have you seen VHDL? The Process statement is very elegant, whereas the way Cell coding does parallelism is certainly not. Behavioral VHDL is actually quite easy to follow as well. RTL and structural HDL are much more confusing, and are required for programmable logic synthesis, but certainly not for parallel CPUs.



    And let's face it, fast-clocked sequential OOE is deader than a doornail.



    One of the topics often covered by the industry rags is how ever complicated software has become less and less stable. Whereas a plane or car with a million parts works great, a million lines of code will always be full of lingering problems and take as much money to develop. The reason is that coding paradigms are too dated to be effective in modern technology. OO is hardly a step forward: more descriptive syntax, greater emphasis on varied subroutine/function usage rather than forcing things into classes, and a more inherently parallel coding structure are all features that seem to be agreed on as the necessary next steps.
  • Reply 65 of 146
    splinemodelsplinemodel Posts: 7,311member
    On topic:



    I just don't think there's any reason for Intel to buy nVidia when they could just spin something up themselves. A lot of the price AMD is paying for ATI will undoubtedly go towards portions of the company where there is a lot of overlap in AMD. For Intel, which does one hell of lot more R&D than AMD does, probably 90%, if not more, of the cost of nVidia would be wasted in overlapping technology.



    If Intel wants to counter AMD tit-for-tat (which I don't think they will), they will more likely invest the 600M from the XScale sell-off towards some hot, new IP in the video/GPU world. The rest, they'd pull from existing projects. This wouldn't be a bad thing: 45nm GPUs when AMD-ATI is still debugging 90nm.
  • Reply 66 of 146
    melgrossmelgross Posts: 33,580member
    Quote:

    Originally posted by Splinemodel

    On topic:



    I just don't think there's any reason for Intel to buy nVidia when they could just spin something up themselves. A lot of the price AMD is paying for ATI will undoubtedly go towards portions of the company where there is a lot of overlap in AMD. For Intel, which does one hell of lot more R&D than AMD does, probably 90%, if not more, of the cost of nVidia would be wasted in overlapping technology.



    If Intel wants to counter AMD tit-for-tat (which I don't think they will), they will more likely invest the 600M from the XScale sell-off towards some hot, new IP in the video/GPU world. The rest, they'd pull from existing projects. This wouldn't be a bad thing: 45nm GPUs when AMD-ATI is still debugging 90nm.




    Well, I do agree with that. But, sadly, business is not always 100% rational. There is the element of "If you do that, I'll do it as well." Intel could afford it better than AMD. I've seen many business's buy others for a product line, when it would have been far cheaper to do it themselves. but there are reasons for that as well. Sometimes the brandname, sometimes the customer list, sometimes the speed of getting into the business RIGHT NOW, rather than next year, or the year after that. It depends on why it's being done.



    I mentioned that Intel has been investigating getting back into high end GPU's, an area that they were NOT successful in last time they did so. That lack of success could be another reason why they might want to buy in. They have a lot on their minds now. It's hard to say which would be more distracting, buying a company, or starting up an entire division of R&D.



    Before, they were just investigating. Now, they may feel it to be more urgent.
  • Reply 67 of 146
    vineavinea Posts: 5,585member
    Quote:

    Originally posted by Splinemodel

    Have you seen the code for Cell?



    Nope.



    Quote:

    Have you seen VHDL?



    Yep.



    Quote:

    The Process statement is very elegant, whereas the way Cell coding does parallelism is certainly not.



    Which is why it will all get hidden by the compiler eventually.



    Quote:

    And let's face it, fast-clocked sequential OOE is deader than a doornail.



    Which matters not at all to the .NET developer and only currently to console developers because the toolset isn't mature.



    Quote:

    One of the topics often covered by the industry rags is how ever complicated software has become less and less stable. Whereas a plane or car with a million parts works great, a million lines of code will always be full of lingering problems and take as much money to develop.



    The reason is that coding paradigms are too dated to be effective in modern technology. OO is hardly a step forward: more descriptive syntax, greater emphasis on varied subroutine/function usage rather than forcing things into classes, and a more inherently parallel coding structure are all features that seem to be agreed on as the necessary next steps.



    Agreed by whom? Hardware engineers?



    The reason a million lines of code has defects is because even with the highest levels of quality (<0.1 defects per KSLOC...which no one will get with a million SLOC) you still have a 100 defects some of which can be fatal.



    A Boeing 777 has only 123K unique parts out of a total 3M. It has roughly 2.3M SLOC. Yea and verily, 777s do come off the line with defects...and that's manufacturing from a proven template.



    Software development is a lot more like building the 777 design and prototype. You don't think there were defects during that process? The Aegis weapon system is 31M SLOC. How many verilog projects have 31M SLOC?



    And why SHOULDN'T a million lines of code cost as much as an object with a million parts? And name a car with a million parts? The average car has 3,800 unique parts and about 35,000 total items. Windows XP has 40 million SLOC. Linux is 30-100 million SLOC.



    http://www.theautochannel.com/news/p...ess018372.html



    And if you had a car with 40 million moving parts you'd see it fail three orders of magnitude more often than they do now.



    Even with five 9s quality there are still 30K defects.



    You vastly underestimate the complexity of modern software and what is more staggering is that each line of code today does more work than they did in the past (because higher level, more abstract languages do more work per line than lower level languages...like say Verilog).



    Software developer productivity has kept pace with or exceeded the rapid advances in other disciplines. While we will continue to evolve and provide even greater productivity we certainly have nothing to be ashamed of as an industry.



    Vinea
  • Reply 68 of 146
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by vinea

    Software developer productivity has kept pace with or exceeded the rapid advances in other disciplines. While we will continue to evolve and provide even greater productivity we certainly have nothing to be ashamed of as an industry....






    If you (generic you, not you vinea specifically) made OSX, the software for the space shuttle perhaps, yes, you have something to be very proud of. If you made Linux, yes, very proud of that, though the end-user side needs a lot of work



    If you (generic you, not you vinea specifically) have been involved with Win95, 98/Me, 2000, XP, Office, and ShiteOnTheVista, you need to put a paperbag over your head.
  • Reply 69 of 146
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by melgross

    Well, I do agree with that. But, sadly, business is not always 100% rational. There is the element of "If you do that, I'll do it as well." Intel could afford it better than AMD. I've seen many business's buy others for a product line, when it would have been far cheaper to do it themselves. but there are reasons for that as well. Sometimes the brandname, sometimes the customer list, sometimes the speed of getting into the business RIGHT NOW, rather than next year, or the year after that. It depends on why it's being done.



    I mentioned that Intel has been investigating getting back into high end GPU's, an area that they were NOT successful in last time they did so. That lack of success could be another reason why they might want to buy in. They have a lot on their minds now. It's hard to say which would be more distracting, buying a company, or starting up an entire division of R&D.



    Before, they were just investigating. Now, they may feel it to be more urgent.






    This is IMHO a good post on where Intel stands in all this. Pursue their own R&D or knee-jerk into talking with nVidia.



    Intel has CPUs, chipsets, the whole wireless Centrino platform. The Intel integrated graphics is about 50% of the market, IIRC. They get into high-end GPUs, particularly with their fab skillz with 65nm and 45nm down the line, Intel-nVidia could have something really interesting.



    Think about Conroe demolishing Athlons at lower TDP. Now think about nVidia GPUs wiping the floor with ATI at lower TDP. Think about Intel integrated graphics where you could actually play a game not older than 3 years
  • Reply 70 of 146
    pbpb Posts: 4,255member
    Fresh news here, but I am not sure if I understand well what this means.
  • Reply 71 of 146
    sunilramansunilraman Posts: 8,133member
    Actually PB someone mentioned this earlier already



    ATI makes the Radeon Xpress200 chipset that goes in various motherboards. It's a chipset with integrated graphics based on the Radeon X300 (pretty crap for games, but whatever...)

    http://www.ati.com/products/radeonxp...tel/index.html



    The motherboard is for Intel CPUs. When ATI's license runs out, they will discontinue making chipset/ motherboard stuff for Intel. They will of course continue to sell the Radeon® Xpress 1100 Series for AMD Desktops chipset and Radeon® Xpress 200 Series for AMD Processors chipset/ motherboard thingy.

    http://www.ati.com/products/radeonxp...dsk/index.html

    http://www.ati.com/products/radeonxp...eries/amd.html



    In the long run, maybe within a year, we'll see some integrated ATI stuff on AMD's AM2 platform. And who knows what other goodies will come from "a transaction that will combine AMD?s technology leadership in microprocessors with ATI?s strengths in graphics, chipsets and consumer electronics... [resulting in] ...a processing powerhouse: a new competitor, better equipped to drive growth, innovation and choice for its customers in commercial and mobile computing segments and in rapidly-growing consumer electronics segments"

    http://www.ati.com/companyinfo/about/amd-ati.html
  • Reply 72 of 146
    sunilramansunilraman Posts: 8,133member
    AMD and ATI will have fierce competition. They're hoping synergy will emerge from this merger/ buyout. Intel Core, Core2, etc. is killer in the desktop and laptop space. AthlonX2 loses out to Conroe and Xeon Woodcrests take on Opterons hard and fast. AMD's Turion for laptops has not been that hot. Well, figuratively, not literally . Intel makes chipsets. ATI is trying to compete in the chipset for motherboards space. nVidia also makes chipsets - nForce for PC is pretty good.



    Then we look at GPUs, ATI and nVidia neck and neck, with Intel Integrated graphics picking up and dominating the low-end.



    AMD-ATI has its work cut out for it. But indeed, maybe we will see some fantastic stuff down the line from the underdogs.



    I'm an nVidia fanboi, and now ex-AMD fanboi (a Conroe-descendant will be in my next PC, I suspect...). Still, best wishes to AMD-ATI
  • Reply 73 of 146
    sunilramansunilraman Posts: 8,133member
    Indulge me a little, again, I'll say it: what do you think will happen if nVidia got access to Intel's 65nm and 45nm manufacturing skillz? Think about it
  • Reply 74 of 146
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by PB

    Fresh news here, but I am not sure if I understand well what this means.






    Hardmac.com gave a good short explanation: "As one would have expected it, now that ATI belongs to AMD, Intel will not renew the license to ATI for having the right to develop and manufacture Intel-compatible chipsets.

    For ATI, it is worth 100 million $, and it allows AMD to isolate Intel for the integrated chipset market. Of course Intel develops its own graphical chipsets, but their performance levels can not compete with dedicated solutions provided by ATI or nVidia. The previous scheme where Intel and ATI were partners to compete with the long partnership between AMD and nVidia might evolve in the coming months. Either Intel and nVidia will team up, or Intel will push R&D to develop more competitive integrated chipsets and/or GPUs. The latter has been rumored to be already in action; the forthcoming GMAX3000 being the first result of such efforts."
  • Reply 75 of 146
    pbpb Posts: 4,255member
    [QUOTE]Originally posted by sunilraman

    Quote:

    Originally posted by PB

    Fresh news here, but I am not sure if I understand well what this means.


    Hardmac.com gave a good short explanation:



    Thanks sunilraman, I just saw this one. So, no clear impacts for the Mac and nVidia's position in the new order as of yet.
  • Reply 76 of 146
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by PB

    Thanks sunilraman, I just saw this one. So, no clear impacts for the Mac and nVidia's position in the new order as of yet.






    Heh. It's kinda like, if the sun explodes, we'll only feel it 8 minutes later (excluding technical details of gravity and physics and sh*t). We can be sure Steve Jobs is meditating for a little while on this news today, but yeah, it's a long term thing to see what really transpires.
  • Reply 77 of 146
    splinemodelsplinemodel Posts: 7,311member
    Quote:

    Originally posted by vinea

    . . .

    Agreed by whom? Hardware engineers? . . .




    Well, the electronic hardware industry rags don't concern themselves too much with software. It's the sentiment of mostly embedded software developers, although these days embedded design is more common than ever, and it results in more revenue and more code than high-level software. Increasing globalization of the labor force, coupled with inreased focus on embedded devices is only going to fuel the development of more efficient paradigm.



    One of the analogies I remember reading was the comparison of contemporary software paradigm as a "universal bolt." A lot of developers champion the idea of code re-use to a non-advantageous extent. Whereas a machine will include many types of bolts, each type selected for its suitability in a specific case, software development seems to influence the idea of using fewer total parts, but parts that are much more complex, much less reliable, and much less well suited for each individual task. Put simply, this approach has failed, and continues to be a risk for failure whenever it is used.



    What's the next step? You don't seem to think there is a next step, which is probably a bad thing to assume since hardware is changing dramatically. At a certain point, there's only so much that can be done in a compiler: if software developers don't want to learn new paradigms, that's fine -- the business will just move to India, China, and Eastern Europe, where the developers there are more persuadable.
  • Reply 78 of 146
    Quote:

    Originally posted by sunilraman

    AMD and ATI will have fierce competition. They're hoping synergy will emerge from this merger/ buyout. Intel Core, Core2, etc. is killer in the desktop and laptop space. AthlonX2 loses out to Conroe and Xeon Woodcrests take on Opterons hard and fast. AMD's Turion for laptops has not been that hot. Well, figuratively, not literally . Intel makes chipsets. ATI is trying to compete in the chipset for motherboards space. nVidia also makes chipsets - nForce for PC is pretty good.



    Then we look at GPUs, ATI and nVidia neck and neck, with Intel Integrated graphics picking up and dominating the low-end.



    AMD-ATI has its work cut out for it. But indeed, maybe we will see some fantastic stuff down the line from the underdogs.



    I'm an nVidia fanboi, and now ex-AMD fanboi (a Conroe-descendant will be in my next PC, I suspect...). Still, best wishes to AMD-ATI




    Intel is likely to hit the FSB wall with there quad-cores and amd's true quad-cores will smoke them and amd is still a lot better with 4 cpus them Intel is. Hyper Transport based co-processors and HTX cards may force Intel to starting use Hyper Transport.
  • Reply 79 of 146
    recompilerecompile Posts: 100member
    Quote:

    Originally posted by melgross

    While I can agree with most of what you said, you left a couple of things out.



    The most important is that Apple had a GOOD look at Intel's roadmap well before the deal was consummated. You can be sure of that.



    Remember when Jobs was up on stage and talked about the performance/power situation? Many people were thinking, "What is he smoking, the Prescott, and the Xeons use so much power, and they are being killed by AMD, and IBM's G5 is pretty close, and uses less power?"



    Going by that, even though Intel is the gorilla, the performance still sucked.



    Now, we see otherwise. Apple knew what we didn't.



    Apple isn't going to AMD. At this time, they would be fools to do so. And AMD is having many pricing problems which is going to destroy their profits. Intel can afford it, but the still much smaller AMD may not be able to.




    I agree with you whole-heartily. I was not trying to imply that Intel did not have a good roadmap. But as you reiterated, Apple cannot afford any hiccups. The "Gorilla" does have the strength to keep prices low for longer then AMD could, should it come to that. But I do believe that should AMD keep on the pace they are on, look for it in the future of Apple. Also with buying ATI, this would give them leverage in the price war. Apple buys both. If together they are cheaper than the purchase of a chip from Intel, and graphic cards from ATI or Nvida, then look out Intel. Besides, they are now in a position to use both (Down the road I believe, not right now). This way they would not have to abandon Intel, just keep them on their toes.
  • Reply 80 of 146
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by Joe_the_dragon

    Intel is likely to hit the FSB wall with there quad-cores and amd's true quad-cores will smoke them and amd is still a lot better with 4 cpus them Intel is. Hyper Transport based co-processors and HTX cards may force Intel to starting use Hyper Transport.



    They have something like HyperTransport in the works.
Sign In or Register to comment.