AMD chief says Apple will eventually use AMD chips

123468

Comments

  • Reply 101 of 159
    Quote:
    Originally Posted by auxio


    I see the AMD vs Intel battle as being similar in a lot of ways to Apple vs Microsoft.



    Except that Intel does a lot more research, has more advanced technology, and a more elegant roadmap.
  • Reply 102 of 159
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by auxio


    And 7.5 somehow made Mac OS leap ahead into the current state of OS technology? That's what my real argument was. Mac OS was 1980's technology until Mac OS X came out. Windows 95 was still head and shoulders better in many ways, even if it was still pretty bad.



    Windows 95 was way behind Mac OS in other areas, so what are you getting at?



    Quote:

    I can't remember how many times I saw the bomb/unhappy Mac on my wife's old Mac. It certainly felt like a lot more than I ever saw the BSoD when using Windows 95.



    That's too bad. I have a complete opposite experience.



    Quote:

    At least NT was a reality -- and there were plans in the pipeline to combine it with 95 (which essentially happened with Windows 2000 -- which I used happily for a few years before XP finally came out). Apple had nothing but vaporware in it's pipeline at the time.



    No, it didn't happen untIl XP. 2000 was not a consumer OS. You can whine all you want about how you could use it at home, but it was never meant that way.



    Quote:

    Perhaps some of the components were better than PCs (like the audio system and the Apple monitors), but I didn't need that at the time.



    You're twisting reality. Example? At the time, it wasn't unusual for a PC not to have a sound chip at all; you had to have a sound card on PCI. Built-in audio wasn't usual until a few years later.



    It also wasn't until Windows 98 that it supported multiple monitors, and only so in a very buggy manner.



    Quote:

    I needed a computer to learn software development, telnet into my computer labs, email, and write a couple of reports in Word with. The PC I bought happily filled those needs and I learned to use it pretty well. I learned all the quirks of Windows 95 and Linux rather than paying $1000 more and learning all of the quirks of Mac OS.



    Yeah, yeah.
  • Reply 103 of 159
    Quote:
    Originally Posted by Splinemodel


    Except that Intel does a lot more research, has more advanced technology, and a more elegant roadmap.



    True. I'm rather concerned if AMD can afford to follow Intel to 32nm, and Intel still has the NUMA, CSI and IMC cards to pull out if they can't do anything else. Though sometimes Intel just seems to go ahead on raw power, as in the case of Kentsfield and Clovertown. I can't see anything elegant with those implementations of a four way core.



    I still wonder how effective NetBurst could have been if one had eliminated the latencies and bandwidth obstructions. I'm still amazed how Intel could increase the pipeline length of the Northwood with circa 50% and still retain the same performance at the same clock speeds. Those engineers are good.
  • Reply 104 of 159
    Quote:
    Originally Posted by jamezog


    Perhaps... I can't see Apple offering both AMD and Intel at the same time - that's way too much like Dell's "customize it all" business model. I could definitely see Apple jumping ship, though, or at least threatening to jump ship in order to squeeze Intel a bit.



    Meh... it's still good press (rumor-mongering) for Apple fiends.
  • Reply 105 of 159
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by sunilraman


    Yup, exactly. It seems that they had specific performance targets within thermal envelope targets. It happens that there's all these enormous headroom if you're not obsessed about power draw, and decibels.



    Agreed, when AMD brings out their 65nm stuff Intel will have higher clocked but within low-thermal envelope gear. I can't imagine clock speeds going DOWN from this point onwards, I really think the next stage (Core 3 say) will be 3ghz and upwards. But maybe I'm still locked into the MHZ race. Especially due to influence from visiting all those overclocking websites.



    The MHz, now GHz race, is quite valid, as long as one is comparing oranges to oranges.



    AMD K8 designs have better performance with higher clocked chips. So do Core 2 chips.



    But one speed on a K8 doesn't compare directly to the same speed on a Core 2.



    AMD used to name its chips on the clock rate that the performance would be if it were an Intel chip.



    Of course, with the new chips, that doesn't work anymore. There are too many complexities in the new designs.
  • Reply 106 of 159
    It will be good to use AMD's next generation chipsets in the Mac. Hypertransport access to secondary devices including a co-processor will greatly boost processing power for specific tasks. If Apple were to put in a special processor/board for improving the performance of their multimedia/whatever by a large magnitude, it would benefit everyone. Intel is trying to come up with a similar offering, but AMD is ahead in this game.



    http://www.anandtech.com/cpuchipsets...oc.aspx?i=2768
  • Reply 107 of 159
    Quote:
    Originally Posted by melgross


    The MHz, now GHz race, is quite valid, as long as one is comparing oranges to oranges.



    AMD K8 designs have better performance with higher clocked chips. So do Core 2 chips.



    But one speed on a K8 doesn't compare directly to the same speed on a Core 2.



    AMD used to name its chips on the clock rate that the performance would be if it were an Intel chip.



    Of course, with the new chips, that doesn't work anymore. There are too many complexities in the new designs.



    Fair enough. But personally I want to see a 45nm 5ghz 50W TDP Core-based Quadcore Intel by end of 2007.
  • Reply 108 of 159
    Quote:
    Originally Posted by talksense101


    It will be good to use AMD's next generation chipsets in the Mac. .. Intel is trying to come up with a similar offering, but AMD is ahead in this game.



    Sorry, but where have you been this past year? Intel's Core and Core2 on mobile and desktop platforms now clealry edge out AMD.



    Intel's "similar offering" is here, the Core MicroArchitecture and other goodies now starting to be realised on Core2 and going forward...



    AMD's next-generation 65nm stuff will compete with Intel's next-generation 45nm stuff in a year's time and going forward.
  • Reply 109 of 159
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by sunilraman

    [quote



    The key to Intel's success in regaining the CPU crown is going down to 65nm and 45nm. That gave them the jump on everyone. "Hitting the wall at 90nm" was a pretty catastrophic scenario in CPU-land. IBM/Moto couldn't hack it, AMD managed to, and still do, produce some nice stuff at 90nm with decent clocks and thermal envelopes in their current range.



    None of them "hacked" it. AMD least of all. AMD was so far behind everyone else in moving to 90nm ( they just finished a little while ago), that they were able to take advantage of the solutions that both IBM and Intel had found.



    AMD's thermal's are pretty bad right now. up to 125 watts. Right there with the old Intel chips, and well above any of the new Core and Core 2 designs.



    Quote:

    It was clear for a few years IBM/Freescale would not be able to pull 65nm in any reasonable amount of time to save Apple.



    Both companies could have, if they wanted to. But neither did.



    Quote:

    Aside from the Core Microarchitecture and other chip designer-y stuff, is it not that they said that the way to keep Moore's Law going is to go down to 65nm and onwards to 45nm.



    Beyond 45nm, I wonder what's on the horizon. And WTF happened to the promise of optical computing? Shuffling photons around could be much cooler (literally and figuratively).





    Now we're getting into some VERY interesting stuff.



    Each time they more to a smaller die shrink, they are going to encounter even greater thermal problems. It's a matter of physics. Intel is working on vertical transistors, as a way of getting the same (or fairly close to) number of atoms into the gates. This is a tough road to travel. Other technologies are being worked on.



    45nm will be attainable. But, after that it's a crapshoot. The best figuring at this time is that the smallest they can go with current thinking is somewhere between 32 and 20 nm. That's even with better materials and designs.



    They used to think it was about 15 to 10 nm. but those thermal problems hadn't been considered. It was thought that the difficulty would be confined to being able to make masks at that size, and that would be the smallest they could go, even with the theoretical x-ray beam equipment that they had no idea how to produce.



    But, now they know otherwise.



    The leakage, and other problems which are even more daunting, increase geometrically, as the square (height x width) of the lines matter more than the width alone at these sizes.



    There have been some major breakthroughs in optical computing. Just recently, a chip was produced in the lab that contains hundreds of lasers. this has been done before, but this is the first time it has been done on silicon. The holy grail.



    But, it will take some time before this can be used for actual computing purposes, and it is mostly useful for transmitting information between chips than in doing actual computations.



    for example, it will decrease the cost of bringing optical fiber the last mile, and the last hundred feet.



    It will also be instrumental in allowing supercomputers to increase their performance, and in having an extermely high speed link between them.



    Shades of the Forbin Project with Colossus.



    Quote:

    [Side Geek Note] Apparently in Star Trek: Next Gen somewhere in there they talk about the computers, where imagine instead of electrons flying about you have photons or subatomic particles or something moving about, not only in real space (not fast enough), it moves in "subspace" (standard term for anything faster-than-light in the Star Trek universe).



    Star Trek was cute? Wasn't it?
  • Reply 110 of 159
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by Zandros


    Didn't Tulsa beat those 8xx Opterons pretty good? Granted, most of it seems to be thanks to the 64 MB cache on the IBM chipset but still. What could these chips do with HT and the integrated memory controller... Netburst must have been severly handicapped by latencies and bandwidth.



    According to more recent tests, as far as the two core models are concerned, when one goes to 4 and 8 sockets, memory bandwidth becomes much more important.
  • Reply 111 of 159
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by auxio


    I guess the timeline on Wikipedia is wrong then... maybe you should submit a correction to the article. I wasn't using a Mac at the time, so I don't know the exact dates. Regardless, Mac OS was still a historical footnote in any operating systems textbook at the time.



    Right, sorry, you got me. That's what happens when I skim through a Wikipedia article and post too fast....



    Sure. But at least it was there and working to some extent (ie. running more than one major app at a time wasn't quite as much of a gamble on the PC side).



    I remember shopping for a computer at the time and I found that Macs were at least $1000 more than an equivalent PC. Sure you can't directly compare the Motorola CPUs to the Intel CPUs (the slippery argument Apple fanboys love to use), but you also can't argue that the PC I got for $1000 cheaper would do everything I needed to do at the time as well as the Mac. Sure it wouldn't play Marathon, but it did play Leisure Suit Larry pretty well.



    Windows 95 had a theoretical advantage over the Mac OS at the time, but did Wiki bother to tell you that it didn't have much practical advantage?



    It crashed at least as much. If you ran a 16 bit program along with a "protected" 32 bit one, neither was protected.



    By continuing the ISA bus inside the machines, the "Plug N Play" didn't work. Neither did USB. It was based on DOS, even though MS, at the time, denied it.



    There ware so many problems that it's hardly worth mentioning all of them. By the time Apple came out with System 9, both 95, and 98, were outdated, and a total mess.



    Then MS came out with "me".



    Oh, the tales we can tell...
  • Reply 112 of 159
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by Zandros


    True. I'm rather concerned if AMD can afford to follow Intel to 32nm, and Intel still has the NUMA, CSI and IMC cards to pull out if they can't do anything else. Though sometimes Intel just seems to go ahead on raw power, as in the case of Kentsfield and Clovertown. I can't see anything elegant with those implementations of a four way core.



    I still wonder how effective NetBurst could have been if one had eliminated the latencies and bandwidth obstructions. I'm still amazed how Intel could increase the pipeline length of the Northwood with circa 50% and still retain the same performance at the same clock speeds. Those engineers are good.



    Those two chips are simply an intermediary between the current two chip designs and new ones with built-in memory controllers.



    They aren't as bad as all that either. The cache on Intel chips is not only larger, but is more spohisticated. Each core can get the entire cache of each of the two cores if needed, and can share otherwise. AMD can't do that. Also each of the two cores on a die can go to memory seperately. That mitigates some of the advantage of the on die controller.
  • Reply 113 of 159
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by talksense101


    It will be good to use AMD's next generation chipsets in the Mac. Hypertransport access to secondary devices including a co-processor will greatly boost processing power for specific tasks. If Apple were to put in a special processor/board for improving the performance of their multimedia/whatever by a large magnitude, it would benefit everyone. Intel is trying to come up with a similar offering, but AMD is ahead in this game.



    http://www.anandtech.com/cpuchipsets...oc.aspx?i=2768



    There is nothing impressive here.



    By AMD's own timeline, they are at least 18 months to two years behind Intel on moving over to 65 nm completely, which Intel had already managed. They haven't started yet. Intel will be moving to 45 around the end of 2007 to the beginning of 2008, so AMD will be two years behind that as well.



    The K8L has already proven to be a disappointment to everyone, as the tests on systems have shown. About 0 to 5% improvement over the current K8 line doesn't give them much to brag about. With high end K8L designs using 125 watts, they are well behind there as well.



    As far as some of the other technologies go, they have to catch up in cache technology. At best, what they have shown will come close.



    The fact that they are still going to use 3 athrimetic units, where Intel now uses 4, will continue to dog them for quite a while. Going to 4 will just consume more power.



    Intel will be moving to on die memory controllers with 45 nm.



    I could go on, but it would be pointless.
  • Reply 114 of 159
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by sunilraman


    Fair enough. But personally I want to see a 45nm 5ghz 50W TDP Core-based Quadcore Intel by end of 2007.



    As do we all.
  • Reply 115 of 159
    Quote:
    Originally Posted by melgross


    They aren't as bad as all that either. The cache on Intel chips is not only larger, but is more spohisticated. Each core can get the entire cache of each of the two cores if needed, and can share otherwise. AMD can't do that.



    I know, but i feel that if someone wants to solve a problem, the answer would be "throw more cache at it", as opposed to the previous "throw some more clock cycles at it". Same case with the four way chip, "throw more cores at it". That's what I mean with not very elegant. It's a very easy thing to do, and the same reason I'm impressed how Intel's engineers handled the Prescott.



    Quote:

    Also each of the two cores on a die can go to memory seperately. That mitigates some of the advantage of the on die controller.



    They can? I thought only Xeon chips supported dual FSBs. The problem is that they still have to go through the MCH which, as far as I know, will be a bottle neck.



    Are we really sure on each process shrink drawing more power? Intel always insist on almost doubling the transistor count with each shrink, so I think that plays a large part. If you just keep the same chip, you would most likely reach a lower thermal envelope and reach higher clock speeds as a result. Transistor leakage is a problem that increases with each shrink though, but how much of an effect has it?
  • Reply 116 of 159
    Quote:
    Originally Posted by Zandros


    ...Transistor leakage is a problem that increases with each shrink though, but how much of an effect has it?



    I think it is a major technical challenge in going to 45nm and 32nm or lower. AFAIK.

    STUPID ELECTRONS!!! We need to find another subatomic particle to use in CPUs.
  • Reply 117 of 159
    Quote:
    Originally Posted by sunilraman


    I think it is a major technical challenge in going to 45nm and 32nm or lower. AFAIK.

    STUPID ELECTRONS!!! We need to find another subatomic particle to use in CPUs.



    My vote is for the bovine-related particles, moo-ons, found only in the rare Upsidaisium element, but then we would need Moose and Squirrel for that... which could give AMD their chance.
  • Reply 118 of 159
    davegeedavegee Posts: 2,765member
    Quote:
    Originally Posted by Gambit


    Marijuanna is not an hallucinogen.*





    True but laced with just the right kind of special spices it makes for one heck of a delivery system.
  • Reply 119 of 159
    "Marijuanna is not an hallucinogen"



    why does everybody always correct me? this is the most pc anal retenant frikin mb i have ever been on... humor doesnt have to be accurate it just has to be silly ironic or humorous,people who are fans of george carlin will get that one... he would say that often during his early standups oh excuse me "stage performances" im gonna sneak over to a few you peoples houses and wipe my ass with your toothbrushes...



    this message board just is not what it used to be. bow go ahead and bash me for my spelling and caps or what the fuck ever.....



    your all so wonderfully reknown with knowledge about all the wonderful things that dont make a damn or wont even two years from now.... every frikin post i make is torn apart and for the negative.. crap man ive been on the internet since the early box modems and green and black screens i think i know how to make a somewhat cohesive post.



    BLAH BLAH APPLE SUCKS I DONT LIKE ITV CAPS HURT MY EYES BICKER BICKER BICKER PC PC PC AND THEN MORE BICKERING AND THEN THEN WHEN IS APPLE GOING TO MAKE A CUBE POSTS BLAH BLAH BLUE RAY IS BETTER



    "NO... "DEATH RAY IS BETTER" DIE DIE DIE



    DID I MENTION THAT THIS BOARD NEEDS AND ENEMA?
  • Reply 120 of 159
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by Catman4d2


    "Marijuanna is not an hallucinogen"



    why does everybody always correct me? this is the most pc anal retenant frikin mb i have ever been on... humor doesnt have to be accurate it just has to be silly ironic or humorous,people who are fans of george carlin will get that one... he would say that often during his early standups oh excuse me "stage performances" im gonna sneak over to a few you peoples houses and wipe my ass with your toothbrushes...



    this message board just is not what it used to be. bow go ahead and bash me for my spelling and caps or what the fuck ever.....



    your all so wonderfully reknown with knowledge about all the wonderful things that dont make a damn or wont even two years from now.... every frikin post i make is torn apart and for the negative.. crap man ive been on the internet since the early box modems and green and black screens i think i know how to make a somewhat cohesive post.



    BLAH BLAH APPLE SUCKS I DONT LIKE ITV CAPS HURT MY EYES BICKER BICKER BICKER PC PC PC AND THEN MORE BICKERING AND THEN THEN WHEN IS APPLE GOING TO MAKE A CUBE POSTS BLAH BLAH BLUE RAY IS BETTER



    "NO... "DEATH RAY IS BETTER" DIE DIE DIE



    DID I MENTION THAT THIS BOARD NEEDS AND ENEMA?



    I think my irony detector exploded.
Sign In or Register to comment.