nvidia, amd, and the G5

2

Comments

  • Reply 21 of 57
    kidredkidred Posts: 2,402member
    [quote]Originally posted by powerdoc:

    <strong>



    Thanks to be realist.



    I doubt that AMD will make the G5, since the AIM alliance Apple have two suppliers for his chips : Mot and IBM.

    Apple will never enter in direct competition in the market of high end servers, where IBM is the king and where servers cost several dozens thousands dollars. As many geeks here have said : you can put a power4 in a mac without making him burning. The best alternative to mot is IBM. If IBM is able to make a new chip with an altivec stuff like,Apple will choose it for his high end product. IBM is the only company able know to product SOI 0,13 product with low K dielectric.



    Mot seems more interested in the embedded market : the G4 7455 was awarded best embedded chip processor of the year 2001.



    The rumors seems to say that the next chip will be made by IBM and will not be call the G5, therefore it explains why some people say that there will no G5 </strong><hr></blockquote>



    I was under the impression that the contract for AIM ends this year.





    [quote] I don't see a G4, even at 2 gighz say...on Rapid Io being able to counter that level of performance. <hr></blockquote>



    I'm sorry, but if we get to 2ghz within the next 2 years, a 2ghz G4 will be PLENTY fast for at least 70% of users. Processor speed is starting to outpace bloated app speed. I know video and 3D need the speed and will never be happy, but the rest of us would be giddy with a 2ghz G4 within a year or so.
  • Reply 22 of 57
    jdbonjdbon Posts: 109member
    [QUOTE]





    So what do they do? Motorola has no compelling incentive to produce a high performance desktop/workstation cpu. None. They make gear for routers and such. If it happens to also be serviceable for desktops, great, but the vast bulk of their profits are elsewhere.



    Exactly! Apple needs a CPU vendor who gives a damn. Forget about bringing the CPU development in house, they already have an OS, a slew of Apps, and a computer hardware line to develop. When you think about, Apple is doing quite a bit of work considering their small market share. The CPU and GPU should be someone else's problem. So my answer to Steve would be: Call AMD, now! Tell that they will sign a 5 year contract, and a minimum of 1.5 million cpus a year ( for their quad Powermacs of course, hehe). AMD will get the business as well as the exposure they need to grow. I'd love to see the CEO of AMD get on stage at a Macworld and say something like this: "you all know how fast our Athlon and Opteron processor are on the X86 side running Windows XP, well now we've developed a cpu that runs an OS worthy of utilizing the chip;s full potential. You've all watched the G4 Powermacs beat Pentiums at Alti-vec enhanced tasks, now you are going to see the AMD G5 destroy the Pentium at every task, while running the worlds most advanced, unix based OS!"



    I can dream can't I?



    [ 07-05-2002: Message edited by: jdbon ]</p>
  • Reply 22 of 57
    spiffsterspiffster Posts: 327member
    I just dont see OS X being run on x86. Wouldn't the PPC decoder or whatever is used slow down the processor itself? I also cant see Apple porting OS X to x86. Too much work for the benefits.



    I think IBM would be the best choice. No need of porting, and IBM already has dedicated itself to the desktop market, though im not sure about the Power4 (and are also already part of AIM).
  • Reply 24 of 57
    ghost_user_nameghost_user_name Posts: 22,667member
    [quote]Originally posted by arbitrary:

    <strong>

    .

    .

    .

    Steve looks at Avie. Avie looks at Steve.



    Which one do they pick?



    </strong><hr></blockquote>



    <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />



    IBM.

    or Nvidia.



    I like Nvidia better. They are hungry for a new market, and they could do an excellent job building the CPU, the GPU, and MoBo. I hope I'm proven right.





    mika.
  • Reply 25 of 57
    bigcbigc Posts: 1,224member
    [quote]Originally posted by Lemon Bon Bon:

    <strong>"so quit whinin' "



    So...who's whining?



    Mr. Big 'C'...?



    (Do you ever have anything useful to say above your usual one and a half sentence of 'C...'?)



    Lemon Bon Bon</strong><hr></blockquote>



    exactly
  • Reply 26 of 57
    jet powersjet powers Posts: 288member
    [quote]The cycle has served Intel well, until now. As the specialized chips around it have become commodified, the CPU has survived thanks to its power and versatility. But when it comes to multimedia - and that's where the demand is - the CPU gives way to the graphics chip, which is hundreds of times more efficient. The latest GeForce, scheduled to launch this summer, will have nearly 120 million transistors - more than double those on a Pentium 4. Unlike other specialized chips, the GPU will not likely shrink so much that it will be swallowed by the CPU. If anything, the reverse could happen. After all, no one needs a speedy 2-GHz CPU to run Excel.



    For a perfect example of the changing dynamic between the GPU and CPU, look at the Xbox. It uses a special version of Nvidia's nForce chipset, built around a tricked-out GeForce3 to handle graphics and sound. Microsoft paid Nvidia more than it did Intel for its 733-MHz Pentium III. For Huang, it's a proof of concept. "The Xbox is how the computer will be built in the next 20 years. More semiconductor capacity will go to the user experience," he says. "The microprocessor will be dedicated to other things like artificial intelligence. That trend is helpful to us. It's a trend that's inevitable."<hr></blockquote>



    Either the smartest or stupidest thing I've heard in months.



    I can't decide which.



    ting5
  • Reply 27 of 57
    jdbonjdbon Posts: 109member
    First: The "its fast enough"does not hold water. In fact, with Apples current direction they need all the cpu power they can get! They really want to make inroads the cinematic world as well as the sciences, two areas which require fast CPUs for complex calculations.



    Second: I think that the proposed AMD/Apple CPU would not be a x86 cpu emulating PPC, but rather a native PPC chip.



    Now here's the question for you hardware gurus out there: How difficult would it be to use elements of the upcoming Opteron processor and adpat them to PPC so that it could run current Mac software without modification? In addition how long would this take if possible?
  • Reply 28 of 57
    thresherthresher Posts: 35member
    [quote]Originally posted by jdbon:

    <strong>First: The "its fast enough"does not hold water. In fact, with Apples current direction they need all the cpu power they can get! They really want to make inroads the cinematic world as well as the sciences, two areas which require fast CPUs for complex calculations.



    Second: I think that the proposed AMD/Apple CPU would not be a x86 cpu emulating PPC, but rather a native PPC chip.



    Now here's the question for you hardware gurus out there: How difficult would it be to use elements of the upcoming Opteron processor and adpat them to PPC so that it could run current Mac software without modification? In addition how long would this take if possible?</strong><hr></blockquote>



    There is no way that Apple/AMD would be suicidal enough to do PPC in emulation. The overhead would be ridiculous.



    What would happen is that the Altivec instruction set would have to be integrated into the Athlon/Opteron, assuming they still have the licensing. Furthermore, the Athlon/Opteron will have MMX1 and MMX 2 (licensed from intel) built in, so PC software manufacturers would be able to translate to OS X much more easily. Altivec and MMX2 are basically different ways of doing the same thing.



    The nice thing about OS X is that it is BSD based. BSD already runs on X86 machines. Basically, it's all there, some tweaking to get rid of G# specific hardware calls and you're most of the way there.



    Furthermore, Windows applications could be ported fairly simply. Windows uses a HAL (hardware abstraction layer) so that programs don't really talk directly to the hardware, the OS does. So, rewriting the programs would be mostly a matter of making sure they call to whatever the Darwin equivalent is.



    OS X apps would not have much difficulty either with the transition. You could probably patch them to work on the new hardware.



    BTW, I do not see Nvidia as a partner to this, not yet anyway. nVidia is lining itself up to be a challenger to ALL cpu manufacturers, including AMD, so I doubt they would want to share the platform. While AMD licenses to nVidia on the Athlon platform, this works to AMD's benefit. Making nVidia a full fledged partner might cause conflict just like what happened in the Motorola/IBM/Apple triumvirate.
  • Reply 29 of 57
    zazzaz Posts: 177member
    It is an interesting proposal...



    Still, I think that an IBM developed PPC (not necessarily a Pwr4) coupled to an nVidia developed and optimized psuedo-nForce is really more up Apples alley.



    A PPC nForce type architecture accomplishes all the things Apple needs, gives nVidia a huge and attractive OEM account... one that can be developed along side their own offerings and IBM a secondary large customer for its own PPC units.



    Apple wins with both companies as they have specific desktop and server markets they are trying to maintain, where as Mot is effectively ambivalent. In so Apple would not have to redo any OS level items and keep software issues to a minimum.



    The nForce is a good board, but no large OEMs have taken it up as of yet. nVidia would get a huge amount of recognition.



    Apple may only have 5% of the desktop market share...but it is the 6th largest maker... and that is a lot of units.



    [ 07-05-2002: Message edited by: zaz ]</p>
  • Reply 29 of 57
    mac phdmac phd Posts: 6member
    I'm pretty sure that the AMD chips now use a form of x86 translation, layered on top of a basically RISC-y core. Not the same as what Transmeta does though. So conceivably, that same core could be used with a PPC layer replacing the x86 one. But I remember this coming up in a thread before, and there might be a big 'gotcha'.



    Anyone that's a CPU guru have any thoughts/wisdom?
  • Reply 31 of 57
    jdbonjdbon Posts: 109member
    So what your saying is that it would require some modification, but none too drastic so that software would have to be re-written? Is this less or more drastic than the change from 68k to PPC?
  • Reply 32 of 57
    anandanand Posts: 285member
    I think "Yet Another Registration" is the only one to get this possible link between nvidia and Apple. The future is not in CPU's but in GPU's. That is very true with QE. QE is not going to make the simple things faster but rather the really complex things faster. That is where a fast GPU will come in. In a way apple is already moving in that direction with QE. Apple will never use a x86 processor - in any form. They have said that many times. I belive Moki's claim of a version of the Power 4 more that a AMD (x86) powered Apple.
  • Reply 32 of 57
    thresherthresher Posts: 35member
    [quote]Originally posted by mac phd:

    <strong>I'm pretty sure that the AMD chips now use a form of x86 translation, layered on top of a basically RISC-y core. Not the same as what Transmeta does though. So conceivably, that same core could be used with a PPC layer replacing the x86 one. But I remember this coming up in a thread before, and there might be a big 'gotcha'.



    Anyone that's a CPU guru have any thoughts/wisdom?</strong><hr></blockquote>



    Nope, AMD runs X86 code natively, no translation necessary. The core is not a RISC solution, not anymore so than the Pentium 4 anyway.



    Adding PPC to X86 is not possible, not natively. They use entirely different architectures (in fact, X86 and PPC ARE different architechtures just to clarify the issue).



    Since both BSD and Windows run with an abstraction layer, the target would be that layer, not the hardware. It becomes mostly irrelevant what hardware you have as long as the drivers can talk to the OS and the hardware. That's a bit simplistic, but it would be no more difficult than porting a game from the PS2 to the XBOX or Gamecube.
  • Reply 34 of 57
    jdbonjdbon Posts: 109member
    Well if AMD cannot modify its current chips, how long would it take if AMD developed a PPC chip from scratch (and would they still be interested)?
  • Reply 35 of 57
    zazzaz Posts: 177member
    [quote]Originally posted by jdbon:

    <strong>Well if AMD cannot modify its current chips, how long would it take if AMD developed a PPC chip from scratch (and would they still be interested)?</strong><hr></blockquote>



    Half a decade and, no, probably not. Firstly, they have never buil tone from ground up..it has always been an x86 clone of sorts. Second, they will want to focus on their current x86 market. They are already trying ot crack Intel... why try to mess with IBM too? The cash simply isn't there.
  • Reply 36 of 57
    thresherthresher Posts: 35member
    [quote]Originally posted by jdbon:

    <strong>Well if AMD cannot modify its current chips, how long would it take if AMD developed a PPC chip from scratch (and would they still be interested)?</strong><hr></blockquote>



    It would take a year or two for them to get the engineering implemented and the plants converted.



    The question is, why would they want to? Initially when PPC was introducted, there was a clear advantage for RISC over CISC. In the meantime, X86 has introduced a number of RISC like stuff into their processors.



    There is no clear advantage any longer. RISC doesn't seem to scale as well as CISC does. RISC was really meant for a time when CPU horsepower was limited and everyone was trying to wring the last ounce out of the hardware. This really isn't the case any longer. The X86 (and upcoming X86-64) architecture have processing cycles to spare. There really isn't any need for RISC when brute force computing will do the trick, is readily available, and cheaper.
  • Reply 37 of 57
    thresherthresher Posts: 35member
    [quote]Originally posted by zaz:

    <strong>



    Half a decade and, no, probably not. Firstly, they have never buil tone from ground up..it has always been an x86 clone of sorts. Second, they will want to focus on their current x86 market. They are already trying ot crack Intel... why try to mess with IBM too? The cash simply isn't there.</strong><hr></blockquote>



    That simply isn't true. The Athlon was entirely designed in house. It is compatible and runs X86 code natively, but it was designed entirely in house. They built it from the ground up. Same with the K6. At one time, they were licensed to make 8088, 286, and 386 chips. AMD is perfectly capable from an engineering standpoint. What they don't have is deep pockets.



    Another reason AMD would want to use their existing architecture is that they have very limited production facilities. Turning one of them over to producing PPC chips just wouldn't be cost effective and would detract from the number of Athlons they could manufacture, at least until they could build another plant (the costs associated with a new fab are well over a billion dollars).
  • Reply 38 of 57
    willy_mewilly_me Posts: 10member
    Back to reality here...



    The G5 does exist and will eventually be used in power Macs. It will me made my Motorola, or possibly sub-contracted for production by IBM. It will be a fast, 64bit eBook based CPU with built in DDR controller and HyperTransport / PCIX interfaces.



    You can look at the Motorola 8540 for a look at what is to come. But be advised, the 8540 is not a 64bit CPU. It's only 32 bits. The eBook spec allows for both 32 and 64 bit CPUs. Being an embedded design, the 8540 uses the 32 bit core. It can't address as much memory but it's just as fast for most things. (I had a look at the eBook spec sheets..)



    What Motorola has basically done is they've redesigned their CPU so that customers can pick and choose the components that they want. This is great for Apple. It expands the market for the G5 and there by reduces the cost of development. In the future, expect to see Motorola selling single core G5s in their embedded designs while selling multiple core G5s to Apple. Only one core designed for several markets will greatly reduce development costs.



    Having multiple cores on a chip is actually a really good idea. The UNIX base of OSX should actually be able to get almost 2x the performance from a dual core chip. The 8x86 world has taken another route. They simply make more powerfull chips. The problem with this is that the more powerful chips are less efficient. Long pipelines are expensive and they need 4x the number of transistors in order to double the performance. With multiple cores, 4x the number of transistors gives you 4 cores.



    It's funny, I'm sure if Microsoft moved exclusively to NT several years back then we would now be looking at multiple core 8x86 cpus. Unfortunately, both Intel and AMD have to make chips that run fast with the current software - up until just receintly that would be an SMP unfriendly Windows 9X. The same goes for PPC and Apple. While IBM and Motorola have been talking about it for years, only now does Apple have an OS that will support multiple cores on a single die.



    In the future, expect to see Apple with a single CPU that contains quad 2GHz G5s while the 8x86 world is at 10GHz. The 8x86 chips will probably be faster for most applications (expecially games) but they'll also be twice as big and consume a ton of power. And those applications that are properly threaded (like media encoding), the quad G5s will actually be faster. And if Apple realy wanted a fast computer, they could always simply add another multi-cored CPU. After all, they'll probably have half the transistors of an 8x86 chip. hehe..



    William
  • Reply 39 of 57
    jdbonjdbon Posts: 109member
    Now the question is who has to conform? Will the chip makers conform to Apple needs, or will Apple have to conform to what's available in the industry? I think the answer is pretty obvious.
  • Reply 40 of 57
    willy_mewilly_me Posts: 10member
    [quote]Originally posted by jdbon:

    <strong>Now the question is who has to conform? Will the chip makers conform to Apple needs, or will Apple have to conform to what's available in the industry? I think the answer is pretty obvious.</strong><hr></blockquote>

    A bit of both. Apple will have to conform to what's available in the industry but Motorola is redesigning their PPC division to be very flexable. Apple will have to limit their requests to what Motorola can produce, but... if Apple wants a quad core CPU with DDR interface, HyperTransport, and a big cache they can get it (assuming the manufacturing technology exists.)



    William
Sign In or Register to comment.