IBM unveils dual-core PowerPC chips up to 2.5GHz

2456714

Comments

  • Reply 21 of 279
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by curiousuburb





    Yeah, I saw that. I'm wondering who will want a dual 1.4GHz chip.



    The enhancements are intriguing though. They don't tell us much about it.



    We can SPECULATE
  • Reply 22 of 279
    brussellbrussell Posts: 9,812member
    Quote:

    Originally posted by melgross

    Will Apple use these singly, or will we see dual duals as well? Hmm.



    I wonder about that too. I expect they'll just go from dual chips to (a single) dual core chip, but who knows? Maybe they'll have a top end with dual dual-core chips.
  • Reply 23 of 279
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by BRussell

    I wonder about that too. I expect they'll just go from dual chips to (a single) dual core chip, but who knows? Maybe they'll have a top end with dual dual-core chips.



    I'm afraid that they might just stick with one as well.



    But their case and mobo designs have two interfaces. It might be easy, if the cooling allows it, to have duals.



    A new Express mobo would be something to wonder about.
  • Reply 24 of 279
    aquamacaquamac Posts: 585member
    I would buy a dualcore PM G5 but a G5 powerbook? ....mmmm No.
  • Reply 25 of 279
    g3prog3pro Posts: 669member
    Do people forget that AMD maxes out at around 2.6ghz JUST LIKE THE G5? Of course they do.



    Apple needs to get low-power chips in laptops, and the G5 is definitely not the way to go. Apple knew that and that's one reason for the switch. The other reason is ghz and portability. Take a look at Doom 3 on mac for instance.
  • Reply 26 of 279
    9secondko9secondko Posts: 929member
    Whoever said MHZ is everything?



    The fact is that a 2 GHZ chip will handily outperform a 1.6 GHZ chip of the same architecture.



    MHZ mythology aside, that is simply a fact. Get over it.



    2 GHZ is greate than 1.6. Fact. Not myth. Remember, we are speaking of the SAME chip, not differing architectures.



    And the differing archtitectures Intel is providing are more efficient for mobile applications.



    A 1.6 G5 notebook is underpowered now. Especially for Apples flagship Powerbook which has been suffering performance deficiency for 2 years now.



    And the 3 GHZ Pentium thing...



    The point of that was to speak of architecture. Obviously the M would have to have a few stages deeper pipes and other changes to get to 3GHZ and it would consumer more power, but it certainly does better than a P4 at 2 GHZ. And by far. Just think for a sec.



    The M architecture is "pound for pound" superior to both the P4 and Athlon 64 FX. Clock for clock, it provides more performance. That is why it is the foundation for future Intel processors. When Intel massages the chip to higher frequencies, it gets a greater performance return for the power it consumers then AMD chips, P4 or G5.



    I think the 970 MP is jsut about right, but still behind AMD and when the MP is actually in a computer, AMD will be even further ahead.



    IBM just plain dropped the ball here.

    Again, I think Apple is moving in the right direction. It's about time Apple will have the highest performing chips all around. A premium computer should have premium parts. Intel looks to provide those parts by mid 2006. The future looks bright. It is just taking too long to get here.
  • Reply 27 of 279
    macfandavemacfandave Posts: 603member
    Hmmm... I thought the code name was Little^2Late^2.



    (I couldn't figure out how to do superscripts.)
  • Reply 28 of 279
    mike12309mike12309 Posts: 135member
    Quote:

    Originally posted by macFanDave

    Hmmm... I thought the code name was Little^2Late^2.



    (I couldn't figure out how to do superscripts.)




    a month ago it was "the G5 is supreme and Intel makes a load of crap"... now its the other way around... what has changed? the quality of processors? hmm no. Oh thats right, now steve likes Intel, so i guess i should too.
  • Reply 29 of 279
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by mike12309

    a month ago it was "the G5 is supreme and Intel makes a load of crap"... now its the other way around... what has changed? the quality of processors? hmm no. Oh thats right, now steve likes Intel, so i guess i should too.



    No, it's not that. That's silly.



    Two months ago we still thought that IBM, and maybe Freescale were going to come up with what Apple needed over the long time scale. Apparently they won't.



    Jobs did say that they would still have new (presumably better) machines coming out using these chips. But supposedly IBM hasn't presented a plan that Apple could live with after that. Intel has.



    Remember that Intel has been trying to get Apple to use their chips since the Apple II. They must have showed Apple somethings that IBM couldn't match in the years ahead.



    I still think that the PPC architecture is better. But if it takes three years to do what x86 does in two, it just gets further behind.



    It's not the basic concept, it the implementation.
  • Reply 30 of 279
    farvefarve Posts: 69member
    So the new FX consumes 13-16W. How many Watts does the current G4 use?

    And how much will the newer 90nm G4s use?



    Viktor
  • Reply 31 of 279
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by mike12309

    a month ago it was "the G5 is supreme and Intel makes a load of crap"... now its the other way around... what has changed? the quality of processors? hmm no. Oh thats right, now steve likes Intel, so i guess i should too.



    No...a month ago it was "the G5 should be at 3GHz by now...wow, IBM is like Freescal...they just don't give a damn"...6 months ago it was "the G5 should be at 3GHz by now...what's taking so long"...12 months ago it was "damn, the G5 won't be at 3GHz like it should have been...maybe we'll get something good in January"...today it's "looks like there's nothing new in the pipeline, Intel is the way to go".



    Still not convinced? You the forum search tools.
  • Reply 32 of 279
    kim kap solkim kap sol Posts: 2,987member
    The dual-core announcement is interesting, until you realize that it's not an improvement over the current lineup unless Apple decides to sell dual-dual-cores computers.



    This will not help gaming or any app that isn't multithreaded one bit. What it will let you do is run single-thread apps without a hiccup. Not bad but not ground-breaking.



    Although the few well threaded apps will be monsters on those computers.
  • Reply 33 of 279
    auroraaurora Posts: 1,142member
    Has anyone asked the golden question of when these will ship or is this just paper as Motorola was prone to do for year after agonizing year? AMD does rule the Desk,Intel the Lap in my view. IBM dragged their arse on these things the whole way and as I said where is the product? Steve must have been pissed as hell when told he needed liquid cooling last year for 2.5. By the time IBM has these things shipping Apple will be on Intel
  • Reply 34 of 279
    brussellbrussell Posts: 9,812member
    Quote:

    Originally posted by farve

    So the new FX consumes 13-16W. How many Watts does the current G4 use?

    And how much will the newer 90nm G4s use?



    Viktor




    According to this, the current G4 is 18-21 watts at about the same Ghz rating. According to this, the new G4 will have around 15 W at approximately the same Ghz. So it looks similar to this G5.
  • Reply 35 of 279
    altivec_2.0altivec_2.0 Posts: 995member
    I don't know about anyone else, but I would buy a Dual 2.5Ghz Dual core PowerMac before I would ever consider buying one from intel.



    man, i hate the idea apples switching to intel could have been at least AMD....sad
  • Reply 36 of 279
    mike12309mike12309 Posts: 135member
    Quote:

    Originally posted by Altivec_2.0

    I don't know about anyone else, but I would buy a Dual 2.5Ghz Dual core PowerMac before I would ever consider buying one from intel.



    man, i hate the idea apples switching to intel could have been at least AMD....sad




    here here mate!



    proud owner:



    -ibook 12 inch

    -Athlon 64 3200+ 2.0 Winchester Core (w 20 apple cinema)

    -future owner of a Athlon x2 4800
  • Reply 37 of 279
    brussellbrussell Posts: 9,812member
    Quote:

    Originally posted by Altivec_2.0

    I don't know about anyone else, but I would buy a Dual 2.5Ghz Dual core PowerMac before I would ever consider buying one from intel.



    How efficient would a dual dual-core be? I'm guessing that most users get little benefit right now from duals, let alone quads. I'm sure there are some who use apps that take advantage of them, or use multiple apps simultaneously. But quads?
  • Reply 38 of 279
    naknak Posts: 101member
    Quote:

    Originally posted by mike12309

    ghz does not = preformance. not by a long shot.



    i dont feel a need to go any further than that.




    Perhaps one of my old threads might help. I was confused back in March, but thanks to the numerous an intellectual answers, I am no more.
  • Reply 39 of 279
    tofutoddtofutodd Posts: 30member
    IBM is a specialty chip manufactuer. They wouldn't spend millions on development of a mobile G5 unless there were guranteed customers. Apple is the only consumer of G5 chips...
  • Reply 40 of 279
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by BRussell

    How efficient would a dual dual-core be? I'm guessing that most users get little benefit right now from duals, let alone quads. I'm sure there are some who use apps that take advantage of them, or use multiple apps simultaneously. But quads?



    If the program is able to use two cpu's it should work with four cores as well. Apple has worked on that in 10.4. Previous to that it could only use two.



    Some programs won't get any benefit from four. Mostly games only use one. There's a great deal of talk about this one in PS3 and 360 threads, as well as the dual threads on Ars etc.
Sign In or Register to comment.