Intel's 64bit 4GHz Quad processor vs. IBM future CPU in Mac's.

24

Comments

  • Reply 21 of 72
    Quote:

    Originally posted by wizard69

    Addressable memory is fundamental to being able to affer new and unique capabilities to computing hardware and software. With out a growth in address space the industry will stagnate. 64 bit is as important to computing today as was the processor with the first integrated FP unit, as was the first processor with vector capability and was the the ability to address more that 640K of ram.





    Exactly. There are markets (yes, they are very small compared to general computing) where 64-bit addressable memory is important; hence, why we still keep a quad Enterprise 450 around. Oh, how I wish I could address more than 32 bits on the G5, but, I can wait.



    EDIT: to be clear though, most people are making the claim that 64-bit is not needed in consumer applications. This is true today and probably next year, but, with that extra breathing room, maybe some revolutionary product can come out of it.
  • Reply 22 of 72
    g-newsg-news Posts: 1,107member
    Quote:

    Frankly, it makes very little sense for most people, regardless of the platform, as of now and the near future.



    If you read my statements again, Dave, you will find that I never said 64bit was unnecessary in the future. All I said is that right now, consumer desktop systems, and most certainly also for the next 1-2 years ("near future" above), will not need to be able to adress more than 4GB of RAM.

    I'm not saying 4GB of RAM will be enough for everybody, forever, as Bill Gates would have put it, nor would I ever say that 64bit is inferior than 32bit.



    Read more carefully, instead of jumping the gun.
  • Reply 23 of 72
    My only worry about longhorn would be if apple is ready. Which, I can't imagine them not being. Nothing in the movies looks like something quartz couldnot be programmed to handle. If longhorn is a year or too out then hopefully apple can keep the improvments coming so longhorn really is long in the tooth.
  • Reply 24 of 72
    dobbydobby Posts: 797member
    The G5's are already 64bit and must be reliable enough for the Xserve to go G5 as well. Only the OS needs to be made 64bit. The ball is in Apples court.

    Making the OS 64bit is a huge hurdle. MS are already working feverishly on their desktop version (as Apple is I hope).

    I personally think that Apple has the advantage that they are already running on a 64bit processor and all seems well.

    If IBM introduces new CPU technology it won't be as big a challenge as moving to a 64bit OS.

    Does XP run on either Itanic or Operton?



    Dobby.
  • Reply 25 of 72
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by dobby

    The G5's are already 64bit and must be reliable enough for the Xserve to go G5 as well. Only the OS needs to be made 64bit. The ball is in Apples court.

    Making the OS 64bit is a huge hurdle. MS are already working feverishly on their desktop version (as Apple is I hope).

    I personally think that Apple has the advantage that they are already running on a 64bit processor and all seems well.

    If IBM introduces new CPU technology it won't be as big a challenge as moving to a 64bit OS.

    Does XP run on either Itanic or Operton?



    Dobby.




    There is a Operon version of windows, but Linux is a more worrisome OS than windows is right now for such a systems. I'm surprised none of you seem to realize that. Windows isn't the threat. It's Intel's 4 processor 4GHz System.
  • Reply 26 of 72
    yevgenyyevgeny Posts: 1,148member
    Some things to remember...



    First of all, a 4 CPU server requires a different license of Windows in order to use all 4 CPUs. This license costs a bit more and so you won't see a 4 CPU Intel heater on any desktops any time soon.



    Second, 4 CPU servers are expensive because they are intended to be used for different things than just being a computing node, let alone being a desktop. I doubt that such a beast would be price competetive with two new XServe G5s. A 4 CPU Intel beast would probably run around 10 grand.



    Third, Intel has profound issues with scaling the P4's and Xeons and not having them meltdown. These chips put out some profound ammounts of heat and this is a design issue that must be addressed (adding cost).



    Fourth, the new Xeons are larger than the new G5s and as such fewer of them will fit onto a wafer, meaning Intel has higher per chip costs. Economies of scale don't help Intel because economies of scale only offset the cost of the FAB, not the cost of using more silicon.



    Fifth, don't expect to see Intel ship a 64 bit laptop anytime in the next couple of years. Apple should do so this year.



    Sixth, Win64 isn't out yet and won't be out for a little while longer yet. Of course, when it does come out, it will have better 64 bit support than OS X 10.3.



    There is nothing to worry about regarding Intel's propaganda.
  • Reply 27 of 72
    yevgenyyevgeny Posts: 1,148member
    Quote:

    Originally posted by dobby



    Does XP run on either Itanic or Operton?




    Yes on both counts. Of course, it runs in 32 bit mode on an Opteron, but then again there isn't much software out there for 64 bit windows.
  • Reply 28 of 72
    yevgenyyevgeny Posts: 1,148member
    Quote:

    Originally posted by onlooker

    Don't forget about Linux. Everybody seems to be noting, and expecting windows to be what these processors will rely on, but with the Quad processor systems coming which should/could/would be aimed at the high end 3D workstation market; coupled with the fact that Linux has already become a much more accepted OS in the 3D realm than it was in the past. Do any of you think Apple, and IBM would need to counter with a Quad processor workstation probably available as a BTO? Before you get ahead of denying Apples interest (or lack there of) in 3D. We do know Pixar is planing (and is probably finished) to bring RenderMan Server to OS X. Which says something of interest by Steve Job's I think anyway. Wouldn't Apple need a 4 way processor system to compete with these Intel workstation/servers?



    Although I am all for quad G5s, I don't see the need. Modern CGI rendering is done in the server farm. Who cares if the computing node has 2 or CPU's? It is all about what gives you the best performance for the dollar, and what gives you the best TCO (Total Cost of Ownership- the cost to buy, use and maintain a box). I think that the XServe competes quite well in this area and that 2 XServes with G5s would beat up on a quad P4.



    I apologize for the triple post and now leave you to talk amongnst yourselves.
  • Reply 29 of 72
    bitemymacbitemymac Posts: 1,147member
    why worry about paper ware from intel, when athlon64fx is already capable of quad operation....no?



    This can be done if Mobo/OS can support such features. However, we can always have cluster system to match up CPU power. So, if intel quad cpu system cost $10K, then you can have 2 G5 Xserve clustered for less or just keep adding more cluster units for future upgradability.
  • Reply 30 of 72
    mikemike Posts: 138member
    Has any pricing information been posted anywhere? It would be interesting if Intel replaced the Xeon MP over the next two years with the new 64 bit Xeon. If I can get a 4-way 64-bit Xeon for the same price as the Xeon MP then this really is something!!!
  • Reply 31 of 72
    cubistcubist Posts: 954member
    Quote:

    Originally posted by Telomar

    Put that down to very poor journalism then because Intel has confirmed 64 bit addressing is already in Prescott, awaiting a new socket, and that the first 90 nm produced Xeons will also have it. ...



    64-bit addressing does not make it a 64-bit processor. Prescott is most definitely a 32-bit processor. The existence of Intel's rumored X86/64 processor, "Yamhill", is officially still being denied by Intel.



    But the folks at Intel are not as stupid as they appear to be; they see AMD is eating their lunch, and they will come out with something to compete with it. It'll just be late, cost more, and be slower.



    Intel and Microsoft are burning through incredible amounts of cash these days, but they have lots more where that came from.
  • Reply 32 of 72
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by Yevgeny

    Although I am all for quad G5s, I don't see the need. Modern CGI rendering is done in the server farm. Who cares if the computing node has 2 or CPU's? It is all about what gives you the best performance for the dollar, and what gives you the best TCO (Total Cost of Ownership- the cost to buy, use and maintain a box). I think that the XServe competes quite well in this area and that 2 XServes with G5s would beat up on a quad P4.



    I apologize for the triple post and now leave you to talk amongnst yourselves.




    I think that's CG, or 3D rendering, and yes they all use their own render-farms, which are all intel based running Linux in most cases I believe. But I'm not talking about render-farms for ILM , Weta, Pixar, or Dreamworks.

    I'm talking about Machines for serious 3D prosumers/users/freaks who all know they must always have the fastest machine, and graphics card available.



    Right now it looks like intel just made a stand against any nay-sayers, or up, and comers in that area.



    Quote:

    2 XServes with G5s would beat up on a quad P4.



    You must have missed most of the facts, and specs on the internet contesting the G5, and proving it isn't the fastest thing imaginable as we all hoped it was. The specs Apple finessed, and posted were contested by too many, and proven wrong, and misleading IMHO. I was actually ashamed that Steve Jobs still made the claim at MWSF. That being said 2 Xserves with dual G5's probably wouldn't beat up a quad P4 system let alone one of these

    new Intel 4GHz 4 processor systems running 64-bit version of Linux.



    (note) -> However, I do agree that a quad processor system isn't completely necessary for consumers, but ruling out a couple highend pro-sumer machines if Apple decides to impact an inroad into the highend 3D arena isn't necessarily a bad idea either.



    The thing about clustering a couple Xserves together is that the PowerMac line always looks faster, and less expensive than an Xserve does. A person could cluster 2 PowerMacs together, but it's messy, (in appearance) and that's not how Apple seems to like to appear. If Apple has new machines in the wings I think this intel information may bring about a redesign of how these Apple products are presented; possibly a few price shifts for an Xserve w/the Xserve Cluster node possibly as a package, and/or possibly a bigger lineup (more choices) for the PowerMac lineup.



    Apple definitely needs to get on with it though. The Mac has been trailing pretty far behind for the past few years. (which is all Motorola's fault anyway) They used to dictate where things were going in personal computers, and they still do to an extent, but now it's like they watch, and see if it's working for anybody else before they try it. (other than the iPod)

    In computing; - Power, performance, and having a machine capable to do the job says a lot more to consumers who buy your products, and ones who don't yet than one may think. It's like subliminal messages, or reoccurring dreams. The thought of the image resonates, and is always in the back of your mind. It definitely applies when purchasing a computer. The thought's of a super stomper system would help sales tremendously now, and in the future. I just hope we see some really tremendous things from IBM, and Apple by the end of 2004 in Power, Graphics performance, and price vs. price on equally equipped machines from competitors. Because 2005 is just too late.
  • Reply 33 of 72
    mikemike Posts: 138member
    Quote:

    Originally posted by Yevgeny

    Some things to remember...



    Second, 4 CPU servers are expensive because they are intended to be used for different things than just being a computing node, let alone being a desktop. I doubt that such a beast would be price competetive with two new XServe G5s. A 4 CPU Intel beast would probably run around 10 grand.





    Try around $50 for a loaded 4 way Xeon MP setup to run Linux (including memory). For us, this would be our primary machine that would replicate to our cluster nodes. (Actually 2...one hot standby) The cluster nodes would be Dell 1750's or dual G5 XServe machines.



    You are correct that some people are premature in pointing out that two G5's could equal the power of a 4-way Xeon. There is much more to the equation then this. What should I power my cluster with (as in primary node)? Apple does not have anything that could handle this for us (and many others) at this time.



    Others would argue that a company could save $40,000 by just using 2 - Dual G5's. What about employee costs? What about software costs? What about memory costs? What about storage array costs? What about replication node costs? etc.



    There is so much more to the equation then having two dual processor machines that would equal the HP of one quad processor machine.
  • Reply 34 of 72
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by cubist

    64-bit addressing does not make it a 64-bit processor. Prescott is most definitely a 32-bit processor. The existence of Intel's rumored X86/64 processor, "Yamhill", is officially still being denied by Intel.



    But the folks at Intel are not as stupid as they appear to be; they see AMD is eating their lunch, and they will come out with something to compete with it. It'll just be late, cost more, and be slower.



    Intel and Microsoft are burning through incredible amounts of cash these days, but they have lots more where that came from.




    You must have missed the presentation.



    Quote:

    Nocona chips for two-processor servers will arrive in the second quarter, Barrett said, followed quickly by Prescott processors with 32/64-bit capability for single-processor servers and workstations. Prescott and Nocona are functionally the same processor but differ in cache size and bus speed. The 32/64-bit technology will then come to chips for servers with four or more processors in 2005, Barrett added.



  • Reply 35 of 72
    tinktink Posts: 395member
    Quote:

    Don't forget about Linux.

    -onlooker



    IBM has been ramping up their Linux support not only in software but in hardware and more specifically Power CPU's and their associated compilers.



    The Linux market on cheap x86 box's has probably instigated Big Blues charge in this arena as their Unix big iron ($$$$) has become threatened. Part of the 970 is for this market and it looks like IBM is moving to Blades and the smaller server space with Linux.



    While IBM uses all sorts of processors I would think they would be gearing towards the whole widget with home grown Power derivative CPU's.



    I don't see Chimpzilla conceding the market to IBM or to AMD for that matter.
  • Reply 36 of 72
    shawkshawk Posts: 116member
    Sources at the Redmond campus report that it's common knowledge on campus that Longhorn is not expected until middle 2007, at best. It could slip to 2008 if the current Longhorn feature set is not reduced.

    Redmond sold major corporations expensive service contracts for OS upgrades. These expensive contracts will expire before Longhorn is expected to ship. Customers are unhappy.

    The Intel 64bit 4GHz CPU may have larger problems than technical issues.
  • Reply 37 of 72
    wizard69wizard69 Posts: 13,377member
    Hi Atom;



    Yes people seem to mis this point, there are already a large number of applications out there that make use of 64 bit addressing. Then there are a very large number of applications that could make use of 64 bit addressing if low cost hardware existed for it to run on. This is the software that often has to approach data mangement from a more complex perspective to enable operation on 32 bit hardware.



    As far as consumer applications go I think it is only a matter of developers seeing enough 64 bit hardware in the wild for them to target 64 bit hardware. Games are one item that could make immediate use of the extended address space 64 bits offer. Media editing programms are not far behind. I would have to say that there are actually a number of potential consumer applications that could take advantage of 64 bits.



    The only hold up we have is the adoption of 64 bit hardware. There has to be enough hardware out there to enable profitable sales of the software. This is where Apple has the potential to lead if they can transition their consumer lines quickly to 64 bit.



    Quote:

    Originally posted by atomicham

    Exactly. There are markets (yes, they are very small compared to general computing) where 64-bit addressable memory is important; hence, why we still keep a quad Enterprise 450 around. Oh, how I wish I could address more than 32 bits on the G5, but, I can wait.



    EDIT: to be clear though, most people are making the claim that 64-bit is not needed in consumer applications. This is true today and probably next year, but, with that extra breathing room, maybe some revolutionary product can come out of it.




  • Reply 38 of 72
    yevgenyyevgeny Posts: 1,148member
    Quote:

    Originally posted by onlooker

    I think that's CG, or 3D rendering, and yes they all use their own render-farms, which are all intel based running Linux in most cases I believe. But I'm not talking about render-farms for ILM , Weta, Pixar, or Dreamworks.

    I'm talking about Machines for serious 3D prosumers/users/freaks who all know they must always have the fastest machine, and graphics card available.



    Right now it looks like intel just made a stand against any nay-sayers, or up, and comers in that area.




    Ummm, no? This won't be the first quad CPU machine from Intel. So far, how many prosumer freaks use quad CPU machines? None. Everyone uses dual Xeons. The price curve for quad processors is too high for most people to use. Such machines are aimed at the server market.



    Quote:

    Originally posted by onlooker

    You must have missed most of the facts, and specs on the internet contesting the G5, and proving it isn't the fastest thing imaginable as we all hoped it was. The specs Apple finessed, and posted were contested by too many, and proven wrong, and misleading IMHO. I was actually ashamed that Steve Jobs still made the claim at MWSF. That being said 2 Xserves with dual G5's probably wouldn't beat up a quad P4 system let alone one of these

    new Intel 4GHz 4 processor systems running 64-bit version of Linux.




    Well, since Linux x86 apps are compiled using the GCC compiler which is the compiler used in those speed tests, then we can say that a dual 2GHz G5 system is faster than a dual Xeon system when the Xeon system is running Linux. Either way, I wouldn't expect Intel's CPUs to scale linearly. Branch mispredicts will be worse than they currently are. Besides, Intel's CPU is vaporware and the G5 is available in the here and now and is about to get a nice speed bump.



    Further, four 4GHz Intel processors would have to fight for data over one bus to main memory. This would basically starve high bandwidth processes. Why not have two different machines with two busses?



    Quote:

    Originally posted by onlooker

    (note) -> However, I do agree that a quad processor system isn't completely necessary for consumers, but ruling out a couple highend pro-sumer machines if Apple decides to impact an inroad into the highend 3D arena isn't necessarily a bad idea either.



    I think that dual G5s are just fine for the high end pro sumer 3D Freaks. Again, how many quad Xeon machines does Intel sell to this market? Not many.



    Quote:

    Originally posted by onlooker

    The thing about clustering a couple Xserves together is that the PowerMac line always looks faster, and less expensive than an Xserve does. A person could cluster 2 PowerMacs together, but it's messy, (in appearance) and that's not how Apple seems to like to appear. If Apple has new machines in the wings I think this intel information may bring about a redesign of how these Apple products are presented; possibly a few price shifts for an Xserve w/the Xserve Cluster node possibly as a package, and/or possibly a bigger lineup (more choices) for the PowerMac lineup.



    XServes are more expensive than PowerMacs because they are servers. They have redundancy designed in. XServes are already dirt cheap for what they offer.
  • Reply 39 of 72
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by Yevgeny

    Ummm, no? This won't be the first quad CPU machine from Intel. So far, how many prosumer freaks use quad CPU machines? None. Everyone uses dual Xeons. The price curve for quad processors is too high for most people to use. Such machines are aimed at the server market.



    I'm not sure where I said it was the first Quad CPU Machine from Intel, because I know 2 people that have them, and 5 more that have Dual Xeons. And yes they are all 3D freaks that were interested in what direction Apple was headed with the G5 knowing RenderMan was coming, and Apple would possibly have more surprises coming with that announcement. Probably not as interested now as they were.



    Quote:

    Originally posted by Yevgeny

    Well, since Linux x86 apps are compiled using the GCC compiler which is the compiler used in those speed tests, then we can say that a dual 2GHz G5 system is faster than a dual Xeon system when the Xeon system is running Linux. Either way, I wouldn't expect Intel's CPUs to scale linearly. Branch mispredicts will be worse than they currently are. Besides, Intel's CPU is vaporware and the G5 is available in the here and now and is about to get a nice speed bump.



    I have no idea what brings you to your conclusions, but you go.



    Quote:

    [b]I think that dual G5s are just fine for the high end pro sumer 3D Freaks. Again, how many quad Xeon machines does Intel sell to this market? Not many.[b]



    I don't know how much 3D you do, but I doubt it's much because the current G5's are not ready for Highend 3D work yet. They don't even have a graphics card. Which is the first priority.



    Quote:

    XServes are more expensive than PowerMacs because they are servers. They have redundancy designed in. XServes are already dirt cheap for what they offer.



  • Reply 40 of 72
    EVERYONE NEEDS TO WAKE UP TOMMORROW, and somehow imagine what it would be like to run LINUX on their X86 machine. (Debian, RedHat, Mandrake, etc.)



    They might get out their 2 or 3 year old computer and try it out. Go to Barnes and Noble or Borders and get a book, start out by installing the thing. Upgrade your hard drive if below the min. requirements...do the nasty and just do it.



    They need to ask theselves does it work for me?
Sign In or Register to comment.