Intel shows new chips, outlines platform directions

1235789

Comments

  • Reply 81 of 177
    smalmsmalm Posts: 671member
    From IDF:

    In his key note Pat Gelsinger showed the absolute performance of Dempsey and Woodcrest in a chart, but only relative to a current dual Xeon (Irwindale, 3.6 GHz, 2 chips, 2 cores).

    Dempsey (2 chips, 4 cores) was said to speed up the SPECint_rate_base2000 around 88% (that would be from 38 to 71). Woodcrest (2 chips, 4 cores) added then another 52% (to 108 ).

    A HP ProLiant DL385 with 2.2GHz Opterons (2 chips, 4 cores) reached 65, a IBM eServer P5 570 (2 chips, 4 cores) 74.



    Full (german) article from c't can be seen here



    My G5 2.5GHz (2 chips, 2 cores ) reaches 22 points \
  • Reply 82 of 177
    sunilramansunilraman Posts: 8,133member
    -h.264 encoding on intel xScale handheld --- anybody see this?? anyone thinking what i am thinking?? hmmmm?

    -not Cell, not PPC, not vaporware, but intel actually demonstrating some nice h.264 decoding on a handheld



    also,

    WiMax??? hmmmmm.... i need wiMax in my suburb here real bad... DSL has been quite a bit of a pain in the 455.

  • Reply 83 of 177
    splinemodelsplinemodel Posts: 7,311member
    Quote:

    Originally posted by Hiro

    1. Basic computer science. . . .



    I'm not entirely sure why everyone finds so much to respond to a post which started out on the premise of "Cell is different, but it fits into IBM's business model."



    Quote:

    Originally posted by Hiro

    [BIt is also poorly optimized for handling the vast quantity of code generated by programmers who only know high level concepts and therefore are unable to write code that is in-order hardware friendly. I am a CS professor, this is not a small problem for the VAST majority of existing code.[/B]



    I've crossed swords with CS professors in the past, which isn't to say that I have any lack of respect for academia; it is to say that there's a lot of scope in CS beyond low-level computability concepts.



    1. A lot of recent projects have been done in Java. It's ironic that, despite how awful Java runs on normal processors, it might actually work on Cell since the VM can be written by people in-the-know.



    2. Good compilers already shift code all around.



    3. Who cares anyway? Cell won't be sold as a general purpose computing platform. But until someone can get a chip that can offer Cells kind of numbers, optimized or not, it's still very much the big dog on the block.
  • Reply 84 of 177
    sunilramansunilraman Posts: 8,133member
    Quote:

    Originally posted by Splinemodel

    ......Who cares anyway? Cell won't be sold as a general purpose computing platform. But until someone can get a chip that can offer Cells kind of numbers, optimized or not, it's still very much the big dog on the block....





    in any case apple's h.264 decoding coproc-chip, if needed, will be an Intel xScale

    let's forget about Cell for now, shall we? we can file it in the same would-have-been-nice-but-whatevah pile with all the motoROKR stuff
  • Reply 85 of 177
    melgrossmelgross Posts: 32,994member
    Quote:

    Originally posted by Kickaha

    Except that MacOS X on the PPC gets around this, melgross. Do we know for a fact that whatever chip Apple ends up using will certainly cause problems? New architecture coming down the pipe, and all that... \



    32bits -> 64bits is only going to be absolutely necessary for memory addresses (pointers). Ints, floats, etc, will still be packed as before, would be my guess, so the memory usage will not double, just increase a bit.



    Yes, more memory will be needed, but I don't believe it's a simple doubling.




    Nothing here is simple. If anyone tells us that, it's because they really don't know, and are just huffing and puffing. Insinuating remarks and all.



    This is why it matters if the chipset and OS is a 64 bit set, or a 32/64 bit set. The PPC is different from the x86. Remember that the PPC was designed from the beginning to scale to 64 bits without penalty to 32 bit programming.



    The x86 was never designed for 64 bits. It was just added by AMD. So they are very different. I don't pretend to know x86 64 bit extentions as well as PPC cpu's because I've not looked at them with any concentration until recently. I'm still catching up. But what I've read says that if the OS and chip are running in 64 bit mode then the memory increase is necessary.



    The talk is that the developer OS that Apple is supplying is 32 bits. That's very likely. If Apple supplies a 64 bit OS unlike what it does now, it would have to operate the same way Windows 64 operates. If there is some way around that, and I'm not saying that there isn't, The OS would have to operate in some 32 bit mode as well. But I can't see how that would be done. because as far as I know, when the OS is 64 bits the chip simply goes into that mode. That's why they need 64 bit drivers. Apple has gotten around this in several ways on the PPC, and the way the OS is not fully 64 bits. Whether this can have a satisfactory work around is something that is in heavy discussion around the web.



    Again, anyone who stamps their little foot and says "No" or "Yes" is just making a fool of themselves, because they don't know.



    I'm just going by what is known at this time. It could change, but something would have to be figured out to get around the limitations. Not simple.



    With some people, everything is a "re-compile". Bull!
  • Reply 86 of 177
    hirohiro Posts: 2,663member
    Quote:

    Originally posted by Splinemodel

    I'm not entirely sure why everyone finds so much to respond to a post which started out on the premise of "Cell is different, but it fits into IBM's business model."







    I've crossed swords with CS professors in the past, which isn't to say that I have any lack of respect for academia; it is to say that there's a lot of scope in CS beyond low-level computability concepts.



    1. A lot of recent projects have been done in Java. It's ironic that, despite how awful Java runs on normal processors, it might actually work on Cell since the VM can be written by people in-the-know.



    2. Good compilers already shift code all around.



    3. Who cares anyway? Cell won't be sold as a general purpose computing platform. But until someone can get a chip that can offer Cells kind of numbers, optimized or not, it's still very much the big dog on the block.




    1. Sun is OS'ing Java engines. That is the death knell for performance enhancements. And Sun is nearly dead due to mismanagement, they won't be much help in getting a truly world class design done either. Matter of fact if your name isn't MicroSoft you pay Sun for the pleasure of using the Java name for the JVM and do your own port. Sun partially supports a free Linux JVM but most of that effort is employee volunteer time, not budget line item time.



    2. Ummm, if they really were good compilers shifting code around RISC would have left CISC so far in the dust Intel would have gone completely RISC a long time ago. But existing compilers do a crappy job of shoving code around for use on RISC purely in order designs. The compilers promise has been an empty one since the late 70's when they were going to solve everyones performance problems. Still waiting.



    3. Why are you arguing if you agree? CELL isn't a general purpose computing platform. Big numbers in a narrow domain make for good press releases, not great sales across an entire market. Until #2 is fixed Cell is handcuffed, and if #2 is fixed, that is bigger news than CELL itself.



    CELL is a huge risk as a platform, not as a console CPU or HD decoding chip. Sony is following its Emotion Engine design with a rather ambitious and radical extension of the same concept. While I wouldn't call Emotion Engine a failure, it has utterly under-delivered on the hype of it's promise, it was a good console CPU which took extremely knowledgeable programmers to use well in the gaming niche. None of the workstations promised for EE ever got off the ground, I wouldn't be the least bit surprised if the same thing happened this time around too.
  • Reply 87 of 177
    sunilramansunilraman Posts: 8,133member
    intel and ppc separation strategy is universal binary. cool. but think about it: worse case scenario, 8 distinct areas for each piece of software



    32bit PPC (G4)

    32bit multiPPC (G4 dualcore)

    32/64bit PPC (G5)

    32/64bit multiPPC (G5 multicpu)

    32bit x86 (intel)

    32/64bit x86 (intel with em64 thingy)

    32bit x86 multicore (intel)

    32/64bit x86 multicore (intel em64 multicores)



    i would love it if someone could elucidate how the apple APis and to what extent cocoa and xcode allow for this sort of flexibility, where a universal binary can provide for this sort of latitude: exceptional 64bit, multithreaded, multicore performance on intel or ppc, while degrading gracefully to a single-cpu-32-bit-PPC, for example.



    now, think about how the marketing of these apps to different target audiences is going to happen.



    the challenges i have outlined above will be steve & co's greatest time to shine. windows is a garbled mess, and i wont say anything about linux because it will start some flamefest.



    intel looks like they will pony up the hardware, so apple is sorted on the hardware side. no more steve looking at a slide saying "we expected to be at so-and-so by now" (my fingers crossed)



    now apple and mac developers, show us what multithreaded, multicore, 64bit hardware can really do, and we will really blow away the competition and make apple a SERIOUS player in the enterprise as well.





  • Reply 88 of 177
    kickahakickaha Posts: 8,760member
    The APIs are transparent to the underlying binary formats. To the developer using the APIs, it will look the same as now.



    No marketing to end users is needed - they only see one file on disk that internally contains all the pieces. To the user, it will look the same as now.
  • Reply 89 of 177
    sunilramansunilraman Posts: 8,133member
    Quote:

    Originally posted by Kickaha

    The APIs are transparent to the underlying binary formats. To the developer using the APIs, it will look the same as now.



    then the key is the APIs provided by apple. how they make the most of whatever hardware os X is running on.





    Quote:

    Originally posted by Kickaha

    No marketing to end users is needed - they only see one file on disk that internally contains all the pieces. To the user, it will look the same as now.



    fair enough. but the marketing that WILL be needed, i would strongly propose, is that apple and developers have to convince them, we give you this [ONE FILE]. it will run on any of our hardware offerings, and the better your computer ("more 64 bit" , and "more cpus/cores" in laymanspeak), the better your performance will be.



    apple i think would have to be able to promise and deliver on this to some extent.



    i can say for sure in PCland there is a tremendous amount of hype and misinformation on 64bit and you can be sure windows and non-apple pc makers will be squeezing every bit of flashiness out of "64bit!" "multi-cpu!" like two computers in one! for the price of half!





    .....ah well, anyway, i'm starting to overthink this too much. time for bed.



    .....some days i think i'm cursed by being in the middle-- too high brow and technical to be an advertising slimeball, and too "lets share the love and educate the masses" to be an elitist hardcore techie admin type engineer-person
  • Reply 90 of 177
    melgrossmelgross Posts: 32,994member
    Quote:

    Originally posted by sunilraman

    then the key is the APIs provided by apple. how they make the most of whatever hardware os X is running on.









    fair enough. but the marketing that WILL be needed, i would strongly propose, is that apple and developers have to convince them, we give you this [ONE FILE]. it will run on any of our hardware offerings, and the better your computer ("more 64 bit" , and "more cpus/cores" in laymanspeak), the better your performance will be.



    apple i think would have to be able to promise and deliver on this to some extent.



    i can say for sure in PCland there is a tremendous amount of hype and misinformation on 64bit and you can be sure windows and non-apple pc makers will be squeezing every bit of flashiness out of "64bit!" "multi-cpu!" like two computers in one! for the price of half!





    .....ah well, anyway, i'm starting to overthink this too much. time for bed.



    .....some days i think i'm cursed by being in the middle-- too high brow and technical to be an advertising slimeball, and too "lets share the love and educate the masses" to be an elitist hardcore techie admin type engineer-person




    Also too lazy, all you've been saying lately is "time for bed".



    Actually, what you say is what is being done now by many developers. At least to the extent possible. Adobe does this with PS. One core, two, two chips, whatever. FCP the same thing. Of course many programs are "good enough" running on one core alone.



    Apple, at least, won't have a choice. their stuff will have to run on anything. If they need one core or four, they will use what's available. x86, PPC.



    Of course, remember that 64 bits doesn't give better performance most of the time. On the G5 it could be somewhat worse because of overhead. On x86 it will usually be somewhat better because in going to 64 bits they fixed a lot of problems inherent in the 8-16-32 bit systems the x86's needed for backwards compatability. None of those problems on the G5 to begin with.
  • Reply 91 of 177
    hirohiro Posts: 2,663member
    Quote:

    Originally posted by sunilraman

    intel and ppc separation strategy is universal binary. cool. but think about it: worse case scenario, 8 distinct areas for each piece of software



    32bit PPC (G4)

    32bit multiPPC (G4 dualcore)

    32/64bit PPC (G5)

    32/64bit multiPPC (G5 multicpu)

    32bit x86 (intel)

    32/64bit x86 (intel with em64 thingy)

    32bit x86 multicore (intel)

    32/64bit x86 multicore (intel em64 multicores)



    i would love it if someone could elucidate how the apple APis and to what extent cocoa and xcode allow for this sort of flexibility, where a universal binary can provide for this sort of latitude: exceptional 64bit, multithreaded, multicore performance on intel or ppc, while degrading gracefully to a single-cpu-32-bit-PPC, for example.



    now, think about how the marketing of these apps to different target audiences is going to happen.



    the challenges i have outlined above will be steve & co's greatest time to shine. windows is a garbled mess, and i wont say anything about linux because it will start some flamefest.



    intel looks like they will pony up the hardware, so apple is sorted on the hardware side. no more steve looking at a slide saying "we expected to be at so-and-so by now" (my fingers crossed)



    now apple and mac developers, show us what multithreaded, multicore, 64bit hardware can really do, and we will really blow away the competition and make apple a SERIOUS player in the enterprise as well.






    You only NEED 2 versions, x86 and PPC-32. Everything else is totally transparent (except Altivec code), you don't even need to worry about the API's.



    Now if you want to get max performance out of any particular CPU combination, you do get some customizations. But devs that are doing that are already quite use to custom coding for every different PPC or x86 version, including PPC w/Altivec (G4/G5) and PPC without (G3), and already make that work through how they have their code set up. So nothing new there either once you accept the x86/PPC universal format.
  • Reply 92 of 177
    rainrain Posts: 538member
    Will my printer, scanner, and other periferals work with the new mac's? Will all my software need to be replaced?

    Is Tiger the last OS i'll run on my new G5?



    Why is Apple building so much of it's own software now? Photoshop like programs... word processors, Layout programs, 3d software, music apps? Is it because all the software developers out there are throwing up their arms and walking away? You can only yank the rug out so many times.

    Will the Intel chips make software development any more compatible with the Mac OS?



    After all... the only reason we have computers is to run software. Smaller chips with better pipes and watt's is cool... but has no real-world application if the software isn't there.

    It seems to me Apple is going to make the same blunder as before and chase away developers and try to do it all on their own again. This time they won't have the education and commercial printing/publishing industry behind them.

    I don't think the consumer industry alone is profitable enough.



    Seems their model is to compete more with Sony then Microsoft. They my as well change thier name to that old 80's group... Fad Gadget.



    How much faith do you guys have in Rosetta?
  • Reply 93 of 177
    snoopysnoopy Posts: 1,901member
    Quote:

    Originally posted by rain

    Will my printer, scanner, and other periferals work with the new mac's? Will all my software need to be replaced?

    Is Tiger the last OS i'll run on my new G5? . . .







    Your name is very appropriate, rain, with such a gloomy outlook. Cheer up. I believe you will find things to be a lot better than you expect. Apple will make new versions of OS X to work on PPC Macs for a long time. At least three years I'd say, but I'm hoping for five. Peripherals should work fine, as well as most software. Some of your applications may need to be replaced, but you could plan your transition to an Intel Mac when there is a major upgrade, which you would buy anyway.
  • Reply 94 of 177
    It has to be remembered, 'Rain', that Apple developed Safari because Internet Explorer sucked. Hard. It was slow. Inferior to the Windows version. And guess what? M$ could drop development of it any time they wished. And guess what? They did.



    Same with 'Office'. M$ could drop it anytime they like. So? Apple are developing their own modern 'Office'. So? If M$ drop it? They'll be there. With a better but less bloated suite of Office. They're own that takes advantage of 'X'.



    Apple asked Adobe to pull their finger out on Premiere. They declined. So? Apple sawed Adobe's 'Premiere' off at the legs. And with some style with Final Cut Pro. In short? Adobe's lacklustre attitude to Mac development saw Apple take matters into their own hand. Rightfully so. So, Apple chases off second rate developers with superior apps? Or bundles them free. Adobe Apple's friend? Well, doing dual Platform probably did more damage to Mac marketshare than we think.



    M$ don't mind gobbling up competitors, running them out of business, buying them up...shooting them out of the sky... I don't care what Apple does to survive. They've addressed areas of weakness on the Mac platform. I won't lament Internet Explorer or Premiere's passing. Same with Photoshop? Does it support Core Image? If Adobe ever think about dropping it? I think Apple could be there too.



    Apple's a Software company in disguise. They do the best computer software in my eyes.



    I think they're gently pushing Tanks on the lawns of a few people who claim to be 'friends'. And with Intel? Pushed a tactical nuke aimed at Redmond.



    I'm very optimistic about the future of software and hardware with the Intel switch.



    Lie back and take it. It won't hurt at all...



    Lemon Bon Bon
  • Reply 95 of 177




    These developers don't seem to be worried...



    Lemon Bon Bon
  • Reply 96 of 177
    rainrain Posts: 538member
    I hope it is a smooth ride.

    It's just been a long 15 years for me with Apple.



    This new year I just outfitted my office with $14,000 in new Apple gear. And bam... they are switching yet again.

    Funny thing is, this happened to me the last time I did a major system upgrade and got my G3's. Then I had to go out and spend more money on more equipment to make it compatible. Just my luck I guess.



    Like then, Apple touted 'Carbon', and that was a complete miserable failure. Not to mention there was no font support for 2 years. (acrobat died and Suitcase didn't work)

    We had to eventually sue Apple to get support for our new machines.



    I'm guessing in another 7 years, when we are all finally settling in with Intel chips and new software... they will mix it up again. Good way to drive sales I guess.



    I just hope I can weather out the Inetel era with my G5's until the next big jump.



    ;-)



    Apple... it's like having some high maintenance woman.
  • Reply 97 of 177
    kickahakickaha Posts: 8,760member
    Well, Windows is just a bitch.
  • Reply 98 of 177
    gene cleangene clean Posts: 3,481member
    Usually high maintenance women are quite bitchy themselves.
  • Reply 99 of 177
    hirohiro Posts: 2,663member
    Rain,



    Your equipment and software will work just fine over the lifetime you deem appropriate for recouping their investment. New Apple hardware changing won't do anything to the ability of the stuff you have to do the job. In 5-7 years, when Apple starts contemplating shutting down support for OS X PPC you will probably be wanting new hardware anyway if you use it to make $$. And software you buy from next year on will probably be universal binary and still usable on the x86 Macs too.



    The hardware change will be much less traumatic than the OS change was.
  • Reply 100 of 177
    wmfwmf Posts: 1,164member
    Quote:

    Originally posted by melgross

    I'll repeat this quote:



    I'll quote from an article in Tom's Hardware. This is the consensis on this.





    "However, the memory advantage can turn into a disadvantage if you don't have enough of it. As each data chunk is 64 bits long, 32 bit chunks of a 32 bit legacy application can consume double the memory compared to running under a 32 bit OS. From this point of view, it does not make much sense to run Windows XP x64 with only a small amount of memory. If you go for this latest version, we recommend installing at least a gigabyte of RAM."




    This quote is outright wrong. Never trust an article about programming written by a non-programmer.



    Quote:

    The first 32 bits gets filled with the programs junk and the next 32 bits is just carried along. Empty bits, if you like. But they must be filled, so it takes twice as much memory to do an operation.



    And this is also wrong. 32-bit apps run in 32-bit mode, which behaves just like a 32-bit processor. A 64-bit kernel just switches modes occasionally, that's all.



    BTW, I don't present any evidence for my claims because non-programmers wouldn't understand it anyway.
Sign In or Register to comment.