Intel shows new chips, outlines platform directions

1234689

Comments

  • Reply 101 of 177
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by wmf

    This quote is outright wrong. Never trust an article about programming written by a non-programmer.







    And this is also wrong. 32-bit apps run in 32-bit mode, which behaves just like a 32-bit processor. A 64-bit kernel just switches modes occasionally, that's all.



    BTW, I don't present any evidence for my claims because non-programmers wouldn't understand it anyway.




    First, you don't know who wrote that article. Second the way the Mac does it now is different, it can run in 32 bit mode, Win 64 doesn't. Third, have you ever thought that they might have actually have been able to tell what was going on? It's not that difficult. And they were right about Win 95 comparred to Win 3.1. Also, Win 95 did need much more memory when running 32 bit programs.



    They aren't slouches over there. They know what they are doing.
  • Reply 102 of 177
    hirohiro Posts: 2,663member
    Well if they know what they are doing why are they outright wrong? An int is 32 bits on a 32-bit processor and also on a 64-bit processor. a long or long int is 64-bits on both. All other data types also maintain their same absolute sizes with the exception of void which does swell to 64-bits under a 64-bit processor to handle memory address ponters. But memory location (variable) sizes are not the same as memory address sizes so Toms Hardware just dropped their pants again. **



    There I have now uncategorically proven that this article is based on unsound reasoning. Not even just sloppy journalism, but outright wrong.



    Are you going to keep defending them, or do we need to bring up their OS X vs Linux server article fiasco for another quite public stepping on their own good-old crank routine. Or maybe go back a couple years and go over their faux pas on 3D graphics accelerators???



    Their track record is much better than Barefeats, but you really need to read them with a critical eye.



    **You could argue that there is no patent guarantee it will always be this way and be right, but seeing as how they are talking about real silicon rather than the whims of future GCC, Intel and IBM compiler writers I think it is safe to go with talking about the current real implementations rather than debate the merits of unimplemented theory.
  • Reply 103 of 177
    Anandtech?



    Lemon Bon Bon
  • Reply 104 of 177
    sunilramansunilraman Posts: 8,133member
    Quote:

    Originally posted by Hiro

    Well if they know what they are doing why are they outright wrong? An int is 32 bits on a 32-bit processor and also on a 64-bit processor. a long or long int is 64-bits on both. All other data types also maintain their same absolute sizes with the exception of void which does swell to 64-bits under a 64-bit processor to handle memory address ponters. But memory location sizes are not the same as memory addresse sizes so Toms Hardware just dropped their pants again. **



    There I have now uncategorically proven that this article is based on unsound reasoning. Not even just sloppy journalism, but outright wrong.



    Are you going to keep defending them, or do we need to bring up their OS X vs Linux server article fiasco for another quite public stepping on their own good-old crank routine. Or maybe go back a couple years and go over their faux pas on 3D graphics accelerators???



    Their track record is much better than Barefeats, but you really need to read them with a critical eye.



    **You could argue that there is no patent guarantee it will always be this way and be right, but seeing as how they are talking about real silicon rather than the whims of future GCC, Intel and IBM compiler writers I think it is safe to go with talking about the current real implementations rather than debate the merits of unimplemented theory.






    kickaha didn't you mention something about the difference between memory pointers needing to be 64bits but data types will be remaining along the lines that hiro has mentioned?



    so i believe kickaha your opinion is that memory usage for apps in 64bit will increase by some amount, but not as simple (and possibly incorrect) as "2x more memory required for everything"
  • Reply 105 of 177
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by Hiro

    Well if they know what they are doing why are they outright wrong? An int is 32 bits on a 32-bit processor and also on a 64-bit processor. a long or long int is 64-bits on both. All other data types also maintain their same absolute sizes with the exception of void which does swell to 64-bits under a 64-bit processor to handle memory address ponters. But memory location sizes are not the same as memory addresse sizes so Toms Hardware just dropped their pants again. **



    There I have now uncategorically proven that this article is based on unsound reasoning. Not even just sloppy journalism, but outright wrong.



    Are you going to keep defending them, or do we need to bring up their OS X vs Linux server article fiasco for another quite public stepping on their own good-old crank routine. Or maybe go back a couple years and go over their faux pas on 3D graphics accelerators???



    Their track record is much better than Barefeats, but you really need to read them with a critical eye.



    **You could argue that there is no patent guarantee it will always be this way and be right, but seeing as how they are talking about real silicon rather than the whims of future GCC, Intel and IBM compiler writers I think it is safe to go with talking about the current real implementations rather than debate the merits of unimplemented theory.




    You haven't proven anything. A word is 16 bits, and a long is 32. Start with that.



    The server article was from Anand. Not Tom's. And they guy who did that one (I forget his name) admitted that he was no expert in that area. I had problems with that one as well.
  • Reply 106 of 177
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by sunilraman

    kickaha didn't you mention something about the difference between memory pointers needing to be 64bits but data types will be remaining along the lines that hiro has mentioned?



    so i believe kickaha your opinion is that memory usage for apps in 64bit will increase by some amount, but not as simple (and possibly incorrect) as "2x more memory required for everything"




    Yup.



    mel, I don't think the article you quoted was stating that assertion for *ALL* 64-bit systems was it? If so, it's just wrong. It *might* be correct, however, for *Windows XP* and how it maps the 32 to 64 internally. That wouldn't surprise me, if it just did a most-conservative and most-braindead approach on memory use, and threw efficient packing out the window. In that context, the quote might be 100% correct.



    But that's not this context.



    This thread is starting to feel *AWFULLY* familiar... didn't we have a rousing *cough* debate about this about a year ago, with much the same result??
  • Reply 107 of 177
    ...did Mathematica's boss at the keynote note these issues? If it takes 2 hours to do a recompile. Is it really a major problem?



    How many apps are using 64 bit now, anyhow?



    When 64bit Intel cpus come...surely the optimisation will come for it...plugs into X-Code compile...what PPC is doing...compiles using the tools it's already using.



    Windows wasn't designed from the outset to be 64 bit. Is that really going to stop Intel selling 64 bit cpus? Or developing 'X' Intel compilers? Intel said they're working on stuff that can plug into 'X'. If it isn't ready on time...then Apps built for cross platform distribution eg Luxology's 'Modo' are ready to go...anyway?



    Big deal?



    In the meantime, Yonah 32bit cpus will do fine. And be more than competitive with G4s ie rub them into the ground.



    People uhmed and ahhed about about 'X'. Was it 64 bit. Did it only have '64 bit' calls in a '32 bit' OS. How would that affect the 64 bit G5 etc...and here we are. Everything works.



    I think some people just can't get over the politics. Cell wasn't picked. Neither was PPC. They obviously weren't enough for where Apple and the industry are going. Sony and IBM aren't big 'computer' players. That's why Apple went to someone that was. Intel.



    I'm sure when Intel 64 bit Meroms and dual core goodies come out we'll see that this thread as trivial issue.



    I just think this Intel direction will shake out the chaff. People will go X-Code. You'll have to plug into it. Revise Code-Warrior code into X-Code. Apple's 'optimised' developer environment to take advantage of cool 'X' tech' more easily. ...and they have the option of using Intel's compiler's for optimised x86 performance.



    But seeing as the Performance per Watt in laptops and desktops is going to overtake PPC, everything will seem much faster than it does now on G3s, G4s...and G5s going forwards. People who know how to code on x86 will make GL, flash, web browsing, apps, games fly. And surely we'll get graphics cards faster...and better drivers..? Sooner?



    Lemon Bon Bon
  • Reply 108 of 177
    On the thread title?



    The new platform directions of handtop, laptop, desktop and the home electronics market. Sounds exciting to me.



    I don't see that level of vision coming from IBM or AMD?



    But that's the scalability Apple needs and strikes me as a fist and glove partnership. Who's the fist and who's the 'glove'? Speculate...



    Lemon Bon Bon
  • Reply 109 of 177
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by Kickaha

    Yup.



    mel, I don't think the article you quoted was stating that assertion for *ALL* 64-bit systems was it? If so, it's just wrong. It *might* be correct, however, for *Windows XP* and how it maps the 32 to 64 internally. That wouldn't surprise me, if it just did a most-conservative and most-braindead approach on memory use, and threw efficient packing out the window. In that context, the quote might be 100% correct.



    But that's not this context.



    This thread is starting to feel *AWFULLY* familiar... didn't we have a rousing *cough* debate about this about a year ago, with much the same result??




    No, I didn't say that. I said (several times) that Apple's current OS and hardware doesn't work that way because it's perfectly capable of running 32 bit software in 32 bit mode the same time it runs 64 bit software in 64 bit mode. Remember also that Apple's OS is only partly 64 bit.



    I'm saying that right now, Win64 and x86 chips with 64 bit extentions can't do that.



    I'm also saying that we don't know as yet how the way x86 chips handle 32 bit programs on an exclusively 64 bit OS will effect the way Apple's OS will work when it's 64 bits on x86 as well.



    Remember that Win is a 64 bit OS. Apple's current OS is a 32/64 bit OS, as is the G5.
  • Reply 110 of 177
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by Lemon Bon Bon

    ...did Mathematica's boss at the keynote note these issues? If it takes 2 hours to do a recompile. Is it really a major problem?



    How many apps are using 64 bit now, anyhow?



    When 64bit Intel cpus come...surely the optimisation will come for it...plugs into X-Code compile...what PPC is doing...compiles using the tools it's already using.



    Windows wasn't designed from the outset to be 64 bit. Is that really going to stop Intel selling 64 bit cpus? Or developing 'X' Intel compilers? Intel said they're working on stuff that can plug into 'X'. If it isn't ready on time...then Apps built for cross platform distribution eg Luxology's 'Modo' are ready to go...anyway?



    Big deal?



    In the meantime, Yonah 32bit cpus will do fine. And be more than competitive with G4s ie rub them into the ground.



    People uhmed and ahhed about about 'X'. Was it 64 bit. Did it only have '64 bit' calls in a '32 bit' OS. How would that affect the 64 bit G5 etc...and here we are. Everything works.



    I think some people just can't get over the politics. Cell wasn't picked. Neither was PPC. They obviously weren't enough for where Apple and the industry are going. Sony and IBM aren't big 'computer' players. That's why Apple went to someone that was. Intel.



    I'm sure when Intel 64 bit Meroms and dual core goodies come out we'll see that this thread as trivial issue.



    I just think this Intel direction will shake out the chaff. People will go X-Code. You'll have to plug into it. Revise Code-Warrior code into X-Code. Apple's 'optimised' developer environment to take advantage of cool 'X' tech' more easily. ...and they have the option of using Intel's compiler's for optimised x86 performance.



    But seeing as the Performance per Watt in laptops and desktops is going to overtake PPC, everything will seem much faster than it does now on G3s, G4s...and G5s going forwards. People who know how to code on x86 will make GL, flash, web browsing, apps, games fly. And surely we'll get graphics cards faster...and better drivers..? Sooner?



    Lemon Bon Bon




    I agree with pretty much everything you've said here.



    Only remember that right now Apple's developer x86 OS is a 32 bit OS, not a 64 bit one. I don't know the details, so I don't know how Mathematica handles it. I suspect that it simply doesn't do any 64 bit calculations that it couldn't do in 32 bit mode on the Mac.
  • Reply 111 of 177
    splinemodelsplinemodel Posts: 7,311member
    Quote:

    Originally posted by Hiro



    3. Why are you arguing if you agree?





    I think the whole dialog has been one big misunderstanding. I'm not arguing. I think Apple made the right choice. I just don't think Cell should be written off as an "un-useful" chip. It's like as soon as Apple chose to go with Intel, someone flipped a switch and the fanboy community (cough, Appleinsider, cough) went into a state of total denial about Cell and its potential. I think that as time passes, Cell will become more mainstream -- or perhaps mainstream will become more Cell -- and that a lot of the problems we recognize will either be solved or will cease to be issues.
  • Reply 112 of 177
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by Splinemodel

    I think the whole dialog has been one big misunderstanding. I'm not arguing. I think Apple made the right choice. I just don't think Cell should be written off as an "un-useful" chip. It's like as soon as Apple chose to go with Intel, someone flipped a switch and the fanboy community (cough, Appleinsider, cough) went into a state of total denial about Cell and its potential. I think that as time passes, Cell will become more mainstream -- or perhaps mainstream will become more Cell -- and that a lot of the problems we recognize will either be solved or will cease to be issues.



    Well, if Cell becomes more oriented towards general purpose computing, it's going to look a *LOT* like the PPC CPUs we've been using for a few years now.... just with more Altivec units.



    Cell basically took the genpurp capabilities of a CPU and stripped them down to the minimalist set, and instead oriented on passing things through to the coprocs. It's still a PPC core, it's just a very very minimal one. Adding the necessary pieces back on for genpurp is going to bring it back to a more traditional PPC chip, in which case there hasn't been any revolution, or even evolution, just an move back to the status quo.



    Cell is *VERY* useful - for a certain set of tasks. General purpose computing on the desktop just isn't one of them. It's closer to an embedded processor, and in that realm, it doth indeed kick arse.
  • Reply 113 of 177
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by Splinemodel

    I think the whole dialog has been one big misunderstanding. I'm not arguing. I think Apple made the right choice. I just don't think Cell should be written off as an "un-useful" chip. It's like as soon as Apple chose to go with Intel, someone flipped a switch and the fanboy community (cough, Appleinsider, cough) went into a state of total denial about Cell and its potential. I think that as time passes, Cell will become more mainstream -- or perhaps mainstream will become more Cell -- and that a lot of the problems we recognize will either be solved or will cease to be issues.



    It isn't that Cell is useless. It's just that it's a very different model from what's being done today, and over many years. It might prove useful in some mainstream context other than games and multimedia workstations.



    It's too new to know whether it's going to catch on for those purposes.



    I know that a few people think that it's easy to move to, but it's not. Apple knew about this since at least 2002 when it was first being discussed in the media. Most likely they knew about it even earlier. They had plenty of time to evaluate its potential. Obviously they thought that it wasn't right for them. We don't know why. It could be performance. It could have been that they thought that it would be too difficult to program for. It could be that they thought that the software houses wouldn't want to make that kind of a leap.



    As far as I know Sony's tools aren't even completed as yet. Beta's have gone out. Also the PS3 won't be out until at least spring 2006, and perhaps early 2007. That could be technical or political. We don't know.



    Why would Apple want to use something that is so new that its never been tested in the real world? The demo's at the previous game shows were admitted by Sony to have been rendered rather than real time as many thought. It's thought that this is going to be difficult to program for. Perhaps it is.



    In a report on Anand a short while ago which was pulled because of, apparently the fear of being sued, game programmers had said that they were only able to use one or two of the SPE's in the Cell, and only one of the cores in the Xenon. The point being made was that it was difficult to actually use all of the power that the chips offered *at this time*. So how long will it take?



    If the PS3 does pretty well, which it most likely will, the chip will be a success. It might still take five years to tell if it will move over to other things than TV's from Toshiba.
  • Reply 114 of 177
    tenobelltenobell Posts: 7,014member
    If looking at the situation in a linear plane of logic Apple will use x86 processors.



    Apple has its developers translating their software from Power to x86. So obviously that means the Macintels will run x86.



    Which would be a straight forward logical assumption to make. And that may well be what Apple does.



    But at the same time there are some holes and inconsistencies that don't make the x86 argument as obvious.



    Such as the fact that Apple nor Intel have ever really said what chips will be used in Macintel's.



    The fact that current developer tools do not address EM64T. It is doubtful Apple will develop 64 bit computing in PPC then reverse that development in its transition to Intel.



    The fact that transitioning to x86 Apple will take on x86 legacy baggage.



    The fact that developers machines are Pentium 4, its highly unlikely Macintel's will run P4's.



    The fact that Intel wants to leave x86 and move to new instruction sets (IA-64).



    The question should be asked. Why would Apple leave Power to move to an older inflexible architecture such as x86, when Intel even proclaims x86 does not have the future potential that Power will have?



    Why would Apple not transition from Power to an architecture with equivalent potential future growth?



    No one but Apple at this point can absolutely say.



    My expectation is that Apple will do the unexpected. That has been Steve Jobs history.
  • Reply 115 of 177
    melgrossmelgross Posts: 33,510member
    Quote:

    Originally posted by TenoBell

    If looking at the situation in a linear plane of logic Apple will use x86 processors.



    Apple has its developers translating their software from Power to x86. So obviously that means the Macintels will run x86.



    Which would be a straight forward logical assumption to make. And that may well be what Apple does.



    But at the same time there are some holes and inconsistencies that don't make the x86 argument as obvious.



    Such as the fact that Apple nor Intel have ever really said what chips will be used in Macintel's.



    The fact that current developer tools do not address EM64T. It is doubtful Apple will develop 64 bit computing in PPC then reverse that development in its transition to Intel.



    The fact that transitioning to x86 Apple will take on x86 legacy baggage.



    The fact that developers machines are Pentium 4, its highly unlikely Macintel's will run P4's.



    The fact that Intel wants to leave x86 and move to new instruction sets (IA-64).



    The question should be asked. Why would Apple leave Power to move to an older inflexible architecture such as x86, when Intel even proclaims x86 does not have the future potential that Power will have?



    Why would Apple not transition from Power to an architecture with equivalent potential future growth?



    No one but Apple at this point can absolutely say.



    My expectation is that Apple will do the unexpected. That has been Steve Jobs history.




    Hp and Dell also haven't yet said which chips they will be using late 2006. Perhaps they will move to the PPC?



    So Apple can't go to x86 64 bits because they are doing it on the PPC? Tools are being developed as we speak. Intel just announced that their compilers will be available to Apple soon enough. You can be sure that there will be tools for EM64T.



    That is too bad. But it won't be as bad as you think because Apple has never used the baggage. Its OS, apps, and those of it's third party partners won't have baggage either. The baggage in the chip can be ignored for the most part.



    I don't know what the P4 in the Dev machines have to do with Apple's machines next year or afterwards. You could say the same thing for the PC companies.



    Intel has learned its lesson. They won't be leaving x86 for quite some time to come.



    I'm not so sure that Intel has commented about Power relative to x86.



    Apple doesn't use Power chip sets. The question is the PPC, which neither Freescale or IBM has seen fit to pour the R&D into to go in the direction that Apple needs them to go.



    You're right. No one but Apple can say.



    Apple might do something unexpected, but it won't involve using something other than x86 chips at this yime.
  • Reply 116 of 177
    thttht Posts: 5,421member
    Quote:

    Originally posted by TenoBell

    The question should be asked. Why would Apple leave Power to move to an older inflexible architecture such as x86, when Intel even proclaims x86 does not have the future potential that Power will have?



    You are confusing instruction set architecture (ISA) with microprocessor architectures. Back in the day, more than a decade ago, the ISA of a CPU factored into the CPU architecture quite a bit.



    When everyone calls x86 an older inflexible architecture, it's largely geared towards the ISA, the actual instructions. For instance, x86 doesn't have a single instruction to do a multiply-add (multiply-accumulate) which has been an important advantage for many of the newer ISAs.



    However, when you look at the actual architectures of x86 CPUs, they are state of the art processors. They have everything that RISC processors have: superscalar, superpipelined, OOOE, SMT, etc. There is nothing older or inflexible about them.



    In today's world, when a billion transistors can be put into a processor, the effects of the ISA design simply don't matter anymore. x86 ISA processors have just as much potential as PPC ISA processors, if not more because of the gigantic market forces behind it.



    Quote:

    Why would Apple not transition from Power to an architecture with equivalent potential future growth?



    Architectures don't have potential for growth, only business models do.
  • Reply 117 of 177
    hirohiro Posts: 2,663member
    Quote:

    Originally posted by TenoBell

    If looking at the situation in a linear plane of logic Apple will use x86 processors.



    Apple has its developers translating their software from Power to x86. So obviously that means the Macintels will run x86.



    Which would be a straight forward logical assumption to make. And that may well be what Apple does.



    But at the same time there are some holes and inconsistencies that don't make the x86 argument as obvious.



    Such as the fact that Apple nor Intel have ever really said what chips will be used in Macintel's.



    The fact that current developer tools do not address EM64T. It is doubtful Apple will develop 64 bit computing in PPC then reverse that development in its transition to Intel.



    The fact that transitioning to x86 Apple will take on x86 legacy baggage.



    The fact that developers machines are Pentium 4, its highly unlikely Macintel's will run P4's.



    The fact that Intel wants to leave x86 and move to new instruction sets (IA-64).



    The question should be asked. Why would Apple leave Power to move to an older inflexible architecture such as x86, when Intel even proclaims x86 does not have the future potential that Power will have?



    Why would Apple not transition from Power to an architecture with equivalent potential future growth?



    No one but Apple at this point can absolutely say.



    My expectation is that Apple will do the unexpected. That has been Steve Jobs history.




    Can we not cross post all over the boards? reference another thread if you must, but don't just paste the same post multiple places. It's bad form.



    I was going to reference this thread as a case for the rebuttal of your identical post in the "Apple orders Mac sites to remove OS X on x86 videos" post, but I see you still haven't grokked this new architecture thingie Intel has announced, even seeing it firsthand. No wonder you are so hung up on Itanium.
  • Reply 118 of 177
    mjteixmjteix Posts: 563member
    I've looked closely at the roadmaps/slides provoded at Intel DF and I found the Apple/Intel "roadmap" provided by Macbidouille site interesting:

    Quote:

    - PowerBook based on Merom in June 2006 (possible availability in september)

    - iBook based on Yonah in June 2006 (possible availability in september)

    - Mac mini and eMac should also shift to Yonah by the end of 2006

    - then Conroe should find its way into iMac by the end of 2006

    - As expected, PowerMac and Xserve should be the last models to receive x86 processors, probably based on Whitefield by mid-2007, except if Apple is pushed to adopt x86 faster for those models (then probably Conroe) due to the slow down of R&D/production for the PPC970MP.



    But I'll make some changes to my taste/hopes:

    1-PowerBook G4 updates (sept 05). up to 1.8GHz 7448

    2-PowerMac/iMac G5 updates (jan 06) up to dual dual core 2.5GHz G5

    3-Xserve G5 updates (mar 06) up to dual dual core 2.5GHz G5

    4-iBooks/Mac minis/eMacs (june 06) up to 2.xGHz Yonah (SC/DC)

    5-PowerBooks (sept 06) up to 2.xGHz Merom (DC)

    6-iMacs (nov 06) up to 3Ghz Conroe (DC)

    7-PowerMacs (jan 07) Conroe (low-end) WoodCrest (high-end)

    8-iBooks/Mac minis/eMacs (mar 07) up to 2.xGHz Merom (DC)

    9-Xserve (june 07) Quad Cores Whitefield

    then repeat 5-9 with the next gen Intel's CPUs...
  • Reply 119 of 177
    tenobelltenobell Posts: 7,014member
    Quote:

    Originally posted by melgross

    [B]Hp and Dell also haven't yet said which chips they will be using late 2006. Perhaps they will move to the PPC?



    But HP nor Dell are nearly as secretive as Apple. And both are far more predictable.



    Quote:

    So Apple can't go to x86 64 bits because they are doing it on the PPC? Tools are being developed as we speak. Intel just announced that their compilers will be available to Apple soon enough. You can be sure that there will be tools for EM64T.



    No I was saying if we go with current appearances Macintel software is only being translated for 32 bit.



    The other part of the story is that Intel is really lukewarm on EM64T. They've been forced to support it because Microsoft adopted it for XP-64. An Intel spokesman even went as far as to call EM64T a feature extension of x86, and not much of an architecture advancement.



    Intel really wants to push IA-64.



    Quote:

    That is too bad. But it won't be as bad as you think because Apple has never used the baggage. Its OS, apps, and those of it's third party partners won't have baggage either. The baggage in the chip can be ignored for the most part.



    The problem is that the baggage comes along with the chip. From what I've been reading x86 has major design inefficiencies that are none existent in newer architectures.



    Quote:

    I don't know what the P4 in the Dev machines have to do with Apple's machines next year or afterwards. You could say the same thing for the PC companies.



    Just pointing out that developers are transferring code on a chip that won't even be in use next year. That is significant in that Intel will be using a different architecture.



    Yes other PC companies will use Intel's newest chips, but they will be far more obvious with what they will use.



    My point is that Apple has not been obvious at all. Everyone assumes it will be obvious, but Apple has not confirmed that.





    Quote:

    Intel has learned its lesson. They won't be leaving x86 for quite some time to come.I'm not so sure that Intel has commented about Power relative to x86.



    You are right that Intel will be forced to continue with x86 because the majority of the PC industry won't leave it.



    But Intel is developing other architectures besides x86.



    Apple has little stake in x86 and at this point and has options.



    Quote:

    "Long term, the architecture Itanium needs to aim at is [IBM's] Power line. We have nothing in our existing 32-bit line capability that can compete with Power. It's a very high performance line." CEO Paul Otellin



    Our x86 line cannot compete with Power.



    Quote:

    Apple doesn't use Power chip sets. The question is the PPC, which neither Freescale or IBM has seen fit to pour the R&D into to go in the direction that Apple needs them to go.



    It sounds as though you think I am with the people arguing Apple should stick with PPC. I am not.



    Quote:

    Apple might do something unexpected, but it won't involve using something other than x86 chips at this time.



    There are still other options within Intel.
  • Reply 120 of 177
    tenobelltenobell Posts: 7,014member
    Quote:

    In today's world, when a billion transistors can be put into a processor,



    That is actually the crux of the problem too many transitors. It becomes an art of diminishing returns.



    Quote:

    Can we not cross post all over the boards?



    I see you still haven't grokked this new architecture thingie Intel has announced, even seeing it firsthand. No wonder you are so hung up on Itanium.



    I know - I know. I wasn't bringing the exact same argument here. In fact I changed wording in the post to suit this thread. I in fact did not say anything about Itanium here.



    This thread is only looking at x86 and PPC as the only two options.



    My point is those are not the only two options.
Sign In or Register to comment.