Intel Core 2 Step Pentium

Posted:
in Future Apple Hardware edited January 2014
Null.
«1

Comments

  • Reply 1 of 27
    Apple and Intel have a bit of time to worry about that, and Apple is in the better position in that they have shown the market the ability to switch processors successfully with OS X with little or no hiccups for the end user. They could easily switch to AMD processors at some future date if the need were there. However I think that Intel will be driving the development in the market just aas they did with the Pentium, and AMD will be forced to follow. IBM is basically out of the picture for desktop computer processors and probably won't get back into the game unless someone contracts them to build a custom processor. I would say that the likelihood of that is slim. So, unless some other player comes into the market it is between AMD and Intel. AMD will have to try to lead or follow in the race, but they can only lead as long as the new chips are supported by Microsoft.
  • Reply 2 of 27
    Quote:
    Originally Posted by Slewis View Post


    Who else thinks Intel has gone nuts? . . .



    I'm glad to see we have another industry analyst here on AI . . . .



    Intel definitely has its reasons, and to tell you the truth, if I had to buy stock now in one or the other, Intel or AMD, I would buy Intel.
  • Reply 3 of 27
    wmfwmf Posts: 1,164member
    Quote:
    Originally Posted by Slewis View Post


    Who else thinks Intel has gone nuts? Perhaps I should've seen it in the branding, "Core" but I must've missed it back in January. I've been watching them, and I've seen their Roadmap. Intel is doing exactly what they did with the Pentium, only now they are racing for Higher Core Counts, instead of Clock Speeds. It's quite an inefficient way of doing things, because it works to a point, but after that point it's brutally useless and inefficient.



    And the other ways are worse. Increasing the frequency increases power consumption much more than it increases performance. Increasing the core size is the same deal.



    Quote:

    So when Intel's designs go out of control (again) will Apple take IBM back and place a Cell in the Mac?



    Cell is even worse for desktop computing than a many-core x86, because the porting effort is so large. Heck, Cell has more cores than Intel processors.



    Quote:

    AMD, as you probably know, is working on making specialized chips. Their first step in that direction, is called "Fusion" and uses their own AMD Brand Microprocessors, and combines them with the Graphics technology of the recently aquired ATI.



    This is fine on the low end, but the integrated GPU will never be as fast as a discrete GPU, and there's also the porting cost to use these accelerators.
  • Reply 4 of 27
    slewisslewis Posts: 2,081member
    Null.
  • Reply 5 of 27
    Quote:
    Originally Posted by Slewis View Post


    Intel is planning to but hundreds of cores on cores. Like little Mini Cores, and it's only going to do so much, just like Mhz/Ghz. They were only able to increase the speeds of the P4 so much before it really did a whole lot of nothing.



    No offense, but I don't think you know what you're talking about. These designs are just plans at the moment, but if Cell succeeds, then it will be clear that there are markets for massively parallel computing. Honestly, the benefits of AMD's purchase of ATI are more understandable at the low-end of the market where it's acceptable to have an SoC, and the lower cost it brings can make up for the fact that motherboard hardware has to be more frequently redesigned and that AMD is two steps behind in process technology and low-power technology. The massively multicore Intel chips are not targeted at the low-end of the market, and they may not even be targeted at the PC market at all.
  • Reply 6 of 27
    thttht Posts: 5,476member
    AMD has pretty much lost the personal computing market, and with the their Fusion strategy, it appears to be a defacto white flag and they are moving to the "consumer electronic" market of the future, the low end PC market, and trying to hang on to the 4+ socket server market.



    In a couple of years, Intel will be the only option for high end desktop/laptop processors. They are already now, it just won't be as obvious as it will be a couple of years from now.
  • Reply 7 of 27
    slewisslewis Posts: 2,081member
    Null.
  • Reply 8 of 27
    slewisslewis Posts: 2,081member
    Null.
  • Reply 9 of 27
    thttht Posts: 5,476member
    Quote:
    Originally Posted by Slewis View Post


    Last I checked, AMD was still gaining marketshare, despite the Core 2 Duo. I could be wrong though, but AMD is not completely dead, and is definately not dying.



    Who said AMD is dying?
  • Reply 10 of 27
    Quote:
    Originally Posted by Slewis View Post


    Um... Actually I read quote in CPU (Computer Power User) where Intel did say they were planning just that. Whether it will ever be reality, I don't know, but if you're going to "no offense" anyone, you're looking at the wrong person.



    Sebastian



    Who might that be? I know "no offense" is kind of moot, but I meant it. Just like everyone's an armchair manager when watching football on the weekend, a lot of folks are also quite intent that they somehow "know better" about the business plans of giant corporations. Rarely is this the case.



    I work in the electronics industry. I don't claim to know everything, but it's easier to spot trends when you're around them all the time, and to at the least be able to validate business decisions the big corps make. There's always new stuff on the horizon that can change the whole market in a heartbeat, but barring something tremendous, AMD's acquisition seems to indicate beyond a reasonable doubt that they will not be competing in the market Intel hopes to engage with it's massively multicore chips. If you're trying to somehow insinuate that Intel is foolish to explore massively multicore chips rather than designing large-die SoCs the way AMD is, I'd say you're wrong. There's definitely a future for "Fusion," just not in high performance computing.
  • Reply 11 of 27
    slewisslewis Posts: 2,081member
    Null.
  • Reply 12 of 27
    slewisslewis Posts: 2,081member
    Null.
  • Reply 13 of 27
    robmrobm Posts: 1,068member
    "The 2 largest PC Manufacturers, Dell and HP, both use AMD, with Dell just recently adding those chips to their lineup, as well as Intel. Not everything is going to be Intel only.

    "



    But don't they HAVE to - just to be able alternatives to the market ?

    If they don't - umm then their smaller competition will.



    Also keeps Intel somewhat off-balance. Necessary I'd say, if I was a Dell man (which I'm not in way shape or form).
  • Reply 14 of 27
    slewisslewis Posts: 2,081member
    Null.
  • Reply 15 of 27
    jvbjvb Posts: 210member
    AMD's are great machines. Right now the thing AMD needs most in the performance market is to make the switch from 90nm. For all the overclocking aficionados, it just runs too hot. Once they bring that down, AMD will gain some of their previous enthusiast market back.
  • Reply 16 of 27
    Quote:
    Originally Posted by jvb View Post


    AMD's are great machines. Right now the thing AMD needs most in the performance market is to make the switch from 90nm. For all the overclocking aficionados, it just runs too hot. Once they bring that down, AMD will gain some of their previous enthusiast market back.



    We have not seen anything impressive about AMD's 65nm node process yet. Perhaps because it isn't as mature, and that's exactly the same reason which makes AMD release the low end on 65 nm first.



    IMO, the AMD Torrenza initiative is kind of foolish. It seems almost like they want to step backwards in time, making dedicated coprocessors in the age when finally GPUs may even become general purpose processors. 80387 and 80487 again?
  • Reply 17 of 27
    jvbjvb Posts: 210member
    Quote:
    Originally Posted by Zandros View Post


    We have not seen anything impressive about AMD's 65nm node process yet. Perhaps because it isn't as mature, and that's exactly the same reason which makes AMD release the low end on 65 nm first.



    IMO, the AMD Torrenza initiative is kind of foolish. It seems almost like they want to step backwards in time, making dedicated coprocessors in the age when finally GPUs may even become general purpose processors. 80387 and 80487 again?



    I wouldn't go right to foolish. From AMD's perspective, they have to make up lost ground, so they better try something. I think in the next five years we might see AMD move even more towards the server market.
  • Reply 18 of 27
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by wmf View Post


    And the other ways are worse. Increasing the frequency increases power consumption much more than it increases performance. Increasing the core size is the same deal.



    Adding cache supposedly doesn't increase power consumption much, but there's only so much that can do.



    The number of transistors on a chip is doubling about every 18 months, and those transistors have to go somewhere. Currently, adding cores is the best way to go. Doubling the clock on a given chip increases the power consumption by four. Doubling the cores increases the max power consumption by about two. Idled cores can theoretically be turned off.
  • Reply 19 of 27
    Quote:
    Originally Posted by JeffDM View Post


    Adding cache supposedly doesn't increase power consumption much, but there's only so much that can do.



    The number of transistors on a chip is doubling about every 18 months, and those transistors have to go somewhere. Currently, adding cores is the best way to go. Doubling the clock on a given chip increases the power consumption by four. Doubling the cores increases the max power consumption by about two. Idled cores can theoretically be turned off.



    Adding cache suffers from diminishing returns, and it becomes more vulnerable to manufacturing defects. Larger more complex cores suffer from greater signal delays across the core, plus longer lines leading to greater power leakage (and greater design difficulty, testability, and diminishing returns on the performance vs complexity tradeoff). Multiple cores are largely independent (and their communication fabric can be redundant) so the chips are resiliant against manufacturing defects and failures-over-time (which will become a big issue below 65nm), plus the lines are short keeping propogation times within the core low, leakage is reduced, and core complexity is minimized. Verifying a simple core is much easier, then replicating it many times allows the hardware to be extremely powerful (in terms of concurrent execution potential) without a great testing and design burden. Manufacturing flaws just provide a different grade of product for marketing (much like clockrate capability already does).



    In the future things like asynchronously clocked cores, deeper pipelines, wider SIMD, etc. are all more easily done on simple cores which are massively replicated.



    For the hardware guys choosing to go multi-core is a no-brainer -- concurrency has always been inevitable, its just been a matter of how long it would take before they gave up trying to hide it from the software guys. Once the software guys get their heads around decent concurrent design principles, and the tools to support it, then things will really start to hum along. We've been stuck in a serial mindset for entirely too long. The revolution is nigh and the world is divided into those who know it is coming and those in denial.
  • Reply 20 of 27
    robmrobm Posts: 1,068member
    Programmer - I always appreciate your posts !

    Thoughtful and informed - now that's enough greasing.



    "The revolution is nigh and the world is divided into those who know it is coming and those in denial."

    Can I out you on this comment, pls sir ?

    Like, can you flesh it out for those of us in the slow class ?



    I have some major ass (for me) purchases coming up - I would like to able to reassure myself that I'm on to it ... lol
Sign In or Register to comment.