Whatever happened to Moore's Law?

Posted:
in Future Apple Hardware edited January 2014
Didn't Moore's law state that tech like computers doubles in speed and halves in price every 18 months? Why are we paying the same for Powerbooks that are barely faster than they were 2 years ago?

I guess I understand some of the technical issues, heat problems and such, but the rate at which computers in general have increased in speed has come to a standstill compared to the mid to late 90's when it seemed like things were twice as fast every year or so.

I agree that it's kinda nice that if you bought a Powerbook in January you still have the top-of-the-line and if the latest rumors are correct, you'll still basically have that after the next release with processor speed not increasing or barely increasing. It just seems really odd...

Will there be new tech that will allow another period of rapid growth, or will this be the likely pace for the near future?

Thanks in advance for any thoughts,

kc

Comments

  • Reply 1 of 18
    This was scrapped.
  • Reply 2 of 18
    rhumgodrhumgod Posts: 1,289member
    When companies invest in optical switching for transistors rather than electrical, things will improve. Much less heat, and unlimited speed potential. Just not there yet.
  • Reply 3 of 18
    Moore's Law has always been misquoted and misunderstood. At no point did it ever say anything about the speed and price of your computer.



    Moore's Law was (is) a statement about the rate at which the number of transistors on the optimal die increase. Moore never expected it to hold true as long as it has, and it still appears to be holding true. He stated the "Law" (which is really just a projection of trends, not a true "law" like those found in physics) back in the 60s, I believe.



    Through the 80s and 90s people came to equate this with increases in clock rate, and therefore performance. This was due to the increasing transistor density automatically allowing manufacturers to increase the frequency and essentially gaining "free" performance improvements. In the last ten years this has led to ever increasing power consumption as frequency goes up due to another effect coming into effect -- more power tends to leak from smaller devices, and there are more and more of these devices to leak. A couple of years ago we hit a wall where increasing the frequency (and hence power) just isn't feasible or practical. All processor designers hit that wall, it just impacts their different designs at different clock rates.





    The rate of improvement has stalled significantly and in terms of single processing cores it doesn't look like that will change. Possibly ever. The new direction is to put multiple cores on a single chip and change the software so that it can take advantage of them. This approach has the potential to allow us to get back to our dramatic performance differences, but with the important caveat that the software must be able to take advantage of many cores. Most software is single threaded and must be rewritten. Some problems are quite intractable from a concurrency perspective, and therefore will not scale in an MP world. Fortunately many of the problems that most people care about do scale if the software is done right. The bad news is that the programmers and tools out there right now are poorly suited to MP and there will be some slow and painful changes going on in the development industry to realign with the new realities of hardware.

  • Reply 4 of 18
    Programmer et al:

    Great info. Makes a lot of sense the way you explain it.



    Good day to all,

    kc
  • Reply 5 of 18
    I believe I need to save this for future reference...it's a hard concept to explain to people that believe moores law is really a law lol
  • Reply 6 of 18
    Just for the record, I didn't really equate Moore's Law with...oh, say gravity or the Pythagorean theorem, I just assumed it was some sort of a theoretical guideline/projection for technological progression!





    Later,

    kc
  • Reply 7 of 18
    lol...I was more refering to my not so savy friends at work etc..
  • Reply 8 of 18
    Sounds more like the 'Law of Diminishing Returns'have taken root within the PPC's current form and no suprise that Apple jumped to a new design via Intel.



    from http://www.encyclopedia.com/html/d1/diminish.asp



    'Law of Diminishing Returns'



    Law stating that if one factor of production is increased while the others remain constant, the overall returns will relatively decrease after a certain point.



    Thus, for example, if more and more laborers are added to harvest a wheat field, at some point each additional laborer will add relatively less output than his predecessor did, simply because he has less and less of the fixed amount of land to work with. The principle, first thought to apply only to agriculture, was later accepted as an economic law underlying all productive enterprise. The point at which the law begins to operate is difficult to ascertain, as it varies with improved production technique and other factors.



    Anticipated by Anne Robert Jacques Turgot and implied by Thomas Malthus in his Essay on the Principle of Population (1798 ), the law first came under examination during the discussions in England on free trade and the corn laws. It is also called the law of decreasing returns and the law of variable proportions.
  • Reply 9 of 18
    powerdocpowerdoc Posts: 8,123member
    Programmer gave the right explanation of the Moore law.



    There is also a second Moore law : the cost of the fabbing process, also increase a similar way. It cost more and more to build new chips plants.
  • Reply 10 of 18
    Quote:

    Originally posted by Powerdoc

    Programmer gave the right explanation of the Moore law.



    There is also a second Moore law : the cost of the fabbing process, also increase a similar way. It cost more and more to build new chips plants.




    Those expensive new fab plants translate into higher chip prices which eventually get passed on to the consumer in the form of retail prices.



    Existing and/or particular technologies can always be improved to some degree but eventually the 'Law of diminshing returns' will reveal its ugly head to the point where consumers are no longer willing to pay higher prices for such minor technological improvements.



    This doesnt mean that technology ceases, only that a new idea/concept is needed to take it to the next level.
  • Reply 11 of 18
    progmacprogmac Posts: 1,850member
    Isn't Moore's law like Murphy's law? Just one of those things people say?
  • Reply 12 of 18
    Quote:

    Originally posted by johnsocal

    Sounds more like the 'Law of Diminishing Returns'have taken root within the PPC's current form and no suprise that Apple jumped to a new design via Intel.



    What makes you think this applies to just PPC? It has hit Intel just as hard which is why they never shipped the 4 GHz Pentium4, just like IBM never shipped the 3 GHz 970FX. Both companies have had to re-evaluate their plans. AMD either saw it coming or (more likely) didn't have the process technology to compete with Intel and thus took a different road to real achieve performance instead of impressive clock rates. Intel is now running toward dual core and Pentium-M based designs, and IBM is the first (and certainly not the last) moving toward many simple core high clock rate designs (i.e. XBox360's 3 core Xenon processor and the Cell -- both of which contain PowerPC cores).



    Apple is jumping ship from IBM to Intel for two main reasons: (1) they want a manufacturer who is committed to the desktop and notebook markets, (2) they are more impressed by Intel's roadmap and that is probably based largely on Intel's process technology which is the best in the world. No doubt it helps that Intel is just down the road from Apple and they all have the Silicon Valley mindset (unlike BigBlue who is primarily in the eastern US).
  • Reply 13 of 18
    Quote:

    Originally posted by Programmer

    What makes you think this applies to just PPC? It has hit Intel just as hard which is why they never shipped the 4 GHz Pentium4, just like IBM never shipped the 3 GHz 970FX. Both companies have had to re-evaluate their plans. AMD either saw it coming or (more likely) didn't have the process technology to compete with Intel and thus took a different road to real achieve performance instead of impressive clock rates. Intel is now running toward dual core and Pentium-M based designs, and IBM is the first (and certainly not the last) moving toward many simple core high clock rate designs (i.e. XBox360's 3 core Xenon processor and the Cell -- both of which contain PowerPC cores).



    Apple is jumping ship from IBM to Intel for two main reasons: (1) they want a manufacturer who is committed to the desktop and notebook markets, (2) they are more impressed by Intel's roadmap and that is probably based largely on Intel's process technology which is the best in the world. No doubt it helps that Intel is just down the road from Apple and they all have the Silicon Valley mindset (unlike BigBlue who is primarily in the eastern US).






    I dont think it applies only to PPC as stated im my post since I only used it as a reference point of one particular technology facing LODR.



    I agree with what you said but I would also like to add that price is also one major reason for Apple to jump ship because I don't think anyone can compete against intel's economies of scale.
  • Reply 14 of 18
    Quote:

    Originally posted by Rhumgod

    When companies invest in optical switching for transistors rather than electrical, things will improve. Much less heat, and unlimited speed potential. Just not there yet.



    How does an optical transistor work? The more I think about ;optical processors' the less feasible it sounds.
  • Reply 15 of 18
    rhumgodrhumgod Posts: 1,289member
    Quote:

    Originally posted by 1337_5L4Xx0R

    How does an optical transistor work? The more I think about ;optical processors' the less feasible it sounds.



    Here's a good article on it. I know Agilent Labs have been working on photonics for a while - I think the optical bus will come before the optical CPU, but it is a start.
  • Reply 16 of 18
    kreshkresh Posts: 379member
    Quote:

    Originally posted by Programmer

    Moore's Law has always been misquoted and misunderstood. At no point did it ever say anything about the speed and price of your computer.



    Moore's Law was (is) a statement about the rate at which the number of transistors on the optimal die increase. Moore never expected it to hold true as long as it has, and it still appears to be holding true. He stated the "Law" (which is really just a projection of trends, not a true "law" like those found in physics) back in the 60s, I believe.







    hehe I like the part about how brightly the transitors would glow
  • Reply 17 of 18
    You reach a point when the commomly used physics involved in a computer does not apply.



    Semi-conductor theory (as used in processors) and mangetic (mean-field) theory breaks down as a meaningful approximation when material sizes are less thean a few hundred atoms wide.



    Go smaller still and you are looking at quantum effects which may be useful to exploit in new types of logic in computers - but this is not realised in lab let alone in factories.



    Industry in computers requires a big jump to make chips from 90nm to 65nm dies - replacing machinery and overcoming engineering probs. 65nm is about 300atoms rough guestamate. So still not quantum. Heat issues are the BIG mother of the problems to overcome just now.. Intel put more money into resarching how to make lower power (heat) chips... Yawn!



    Opto electronics is another option.. which has had loads of research money pumped in but I don't see anything coming out .
  • Reply 18 of 18
    davegeedavegee Posts: 2,765member
    Quote:

    Originally posted by progmac

    Isn't Moore's law like Murphy's law? Just one of those things people say?







    With that many posts to the AI forums it's quite clear that progmac has spent way WAY WAY too much time in the Apple Outsider forums.



    Dave
Sign In or Register to comment.