Rumor: TSMC to build quad-core 20nm chips for Apple by late 2013

13

Comments

  • Reply 41 of 66
    galbigalbi Posts: 968member


    Remember what happened to Apple when they "dropped" Google. image

     

     0Likes 0Dislikes 0Informatives
  • Reply 42 of 66
    iqatedoiqatedo Posts: 1,846member

    Quote:

    Originally Posted by Galbi View Post


    Remember what happened to Apple when they "dropped" Google. image

     



     


    Sure, gained further admiration for reducing the evil empire's hold on aspects of their business! image

     0Likes 0Dislikes 0Informatives
  • Reply 43 of 66


    Originally Posted by Galbi View Post

    Remember what happened to Apple when they "dropped" Google. image


     


    Yeah, they created a better mapping system that will be the standard in the future and rid their OS of the native bloat and time-waste of a central, worldwide home video repository. 


     


    If only other companies would drop Google, perhaps they'd wind up in better places, too.

     0Likes 0Dislikes 0Informatives
  • Reply 44 of 66
    aizmovaizmov Posts: 989member
    wizard69 wrote: »

    Does the iPad need more GPU cores?

    Yes.
     0Likes 0Dislikes 0Informatives
  • Reply 45 of 66
    wizard69wizard69 Posts: 13,377member
    aizmov wrote: »
    Yes.

    Even if they double performance in other ways?

    I would agree the ultimate answer is yes, but we still live in a physical world. As such the next iPad still needs to be a balanced design given the technology available. If the goal is better performance then core counts aren't important if Apple can get there via other avenues.
     0Likes 0Dislikes 0Informatives
  • Reply 46 of 66
    jnjnjnjnjnjn Posts: 588member
    Not a correction. I was simply providing evidence to support that which you already believed but had some right for skepticism as 10 nm has been touted as the "breaking point" for Moore's Law for more than a decade. Incorrectly, I incidentally believe. There is; however, something much larger at work than Moore's Law.
    I believe Tallest Skil is essentially referring to the same ideas I am promoting:
    "Exponential growth continued through paradigm shifts from vacuum tubes to discrete transistors to integrated circuits."
    The resources underlying the exponential growth of an evolutionary process are relatively unbounded:
    (i) The (ever growing) order of the evolutionary process itself. Each stage of evolution provides more powerful tools for the next. In biological evolution, the advent of DNA allowed more powerful and faster evolutionary “experiments.” Later, setting the “designs” of animal body plans during the Cambrian explosion allowed rapid evolutionary development of other body organs such as the brain. Or to take a more recent example, the advent of computer assisted design tools allows rapid development of the next generation of computers.
    (ii) The “chaos” of the environment in which the evolutionary process takes place and which provides the options for further diversity. In biological evolution, diversity enters the process in the form of mutations and ever changing environmental conditions. In technological evolution, human ingenuity combined with ever changing market conditions keep the process of innovation going.
    Ray Kurzweil predicts the meta-trend, that Moore's Law attempts to describe, will continue beyond 2020. The result in 2029 should be a computer 512 times more powerful than Watson. If the trend continues then by 2045 we will have computers 131,072 times more powerful than Watson.
    From The Law of Accelerating Returns
    March 7, 2001 by Ray Kurzweil
    It is important to note that Moore’s Law of Integrated Circuits was not the first, but the fifth paradigm to provide accelerating price-performance. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to Turing’s relay-based “Robinson” machine that cracked the Nazi enigma code, to the CBS vacuum tube computer that predicted the election of Eisenhower, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer which I used to dictate (and automatically transcribe) this essay.
    But I noticed something else surprising. When I plotted the 49 machines on an exponential graph (where a straight line means exponential growth), I didn’t get a straight line. What I got was another exponential curve. In other words, there’s exponential growth in the rate of exponential growth. Computer speed (per unit cost) doubled every three years between 1910 and 1950, doubled every two years between 1950 and 1966, and is now doubling every year.
    But where does Moore’s Law come from? What is behind this remarkably predictable phenomenon? I have seen relatively little written about the ultimate source of this trend. Is it just “a set of industry expectations and goals,” as Randy Isaac, head of basic science at IBM contends? Or is there something more profound going on?

    You read way to much into it.
    Moore's Law is just an observed phenomenon seen as a reference point and a goal for further innovation later on (*)
    A similar observation regarding doubling of clock speed that held true for some decades and was also an industry reference point is already untrue for some years.
    Simply because it wasn't feasible to implement commercially.
    One observation holds true for every exponential growth: it will halt at a certain point because of the physical reality we live in, nothing 'magical' to it.

    J.

    (*) (edit) A 'self fulfilling prophecy' so to say.
     0Likes 0Dislikes 0Informatives
  • Reply 47 of 66
    jragostajragosta Posts: 10,473member
    It's truly amazing. I wonder… is part of the drive toward this sort of exponential betterment the very existence of Moore's Law? Because you'll definitely see other industries stagnate when one company has such a marked advantage over the others as Intel does here. 

    Possibly some minor effect.

    Clearly, the technology has lent itself to rapid improvements. In addition, the use pattern has created a situation where there was never enough computer power for your needs - so people were constantly demanding faster computers. So you've had consumers constantly clamoring for faster computers - which led the major players to make computer performance one of their primary selling features. That puts pressure on their suppliers to improve their products, as well.

    That is a pretty general phenomenon. Until very recently, most people found that their computers slowed them down - and therefore had a reason to desire a faster computer. Even simple things like word processing and modest spreadsheets were slow enough to interfere with your train of thought. Of course, much of that was crappy coding on the part of Microsoft and others, but the consumer doesn't care. They just needed more speed. Today, most users find that any moderately recent computer can keep up with them with almost no delays for normal work (obviously, high end graphics or video or scientific work is an exception). So computer speed isn't as big a driver as it once was. Even for games, we've gotten to the point where you can play even fairly intense action games full screen with fluid motion, so further improvements are likely to be more subtle.

    On the other hand, there is the pygmalion effect - which is what you are referring to. I've never seen the pygmalion effect applied to an entire market, but I suppose it's possible that it might have at least some effect. For example, if companies believe that they're going to need to double computer performance every year and a half to keep up, they are probably going to be more willing to invest money in the R&D needed to make that happen.
     0Likes 0Dislikes 0Informatives
  • Reply 48 of 66


    I remember in the mid-80s, when they were transitioning from 5 micron rules to 3 microns, and dreaming about someday going to 1 micron, they said the absolute physical limit would be quarter-micron rules (250 nm), because the rate of soft errors due to unavoidable radioactive decays in the semiconductor and packaging would become unacceptable below that.


     


    Well, I guess that was baloney, but 20 nm is what? 200 atoms wide? That's getting down there. Quantum jitters are gonna get you eventually.


     


    "I like coffee, I like tea...the Java Jive's got a hold on me...."

     

     0Likes 0Dislikes 0Informatives
  • Reply 49 of 66
    jnjnjnjnjnjn Posts: 588member
    jragosta wrote: »
    Possibly some minor effect.
    ...
    Even simple things like word processing and modest spreadsheets were slow enough to interfere with your train of thought. Of course, much of that was crappy coding on the part of Microsoft and others, but the consumer doesn't care. They just needed more speed. Today, most users find that any moderately recent computer can keep up with them with almost no delays for normal work (obviously, high end graphics or video or scientific work is an exception). So computer speed isn't as big a driver as it once was. Even for games, we've gotten to the point where you can play even fairly intense action games full screen with fluid motion, so further improvements are likely to be more subtle.
    ...

    Ha ha, you are suggesting that clock speeds didn't increase because consumers are happy!
    You couldn't be more wrong, clock speed didn't increase because they hit a wall.
    Intel and other changed the marketing tune to reflect this.

    J.
     0Likes 0Dislikes 0Informatives
  • Reply 50 of 66
    jnjnjnjnjnjn Posts: 588member
    What about the capability to run x86 code -- shouldn't that be an issue for using ARM on Mac?
    As I understand it, today's x86 architecture is CISC, which is "transformed" into RISC for execution... And the "transformation" process is Intel IP.

    No. x86 code isn't usefull on a touch device (and future notebooks will of course be touch devices).
    The web will probably be sufficient if you have to use MS code.
    If I remember correctly several hardware translators for the x86 instruction set exist.
    The crusoe processor (from Transmeta) for example. And it isn't owned by Intel.
    Software translation is another option, recently a Russian team developed a super effective emulator for x86 code on ARM.
    But software or hardware translation is a waste of time if you have no need for it.
    The fact that today's x86 chips - as you correctly state - are in fact RISC chips with hardware translation of x86 code, indicates the inherent advantage ARM has without all the bloatware needed to implement that.

    J.
     0Likes 0Dislikes 0Informatives
  • Reply 51 of 66
    solipsismxsolipsismx Posts: 19,566member
    Put the A6'X'(?) with more GPU cores and higher clocked cpu...it should be a very impressive performer in the next iPad (4?) (New, new ;)

    Consider 32nm + more A15-based goodness + Img Tech Rogue + single backlight with GG2 and in-cell tech to reduce thickness and I think you could have a much more powerful iPad that is even thinner and lighter than the iPad 2 and yet lasts longer with that Retina Display.
     0Likes 0Dislikes 0Informatives
  • Reply 52 of 66
    solipsismxsolipsismx Posts: 19,566member
    jnjnjn wrote: »
    No. x86 code isn't usefull on a touch device (and future notebooks will of course be touch devices).
    The web will probably be sufficient if you have to use MS code.
    If I remember correctly several hardware translators for the x86 instruction set exist.
    The crusoe processor (from Transmeta) for example. And it isn't owned by Intel.
    Software translation is another option, recently a Russian team developed a super effective emulator for x86 code on ARM.
    But software or hardware translation is a waste of time if you have no need for it.
    The fact that today's x86 chips - as you correctly state - are in fact RISC chips with hardware translation of x86 code, indicates the inherent advantage ARM has without all the bloatware needed to implement that.
    J.

    1) Emulation isn't virtualization.

    2) Desktop OSes eschewing the physical keyboard for a virtual one is a major reason why Win tablet didn't take off in the decade head start over the iPad.

    3) Macs on x86 is still an important feature for many users. Emulation will not replace this. You're more likely to see a service like Amazon's EC2 allow for RDP to a virtualized OS be the de facto way before you see emulation x86 code on ARM as the norm.
     0Likes 0Dislikes 0Informatives
  • Reply 53 of 66
    jragostajragosta Posts: 10,473member
    jnjnjn wrote: »
    Ha ha, you are suggesting that clock speeds didn't increase because consumers are happy!
    You couldn't be more wrong, clock speed didn't increase because they hit a wall.
    Intel and other changed the marketing tune to reflect this.
    J.

    Where in the world did I say that clock speeds didn't increase because consumers were happy? Clearly, there are technical limitations. But in spite of the clock speed wall, there was still a major push to increase performance.

    My point (which you entirely missed, obviously) is that consumers were putting a great deal more pressure on PC makers for faster computers a decade ago than they are today. Today's computers are fast enough for most people so it's not a desperate waiting game for the faster chips like it was a decade ago. But then, given the drivel that you post, you were probably still in diapers a decade ago.

    That is a very large part of the reason why computer sales are declining. While the economy hasn't helped, even in previous recessions, computer sales continued to grow at double digit rates. I believe that the current slow down is not so much due to the recession as due to the fact that most people are reasonably content with the speed of their PCs.
     0Likes 0Dislikes 0Informatives
  • Reply 54 of 66
    wizard69wizard69 Posts: 13,377member
    jnjnjn wrote: »
    No. x86 code isn't usefull on a touch device (and future notebooks will of course be touch devices).
    You blew your credibility right here. We ave touch devices and we have notebooks right now. They Steve different purposes and trying to turn a notebook into a tablet or Touch device is foolishness. Beyond that the statement about x86 code is ridiculous.
    The web will probably be sufficient if you have to use MS code.
    Actually the web means nothing in his context. We are talking legacy code here some of which hasn't been written.
    If I remember correctly several hardware translators for the x86 instruction set exist.
    The crusoe processor (from Transmeta) for example. And it isn't owned by Intel.
    Software translation is another option, recently a Russian team developed a super effective emulator for x86 code on ARM.
    Yeah the guy has achieved 40% of the speed of native ARM code. . That is so damn exciting I can't get over it.
    But software or hardware translation is a waste of time if you have no need for it.
    True. What I believe many here, myself included, are saying is that native i86 code execution is still an extremely important capability for laptops.
    The fact that today's x86 chips - as you correctly state - are in fact RISC chips with hardware translation of x86 code, indicates the inherent advantage ARM has without all the bloatware needed to implement that.
    J.
    I don't think anybody here dismisses the fact that ARM has architectural advantages due to the minimal need to support legacy instructions and features. Frankly I've never understood why intel hasn't cut the cord and simply remove legacy hardware features.
     0Likes 0Dislikes 0Informatives
  • Reply 55 of 66
    wizard69wizard69 Posts: 13,377member
    jragosta wrote: »
    Where in the world did I say that clock speeds didn't increase because consumers were happy? Clearly, there are technical limitations. But in spite of the clock speed wall, there was still a major push to increase performance.
    Actually things didn't stall they just dwelled for a bit. We now have processors that can hit 4GHz fairly easily. That is pretty quick considering the architectures involved.
    My point (which you entirely missed, obviously) is that consumers were putting a great deal more pressure on PC makers for faster computers a decade ago than they are today. Today's computers are fast enough for most people so it's not a desperate waiting game for the faster chips like it was a decade ago. But then, given the drivel that you post, you were probably still in diapers a decade ago.
    The demand for faster hardware is still there when it comes to PC users. The real problem today is that people are getting better results from things like iPad. Advanced PC users still demand better performance, but the market for those sorts of users has dried p as other devices solve their imputing needs. I suspect we will see an even bigger split in the future where many households won't have traditional computers at all.
    That is a very large part of the reason why computer sales are declining. While the economy hasn't helped, even in previous recessions, computer sales continued to grow at double digit rates. I believe that the current slow down is not so much due to the recession as due to the fact that most people are reasonably content with the speed of their PCs.
    Nope, people are finding their needs solved by other devices. Most of what would have been PC sales have gone to smart phone and tablet buyers. It is more a case of PCs being abandoned then anything.
     0Likes 0Dislikes 0Informatives
  • Reply 56 of 66
    jnjnjnjnjnjn Posts: 588member
    solipsismx wrote: »
    1) Emulation isn't virtualization.
    2) Desktop OSes eschewing the physical keyboard for a virtual one is a major reason why Win tablet didn't take off in the decade head start over the iPad.
    3) Macs on x86 is still an important feature for many users. Emulation will not replace this. You're more likely to see a service like Amazon's EC2 allow for RDP to a virtualized OS be the de facto way before you see emulation x86 code on ARM as the norm.

    1) Didn't say it was, wasn't talking about that...

    2) That was before Apple demonstrated iOS. Notebooks will be replaced by larger Pads (Tablets) say A4 format. iOS will replace Mac OS X, but it will of couse be a mix (with a lot of new features). In the somewhat longer run iMacs will be replaced by Pads too.

    3) Didn't say it wasn't. Emulation can replace it, think of Rosetta for example. But I didn't say that is necessarily the way to do it, I was talking about hardware translation and software translation. A hardware translation a la Crusoe could well be effective enough.
    Isn't worth the effort though.

    J.
     0Likes 0Dislikes 0Informatives
  • Reply 57 of 66
    jnjnjnjnjnjn Posts: 588member
    wizard69 wrote: »
    You blew your credibility right here. We ave touch devices and we have notebooks right now. They Steve different purposes and trying to turn a notebook into a tablet or Touch device is foolishness. Beyond that the statement about x86 code is ridiculous.
    Actually the web means nothing in his context. We are talking legacy code here some of which hasn't been written.
    Yeah the guy has achieved 40% of the speed of native ARM code. . That is so damn exciting I can't get over it.
    True. What I believe many here, myself included, are saying is that native i86 code execution is still an extremely important capability for laptops.
    I don't think anybody here dismisses the fact that ARM has architectural advantages due to the minimal need to support legacy instructions and features. Frankly I've never understood why intel hasn't cut the cord and simply remove legacy hardware features.

    Read my other post to clarify it for you. Maybe I am even less credible to you now?
    (You make a funny mistake: "They Steve different purposes and trying to turn a notebook into a tablet or Touch ...", it should be 'serve', instead of 'Steve'. You know, Steve Jobs wasn't always right, even Apple doesn't think so, look at the iPad mini. Times change.)

    J.
     0Likes 0Dislikes 0Informatives
  • Reply 58 of 66
    jnjnjnjnjnjn Posts: 588member
    jragosta wrote: »
    Where in the world did I say that clock speeds didn't increase because consumers were happy? Clearly, there are technical limitations. But in spite of the clock speed wall, there was still a major push to increase performance.
    My point (which you entirely missed, obviously) is that consumers were putting a great deal more pressure on PC makers for faster computers a decade ago than they are today.
    My point is that the change in the chip roadmap is caused by physical limitations and has nothing to do (and didn't coincide with) a change in desire to push the performance.
    The latter is caused by the former, not the other way around.
    jragosta wrote: »
    ... But then, given the drivel that you post, you were probably still in diapers a decade ago.
    Again, your completely wrong.

    J.
     0Likes 0Dislikes 0Informatives
  • Reply 59 of 66
    solipsismxsolipsismx Posts: 19,566member
    jnjnjn wrote: »
    1) Didn't say it was, wasn't talking about that...
    2) That was before Apple demonstrated iOS. Notebooks will be replaced by larger Pads (Tablets) say A4 format. iOS will replace Mac OS X, but it will of couse be a mix (with a lot of new features). In the somewhat longer run iMacs will be replaced by Pads too.
    3) Didn't say it wasn't. Emulation can replace it, think of Rosetta for example. But I didn't say that is necessarily the way to do it, I was talking about hardware translation and software translation. A hardware translation a la Crusoe could well be effective enough.
    Isn't worth the effort though.
    J.

    So much wrongness on so many levels.
     0Likes 0Dislikes 0Informatives
  • Reply 60 of 66
    jnjnjnjnjnjn Posts: 588member
    solipsismx wrote: »
    So much wrongness on so many levels.

    Hmm, it's fine if you have another opinion, it isn't if you state something is wrong and don't indicate what it is.
    Saying something is wrong doesn't make it so.

    J.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.