or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Rumor: TSMC to build quad-core 20nm chips for Apple by late 2013
New Posts  All Forums:Forum Nav:

Rumor: TSMC to build quad-core 20nm chips for Apple by late 2013 - Page 2

post #41 of 67
Quote:
Originally Posted by MacBook Pro View Post


Not a correction... There is; however, something much larger at work than Moore's Law.
I believe Tallest Skil is essentially referring to the same ideas I am promoting:
"Exponential growth continued through paradigm shifts from vacuum tubes to discrete transistors to integrated circuits."
The resources underlying the exponential growth of an evolutionary process...

But where does Moore’s Law come from? What is behind this remarkably predictable phenomenon? I have seen relatively little written about the ultimate source of this trend. Is it just “a set of industry expectations and goals,” as Randy Isaac, head of basic science at IBM contends? Or is there something more profound going on?

 

It's intriguing that a law that cannot be a law in a physical sense, just continues to behave as one nearly 50 years after the original observation (Gordon Moore, 1965). What will always be frightening however (or exciting if one is a masochist), is an exponential curve on a logarithmic plot lol!!!

Where are we on the curve? We'll know once it goes asymptotic!
Reply
Where are we on the curve? We'll know once it goes asymptotic!
Reply
post #42 of 67

Remember what happened to Apple when they "dropped" Google. lol.gif
 

"Like I said before, share price will dip into the $400."  - 11/21/12 by Galbi

Reply

"Like I said before, share price will dip into the $400."  - 11/21/12 by Galbi

Reply
post #43 of 67
Quote:
Originally Posted by Galbi View Post

Remember what happened to Apple when they "dropped" Google. lol.gif
 

 

Sure, gained further admiration for reducing the evil empire's hold on aspects of their business! 1hmm.gif

Where are we on the curve? We'll know once it goes asymptotic!
Reply
Where are we on the curve? We'll know once it goes asymptotic!
Reply
post #44 of 67
Originally Posted by Galbi View Post
Remember what happened to Apple when they "dropped" Google. lol.gif

 

Yeah, they created a better mapping system that will be the standard in the future and rid their OS of the native bloat and time-waste of a central, worldwide home video repository. 

 

If only other companies would drop Google, perhaps they'd wind up in better places, too.

Originally posted by Marvin

Even if [the 5.5” iPhone exists], it doesn’t deserve to.
Reply

Originally posted by Marvin

Even if [the 5.5” iPhone exists], it doesn’t deserve to.
Reply
post #45 of 67
Quote:
Originally Posted by wizard69 View Post


Does the iPad need more GPU cores?

Yes.

iPod nano 5th Gen 8GB Orange, iPad 3rd Gen WiFi 32GB White
MacBook Pro 15" Core i7 2.66GHz 8GB RAM 120GB Intel 320M
Mac mini Core 2 Duo 2.4GHz 8GB RAM, iPhone 5 32GB Black

Reply

iPod nano 5th Gen 8GB Orange, iPad 3rd Gen WiFi 32GB White
MacBook Pro 15" Core i7 2.66GHz 8GB RAM 120GB Intel 320M
Mac mini Core 2 Duo 2.4GHz 8GB RAM, iPhone 5 32GB Black

Reply
post #46 of 67
Quote:
Originally Posted by Aizmov View Post

Yes.

Even if they double performance in other ways?

I would agree the ultimate answer is yes, but we still live in a physical world. As such the next iPad still needs to be a balanced design given the technology available. If the goal is better performance then core counts aren't important if Apple can get there via other avenues.
post #47 of 67
Quote:
Originally Posted by MacBook Pro View Post

Not a correction. I was simply providing evidence to support that which you already believed but had some right for skepticism as 10 nm has been touted as the "breaking point" for Moore's Law for more than a decade. Incorrectly, I incidentally believe. There is; however, something much larger at work than Moore's Law.
I believe Tallest Skil is essentially referring to the same ideas I am promoting:
"Exponential growth continued through paradigm shifts from vacuum tubes to discrete transistors to integrated circuits."
The resources underlying the exponential growth of an evolutionary process are relatively unbounded:
(i) The (ever growing) order of the evolutionary process itself. Each stage of evolution provides more powerful tools for the next. In biological evolution, the advent of DNA allowed more powerful and faster evolutionary “experiments.” Later, setting the “designs” of animal body plans during the Cambrian explosion allowed rapid evolutionary development of other body organs such as the brain. Or to take a more recent example, the advent of computer assisted design tools allows rapid development of the next generation of computers.
(ii) The “chaos” of the environment in which the evolutionary process takes place and which provides the options for further diversity. In biological evolution, diversity enters the process in the form of mutations and ever changing environmental conditions. In technological evolution, human ingenuity combined with ever changing market conditions keep the process of innovation going.
Ray Kurzweil predicts the meta-trend, that Moore's Law attempts to describe, will continue beyond 2020. The result in 2029 should be a computer 512 times more powerful than Watson. If the trend continues then by 2045 we will have computers 131,072 times more powerful than Watson.
From The Law of Accelerating Returns
March 7, 2001 by Ray Kurzweil
It is important to note that Moore’s Law of Integrated Circuits was not the first, but the fifth paradigm to provide accelerating price-performance. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to Turing’s relay-based “Robinson” machine that cracked the Nazi enigma code, to the CBS vacuum tube computer that predicted the election of Eisenhower, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer which I used to dictate (and automatically transcribe) this essay.
But I noticed something else surprising. When I plotted the 49 machines on an exponential graph (where a straight line means exponential growth), I didn’t get a straight line. What I got was another exponential curve. In other words, there’s exponential growth in the rate of exponential growth. Computer speed (per unit cost) doubled every three years between 1910 and 1950, doubled every two years between 1950 and 1966, and is now doubling every year.
But where does Moore’s Law come from? What is behind this remarkably predictable phenomenon? I have seen relatively little written about the ultimate source of this trend. Is it just “a set of industry expectations and goals,” as Randy Isaac, head of basic science at IBM contends? Or is there something more profound going on?

You read way to much into it.
Moore's Law is just an observed phenomenon seen as a reference point and a goal for further innovation later on (*)
A similar observation regarding doubling of clock speed that held true for some decades and was also an industry reference point is already untrue for some years.
Simply because it wasn't feasible to implement commercially.
One observation holds true for every exponential growth: it will halt at a certain point because of the physical reality we live in, nothing 'magical' to it.

J.

(*) (edit) A 'self fulfilling prophecy' so to say.
Edited by jnjnjn - 10/13/12 at 6:52am
post #48 of 67
Quote:
Originally Posted by Tallest Skil View Post

It's truly amazing. I wonder… is part of the drive toward this sort of exponential betterment the very existence of Moore's Law? Because you'll definitely see other industries stagnate when one company has such a marked advantage over the others as Intel does here. 

Possibly some minor effect.

Clearly, the technology has lent itself to rapid improvements. In addition, the use pattern has created a situation where there was never enough computer power for your needs - so people were constantly demanding faster computers. So you've had consumers constantly clamoring for faster computers - which led the major players to make computer performance one of their primary selling features. That puts pressure on their suppliers to improve their products, as well.

That is a pretty general phenomenon. Until very recently, most people found that their computers slowed them down - and therefore had a reason to desire a faster computer. Even simple things like word processing and modest spreadsheets were slow enough to interfere with your train of thought. Of course, much of that was crappy coding on the part of Microsoft and others, but the consumer doesn't care. They just needed more speed. Today, most users find that any moderately recent computer can keep up with them with almost no delays for normal work (obviously, high end graphics or video or scientific work is an exception). So computer speed isn't as big a driver as it once was. Even for games, we've gotten to the point where you can play even fairly intense action games full screen with fluid motion, so further improvements are likely to be more subtle.

On the other hand, there is the pygmalion effect - which is what you are referring to. I've never seen the pygmalion effect applied to an entire market, but I suppose it's possible that it might have at least some effect. For example, if companies believe that they're going to need to double computer performance every year and a half to keep up, they are probably going to be more willing to invest money in the R&D needed to make that happen.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #49 of 67

I remember in the mid-80s, when they were transitioning from 5 micron rules to 3 microns, and dreaming about someday going to 1 micron, they said the absolute physical limit would be quarter-micron rules (250 nm), because the rate of soft errors due to unavoidable radioactive decays in the semiconductor and packaging would become unacceptable below that.

 

Well, I guess that was baloney, but 20 nm is what? 200 atoms wide? That's getting down there. Quantum jitters are gonna get you eventually.

 

"I like coffee, I like tea...the Java Jive's got a hold on me...."
 

post #50 of 67
Quote:
Originally Posted by jragosta View Post

Possibly some minor effect.
...
Even simple things like word processing and modest spreadsheets were slow enough to interfere with your train of thought. Of course, much of that was crappy coding on the part of Microsoft and others, but the consumer doesn't care. They just needed more speed. Today, most users find that any moderately recent computer can keep up with them with almost no delays for normal work (obviously, high end graphics or video or scientific work is an exception). So computer speed isn't as big a driver as it once was. Even for games, we've gotten to the point where you can play even fairly intense action games full screen with fluid motion, so further improvements are likely to be more subtle.
...

Ha ha, you are suggesting that clock speeds didn't increase because consumers are happy!
You couldn't be more wrong, clock speed didn't increase because they hit a wall.
Intel and other changed the marketing tune to reflect this.

J.
post #51 of 67
Quote:
Originally Posted by Dick Applebaum View Post

What about the capability to run x86 code -- shouldn't that be an issue for using ARM on Mac?
As I understand it, today's x86 architecture is CISC, which is "transformed" into RISC for execution... And the "transformation" process is Intel IP.

No. x86 code isn't usefull on a touch device (and future notebooks will of course be touch devices).
The web will probably be sufficient if you have to use MS code.
If I remember correctly several hardware translators for the x86 instruction set exist.
The crusoe processor (from Transmeta) for example. And it isn't owned by Intel.
Software translation is another option, recently a Russian team developed a super effective emulator for x86 code on ARM.
But software or hardware translation is a waste of time if you have no need for it.
The fact that today's x86 chips - as you correctly state - are in fact RISC chips with hardware translation of x86 code, indicates the inherent advantage ARM has without all the bloatware needed to implement that.

J.
post #52 of 67
Quote:
Originally Posted by Lemon Bon Bon. View Post

Put the A6'X'(?) with more GPU cores and higher clocked cpu...it should be a very impressive performer in the next iPad (4?) (New, new 1wink.gif

Consider 32nm + more A15-based goodness + Img Tech Rogue + single backlight with GG2 and in-cell tech to reduce thickness and I think you could have a much more powerful iPad that is even thinner and lighter than the iPad 2 and yet lasts longer with that Retina Display.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #53 of 67
Quote:
Originally Posted by jnjnjn View Post

No. x86 code isn't usefull on a touch device (and future notebooks will of course be touch devices).
The web will probably be sufficient if you have to use MS code.
If I remember correctly several hardware translators for the x86 instruction set exist.
The crusoe processor (from Transmeta) for example. And it isn't owned by Intel.
Software translation is another option, recently a Russian team developed a super effective emulator for x86 code on ARM.
But software or hardware translation is a waste of time if you have no need for it.
The fact that today's x86 chips - as you correctly state - are in fact RISC chips with hardware translation of x86 code, indicates the inherent advantage ARM has without all the bloatware needed to implement that.
J.

1) Emulation isn't virtualization.

2) Desktop OSes eschewing the physical keyboard for a virtual one is a major reason why Win tablet didn't take off in the decade head start over the iPad.

3) Macs on x86 is still an important feature for many users. Emulation will not replace this. You're more likely to see a service like Amazon's EC2 allow for RDP to a virtualized OS be the de facto way before you see emulation x86 code on ARM as the norm.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #54 of 67
Quote:
Originally Posted by jnjnjn View Post

Ha ha, you are suggesting that clock speeds didn't increase because consumers are happy!
You couldn't be more wrong, clock speed didn't increase because they hit a wall.
Intel and other changed the marketing tune to reflect this.
J.

Where in the world did I say that clock speeds didn't increase because consumers were happy? Clearly, there are technical limitations. But in spite of the clock speed wall, there was still a major push to increase performance.

My point (which you entirely missed, obviously) is that consumers were putting a great deal more pressure on PC makers for faster computers a decade ago than they are today. Today's computers are fast enough for most people so it's not a desperate waiting game for the faster chips like it was a decade ago. But then, given the drivel that you post, you were probably still in diapers a decade ago.

That is a very large part of the reason why computer sales are declining. While the economy hasn't helped, even in previous recessions, computer sales continued to grow at double digit rates. I believe that the current slow down is not so much due to the recession as due to the fact that most people are reasonably content with the speed of their PCs.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #55 of 67
Quote:
Originally Posted by jnjnjn View Post

No. x86 code isn't usefull on a touch device (and future notebooks will of course be touch devices).
You blew your credibility right here. We ave touch devices and we have notebooks right now. They Steve different purposes and trying to turn a notebook into a tablet or Touch device is foolishness. Beyond that the statement about x86 code is ridiculous.
Quote:
The web will probably be sufficient if you have to use MS code.
Actually the web means nothing in his context. We are talking legacy code here some of which hasn't been written.
Quote:
If I remember correctly several hardware translators for the x86 instruction set exist.
The crusoe processor (from Transmeta) for example. And it isn't owned by Intel.
Software translation is another option, recently a Russian team developed a super effective emulator for x86 code on ARM.
Yeah the guy has achieved 40% of the speed of native ARM code. . That is so damn exciting I can't get over it.
Quote:
But software or hardware translation is a waste of time if you have no need for it.
True. What I believe many here, myself included, are saying is that native i86 code execution is still an extremely important capability for laptops.
Quote:
The fact that today's x86 chips - as you correctly state - are in fact RISC chips with hardware translation of x86 code, indicates the inherent advantage ARM has without all the bloatware needed to implement that.
J.
I don't think anybody here dismisses the fact that ARM has architectural advantages due to the minimal need to support legacy instructions and features. Frankly I've never understood why intel hasn't cut the cord and simply remove legacy hardware features.
post #56 of 67
Quote:
Originally Posted by jragosta View Post

Where in the world did I say that clock speeds didn't increase because consumers were happy? Clearly, there are technical limitations. But in spite of the clock speed wall, there was still a major push to increase performance.
Actually things didn't stall they just dwelled for a bit. We now have processors that can hit 4GHz fairly easily. That is pretty quick considering the architectures involved.
Quote:
My point (which you entirely missed, obviously) is that consumers were putting a great deal more pressure on PC makers for faster computers a decade ago than they are today. Today's computers are fast enough for most people so it's not a desperate waiting game for the faster chips like it was a decade ago. But then, given the drivel that you post, you were probably still in diapers a decade ago.
The demand for faster hardware is still there when it comes to PC users. The real problem today is that people are getting better results from things like iPad. Advanced PC users still demand better performance, but the market for those sorts of users has dried p as other devices solve their imputing needs. I suspect we will see an even bigger split in the future where many households won't have traditional computers at all.
Quote:
That is a very large part of the reason why computer sales are declining. While the economy hasn't helped, even in previous recessions, computer sales continued to grow at double digit rates. I believe that the current slow down is not so much due to the recession as due to the fact that most people are reasonably content with the speed of their PCs.
Nope, people are finding their needs solved by other devices. Most of what would have been PC sales have gone to smart phone and tablet buyers. It is more a case of PCs being abandoned then anything.
post #57 of 67
Quote:
Originally Posted by SolipsismX View Post

1) Emulation isn't virtualization.
2) Desktop OSes eschewing the physical keyboard for a virtual one is a major reason why Win tablet didn't take off in the decade head start over the iPad.
3) Macs on x86 is still an important feature for many users. Emulation will not replace this. You're more likely to see a service like Amazon's EC2 allow for RDP to a virtualized OS be the de facto way before you see emulation x86 code on ARM as the norm.

1) Didn't say it was, wasn't talking about that...

2) That was before Apple demonstrated iOS. Notebooks will be replaced by larger Pads (Tablets) say A4 format. iOS will replace Mac OS X, but it will of couse be a mix (with a lot of new features). In the somewhat longer run iMacs will be replaced by Pads too.

3) Didn't say it wasn't. Emulation can replace it, think of Rosetta for example. But I didn't say that is necessarily the way to do it, I was talking about hardware translation and software translation. A hardware translation a la Crusoe could well be effective enough.
Isn't worth the effort though.

J.
post #58 of 67
Quote:
Originally Posted by wizard69 View Post

You blew your credibility right here. We ave touch devices and we have notebooks right now. They Steve different purposes and trying to turn a notebook into a tablet or Touch device is foolishness. Beyond that the statement about x86 code is ridiculous.
Actually the web means nothing in his context. We are talking legacy code here some of which hasn't been written.
Yeah the guy has achieved 40% of the speed of native ARM code. . That is so damn exciting I can't get over it.
True. What I believe many here, myself included, are saying is that native i86 code execution is still an extremely important capability for laptops.
I don't think anybody here dismisses the fact that ARM has architectural advantages due to the minimal need to support legacy instructions and features. Frankly I've never understood why intel hasn't cut the cord and simply remove legacy hardware features.

Read my other post to clarify it for you. Maybe I am even less credible to you now?
(You make a funny mistake: "They Steve different purposes and trying to turn a notebook into a tablet or Touch ...", it should be 'serve', instead of 'Steve'. You know, Steve Jobs wasn't always right, even Apple doesn't think so, look at the iPad mini. Times change.)

J.
post #59 of 67
Quote:
Originally Posted by jragosta View Post

Where in the world did I say that clock speeds didn't increase because consumers were happy? Clearly, there are technical limitations. But in spite of the clock speed wall, there was still a major push to increase performance.
My point (which you entirely missed, obviously) is that consumers were putting a great deal more pressure on PC makers for faster computers a decade ago than they are today.
My point is that the change in the chip roadmap is caused by physical limitations and has nothing to do (and didn't coincide with) a change in desire to push the performance.
The latter is caused by the former, not the other way around.
Quote:
Originally Posted by jragosta View Post

... But then, given the drivel that you post, you were probably still in diapers a decade ago.
Again, your completely wrong.

J.
post #60 of 67
Quote:
Originally Posted by jnjnjn View Post

1) Didn't say it was, wasn't talking about that...
2) That was before Apple demonstrated iOS. Notebooks will be replaced by larger Pads (Tablets) say A4 format. iOS will replace Mac OS X, but it will of couse be a mix (with a lot of new features). In the somewhat longer run iMacs will be replaced by Pads too.
3) Didn't say it wasn't. Emulation can replace it, think of Rosetta for example. But I didn't say that is necessarily the way to do it, I was talking about hardware translation and software translation. A hardware translation a la Crusoe could well be effective enough.
Isn't worth the effort though.
J.

So much wrongness on so many levels.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #61 of 67
Quote:
Originally Posted by SolipsismX View Post

So much wrongness on so many levels.

Hmm, it's fine if you have another opinion, it isn't if you state something is wrong and don't indicate what it is.
Saying something is wrong doesn't make it so.

J.
post #62 of 67
Quote:
Originally Posted by jnjnjn View Post

Hmm, it's fine if you have another opinion, it isn't if you state something is wrong and don't indicate what it is.
Saying something is wrong doesn't make it so.
J.

You can have an opinion but you've stated yours as fact. You didn't state "I think iOS will replace Mac OS X and here is how and when I think it will happen and why..." You stated "iOS will replace Mac OS X." So please, have an opinion, which will mean editing everything you've written in this thread.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #63 of 67
Quote:
Originally Posted by SolipsismX View Post

You can have an opinion but you've stated yours as fact. You didn't state "I think iOS will replace Mac OS X and here is how and when I think it will happen and why..." You stated "iOS will replace Mac OS X." So please, have an opinion, which will mean editing everything you've written in this thread.

I see. Everything has to be spelled out for you.
I'am not playing that game.
Have a nice day.

J.
post #64 of 67
Quote:
Originally Posted by jnjnjn View Post

I see. Everything has to be spelled out for you.
I'am not playing that game.
Have a nice day.
J.

I certainly can't read your mind and you backtracking with "that's not what I meant" despite you and only you having control over the words you type wears very thin. Learn to communicate or go away.

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply

"The real haunted empire?  It's the New York Times." ~SockRolid

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

Reply
post #65 of 67
Quote:
Originally Posted by SolipsismX View Post

I certainly can't read your mind and you backtracking with "that's not what I meant" despite you and only you having control over the words you type wears very thin. Learn to communicate or go away.

You do like to have the last word, even if you quote me completely wrong.
I would advise you to improve your reading skills.

J.
post #66 of 67
Originally Posted by SolipsismX View Post
So much wrongness on so many levels.

 

On what parts, 1 and 3? 

 

I, at least, agree partially with his point 2. Partially. iOS won't replace OS X by a long shot, and physical keyboards will stick around for a good while.

Originally posted by Marvin

Even if [the 5.5” iPhone exists], it doesn’t deserve to.
Reply

Originally posted by Marvin

Even if [the 5.5” iPhone exists], it doesn’t deserve to.
Reply
post #67 of 67

I think personally, The temptation to upgrade is getting less. My Macs are just not breaking unless you count the odd hard drive But this is not true of my familys cheaper windows PCs. I just wonder if a lot of Mac users feel the same. If it is so, I would imagine Apple would know and be looking to target new users. So, you know. I would imagine the upgrade segment of the iMac and possibly laptop segment of the market would be falling if not already then soon.

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Rumor: TSMC to build quad-core 20nm chips for Apple by late 2013