or Connect
AppleInsider › Forums › General › General Discussion › Apple threatened Intel with 'wake-up call' over chip power consumption
New Posts  All Forums:Forum Nav:

Apple threatened Intel with 'wake-up call' over chip power consumption

post #1 of 61
Thread Starter 
Officials at Apple were at at one point so unsatisfied with power consumption levels of Intel's processors that they threatened to end their partnership with the chipmaker, if the problems were not addressed.

The revelation was shared by Greg Welch, director of Intel's Ultrabook group, with The Wall Street Journal. He said that Apple gave Intel a "real wake-up call" when the Mac maker threatened to end their business relationship.

Apple officials told Intel that the chipmaker needed to "drastically slash its power consumption," or else Apple would turn elsewhere for chips. The threats were said to have helped spur Intel's interest in creating its new Ultrabook specification.

As announced earlier this week, Intel Capital, the strategic investment arm of the world's largest chipmaker, will invest $300 million in a new "Ultrabook fund" to invest in new technologies. Intel is pushing manufacturers to build thin-and-light notebooks that aim to challenge Apple's MacBook Air.

As Intel has pushed to get its Ultrabook specification off the ground, the chipmaker's partners are said to have struggled keeping their ultraportable notebooks under a price of $1,000. Apple's entry-level 11.6-inch MacBook Air sells for $999, and is one of the company's most popular notebooks.

For years, rumors suggested that Apple would transition the iPhone to the Atom architecture, but the change failed to materialize as Intel struggled with managing power consumption. The Atom processor was also said to be utilized in early prototypes of the iPad as far back as 2008.



Unsatisfied with the power consumption levels of Intel's Atom platform, Apple instead turned to ARM for its iPhone and iPad processors. The company also bought ARM design companies PA Semi and Intrinsity, both key acquisitions that allowed Apple to create the custom A4 processor found in the iPhone 4 and first-generation iPad, as well as the dual-core A5 processor found in the iPad 2.

As for its Mac lineup, as recently as 2010 there were indications that Apple and Intel's rival AMD were engaged in initial discussions about the possibility of Apple adopting AMD chips. More recently, there has even been speculation that Apple could merge iOS with Mac OS X with Macs based on an anticipated A6 processor starting in 2012.
post #2 of 61
I'm at at my computer amazed that the proof reading isn't a little better. C'mon guys!

UPDATE: You fixed it. Quick!
post #3 of 61
It's going to be extremely interesting to see what happens with the A6 processor. Intel have had such incredible dominance for so many years, and with the exception of a couple of blips here and there they have generally been in the technology lead, that I wonder if they will have the know-how to deal with shrinking market share if it were to happen.
post #4 of 61
20+ years of this and it took a company that didn't even use Intel chips for most of its existence to tell Intel to get its act together.

There should be more gratitude from the whiners and iHaters towards Apple for shaking up the industry and getting things done!

About time Intel!
post #5 of 61
Quote:
Originally Posted by sflocal View Post

20+ years of this and it took a company that didn't even use Intel chips for most of its existence to tell Intel to get its act together.

There should be more gratitude from the whiners and iHaters towards Apple for shaking up the industry and getting thing done!

About time Intel!

It does beg the question of what HP and Dell and such like had been doing over the years. They were buying the sort of quantities Apple do - what did they use their leverage for, or did they just blindly use whatever Intel told them to?
post #6 of 61
I honestly don't see how the A6 could ever produce enough raw horsepower for even the MBA let alone the real workhorses, so that rumor still seems insane...

But it is more insane that intel needed Apple to tell them that power consumption (and heat) are big concerns for laptops. Even the newly improved chips like the one in my laptop run far too hot. And they needed to be told this?

...and that one of the lead tech companies. Pathetic.
OSX, because making UNIX user friendly is easier than debugging windows.
Reply
OSX, because making UNIX user friendly is easier than debugging windows.
Reply
post #7 of 61
I don't know if Intel ever really targeted the phone market very seriously or to any degree of success with their x86 derivative chips. You can wedge something in, but that wasn't the focus of the architecture. It's been a long time since ARM was a serious contender with desktop & notebooks. Trying to meet both kinds of devices with the same instruction set means compromises somewhere, trying to be a jack of both trades, but short mastering neither.
post #8 of 61
Quote:
Originally Posted by PaulMJohnson View Post

It does beg the question of what HP and Dell and such like had been doing over the years. They were buying the sort of quantities Apple do - what did they use their leverage for, or did they just blindly use whatever Intel told them to?

Reading between the lines, it appears you know the answer already, you crafty fellow.
post #9 of 61
LOL, if I was Intel I'd tell whoever complains about my chips to write more efficient software.
post #10 of 61
Be careful what you wish for. Apple leans on Intel, Intel responds, but then goes out and leads the opposing army. A little passive-aggressive f.u. from Intel--"you can threaten us, but we'll make you pay." Maybe Apple should be continuing those talks with AMD.
A.k.a. AppleHead on other forums.
Reply
A.k.a. AppleHead on other forums.
Reply
post #11 of 61
Its not so hard to imagine a system that has multiple ARM chips operating at only a fraction of the wattage of an Intel system. Three ARM chips each with a quad core could easily outperform the solo Intel chip I bet. Their best chip consumes 14 watts according to the Wall Street Journal today. ARM's chips consume a fraction of 1 watt. Hell, four or FIVE ARM chips on a board would still HEAVILY outperform Intel's lowest power offerings.

Everyone always wants to just think "one chip, multi-core" but what about "multiple chips, with multiple cores."
post #12 of 61
Quote:
Originally Posted by shen View Post

I honestly don't see how the A6 could ever produce enough raw horsepower for even the MBA let alone the real workhorses, so that rumor still seems insane...

But it is more insane that intel needed Apple to tell them that power consumption (and heat) are big concerns for laptops. Even the newly improved chips like the one in my laptop run far too hot. And they needed to be told this?

...and that one of the lead tech companies. Pathetic.

I agree, but I wonder if that's a marketing problem. Decreasing power consumption increases cost, decreases speed, or both. People don't want to think they bought a pokey computer, but they don't realize that there are consequences for that.
post #13 of 61
Quote:
Originally Posted by danv2 View Post

Everyone always wants to just think "one chip, multi-core" but what about "multiple chips, with multiple cores."

There's only so many cores you can use (you can only thread stuff so much), after that you need speed and not cores!
post #14 of 61
Quote:
Originally Posted by Leonard View Post

There's only so many cores you can use (you can only thread stuff so much), after that you need speed and not cores!

Grand Central Dispatch. Solves that problem. At least in my head...I'm not sure in paper or practice. So take that with a big grain of salt.
post #15 of 61
Quote:
Originally Posted by DrDoppio View Post

LOL, if I was Intel I'd tell whoever complains about my chips to write more efficient software.

no matter how efficient your software, if the chip runs hot and draws power at low idles, it won't work in a laptop. Intel has built more than a few chips that meet that description.
OSX, because making UNIX user friendly is easier than debugging windows.
Reply
OSX, because making UNIX user friendly is easier than debugging windows.
Reply
post #16 of 61
Intel has too much money and resources in R&D to ever let ARM chips catch up unless Apple invested billions--which they could, but that is not really an Apple-type business.
post #17 of 61
Quote:
Originally Posted by JeffDM View Post

I agree, but I wonder if that's a marketing problem. Decreasing power consumption increases cost, decreases speed, or both. People don't want to think they bought a pokey computer, but they don't realize that there are consequences for that.

The other solution is to do more with less, cut out the unused OS parts and making multitasking work for the typical user rather than the guy who watches two videos while IMing and playing a video game. Optimize the system to do what real people do in the way they really do it rather than letting it run as if Moore's law will always bail them out.

If you make an OS like that and built hardware to suit it, what would you get?

Oh yeah, it is called an iPad, and it doesn't need an intel workhorse. I guess Apple was planning ahead all along.
OSX, because making UNIX user friendly is easier than debugging windows.
Reply
OSX, because making UNIX user friendly is easier than debugging windows.
Reply
post #18 of 61
Quote:
Originally Posted by danv2 View Post

Grand Central Dispatch. Solves that problem. At least in my head...I'm not sure in paper or practice. So take that with a big grain of salt.

More like an entire tub of sea-salt There is no magic bullet for threading and concurrency, tools such as GCD can help, but they're not going to turn a single threaded process into something that can utilize ten cores.
post #19 of 61
Intel is saying this publicly—I don’t think this something “against” Intel, it’s just an interesting (and positive) detail of the Apple/Intel collaboration. Intel is GLAD to have a partner that pushes them in directions they didn’t see on their own. Remember how they’d been courting Apple for so long before the Intel switch? They loved the idea of a fast-moving company (that makes hardware and software both) that would allow Intel’s innovations to shine in a way that Microsoft and the generic box-makers were never going to do. Unless, possibly, Apple paved the way first! Intel benefits greatly from having a less conservative, more innovative computing partner. (Thunderbolt being an obvious recent example.)

P.S. I really love the new Air; I don’t how much is the new Intel chips, and how much is simply having an SSD, but this tiny thing is shockingly fast! I can’t use any other computer without being annoyed. There just isn’t any waiting... it really does feel like the best of iPad plus the best of Mac.
post #20 of 61
Quote:
Originally Posted by FormerARSgm View Post

I'm at at my computer amazed that the proof reading isn't a little better. C'mon guys!

UPDATE: You fixed it. Quick!

"I'm at at my computer..."?

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #21 of 61
Quote:
Originally Posted by danv2 View Post

Grand Central Dispatch. Solves that problem. At least in my head...I'm not sure in paper or practice. So take that with a big grain of salt.

No, it doesn't. GCD is just a convenient method to parallelize a computing task. It doesn't make a non-parellizable task suddenly parallelizable. If the task isn't capable of being parallelized, then it can't be done in GCD.

There's also the overhead in managing the different processes too. This is what GCD does, and it isn't free - it does take CPU time.
post #22 of 61
Quote:
Originally Posted by Leonard View Post

There's only so many cores you can use (you can only thread stuff so much), after that you need speed and not cores!

If you are in the habit of running programs that scale well across multiple processors more core certainly will help up until bandwidth becomes an issue. Plus modern OS'es are always running something in background. So it is a qualified response, that at some point faster cores is what you need. Sometimes yes sometimes no.

I think Shen (above) got it right about raw horse power but I hardly expect Apple to go back to a 32 bit processor. Instead they most likely are or have been looking seriously at AMD's Fusion hardware. Surprisingly AMD hardware actually looks pretty good right now up against Intel when it comes to power usage. You get a better GPU which gets traded off for less CPU performance, which for many users is exactly the right trade off to make.

On top of all of that I have to agree it is rather pathetic that Intel had to be told their processors are way to hot. Talk about falling into believing your own marketing crap. I also have a hard time believing that Intels other customers have had little input in the matter. Maybe Apple has no fear, while the other companies are totally dependent upon Intel.
post #23 of 61
Quote:
Originally Posted by shen View Post

I honestly don't see how the A6 could ever produce enough raw horsepower for even the MBA let alone the real workhorses, so that rumor still seems insane...

But it is more insane that intel needed Apple to tell them that power consumption (and heat) are big concerns for laptops. Even the newly improved chips like the one in my laptop run far too hot. And they needed to be told this?

...and that one of the lead tech companies. Pathetic.

I'd say one of the arguments against the A6 for the workhorses is it would take away the ability to run Windows decently. While I don't like having to, there are a few Windows applications that I have no choice but to use in my line of work, and if I couldn't sensibly run Windows on the Mac, I'd have to have a bog standard Windows computer instead of the Mac.

For the MBA, whether or not the A6 will be the tipping point or not I'm not sure, but something that's not powerful enough now may be in a generation or two's time. Apple seem to have something of a fixation on controlling as much of the design process as they can (which is clearly working for them), so it wouldn't be a huge shock to see them put their own processors into Macs.
post #24 of 61
Maybe Apple did not learn from the PowerPC fiasco.
post #25 of 61
There's a grammar issue in this article. In the context of Apple's criticism of Intel processor power consumption, the word should be "dissatisfied" rather than "unsatisfied."

I admit to being a Fanatical Moderate. I Disdain the Inane. Vyizderzominymororzizazizdenderizorziz?

Reply

I admit to being a Fanatical Moderate. I Disdain the Inane. Vyizderzominymororzizazizdenderizorziz?

Reply
post #26 of 61
GCD is a wonderful technology there is no doubt there. However there is a board spectrum of potential apps out there that can make use of GCD to one degree or another. If you are lucky an App might get close to x times the single core performance where x is the number of cores available. Generally these are apps that are blindingly easy to parallelize. Some are more difficult and there is vast ocean of apps where the speed up is more variable. As a tool, GCD and the associated API's, just give developers one approach to speeding up an app.

There are other approaches to parallelize an app. However the one thing that people often miss is that few Mac these days are used to do exactly one thing at a time and in fact the user isn't always aware of what is happening in background. For example right now I'm cruising the net responding with this post while my Mac is downloading video in background. This is actually light usage for me. So we have to consider that cores make processes run much smoother on UNIX platforms.

Quote:
Originally Posted by danv2 View Post

Grand Central Dispatch. Solves that problem. At least in my head...I'm not sure in paper or practice. So take that with a big grain of salt.
post #27 of 61
My 11" MBA 2010 runs just fine. I can't really tell a difference on anything I do between it and my Quad-Core 27" mid-2010 2.93 GHz. There are exceptions but the SSD drive in the MBA levels the playing field.

I would imagine that hand brake would run MUCH slower on an MBA if it had a optical drive in it. Other than that for me, I think for most people, it is not completely unreasonable to assume an advanced A6 quad core ARM chip may in fact be suitable for MBA's.
Hard-Core.
Reply
Hard-Core.
Reply
post #28 of 61
Quote:
Originally Posted by PaulMJohnson View Post

It does beg the question of what HP and Dell and such like had been doing over the years. They were buying the sort of quantities Apple do - what did they use their leverage for, or did they just blindly use whatever Intel told them to?

HP and Dell make only HW and load the OS from MS. By making both the SW and HW, Apple has an advantage; the right hand knows what the left is doing. They are able to optimize the SW to the HW, and see first hand what the problems are, like the chip.

It could also be that since PC makers are differentiating on price alone, they may not want to pay higher prices for a better CPU. This could also be a reason that the first USB ports were used by Apple and it was Apple that collaborated with Intel for the Thunderbolt port. Just a TB cable costs $49!

Also, I think that Apple would be well served if they acquired a chip maker. There was a rumor last year about Apple buying ARM. I guess it could be soon, or Apple may decide to buy AMD.
post #29 of 61
Quote:
Originally Posted by sflocal View Post

. . .There should be more gratitude from the whiners and iHaters towards Apple for shaking up the industry and getting thing done!

Good for you sflocal! Saying for what it is, is how it's best done. Apple demands better while the rest are happy raking up the crumbs of coin left behind by our favourite fruit company. If it takes a different chip for the Wintel mob to compete, so be it. But Intel shouldn't leave Apple behind in Apple's demands to improve upon perfection.

Id like to see Apple really get into the processor business. Apple could still hire out the manufacturing, but after the problems with the power chips maybe it's time that Apple applies its standards to this part of the business, and as it has to iOS.

Maybe the cost of patents is what is holding Apple back.

Could it partner with AMD? What are the possibilities, knowledgeable people?

When I find time to rewrite the laws of Physics, there'll Finally be some changes made round here!

I am not crazy! Three out of five court appointed psychiatrists said so.

Reply

When I find time to rewrite the laws of Physics, there'll Finally be some changes made round here!

I am not crazy! Three out of five court appointed psychiatrists said so.

Reply
post #30 of 61
Quote:
Originally Posted by FormerARSgm View Post

I'm at at my computer amazed that the proof reading isn't a little better. C'mon guys!

UPDATE: You fixed it. Quick!

Quote:
Originally Posted by SpamSandwich View Post

"I'm at at my computer..."?


LMAO.....man, people are so so ignorant and quick to to point fingers!

What an....... ARS-hole.
You talkin' to me?
Reply
You talkin' to me?
Reply
post #31 of 61
Anybody else noticed how nonsensical the headline of this article was?

'Apple threatened Intel with 'wake-up call'.

Either you threaten somebody, or you give them a wake-up call - but if you threaten a wake-up call then that's like a threat of a threat.

AI - improve your journalism or I'm going to threaten you!

See how silly it is?
post #32 of 61
Hahaha yeah. Apple with AMD chips. Not going to happen.

And power consumption is going down with the 22nm Ivy Bridge chips.

I don't know what Apple's crying about. They get great battery life

Retina Macbook Pro - 2.6ghz

Galaxy Nexus - Jelly Bean!

Reply

Retina Macbook Pro - 2.6ghz

Galaxy Nexus - Jelly Bean!

Reply
post #33 of 61
I would suggest that Intel are taking Apple seriously because it's Apple who are pushing Intel along these days.

The Air was an excellent example of a chip design that had been demoed to Dell et al, a year or two earlier. I think the story went that it was shown to Apple on the off chance of "we kinda made this but no one was interested in it" and Apple built the first Air from it.

That is why Apple seem to be Intel's premier partner these days, and rightly so. Which is also why Intel probably shouldered some critisism from Apple about power consumption.
post #34 of 61
I wish this article was released a few weeks earlier before I got my new MBA 11". I've noticed that the battery drains too quickly. So today I ran my MBA side-by-side with my wife's late 2010 model, and yes, my battery drained much quicker than hers. In 10 minutes, mine drained 10% but hers only drained 4%. This is really disappointing performance for a mobile machine. So then I checked on Apple's website in the MBA section and noticed they no longer featured long battery life for their new model, so go figure...
post #35 of 61
Quote:
Originally Posted by cloudgazer View Post

Anybody else noticed how nonsensical the headline of this article was?

'Apple threatened Intel with 'wake-up call'.

Either you threaten somebody, or you give them a wake-up call - but if you threaten a wake-up call then that's like a threat of a threat.

AI - improve your journalism or I'm going to threaten you!

See how silly it is?

To be generous, you could read "threatened" as a state induced in Intel via the agency of the wake-up call (as in caused to feel a threat), in the same sense as "Apple threatened Intel with with their plan for world domination."
They spoke of the sayings and doings of their commander, the grand duke, and told stories of his kindness and irascibility.
Reply
They spoke of the sayings and doings of their commander, the grand duke, and told stories of his kindness and irascibility.
Reply
post #36 of 61
Quote:
Originally Posted by addabox View Post

To be generous, you could read "threatened" as a state induced in Intel via the agency of the wake-up call (as in caused to feel a threat), in the same sense as "Apple threatened Intel with with their plan for world domination."

Not with that word order though, in order for threatened to be a state not a verb it would need to be something more like

'Intel (felt) threatened after Apple wake-up call'

I'm still threatening to threaten AI if they don't shape up
post #37 of 61
Quote:
Originally Posted by nagromme View Post

Intel is saying this publiclyI dont think this something against Intel, its just an interesting (and positive) detail of the Apple/Intel collaboration. Intel is GLAD to have a partner that pushes them in directions they didnt see on their own. Remember how theyd been courting Apple for so long before the Intel switch? They loved the idea of a fast-moving company (that makes hardware and software both) that would allow Intels innovations to shine in a way that Microsoft and the generic box-makers were never going to do. Unless, possibly, Apple paved the way first! Intel benefits greatly from having a less conservative, more innovative computing partner. (Thunderbolt being an obvious recent example.)

P.S. I really love the new Air; I dont how much is the new Intel chips, and how much is simply having an SSD, but this tiny thing is shockingly fast! I cant use any other computer without being annoyed. There just isnt any waiting... it really does feel like the best of iPad plus the best of Mac.

Best Post!

I agree that Apple and Intel are good for each other.
I don't get the sudden Intel hatred here.
Apple made a great move in turning to Intel. I loved my G5 dual tower, but I had to have an extra air conditioning vent added to the office to counter the processor heat.
My new MacBookPro is blazing fast, but runs cool and silent.

I have been thru 68000 to PPC, OS9 to OS X, and PPC to Intel. Please, Apple, give me a little time before you trash my software investment yet again.

NO to A6 for Macs!
post #38 of 61
Quote:
Originally Posted by danv2 View Post

Its not so hard to imagine a system that has multiple ARM chips operating at only a fraction of the wattage of an Intel system. Three ARM chips each with a quad core could easily outperform the solo Intel chip I bet. Their best chip consumes 14 watts according to the Wall Street Journal today. ARM's chips consume a fraction of 1 watt. Hell, four or FIVE ARM chips on a board would still HEAVILY outperform Intel's lowest power offerings.

Everyone always wants to just think "one chip, multi-core" but what about "multiple chips, with multiple cores."

The only issue I see with your idea is where all of these chips would go? If I recall correctly, they even skimped on the Thunderbolt connector on the Air because the standard one was too big.

Adding more chips is not the direction the devices are heading, it seems.
post #39 of 61
Quote:
Originally Posted by JeffDM View Post

I don't know if Intel ever really targeted the phone market very seriously or to any degree of success with their x86 derivative chips. You can wedge something in, but that wasn't the focus of the architecture. It's been a long time since ARM was a serious contender with desktop & notebooks. Trying to meet both kinds of devices with the same instruction set means compromises somewhere, trying to be a jack of both trades, but short mastering neither.

The compromise is that ARM processors scale differently. They don't have much scheduling logic so you need to add more cores to scale (instead of scheduling hardware). The advantage to ARM is that you have a lot more raw power in the same size chip. The disadvantage is that legacy software doesn't run well on multi-core processors. Apple has been addressing this in their SDKs though. In many situations ARM will be better for both mobile and desktop. Intel has had trouble scaling a single core processor too (they could only bring the scheduling hardware so far) and are now doing multiple cores even though many programs can't take advantage of them. Originally, Intel was trying to make the Itanium processor that took the scheduling hardware even farther. That seems to have been a failure but maybe that was at least partially due to the fact Itanium wasn't x86 compatible. Since Intel doesn't have control over an operating system, their focus has been on instruction scheduling so that they could increase the speed of existing software. Since Apple has control over the operating system, it makes more sense to make the software run well on multiple cores. In the long run this will scale much better. ARM is also generally more power efficient because power isn't wasted on scheduling. The processor may also turn off entire cores to save some power. When you have more cores, you also have more room to specialize some of them for certain operations. Like maybe make one really good at floating point operations and another really good at vector operations. This has been suggested by researchers, but hasn't really happened yet unless you count GPUs.
post #40 of 61
Well some facts.

There are no ARM chip / design performs at the current level that Intel is bring us with Core 2 Duo or SandyBridge. Not even the upcoming Dual / Quad Core Cortex A15.

There are also no Intel Chip / Design that uses as little power as ARM CPU / SoC either.

Apple would properly want a more Powerful ARM or a much less power hungry Intel CPU for Mac.

I think Haswell, the chip coming after Ivy Bridge will satisfied Apple's need. It is a 15W design with GPU + CPU + Northbridge. That is comparing to current 35W design, and 45W design in Core 2 Duo. Eventually I think Apple wanted Intel to move to Sub 10W design, which Intel has states in their roadmap as something after Haswell.

There are only two kind of people in this world.

Those who dont understand Apple and those who misunderstood Apple.

Reply

There are only two kind of people in this world.

Those who dont understand Apple and those who misunderstood Apple.

Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: General Discussion
AppleInsider › Forums › General › General Discussion › Apple threatened Intel with 'wake-up call' over chip power consumption