The design currently targets TSMC low power process. Im not sure where that tops out clock wise but I wouldn't be surprised to find that the A8X could run at 2GHz. Switch over to a performance optimized process and it is likely that Apple could exceed 2GHz by a wide margin with an increase in power usage. Of course power usage is only part of the problem, scaling performance requires enhancements to the architecture. For example resized caches, faster RAM, more on chip buffering, ALU tweaks and the like. These are easy to accomplish really so I see a significant upside for Apple with even a slightly tweaked design.
The interesting thing here is that Apple apparently has the design flexibility to target a number of different processes. They so far have hit a number of nodes at Samsung and now are on TSMC low power node. I really doubt that they would have much trouble at all targeting a performance node from these companies or even Global Foundries. Add in another CPU core or even a three core complex and you are left with an excellent notebook processor.
You know that if you increase voltage by a small amount, you increase power drain by a large amount. So heat will rapidly become an issue. That's likely why Apple increases speeds when moving a chip to the iPad by so little. I'm not confident that Apple could move this to 2GHz. Remember that there is no real heat sinking in these tablets, though I've often wished that Apple could figure out a way to utilize the aluminum case more directly as a heat sink. Of course, if they chose to use this in a notebook form factor, such as that of the Air, they could have good heat sinking for an ARM design, even without the fan. Perhaps there, with all the improvements you mention, clock could rise to 2GHz, possibly even a bit higher.
Right now, these chips are supposedly optimized for performance/efficiency. I don't know if Apple could move that too much. As process sizes shrink, it even becomes more difficult to remove heat effectively. The problem is that the heat becomes focussed in ever smaller areas on the chip, which makes heat sinking more complex. IBM showed an innovative solution a few years ago which in involves tiny tubes etched into the chips through which either gas or liquid would pass, removing heat directly from the chip, as it would be fabricated as part of the chip, not something several thermal layers away, as is current now. I think they are using something somewhat similar in the last two generations of their Power designs, and particularly in the mainframe CPUs.
I have no idea as to how this could work in a tablet, or phone, but I suppose it's possible to use a piezo pump in an enclosed structure that adheres to the case. Seems expensive though.
It isn't so much cheating as it is a demonstration that Apple can put a lot of stuff on the die that is not directly related to GPU/CPU performance. This means they don't have to build larger PC boards to support independent chips for these functions. I've said it many times in these forums but silicon is what the PC board was in the 1980-90's. It is the place where manufactures like Apple now innovate.
This is actually key to understanding Apple and innovation, their ability to knock out these complex chips sets them apart in the industry.
What's really interesting about Apple's new design is that it's actually more space efficient than the A7. Whereas the industry is finding that as they go down in size they have poorer packing density. Anandtech noted that Apple has increased packing density significantly this generation.
Of course, there is still the frustration in not knowing what at least 30% of Apple's chips are doing. Apple has no need to go to conferences and explain all this as they don't sell their chips elsewhere.
I'm not sure what this article is trying to convey other than manufactures should just pack it in and give up. I quite like my iPad, it serves my purposes for certain tasks such as music creation perfectly, but I don't think I would like to use it for all of my tablet needs, especially not being unable to run multiple apps in the background, which would more than likely drive me nuts. So I'm extremely happy to have choice that will give me those things that the iPad doesn't offer. regardless of DED's biased feelings on the matter. As long as manufactures continue to make decent products with decent hardware and software, I just don't see what the big deal is if it's not as fast as the iPad, the majority of people don't even know what to do with this power anyway. Example; the Nokia 2520 mentioned in the article isn't as fast as the current iPad but it's more than quick enough for the software it's running and it get's 11 hours on a charge, 16 when the keyboard is plugged in, it's also one of my favorite gadgets, so much so that I bought 2 for when(if) my current one fails. The article also fails to mention that these companies are just now getting into 64Bit computing, the Nvidia chip that DED loves to belittle is just the first of many to be introduced, the quad core variant is expected to be released in Q1 of 2015. Current K1 64Bit benchmarks have shown it to hold it's own just fine when paired up against the dual core variant of the A8, add another 2 cores and it will be just as fast, if not faster than the A8X as the K1's 64Bit single core benchmarks suggest. Even if it isn't, I still don't see what the problem is, why does everything have to faster than Apple's A8x to be considered decent in DED's eyes. I have been using the K1 32Bit variant for about 6 months now, it's an incredible chip and again as long as manufactures keep producing decent products using these chips, like the Nokia 2520, I will continue to purchase them. So I get it DED, in your opinion nothing is better than Apple but that doesn't mean others can't produce decent products as well or at least try, it's like he wants everyone to fail.
Indeed, the all-important single-core performance of the K1-64 beats the A8X by about 5%. Unfortunately very little software runs natively on any Android device, it runs in a VM which is so inefficient Google doubled its performance this year. Given Android native apps are so rare and unoptimised for either hardware or screen size this whole hardware comparison is largely redundant.
I'm not sure where people get this idea. Mac OS, the core components are running on ARM right now as iOS. Getting desktop Mac OS up and running on ARM would not require much effort at all. The reality is that it is most likely already running on ARM.
That largely depends upon the app. If developers follow Apples guidelines it actually would be little more than a recompile. Take a look at the Linux world if you want to see examples of software packages that run on a wide array of platforms with just a recompile. This use to be a significant problem but these days is far less of an issue.
Or competitive with a Mac Book Air. Remember we have no idea where the current design maxes out clock rate wise nor do we know how far it can go on a different process. Today's A8X has the potential to drive a laptop to relatively good performance figures.
I have to agree on this one. I could however see Apple make a modal tablet. That is one that runs iOS in portable mode and Mac OS when docked to a power supply / base station. An ARM based chip could easily handle that sort of implementation.
Such a chip might allow Apple to knock 2-300 dollars off the price of an Air. Some of that is based on cup costs but there is also value in the reduced PC Board size. We are talking about a machine here that might cost 50-100 dollars more than an equivalent iPad. It is a nice concept to think about.
I've discussed this a number of times, here, and elsewhere.
I've no doubt that Apple has run OS X on ARM since the A5x. Earlier chips would have been much too limited. They likely are running at least some of their apps on it as well.
But we really can't avoid the emulation problem, it's an ugly head. So small, simple apps could be run through Apple XCode, and with some little bit of work, be fine. But larger apps will have more problems. Compilers can't take care of every difficulty. There are simply too many areas in which hand coding is needed. As for large apps that need power, well, we can just forget it! For example, even if Adobe could be persuaded, again, to rewrite Photoshop, there is no way that it would perform anywhere near what is needed. So we wouldn't see Photoshop or CSx on an ARM based OS X device. Same thing for Office, no doubt. Same thing for many large apps. So we would end up with a subset of OS X software, and it would take some time for even that.
Remember that Apple uses a low power i5 for the low end MacBook Air. The A8x, at any clock, won't compare. I was thinking that if Apple again went with a dual core SoC, it could mean that they might be thinking about an ARM MacBook something or other, as they could use two of those. A four core design would work very well in a notebook, of course, and the graphics would then be three times the old A7, or twice what it is now. That would be enough. Clock it higher because of better heat sinking, and we would have some pretty good graphics. Four A8 cores clocked at 20% higher would give decent CPU performance as well.
But my idea also involves something else. It's known that most of the hanguls in emulation is from a small number of chip instructions that chip facially do differently. This is taken care of with equation software, which is around 100 times slower that hardware. That's avlut 8@% of the slowdown. But, these individual instructions aren't patented. So what if Apple took the worse of them, and put them into their chip? The OS could look at what instruction was being called for and zip it to that x86 instruction in the chip, rather than to software emulating that instruction. We would lose a few percent in speed overall for a major boost in emulation speed. In fact, this would likely eliminate the need for most apps to be recompiled and hand coded for the chip.
It would be nice if Apple was thinking along those lines because it can be done. But with three cores, I do t know if it would be worthwhile. We really don't need six cores for entry level notebooks, as most software won't take advantage of them.
Funny. I re-read this article and while Daniel loves his words sharpened to cut to the bone, there isn't all that much to disagree with other than his comments on Samsung production of A series SOC (Samsung is still important, because of a quarter billion A series SOC's a year), and whether there will be a A series in a Mac (by definition, no; some new product line). That said, TSMC, Global Foundries, and Samsung are all producing or will be producing A series SOC's.
The point of the article is that: Qualcomm hasn't been able to capitalize on SOC's for High End Tablets (iPad owns that space), and Nvidia's 64 bit K1 isn't substantially better in any metric than the A8X, and in many metrics, is substantially worse. The Nexus 9 is weak sauce, and a poor build which doesn't help drive industry sales of the K1 family. Further, Apple is able to leverage its profitability to create an efficient, powerful processor family in volumes that Nvidia can't even imagine, while Nvidia is, for all practical purposes, unable to produce a competitive processor for any smartphone.
As for Intel, I don't actually know what is happening to them in mobile, other than the uptake has been slow, but their pricing structure doesn't fit well with the OEM's, who aren't seeing much if any profit in tablets. albeit that could always change.
I do think that Daniel overstates Intel's problems, it has the resources for a long game, but I also find that Intel hasn't made much progress in mobile, which would be the industry assessment as well.
We really need to be careful in discussing Qualcomm, Nvidia, etc. All were surprised that Apple came out with a 64 bit chip last year, but they aren't stupid, or incompetent. Apple's a generation ahead, but that doesn't mean that these companies will forever be a generation behind in performance. That's why the K1 is such an interesting chip, as it's Nvidia's first generation. Qualcomm is working on their high end 64 bit chip, though they've strangely come out with a middle range 64 bit chip already.
Apple doesn't own the space for high end chips, because they make chips only for their own tablets. "Own the space" means there's a competitive situation. As Apple doesn't compete with Qualcomm, they can't own that space. In the competitive market, Qualcomm owns the space, low end, mid, and high end. It might change on the low end with Chinese manufacturers making low end chips.
But right now, the K1 is the best performing chip for Android. How long that will last is anyone's idea.
The Nexus 9 is weak because Android isn't optimized the way iOS is with Apple's chips. Android still has multitasking that's way in excess of what most users need, and processes simply don't turn off, using RAM, processor cycles, and battery.
Yes, yes, they coulda, shoulda. They didn't because they want to remain independent. I'm not going to go over this in detail, because I've already don't so in other posts. But if unhappy, you mean crying into their cups, then obviously no. But thinking that everything is just great is also a no. Read my other posts on this.
Yes... Apple wants to be independent. They design their own software to run on their own hardware. Apple is unlike any other computer manufacturer. And they price their hardware at the high-end to maintain a high profit margin... but volume suffers.
Dell, HP and others choose to use software written by someone else... and slash prices on hardware. The result? All that cheaper hardware sells at higher volume... but at lower profit margins.
It's two different philosophies.
If Apple truly wanted more desktop market share... they would do what the rest of the industry does and slash prices to pump up Mac volume.
They don't... so market share must not be a high-priority after all.
Apple's desktop market share situation might not be "great" but it's been their M.O. for 30 years.
The only questions I have about TB was these: 1. Who's initiative was it? 2. Did Apple and Intel cross license the technology?
Answer those two questions and we might get an idea if we will ever see a TB port in an iOS device.
Intel at one time kinda hinted at the idea that they might not object to compatible hardware. I've yet to see anything materialize so maybe that was wishful thinking. The other thing here is that TB is evolving rapidly, there might be a pause in the industry waiting for Intel to deliver a revision that will be stable for awhile. In the overall picture TB 2 was extremely close to TB 1, this kinda indicates that TB 1 was released early to allow Apple to launch new hardware with the port.
Yes high exposure even if there isn't an Intel inside sticker on the Mac Books! ????????????????????. I couldn't resot saying that because that Intel inside sticker ends up on a lot of crap products.
I thought the questions about Thunderbolt were settled. Intel has stated that while it was Apple that came to them about the idea, and gave them specs they wanted to see, that it was Intel that did the engineering. Intel was quite upset when some people said that Apple had done it and gave it to Intel.
I don't think that Thunderbolt will see the light of day on ARM. It requires at least Express Bus 2. The new version require the newer 3. Arm devices don't have the IO bandwidth for a 40Gbs interface.
We're seeing more Thunderbolt adoption in the Windows industry, now that intels controllers have come down in price with the last revision. So there' so ood reason to be optimistic about it overall.
You like to toot your horn, but where's your record of accuracy? Your regular attempts at character assassination are really just a lot of puffing hot air around. Where's the bullets?
Just look for yourself. You seem to be coming here to just downgrade everyone else who is trying to be objective, without saying anything yourself that's useful. Why don't you try to show that I'm wrong instead? If you can't do that, then don't bother posting to me, or about me.
No, you are confusing two things. It's not always a fair comparison to benchmark two different processor architectures running at the same clock rate, because some, like PowerPC, were designed to run faster to achieve the same result.
But the issue here--with two implementations of ARM--is something different, as he explained pretty well: Nvidia made specific claims about how many instructions the K1 could handle at once, but at the same clock speed, that just isn't the case. Nvidia's claims and predictions about the market were wrong. It was soundly beaten in both GPU and CPU by a company that is not a GPU company. That's pretty aweful for Nvidia. It's like if Apple had beaten Adobe with Aperture, or if iWork had blown away MS Office (well it did with keynote, but not across the board), or if Maps had soundly beaten the 7 year old Google Maps on day one in every respect.
Nvidia isn't okay being "pretty good." It needed to be significantly ahead. And it's not.
Consider if Apple delivered a product that was soundly beaten by a much larger company with established sales. Or just think about the reality of when that happened, when IBM came in and delivered a business PC (82) that was much better than the Apple II. Boom, Apple was knocked down a peg. It already had the Mac in development and nearly ready to ship (84). Now imagine if Apple had shipped the Mac and IBM had shipped an advanced version of OS/2 in 1984 that was just as good if not better than the Mac. Apple would have been destroyed.
It took IBM and Microsoft another 6-10 years to catch up with the Mac. Nvidia needed a 6-10 year advantage like that. Instead, it has a K1 "Kepler desktop graphics" chip that Apple passed up before Nvidia could even sell its market advantage. Now it's too late. Because "good" isn't good enough to make money.
We can look to x86. Intel's designs are significantly different from AMD's designs, despite that they're both x86. Nevertheless it's been understood for some time that comparing MHz to MHz, even there, is worthless, because the designs are different enough to render that comparison useless. In fact, it's called "The Megahertz Myth". The only time it makes sense is within a manufacturer's own line of similar processors. So we can't compare the clock of an i3 with the clock of an i5, etc. but we can compare them between different i3’s, or i5's.
Talking about Apple's clock vs someone else's doesn't help anything. Even if it did have some meaning, it wouldn't matter, as long as the SoC's drew about the same power. Now, if it could be shown that the K1 drew significantly more power, then I'd say that the design was much less efficient, and had problems going forward, if Nvidia couldn't fix that, next generation. But, even that's complicated. Most Android devices are much thicker, and have bigger batteries. Even so, battery life is usually noticeably less. That's also a matter of the inefficient way Android works, and the fact that it can't be optimized the way Apple's devices are.
Windows has always had the same problem, though there are things about OS X that slow it down where Windows isn't. So it's all a mixed bag.
A major point though, is that the K1 is Nvidia's first effort in a true 64 bit design. Don't you think they should get credit for having a first gen device that's better in many areas than Apple's first gen device? I do.
The one area in which I think we can agree about clocks is that the higher you are now, the less room you will have in the future. We are only now seeing intel's high end chips reaching Prescott speeds. Look at how long that took. That's 3.8GHz. Can a phone/tablet get even that high? I don't know, but it will be a stretch. So a lower clock design does have, at least, the possibility of the legs that perhaps higher clocked designs, designs that need higher clocks.
DED is basically a BS artist. That people fall for his non sense is really sad.
Well this is actually debatable. It really comes down to real world usage and I've seen mixed results there. Also power usage is a factor here but even here it isn't clear which chip has the upper hand.
Actually the A8X and that extra bit of RAM might finally get me to upgrade my iPad 3. The A7 certainly wouldn't have caused me to upgrade.
Hmm, I'm not sure about this one, it is very hard to knock better performance and frankly all of these devices are rather young on the market. The maturation process of better hardware and a better OS does lead to a desire to upgrade. In some cases the requirement is to upgrade.
I agree with this 100%. A perfect example here is Apple ignoring the iPod line. I know sales are declining but part of that is a self fulfilling prophecy as the hardware is effectively outdated.
I don't think anybody understands Apples moves with respect to the Mini. Honestly I'm not even sure why the devices have two different motherboards.
It is pure fluff. Sadly far too many people give him credit which indicates to me that they don't have a strong understanding of the industry. Further you have to be pretty gullible to read crap written in the Style DED does and believe what you have read. Most of the time I can't even read half way through his articles before I have to do something else.
There are two ways to measure performance. One way is to go around the OS the way most objective testing software does, and read the hardware performance. The other way is to work through the OS and apps to get a less accurate, but more subjective evaluation.
So what do we want when measuring chip performance? I'm interested in as objective a measurement as possible. There, the K1 fares pretty well in some measurements against the A8x, about the same on others, and worse in more areas. Overall, the A8x seems like a more balanced design.
Evaluatiog a device is a whole 'nother thing. Because now you're evaluating the OS and the app universe as well. The Nexus 9 seems gritty because it's that mix of an inefficient OS, possibly apps that aren't written that well, and who knows what else? Possibly the hardware design isn't that good either. How much of that is reflective of just the K1? That's really hard to tell.
It doesn't make sense to say Windows outsells OS X, because they are not sold the same way. OS X isn't even sold. If you're comparing Macs to Windows PCs, it doesn't really make sense to include all the markets for which Macs don't sell (such as cash registers and mass licensed enterprise fleets). That would mean you'd look at actual markets where Macs and PCs actually compete; in those, Macs are doing better than WinPCs in a variety a places.
But if you insist to compare Macs against every unit shipment that includes a Windows license, then you also need to include iPads, even though there isn't a one-to-one sale in every market, for that very reason. That makes Apple the largest PC vendor by a good margin, and blows your 12:1 ratio up in smoke.
Can't have it both ways.
Also, the same way that Windows sales are supported by lots of low end products and some high end products, Apple's sales are supported by high volume iPads and lower volume but more expensive Macs.
Microsoft doesn't articulate exactly how many low end and high end licenses it sells, and PC makers don't detail how many cheap PCs vs how many high end premium machines they sell.
That makes it rather ignorantly one-sided to scrutinize the product mix of Apple--just because Apple actually provides you with more data about its operations--and make rather uninformed statements about how important it is for Apple to sell x number of low end products.
If you're not concerned with MS/PC OEM's product mix (and you can't be, without any knowledge of what their mix is), then you can't demand that Apple sell x numbers of iPads. You have to look at Apple's overall results (which is all you know about MS/PC OEMs), and compare overall unit sales and overall profits.
Apple currently beats the shit out of every PC OEM in both units sold and in profits earned. It beats the entire PC industry in profits earned.
Inventing an "iPad problem" to worry about doesn't change the fact that Apple is winning the game. Including the sub-game of tablets, where it also soundly beats every OEM in units and profits by a vast margin.
Fretting about iPad sales is intellectually dishonest. And look who else is doing it - business insider, IDC, and the most delusional analysts who have been wrong over and over again.
Ask yourself why none of those ever thought to worry about how many sub $500 PCs every other OEM was selling each quarter.
It's the same as saying "Apple hires child labor!" after Apple reported catching and putting a stop to remaining issues of child labor. <span style="line-height:1.4em;">It's what liars do.</span>
You don't need to include iPads. You can include iPads if you want to. This is a difficult subject. Talking about windows and OS X is talking about the desktop market, which includes notebooks. It merely means computers that are comparable with those two OS's. I've often argued that iPads are "real" computers, because they do what most people need from computers, even though they work differently. Other people argue differently. They insist that without having a hardware keyboard built on, easy ways to attach drives and other standard hardware, they are crippled computers at best, or not real computers at all. I don't agree with that.
But oranges to oranges. Most people us tablets as media machines, and that's short of what a computer is used for. I can use my Sony Playstation to browse the web, do messaging and email, buy stuff, etc. but I would never call it "a" computer, even though there's a computer inside. So we get messed up with these definitions. That's what happens when some dumb company comes out and revolutionizes things. Bad Apple!
Apple doesn't give breakdowns either, other than to say desktop sales and notebook sales. They don't tell us how many of which phone sold, or ipad, or many things. In fact, last financial call. Apple stated that they were rearranging their financial reports to conceal more data "for competitive purposes". So we won't know how many watches they sell, it ATvs, or whatever.
I don't really know what you mean when you say MS/PC OEM product mix. If you meant what Apple does, then yes, Hp isn't any clearer about it than Apple is.
iPad problem. Well, declining sales for three quarters is certainly a problem. Estimates are that sales could be down by half this quarter. I sure hope that's wrong. But even if it's down another 10%, that's a problem. Are you denying that?
And what are you talking about "fretting about iPad sales". You don't think Cook and company aren't fretting about declining sales? If so then that would be financially incompetent. Of course they're fretting. Fretting and trying to think of what to do about it. Dropping the price for memory upgrades in both phones and tablets was one thing they've done.
These numbers are Apple's own, not from Gartner, not from IDC, not from anyone else. While those companies do their estimates, and they usually wrong, Apple releases their own numbers every quarter, and those are the ones we go by.
Those analysts do worry about every sub $500 pc that gets sold. They been criticizing Dell for years for that, and Hp, and everyone else who does that. Haven't you been reading the financial sites? Why do you think Dell went private? It's so they could get out of that cycle out of the eye of Wall Street.
There's no denying that Apple's been working hard to eliminate labor problems, but as we see, it's not easy to do, and major suppliers still have major issues.
OK - enough avoiding the technical issue. The discussion started with the question of whether the iPhone battery would be damaged (i.e. shortened life) by charging with the (higher current) iPad charger relative to the iPhone charger. The answer is no, but not because the iPhone battery life is unaffected by charging current - it is because the iPhone's internal charging circuit limits the current during the CC charging phase. The iPhone charger was designed to supply approximately that maximum current (no reason to make it any higher). The iPad charger can supply a higher current, but the iPhone will not accept it.
The main parameters that affect lithium-ion battery life are overcharging (should not be a factor in this case) and charging temperature, which is directly affected by charge rate due to anode heating. That is why most charging circuits are current-limiting, with the current limit being a trade off between longer battery life and shorter charging times. So no, it is incorrect to say that higher charging rates do not negatively impact battery life, and when manufacturers say that fast charging is OK, they mean, presumably, that they have designed their batteries and chargers to provide an acceptable operational balance between those two conflicting goals.
Ok, look, I've never claimed that the iPhone battery was capable of a quick charge. That would be a full charge after perhaps 15 minutes. What is true, is that the battery is allowed, by Apple, to be more quickly charged by the 12 watt charger. I have said that the circuitry prevents the battery from overcharge. There's no disagreement there. What started this whole thing was when I pointed out the new Motorola phone which does have a quick charge battery, and charger. I said that I would like to have seen Apple do that first, which is an acknowledgement that the 6 and 6+ do NOT quick charge. But, Apple's batteries obviously are capable of being charged more quickly.
Apple is very diligent when it comes to batteries. They don't want a repeat of cheap chargers electrocuting people, they don't want batteries melting, or burning, when being charged. If they felt in the slightest, that these batteries could be damaged in any way with the 12 watt charger, the circuit that detects which device is plugged in, would limit the current to 5 watts. But it doesn't, it gives a higher wattage. Not the full 12 watts, but about 7 or 8. That's according to measurements one of the site did.
I would like to mention that with older phones i.e. The 5s and older, using the bigger charger does almost nothing for recharge times., as the wattage is limited to 5 watts. So the batteries in the 6 and 6+ are designed for this higher charge rate, even though it isn't a true quick charge.
the author is too lenient. do you ppl need someone to point out the obvious---a8x is a tri core processor at 1.5ghz. its not a quad core (4) processor, its not at 2.5ghz like everybody else( nvidia, qualcomm). apple is capable of making quad core 2.5ghz, but they opt not to do it. why?
See above. Commercial manufacturer claims do not constitute a sound technical defense of your assertion.
So people here who know nothing do? Really? Companies who make these statements, particularly when their products are used for critical applications, need to meet standards, and need to show that the can meet their claims.
So where's your links? I'm supposed to stop knowing what I know to be true because three guys here say I'm wrong without presenting a shred of evidence themselves?
I haven't even seen any evidence from you guys that you really know how batteries work. Just some statements that I'm wrong, and that you're right. Oh, foggy did show a couple of lines he likely got from somewhere that gave a very basic statement about batteries, but I already pretty much covered it in more detail.
Look up battery internal resistance. That's the key to charging times, as well as discharge times. I've already said that, but no one seems to have read it.
OK - enough avoiding the technical issue. The discussion started with the question of whether the iPhone battery would be damaged (i.e. shortened life) by charging with the (higher current) iPad charger relative to the iPhone charger. The answer is no, but not because the iPhone battery life is unaffected by charging current - it is because the iPhone's internal charging circuit limits the current during the CC charging phase. The iPhone charger was designed to supply approximately that maximum current (no reason to make it any higher). The iPad charger can supply a higher current, but the iPhone will not accept it.
The main parameters that affect lithium-ion battery life are overcharging (should not be a factor in this case) and charging temperature, which is directly affected by charge rate due to anode heating. That is why most charging circuits are current-limiting, with the current limit being a trade off between longer battery life and shorter charging times. So no, it is incorrect to say that higher charging rates do not negatively impact battery life, and when manufacturers say that fast charging is OK, they mean, presumably, that they have designed their batteries and chargers to provide an acceptable operational balance between those two conflicting goals.
Ok, look, I've never claimed that the iPhone battery was capable of a quick charge. That would be a full charge after perhaps 15 minutes. What is true, is that the battery is allowed, by Apple, to be more quickly charged by the 12 watt charger. I have said that the circuitry prevents the battery from overcharge. There's no disagreement there. What started this whole thing was when I pointed out the new Motorola phone which does have a quick charge battery, and charger. I said that I would like to have seen Apple do that first, which is an acknowledgement that the 6 and 6+ do NOT quick charge. But, Apple's batteries obviously are capable of being charged more quickly.
Apple is very diligent when it comes to batteries. They don't want a repeat of cheap chargers electrocuting people, they don't want batteries melting, or burning, when being charged. If they felt in the slightest, that these batteries could be damaged in any way with the 12 watt charger, the circuit that detects which device is plugged in, would limit the current to 5 watts. But it doesn't, it gives a higher wattage. Not the full 12 watts, but about 7 or 8. That's according to measurements one of the site did.
I would like to mention that with older phones i.e. The 5s and older, using the bigger charger does almost nothing for recharge times., as the wattage is limited to 5 watts. So the batteries in the 6 and 6+ are designed for this higher charge rate, even though it isn't a true quick charge.
Fair enough, but note that overcharge protection is not the same as charge current limiting - different problems and different solutions. Anyway - the charging rate (current) allowed by the circuitry is determined based on battery voltage and temperature, and it is certainly possible that in some situations the 6 and 6+ can take a bit more than the 5 W charger delivers. But the point is not whether these batteries could be charged faster (which they certainly could if allowed) - the point is that, contrary to your earlier argument, faster charging causes more anode heating which, in turn, reduces battery life. I have no idea how Motorola have dealt with that issue - found a way to keep the battery cooler, ignored the problem and settled for reduced life, or some combination of the two.
OK - enough avoiding the technical issue. The discussion started with the question of whether the iPhone battery would be damaged (i.e. shortened life) by charging with the (higher current) iPad charger relative to the iPhone charger. The answer is no, but not because the iPhone battery life is unaffected by charging current - it is because the iPhone's internal charging circuit limits the current during the CC charging phase. The iPhone charger was designed to supply approximately that maximum current (no reason to make it any higher). The iPad charger can supply a higher current, but the iPhone will not accept it.
The main parameters that affect lithium-ion battery life are overcharging (should not be a factor in this case) and charging temperature, which is directly affected by charge rate due to anode heating. That is why most charging circuits are current-limiting, with the current limit being a trade off between longer battery life and shorter charging times. So no, it is incorrect to say that higher charging rates do not negatively impact battery life, and when manufacturers say that fast charging is OK, they mean, presumably, that they have designed their batteries and chargers to provide an acceptable operational balance between those two conflicting goals.
Ok, look, I've never claimed that the iPhone battery was capable of a quick charge. That would be a full charge after perhaps 15 minutes. What is true, is that the battery is allowed, by Apple, to be more quickly charged by the 12 watt charger. I have said that the circuitry prevents the battery from overcharge. There's no disagreement there. What started this whole thing was when I pointed out the new Motorola phone which does have a quick charge battery, and charger. I said that I would like to have seen Apple do that first, which is an acknowledgement that the 6 and 6+ do NOT quick charge. But, Apple's batteries obviously are capable of being charged more quickly.
Apple is very diligent when it comes to batteries. They don't want a repeat of cheap chargers electrocuting people, they don't want batteries melting, or burning, when being charged. If they felt in the slightest, that these batteries could be damaged in any way with the 12 watt charger, the circuit that detects which device is plugged in, would limit the current to 5 watts. But it doesn't, it gives a higher wattage. Not the full 12 watts, but about 7 or 8. That's according to measurements one of the site did.
I would like to mention that with older phones i.e. The 5s and older, using the bigger charger does almost nothing for recharge times., as the wattage is limited to 5 watts. So the batteries in the 6 and 6+ are designed for this higher charge rate, even though it isn't a true quick charge.
You display a fundamental misunderstanding of battery technology.
When we talk of batteries being damaged by fast recharging, we are not talking about melting or burning. We are referring to the irreversible damage to the ability of the battery to hold a charge over the long term.
You seem to think that it's a simple thing for Apple to change to a battery that can be recharged very quickly. It's not; there are a huge number of factors that lead to Apple choosing the type of battery for the iPhone, and it's a flight of fancy to think that this is a relatively straightforward decision.
See above. Commercial manufacturer claims do not constitute a sound technical defense of your assertion.
So people here who know nothing do? Really? Companies who make these statements, particularly when their products are used for critical applications, need to meet standards, and need to show that the can meet their claims.
So where's your links? I'm supposed to stop knowing what I know to be true because three guys here say I'm wrong without presenting a shred of evidence themselves?
I haven't even seen any evidence from you guys that you really know how batteries work. Just some statements that I'm wrong, and that you're right. Oh, foggy did show a couple of lines he likely got from somewhere that gave a very basic statement about batteries, but I already pretty much covered it in more detail.
Look up battery internal resistance. That's the key to charging times, as well as discharge times. I've already said that, but no one seems to have read it.
Sorry - but you are confused again. If you disagree with any of the details of my posts, please state which ones rather than resorting to a purely ad hominem defense. Once I know what you disagree with then I'll provide some references for you to look at, but at present you are simply making vague statements of dissent with patently incorrect claims that you have provided detailed arguments. Also strange in the context of your previous post in which you claimed largely to be agreeing with me. I'm a physicist - I know what battery internal resistance is. You can't just toss out terms like that as if they support your ever-changing and empty argument, whatever it is now.
Well, to be fair, the best time to buy AAPL was the week Jobs died in Oct. 2011. It'll never be that low again, ever. Savvy investors knew the knee-jerk analysts and market neophytes would get the yips, and that's what happened - taking the stock price lower from the hit it had just taken in what was actually a prototypical good fall product release and 4Q report. Although AAPL is coming out of the past decade of market analysts not knowing how to value the company, the volatility was almost always the international hedge funds and mutual funds annual pump and dump in 4Q, sometimes after the 2Q and spring product releases, as well. The market movement in this stock never made any sense, but it is starting to level off into a berth of stability. That's good news/bad news: fewer opportunities to buy low(er). An alternative is to pick up AAPL via an ETF fund, such as XLK, but that'll stick you with Microsoft and Intel and other uglies in order to touch the hem of the golden-haired princess.
From a professional standpoint I have to say that windows 8 when used correctly on a tablet is the best experience by far. They are more expensive but they provide more utility then iPads. I work for a large company and they issued all of the supervisors HP revolves. The things are amazingly good for work. The docking system allows us to use them as full desktops as well. While an iPad is by far more fun to use. However windows 8 when used correctly is more useful in work place enviroments. http://www8.hp.com/us/en/m/ads/elite-products/elitebook-revolve.html
Comments
You know that if you increase voltage by a small amount, you increase power drain by a large amount. So heat will rapidly become an issue. That's likely why Apple increases speeds when moving a chip to the iPad by so little. I'm not confident that Apple could move this to 2GHz. Remember that there is no real heat sinking in these tablets, though I've often wished that Apple could figure out a way to utilize the aluminum case more directly as a heat sink. Of course, if they chose to use this in a notebook form factor, such as that of the Air, they could have good heat sinking for an ARM design, even without the fan. Perhaps there, with all the improvements you mention, clock could rise to 2GHz, possibly even a bit higher.
Right now, these chips are supposedly optimized for performance/efficiency. I don't know if Apple could move that too much. As process sizes shrink, it even becomes more difficult to remove heat effectively. The problem is that the heat becomes focussed in ever smaller areas on the chip, which makes heat sinking more complex. IBM showed an innovative solution a few years ago which in involves tiny tubes etched into the chips through which either gas or liquid would pass, removing heat directly from the chip, as it would be fabricated as part of the chip, not something several thermal layers away, as is current now. I think they are using something somewhat similar in the last two generations of their Power designs, and particularly in the mainframe CPUs.
I have no idea as to how this could work in a tablet, or phone, but I suppose it's possible to use a piezo pump in an enclosed structure that adheres to the case. Seems expensive though.
What's really interesting about Apple's new design is that it's actually more space efficient than the A7. Whereas the industry is finding that as they go down in size they have poorer packing density. Anandtech noted that Apple has increased packing density significantly this generation.
Of course, there is still the frustration in not knowing what at least 30% of Apple's chips are doing. Apple has no need to go to conferences and explain all this as they don't sell their chips elsewhere.
Indeed, the all-important single-core performance of the K1-64 beats the A8X by about 5%. Unfortunately very little software runs natively on any Android device, it runs in a VM which is so inefficient Google doubled its performance this year. Given Android native apps are so rare and unoptimised for either hardware or screen size this whole hardware comparison is largely redundant.
But fun.
I've discussed this a number of times, here, and elsewhere.
I've no doubt that Apple has run OS X on ARM since the A5x. Earlier chips would have been much too limited. They likely are running at least some of their apps on it as well.
But we really can't avoid the emulation problem, it's an ugly head. So small, simple apps could be run through Apple XCode, and with some little bit of work, be fine. But larger apps will have more problems. Compilers can't take care of every difficulty. There are simply too many areas in which hand coding is needed. As for large apps that need power, well, we can just forget it! For example, even if Adobe could be persuaded, again, to rewrite Photoshop, there is no way that it would perform anywhere near what is needed. So we wouldn't see Photoshop or CSx on an ARM based OS X device. Same thing for Office, no doubt. Same thing for many large apps. So we would end up with a subset of OS X software, and it would take some time for even that.
Remember that Apple uses a low power i5 for the low end MacBook Air. The A8x, at any clock, won't compare. I was thinking that if Apple again went with a dual core SoC, it could mean that they might be thinking about an ARM MacBook something or other, as they could use two of those. A four core design would work very well in a notebook, of course, and the graphics would then be three times the old A7, or twice what it is now. That would be enough. Clock it higher because of better heat sinking, and we would have some pretty good graphics. Four A8 cores clocked at 20% higher would give decent CPU performance as well.
But my idea also involves something else. It's known that most of the hanguls in emulation is from a small number of chip instructions that chip facially do differently. This is taken care of with equation software, which is around 100 times slower that hardware. That's avlut 8@% of the slowdown. But, these individual instructions aren't patented. So what if Apple took the worse of them, and put them into their chip? The OS could look at what instruction was being called for and zip it to that x86 instruction in the chip, rather than to software emulating that instruction. We would lose a few percent in speed overall for a major boost in emulation speed. In fact, this would likely eliminate the need for most apps to be recompiled and hand coded for the chip.
It would be nice if Apple was thinking along those lines because it can be done. But with three cores, I do t know if it would be worthwhile. We really don't need six cores for entry level notebooks, as most software won't take advantage of them.
We really need to be careful in discussing Qualcomm, Nvidia, etc. All were surprised that Apple came out with a 64 bit chip last year, but they aren't stupid, or incompetent. Apple's a generation ahead, but that doesn't mean that these companies will forever be a generation behind in performance. That's why the K1 is such an interesting chip, as it's Nvidia's first generation. Qualcomm is working on their high end 64 bit chip, though they've strangely come out with a middle range 64 bit chip already.
Apple doesn't own the space for high end chips, because they make chips only for their own tablets. "Own the space" means there's a competitive situation. As Apple doesn't compete with Qualcomm, they can't own that space. In the competitive market, Qualcomm owns the space, low end, mid, and high end. It might change on the low end with Chinese manufacturers making low end chips.
But right now, the K1 is the best performing chip for Android. How long that will last is anyone's idea.
The Nexus 9 is weak because Android isn't optimized the way iOS is with Apple's chips. Android still has multitasking that's way in excess of what most users need, and processes simply don't turn off, using RAM, processor cycles, and battery.
Yes... Apple wants to be independent. They design their own software to run on their own hardware. Apple is unlike any other computer manufacturer. And they price their hardware at the high-end to maintain a high profit margin... but volume suffers.
Dell, HP and others choose to use software written by someone else... and slash prices on hardware. The result? All that cheaper hardware sells at higher volume... but at lower profit margins.
It's two different philosophies.
If Apple truly wanted more desktop market share... they would do what the rest of the industry does and slash prices to pump up Mac volume.
They don't... so market share must not be a high-priority after all.
Apple's desktop market share situation might not be "great" but it's been their M.O. for 30 years.
Steve Jobs gave some insight into this philosophy: Steve Jobs: We don't ship junk
I thought the questions about Thunderbolt were settled. Intel has stated that while it was Apple that came to them about the idea, and gave them specs they wanted to see, that it was Intel that did the engineering. Intel was quite upset when some people said that Apple had done it and gave it to Intel.
I don't think that Thunderbolt will see the light of day on ARM. It requires at least Express Bus 2. The new version require the newer 3. Arm devices don't have the IO bandwidth for a 40Gbs interface.
We're seeing more Thunderbolt adoption in the Windows industry, now that intels controllers have come down in price with the last revision. So there' so ood reason to be optimistic about it overall.
Would you please produce some attribution for your performance claims, Anandtech or anybody else don't seem to support your claims...
http://www.anandtech.com/show/8666/the-apple-ipad-air-2-review/3
http://www.androidpolice.com/2014/11/11/nexus-9-vs-ipad-air-2-a-mostly-subjective-comparison/
Just look for yourself. You seem to be coming here to just downgrade everyone else who is trying to be objective, without saying anything yourself that's useful. Why don't you try to show that I'm wrong instead? If you can't do that, then don't bother posting to me, or about me.
We can look to x86. Intel's designs are significantly different from AMD's designs, despite that they're both x86. Nevertheless it's been understood for some time that comparing MHz to MHz, even there, is worthless, because the designs are different enough to render that comparison useless. In fact, it's called "The Megahertz Myth". The only time it makes sense is within a manufacturer's own line of similar processors. So we can't compare the clock of an i3 with the clock of an i5, etc. but we can compare them between different i3’s, or i5's.
Talking about Apple's clock vs someone else's doesn't help anything. Even if it did have some meaning, it wouldn't matter, as long as the SoC's drew about the same power. Now, if it could be shown that the K1 drew significantly more power, then I'd say that the design was much less efficient, and had problems going forward, if Nvidia couldn't fix that, next generation. But, even that's complicated. Most Android devices are much thicker, and have bigger batteries. Even so, battery life is usually noticeably less. That's also a matter of the inefficient way Android works, and the fact that it can't be optimized the way Apple's devices are.
Windows has always had the same problem, though there are things about OS X that slow it down where Windows isn't. So it's all a mixed bag.
A major point though, is that the K1 is Nvidia's first effort in a true 64 bit design. Don't you think they should get credit for having a first gen device that's better in many areas than Apple's first gen device? I do.
The one area in which I think we can agree about clocks is that the higher you are now, the less room you will have in the future. We are only now seeing intel's high end chips reaching Prescott speeds. Look at how long that took. That's 3.8GHz. Can a phone/tablet get even that high? I don't know, but it will be a stretch. So a lower clock design does have, at least, the possibility of the legs that perhaps higher clocked designs, designs that need higher clocks.
There are two ways to measure performance. One way is to go around the OS the way most objective testing software does, and read the hardware performance. The other way is to work through the OS and apps to get a less accurate, but more subjective evaluation.
So what do we want when measuring chip performance? I'm interested in as objective a measurement as possible. There, the K1 fares pretty well in some measurements against the A8x, about the same on others, and worse in more areas. Overall, the A8x seems like a more balanced design.
Evaluatiog a device is a whole 'nother thing. Because now you're evaluating the OS and the app universe as well. The Nexus 9 seems gritty because it's that mix of an inefficient OS, possibly apps that aren't written that well, and who knows what else? Possibly the hardware design isn't that good either. How much of that is reflective of just the K1? That's really hard to tell.
You don't need to include iPads. You can include iPads if you want to. This is a difficult subject. Talking about windows and OS X is talking about the desktop market, which includes notebooks. It merely means computers that are comparable with those two OS's. I've often argued that iPads are "real" computers, because they do what most people need from computers, even though they work differently. Other people argue differently. They insist that without having a hardware keyboard built on, easy ways to attach drives and other standard hardware, they are crippled computers at best, or not real computers at all. I don't agree with that.
But oranges to oranges. Most people us tablets as media machines, and that's short of what a computer is used for. I can use my Sony Playstation to browse the web, do messaging and email, buy stuff, etc. but I would never call it "a" computer, even though there's a computer inside. So we get messed up with these definitions. That's what happens when some dumb company comes out and revolutionizes things. Bad Apple!
Apple doesn't give breakdowns either, other than to say desktop sales and notebook sales. They don't tell us how many of which phone sold, or ipad, or many things. In fact, last financial call. Apple stated that they were rearranging their financial reports to conceal more data "for competitive purposes". So we won't know how many watches they sell, it ATvs, or whatever.
I don't really know what you mean when you say MS/PC OEM product mix. If you meant what Apple does, then yes, Hp isn't any clearer about it than Apple is.
iPad problem. Well, declining sales for three quarters is certainly a problem. Estimates are that sales could be down by half this quarter. I sure hope that's wrong. But even if it's down another 10%, that's a problem. Are you denying that?
And what are you talking about "fretting about iPad sales". You don't think Cook and company aren't fretting about declining sales? If so then that would be financially incompetent. Of course they're fretting. Fretting and trying to think of what to do about it. Dropping the price for memory upgrades in both phones and tablets was one thing they've done.
These numbers are Apple's own, not from Gartner, not from IDC, not from anyone else. While those companies do their estimates, and they usually wrong, Apple releases their own numbers every quarter, and those are the ones we go by.
Those analysts do worry about every sub $500 pc that gets sold. They been criticizing Dell for years for that, and Hp, and everyone else who does that. Haven't you been reading the financial sites? Why do you think Dell went private? It's so they could get out of that cycle out of the eye of Wall Street.
There's no denying that Apple's been working hard to eliminate labor problems, but as we see, it's not easy to do, and major suppliers still have major issues.
Ok, look, I've never claimed that the iPhone battery was capable of a quick charge. That would be a full charge after perhaps 15 minutes. What is true, is that the battery is allowed, by Apple, to be more quickly charged by the 12 watt charger. I have said that the circuitry prevents the battery from overcharge. There's no disagreement there. What started this whole thing was when I pointed out the new Motorola phone which does have a quick charge battery, and charger. I said that I would like to have seen Apple do that first, which is an acknowledgement that the 6 and 6+ do NOT quick charge. But, Apple's batteries obviously are capable of being charged more quickly.
Apple is very diligent when it comes to batteries. They don't want a repeat of cheap chargers electrocuting people, they don't want batteries melting, or burning, when being charged. If they felt in the slightest, that these batteries could be damaged in any way with the 12 watt charger, the circuit that detects which device is plugged in, would limit the current to 5 watts. But it doesn't, it gives a higher wattage. Not the full 12 watts, but about 7 or 8. That's according to measurements one of the site did.
I would like to mention that with older phones i.e. The 5s and older, using the bigger charger does almost nothing for recharge times., as the wattage is limited to 5 watts. So the batteries in the 6 and 6+ are designed for this higher charge rate, even though it isn't a true quick charge.
the author is too lenient. do you ppl need someone to point out the obvious---a8x is a tri core processor at 1.5ghz. its not a quad core (4) processor, its not at 2.5ghz like everybody else( nvidia, qualcomm). apple is capable of making quad core 2.5ghz, but they opt not to do it. why?
So people here who know nothing do? Really? Companies who make these statements, particularly when their products are used for critical applications, need to meet standards, and need to show that the can meet their claims.
So where's your links? I'm supposed to stop knowing what I know to be true because three guys here say I'm wrong without presenting a shred of evidence themselves?
I haven't even seen any evidence from you guys that you really know how batteries work. Just some statements that I'm wrong, and that you're right. Oh, foggy did show a couple of lines he likely got from somewhere that gave a very basic statement about batteries, but I already pretty much covered it in more detail.
Look up battery internal resistance. That's the key to charging times, as well as discharge times. I've already said that, but no one seems to have read it.
OK - enough avoiding the technical issue. The discussion started with the question of whether the iPhone battery would be damaged (i.e. shortened life) by charging with the (higher current) iPad charger relative to the iPhone charger. The answer is no, but not because the iPhone battery life is unaffected by charging current - it is because the iPhone's internal charging circuit limits the current during the CC charging phase. The iPhone charger was designed to supply approximately that maximum current (no reason to make it any higher). The iPad charger can supply a higher current, but the iPhone will not accept it.
The main parameters that affect lithium-ion battery life are overcharging (should not be a factor in this case) and charging temperature, which is directly affected by charge rate due to anode heating. That is why most charging circuits are current-limiting, with the current limit being a trade off between longer battery life and shorter charging times. So no, it is incorrect to say that higher charging rates do not negatively impact battery life, and when manufacturers say that fast charging is OK, they mean, presumably, that they have designed their batteries and chargers to provide an acceptable operational balance between those two conflicting goals.
Ok, look, I've never claimed that the iPhone battery was capable of a quick charge. That would be a full charge after perhaps 15 minutes. What is true, is that the battery is allowed, by Apple, to be more quickly charged by the 12 watt charger. I have said that the circuitry prevents the battery from overcharge. There's no disagreement there. What started this whole thing was when I pointed out the new Motorola phone which does have a quick charge battery, and charger. I said that I would like to have seen Apple do that first, which is an acknowledgement that the 6 and 6+ do NOT quick charge. But, Apple's batteries obviously are capable of being charged more quickly.
Apple is very diligent when it comes to batteries. They don't want a repeat of cheap chargers electrocuting people, they don't want batteries melting, or burning, when being charged. If they felt in the slightest, that these batteries could be damaged in any way with the 12 watt charger, the circuit that detects which device is plugged in, would limit the current to 5 watts. But it doesn't, it gives a higher wattage. Not the full 12 watts, but about 7 or 8. That's according to measurements one of the site did.
I would like to mention that with older phones i.e. The 5s and older, using the bigger charger does almost nothing for recharge times., as the wattage is limited to 5 watts. So the batteries in the 6 and 6+ are designed for this higher charge rate, even though it isn't a true quick charge.
Fair enough, but note that overcharge protection is not the same as charge current limiting - different problems and different solutions. Anyway - the charging rate (current) allowed by the circuitry is determined based on battery voltage and temperature, and it is certainly possible that in some situations the 6 and 6+ can take a bit more than the 5 W charger delivers. But the point is not whether these batteries could be charged faster (which they certainly could if allowed) - the point is that, contrary to your earlier argument, faster charging causes more anode heating which, in turn, reduces battery life. I have no idea how Motorola have dealt with that issue - found a way to keep the battery cooler, ignored the problem and settled for reduced life, or some combination of the two.
You display a fundamental misunderstanding of battery technology.
When we talk of batteries being damaged by fast recharging, we are not talking about melting or burning. We are referring to the irreversible damage to the ability of the battery to hold a charge over the long term.
You seem to think that it's a simple thing for Apple to change to a battery that can be recharged very quickly. It's not; there are a huge number of factors that lead to Apple choosing the type of battery for the iPhone, and it's a flight of fancy to think that this is a relatively straightforward decision.
See above. Commercial manufacturer claims do not constitute a sound technical defense of your assertion.
So people here who know nothing do? Really? Companies who make these statements, particularly when their products are used for critical applications, need to meet standards, and need to show that the can meet their claims.
So where's your links? I'm supposed to stop knowing what I know to be true because three guys here say I'm wrong without presenting a shred of evidence themselves?
I haven't even seen any evidence from you guys that you really know how batteries work. Just some statements that I'm wrong, and that you're right. Oh, foggy did show a couple of lines he likely got from somewhere that gave a very basic statement about batteries, but I already pretty much covered it in more detail.
Look up battery internal resistance. That's the key to charging times, as well as discharge times. I've already said that, but no one seems to have read it.
Sorry - but you are confused again. If you disagree with any of the details of my posts, please state which ones rather than resorting to a purely ad hominem defense. Once I know what you disagree with then I'll provide some references for you to look at, but at present you are simply making vague statements of dissent with patently incorrect claims that you have provided detailed arguments. Also strange in the context of your previous post in which you claimed largely to be agreeing with me. I'm a physicist - I know what battery internal resistance is. You can't just toss out terms like that as if they support your ever-changing and empty argument, whatever it is now.
Well, to be fair, the best time to buy AAPL was the week Jobs died in Oct. 2011. It'll never be that low again, ever. Savvy investors knew the knee-jerk analysts and market neophytes would get the yips, and that's what happened - taking the stock price lower from the hit it had just taken in what was actually a prototypical good fall product release and 4Q report. Although AAPL is coming out of the past decade of market analysts not knowing how to value the company, the volatility was almost always the international hedge funds and mutual funds annual pump and dump in 4Q, sometimes after the 2Q and spring product releases, as well. The market movement in this stock never made any sense, but it is starting to level off into a berth of stability. That's good news/bad news: fewer opportunities to buy low(er). An alternative is to pick up AAPL via an ETF fund, such as XLK, but that'll stick you with Microsoft and Intel and other uglies in order to touch the hem of the golden-haired princess.
http://www8.hp.com/us/en/m/ads/elite-products/elitebook-revolve.html
http://h30094.www3.hp.com/product/sku/10459812