or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › iPad's Apple A4 CPU is feature-stripped ARM Cortex-A8 - report
New Posts  All Forums:Forum Nav:

iPad's Apple A4 CPU is feature-stripped ARM Cortex-A8 - report - Page 4

post #121 of 166
Quote:
Originally Posted by nikon133 View Post

Yes, but don't forget you have much more Wiis in houses than other consoles. Relative to number of each console in the wild, there are less Wii games per console owner compared to other platforms.

Apple sells less overall computers but has a market cap that rivals Microsoft.

The Wii is successful, the Wii makes money and it continues to do so despite it's "fatal" flaws.

"fatal" flaws - sound familiar?
post #122 of 166
Quote:
Originally Posted by melgross View Post

I agree about the battery life, and was thinking about the same numbers myself. I believe that Jobs said 10 hours for movies, if I'm not mistaken. If so, that's about the most intensive usage the hardware will be getting other than some 3D gaming. 140 hours of music without the screen on! I'd like to see any other tablet match that.

But he also said 10hrs of reading, answering some questions to Mossberg I believe, and stated that the chip used hardly any power and that all went into the screen.
"Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. THAT'S relativity." - Albert Einstein
Reply
"Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. THAT'S relativity." - Albert Einstein
Reply
post #123 of 166
This $1 Billion estimate to tweak an existing design of a SoC has to be off by an order of a magnitude.

Lets assume a Full Time Engineer(FTE) costs $200,000 per year (Salary + Benefits + Overhead)

For $1 Billion you would get 5,000 FTE's for one year, 2,500 FTE's for two years or 1,666 for three years.

There is no way you need 1600 FTE's for 3 years to modify a Cortex A8 design, the communications and project management overhead would be insane for a project of this scope.

Even if we assume $200 Million for capital expenditures related to the project which to me seems high since its a fabless shop. This still leaves you with 4,000 FTE's for one year, 2,000 FTE's for two years and 1,000 FTE's for 3 years. My guess is the real cost is somewhere in the range of $100 Million - $250 Million.


About this being a stripped down Cortex A8, its just as likely to be a stripped down Cortex A9(Apple's M.O. is to run these under clocked relative to their typical operation). The fact is we won't really have any idea until enough companies get there hands on this device and do a proper teardown of it.
post #124 of 166
Quote:
Originally Posted by DocNo42 View Post

(snip)

So the iPad isn't for you - great. I find it fascinating that people are so threatened by something different that they feel they have to not only not just buy it, but campaign against it

(snip)

Careful Doc! One of our house trolls might wander by and ingest your impeccable logic and self-combust. You wouldn't want that on your conscience would you?
Believe nothing, no matter where you heard it, not even if I have said it, if it does not agree with your own reason and your own common sense.
Buddha
Reply
Believe nothing, no matter where you heard it, not even if I have said it, if it does not agree with your own reason and your own common sense.
Buddha
Reply
post #125 of 166
Quote:
Originally Posted by Woohoo! View Post

I'm willing to bet there is some sort of EFI reading SSD, phoning home like DRM scheme embedded into the processor too.

The iPad is to be a content delivering device, but what's scary is the fact that Apple has removed all but one MacBook for a line of iPad media devices.


Doesn't sound too good for the future of traditional computers.



And what's with the Apple propaganda plug at the end of the article?

That isn't propaganda, but instead the author is reminding folks of Apple's statement regarding their "in-house" silicon after providing information in the article that indicates that the chip is not nearly as much a breakthrough chip as Apple claims.

Also, I don't understand your comment about only one Macbook for a line of iPads. Apple has the Macbook and the Macbook Pros in addition to their desktop machines...??
post #126 of 166
Quote:
Originally Posted by Woohoo! View Post

The MacBooks are intended for the younger market, the MacBook Pro is intended for the more mature market.

Apple introduces a line of iPads, but it's not ready for market yet. As soon as sales pick up all Apple has to do is eliminate the one remaining white MacBook model as it already phased out the other MacBook models.

The iPad is intended to replace Apple's traditional computers in the younger market.


Why is this bad? Because the iPad is a closed device and doesn't encourage as much immediate hacking and interest as a open device does.

On a Mac, anyone who has the interest can fire up Terminal, learn a few Unix commands and be messing around. It encourages that because it's a open device.

The iPad has a high barrier to cross before one can get under the hood. This high barrier is going to discourage future youth from a interest in computers.

The low end Macbook Pro isn't much more expensive than the Macbook. Also, those youngsters can also buy a Mac Mini and an iPad for the price of a Macbook or Macbook Pro and have the best of both worlds. How is that a loss?
post #127 of 166
Do you honestly believe that the ability to hack a device and mess around with its firmware via a Unix prompt has any bearing on the commercial succes of a consumer device like the iPad?

I know a place like Appleinsider attracts a fair share of geeks, but really ... ?
post #128 of 166
Quote:
Originally Posted by Woohoo! View Post

The MacBooks are intended for the younger market, the MacBook Pro is intended for the more mature market.

Apple introduces a line of iPads, but it's not ready for market yet. As soon as sales pick up all Apple has to do is eliminate the one remaining white MacBook model as it already phased out the other MacBook models.

The iPad is intended to replace Apple's traditional computers in the younger market.


Why is this bad? Because the iPad is a closed device and doesn't encourage as much immediate hacking and interest as a open device does.

On a Mac, anyone who has the interest can fire up Terminal, learn a few Unix commands and be messing around. It encourages that because it's a open device.

The iPad has a high barrier to cross before one can get under the hood. This high barrier is going to discourage future youth from a interest in computers.

Do you honestly believe that the ability to hack a device and mess around with its firmware via a Unix prompt has any bearing on the commercial succes of a consumer device like the iPad?

I know a site like Appleinsider attracts a fair share of geeks, but really ... ? Oh, and one more thing; as far as the future youth and their interest in computers - has it ever occured to you that the computers of tomorrow may be very different animals than the Unix-powered big boxes you think of as computers today? The personal computers of tomorrow could in fact be something like ... the iPad.
post #129 of 166
Quote:
Originally Posted by DocNo42 View Post

If you wan't something you can pick apart, get gaga over the guts, or get root and terminal access go buy one of the many existing solutions that will do all that right now. This device isn't targeted at you!

Dude, notice how he said 'personally... I'll likely skip'? He's not telling YOU not to buy one or that the product is total crap if the A4 is a cortex A8 derivative. He just wants something more robust that'll have more 'legs' and be able to keep longer. Is that unreasonable?
post #130 of 166
The implied premise of this report is that Apple is a bunch of stupid bozos who don't know what they are doing.
So, according to this Jon Stokes fellow, Apple first went ahead and spent 278 mio dollars on PA Semi, and then spent another billion dollars to develop a crappy chip that is inferior to any off-the-shelf chip that Apple could have bought on the market? Is THAT what they want us to believe? Well, color me sceptical!

You can call Apple a lot of things, but stupid they are not.
post #131 of 166
Quote:
Originally Posted by DocNo42 View Post

The amount of people today who have HDTV's but no HDTV cable or OTA HD antenna blow your little theory right out of the water. They don't care right now for the primary purpose of their "HD" TV, why are they going to suddenly care for a game console that is a casual, secondary use?

To a geek such as yourself, watching SD on an HDTV is a distraction in and of itself and unfathomable. To non-geeks, the Wii is fun and it's approachable in a way that the Xbox and PS3 aren't. Extra resolution isn't going to change that part of the experience. They couldn't care less that it's 480i instead of 1080p - it's fun!. Wanna know why the Wii is still popular? It's about the end user experience, not the specs!

It is notable that Nintendo, like Apple, is doing just fine dissing people such as yourself who look at a few missing checkboxes and proclaims "it sucks". From the beating self-proclaimed Internet experts took with the iPod, personified in the now infamous Slashdot iPod comments, you think this kind of inane commentary would be a little more reserved. But here we are, all over again...

Watch YFM.

My observation is based on what happens to great and popular products that sit on shelves too long while competitors cycle and update their products. Popularity means nothing if the idea is old, with no fresh take. The Wii has done well, will continue to do OK, but within 2 years will be completely dead, in terms of new sales.

Get some real opinions from people that own one. Average usage for the Wii is 1 month. Many people use it for 2 weeks, then forget they even have it. People are lazy, and despite having a brief compulsion to play an interactive video game, the thrill is quickly gone, aided by the extreme lack of interesting content on the platform.
post #132 of 166
Quote:
Originally Posted by DocNo42 View Post

Really? So Apple spent over 1 Billion on PA Semi to get functionality that anyone else can have?

Wow, that's a pretty bad investment!

/sarcasm

Apple didn't spend $1G on PA Semi. They spent ~$270M to acquire their engineering talent (the company). The last thing we heard about PA Semi was that the team was split into two, one working iPad and the other on iPhones. It was about 150 engineers. It costs about $30M/year for those 150 people. At that spend rate, it'll take another 20 years to hit a billion USD.

Quote:
The chips are important, but if you watch Steve talking with Walt Mossburg right after the unveiling, he comments that the chips use next to nothing and it's screen that consumes the lions share of the power - right about the same time he rightly points out that it's going to be VERY rare for someone to use a device for more than 10 hours at a time thus making recharging an essentially moot issue.

This is Jobsian marketing speak. Everything is true in the way he's saying it.

Quote:
So wouldn't this be "magic circuit"?

I was talking in terms of computational performance. It's doubtful they'll create a more powerful chip than competing designs. Jon Stokes did intimate the right thing I think. They are doing in-house designs to gain an advantage on performance/Watt.
post #133 of 166
Quote:
Originally Posted by wuchmee View Post

The message of January 27th was the form factor. The form factor will overlay different demographics as a content delivery device,

the real genius of the device is that it is the best means to buy stuff from the iSore.
post #134 of 166
Quote:
Originally Posted by Shrike View Post

Yes, there are a lot of Mac users staffed at Ars. It doesn't mean they have any good sources for Apple dealings. Stokes is as just in the dark as anyone else. What he has on his side is good x86 knowledge, industry sources, and a good understanding of microprocessor design. I'm pretty confident he has zero sources into Apple proper. I think the best he's got is a contractor working on compiler and driver design. Maybe it is a developer with iPad access and they have inferred by performance results.

You're making an assumption that you can't really make. What do you know of his sources? Nothing. It's also possible that he has them at Apple and Samsung.

Quote:
Notice how he is just speculating about what is missing. He's already admitted to being wrong about display output. I think all his sources told him was that the A4 is an Cortex-A8 based SoC. And it's only based on indirect knowledge. That's it. I don't think anyone knows but the upper management, the SoC CPU/SMC subteam, and the compiler/driver subteam.

He's speculating, because he hasn't been given detailed info, just the basic stuff. Just like Apple, where they give employees slightly different versions, so they can find how who spilled the bean, it's possible his sources have the same problem.

I don't know why you're so against the idea that he may know what he's talking about.

Quote:
It's right on the iPad specs page: 10 hrs for video or web browsing at default display settings. The iPad has to really hit that mark.


Apple is known to be conservative with their handheld mobile battery ratings. They almost always exceed them.
post #135 of 166
Quote:
Originally Posted by Shrike View Post

You're right. It wasn't Jobs and anything attributed to Apple. Nobody knows how much Apple has spent on the A4.

The $1G quote was the NYT quoting an industry analyst on the cost of designing a proprietary, from the ground-up CPU. That's all he was talking about in an article about how Nvidia, Qualcomm and Apple are off building their own custom SoCs. It's quite doubtful it cost any of them that much as they all took various existing designs and did custom integration work.

One can make a guess. It is fairly simple. Take 50 people including managers and engineers, multiply by 200k and by 2 years, you get 20m. Double it for other procurements, testing and pad it a little more for other stuff. It probably cost Apple on the order of 50 million if they devouted 50 engineers and various related testing, prototyping, licensing and "institutional" costs. For $1G, you're talking 500 people over 2 years and some capital expenditures. Apple probably didn't even spend that much on iPad development as whole. They probably didn't even spend half that.

That's right. I keep telling people that the $1 billion figure referred to a from scratch job, but saying Apple spent $1 billion is more sensationalist, so people keep misquoting it.
post #136 of 166
Quote:
Originally Posted by DocNo42 View Post

The amount of people today who have HDTV's but no HDTV cable or OTA HD antenna blow your little theory right out of the water. They don't care right now for the primary purpose of their "HD" TV, why are they going to suddenly care for a game console that is a casual, secondary use?

To a geek such as yourself, watching SD on an HDTV is a distraction in and of itself and unfathomable. To non-geeks, the Wii is fun and it's approachable in a way that the Xbox and PS3 aren't. Extra resolution isn't going to change that part of the experience. They couldn't care less that it's 480i instead of 1080p - it's fun!. Wanna know why the Wii is still popular? It's about the end user experience, not the specs!

It is notable that Nintendo, like Apple, is doing just fine dissing people such as yourself who look at a few missing checkboxes and proclaims "it sucks". From the beating self-proclaimed Internet experts took with the iPod, personified in the now infamous Slashdot iPod comments, you think this kind of inane commentary would be a little more reserved. But here we are, all over again...

Actually, when we first plugged my daughters Wii into the then new Samsung 61" DLP LED set I had just bought, I was prepared to think that it would look crappy. I was wrong! The Wii has a higher rez than 720x 480. I forget the actual number right now, but it looks much better than expected. Not as good as our PS3, but close enough most of the time. Some games look so good, it's almost impossible to tell from a normal distance.
post #137 of 166
Quote:
Originally Posted by shubidua View Post

But he also said 10hrs of reading, answering some questions to Mossberg I believe, and stated that the chip used hardly any power and that all went into the screen.

If it gives 10 hours of video, it will give more for reading. Unless he was meaning reading magazines with interactive features. That's possible too. But video uses more resources than does reading. I find it hard to believe that reading books would use the same amount of battery power as video.

But Apple is conservative about the battery usage of its handheld devices, likely they are with this too.
post #138 of 166
Quote:
Originally Posted by Garion View Post

The implied premise of this report is that Apple is a bunch of stupid bozos who don't know what they are doing.
So, according to this Jon Stokes fellow, Apple first went ahead and spent 278 mio dollars on PA Semi, and then spent another billion dollars to develop a crappy chip that is inferior to any off-the-shelf chip that Apple could have bought on the market? Is THAT what they want us to believe? Well, color me sceptical!

You can call Apple a lot of things, but stupid they are not.

Stop with the $1 billion garbage already! Stokes knows very well, as does everyone with at least half a brain, that Apple didn't spend anywhere near $1 billion to develop this variation of a chip they licensed.

And who is to say this chip is inferior? The easy thing for Apple to have done was to just continue buying chips from Samsung, unchanged. The fact that they didn't, means that they wanted to do things to it that would have a benefit to them. That could either mean stripping features that have no meaning in the context they will be using it, as well as adding those that will.

As Stokes mentioned, chips that are sold, are sold with features than most companies buying them won't use, but they need to be there to service a group of buyers that will need a sub grouping of those features. So these companies over pay for what they don't need. But they are also buying in much smaller volumes than Apple does. That means that Apple can afford to mod the chip for themselves, where other companies can't.

That's an advantage to Apple. If they buy 60 million of these chips next year, that would be a big advantage. Even if they only buy ten million it would still be more than most others would be buying.

To strip out unneeded features makes for a superior chip, because it will have better yields, lower pricing, lower power usage, better performance, cooler running, etc. Advantage Apple.
post #139 of 166
Quote:
Originally Posted by pmz View Post

Watch YFM.

My observation is based on what happens to great and popular products that sit on shelves too long while competitors cycle and update their products. Popularity means nothing if the idea is old, with no fresh take. The Wii has done well, will continue to do OK, but within 2 years will be completely dead, in terms of new sales.

Get some real opinions from people that own one. Average usage for the Wii is 1 month. Many people use it for 2 weeks, then forget they even have it. People are lazy, and despite having a brief compulsion to play an interactive video game, the thrill is quickly gone, aided by the extreme lack of interesting content on the platform.

You're making that up for sure. Where can you show us that it's true?
post #140 of 166
Quote:
Originally Posted by melgross View Post

You're making an assumption that you can't really make. What do you know of his sources? Nothing. It's also possible that he has them at Apple and Samsung.

I'll give you that it is possible. But really, I don't think it is very possible at all that he has sources at Apple. And I don't think Samsung knows all that much either. Apple runs a really really tight and compartmentalized ship. I don't think anyone is leaking from anyone with an Apple badge (the un-approved leaking that it).

Quote:
I don't know why you're so against the idea that he may know what he's talking about.

Well, it's more that we should take it as a rumor, not as fact. It is still a rumor. If we hear it from Gruber, I'll have more belief. It's not to say I don't think it has a Cortex-A8. It's more probable than it having a Cortex-A9 or a highly customize A8, but it's all speculation. If Apple wanted to keep it secret, they can. They don't need to have a Cortex-A8 flag in the compiler as an easy give away. They don't need to publish papers.

Quote:
Apple is known to be conservative with their handheld mobile battery ratings. They almost always exceed them.

Yes, true. "Almost" though.
post #141 of 166
Quote:
Originally Posted by Shrike View Post

I'll give you that it is possible. But really, I don't think it is very possible at all that he has sources at Apple. And I don't think Samsung knows all that much either. Apple runs a really really tight and compartmentalized ship. I don't think anyone is leaking from anyone with an Apple badge (the un-approved leaking that it).

I happens all the time though. Samsung would know as the manufacturer of the chip. They have to know what they're testing with the machines that check whether the chips are functioning. Tests are written for that purpose. There would be people who know these things.

Quote:
Well, it's more that we should take it as a rumor, not as fact. It is still a rumor. If we hear it from Gruber, I'll have more belief. It's not to say I don't think it has a Cortex-A8. It's more probable than it having a Cortex-A9 or a highly customize A8, but it's all speculation. If Apple wanted to keep it secret, they can. They don't need to have a Cortex-A8 flag in the compiler as an easy give away. They don't need to publish papers.

Until we know for sure, everything we read is a rumor. But this one seems better than most others.
post #142 of 166
Ars talking out of their arse again.
post #143 of 166
Quote:
Originally Posted by myapplelove View Post

Ars talking out of their arse again.

And what are you blowing out of yours?
post #144 of 166
Quote:
Originally Posted by melgross View Post

If it gives 10 hours of video, it will give more for reading. Unless he was meaning reading magazines with interactive features. That's possible too. But video uses more resources than does reading. I find it hard to believe that reading books would use the same amount of battery power as video.

But Apple is conservative about the battery usage of its handheld devices, likely they are with this too.

I'm just reporting what "the almighty" said himself

But I agree that it seems weird though that the iPad would last as long when watching video or reading books. On the other hand I think we do forget the WiFi antenna in this power equation, because when I turn it off on my MB for example, I get something like 10 percent more time on the charge (rough estimation), so it looks like it is taken a lot of power.
"Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. THAT'S relativity." - Albert Einstein
Reply
"Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. THAT'S relativity." - Albert Einstein
Reply
post #145 of 166
Quote:
Originally Posted by DocNo42 View Post

This right here is why Apple isn't saying much beyond "A4" for technical specs for the iPad, and I for one am glad for this device they are not hopping on that tredmill. At the end of the day what the ^@!^# difference does it make what the guts are? Does it run the software quickly (and from every demo I saw recorded at the hands on after the announcement that's a resounding yes!)? Does it get good battery life? Does it do everything I want and need it to do?

If yes, I buy. I'm not going to not buy because they could have put a 900MHz part in and instead they put in an 800MHz part... honestly, other than some geek compulsion to focus on minutia, at the end of the day what does it matter?

There's generally good times and bad times to buy certain tech. Right now, today, C2D is not a good buy for the long term. Right now, today, Cortex A8 is not a good buy for the long term.

I say that as a person with a Rev 1 MBP 17" with a Core Duo Yonah that can't run a lot of stuff that slightly later C2D machines can...because it ain't 64 bit. One example is Java 6 which I need for development. Not a C2D machine, no Java 6. So I bootcamp into Windows to do my java dev.

I needed to buy a machine right then and could not wait until the C2D machines were released but if it was MY money and I COULD wait then I would have because I knew that the C2D machines were qualitatively better and had a longer life. Likewise today, I'm waiting for the Core i5/i7 updates to the MBP even through my current MBP is really impacting my day to day productivity. I need to live with the machine for 3 years before I can replace it again.

I may get an iPad anyway but $500+ is a lot for a device if I know that the heart of the thing is already obsolecent. I will likely only buy it if I'm actively developing for it. Otherwise I'll just continue to target the iPhone and test against the iPad simulator.

The A9 has a shallower pipeline so it does more per clock than the A8. It has OOE that also allows it to do more per clock. With a shallower pipeline the A9 also is more thrify on power...mispredicted branches waste more power (and performance) on a deep pipeline than a shallower one.

So the issue isn't how snappy the iPad is today in 2010 but how snappy it will be in 2014. If the 2010 iPad is single core A8 and the 2011 iPad that is dual core A9 I bet you bought yourself more than a couple extra years of usable service. Meaning instead of feeling sluggish in 2015 vs 2014 (the 1 year delta in ownership) It might feel pretty good until 2017.

IMHO the Merom MBP owners can certainly wait an extra couple years vs us Yonah MBP owners if they want to. There's nothing they can't do...it just might take a little longer to do.

So say the difference was that the dual core A9 2011 iPads have 3rd party multitasking enabled in iPhone OS 5.1 in late 2011 whereas the single core A8 2010 iPads do not without jailbreaking.

Still don't care what CPU the 2010 iPad runs? More the fool you then. It's not a geek compulsion as much as being a knowledgable consumer.
post #146 of 166
Quote:
Originally Posted by Shrike View Post

They don't need to have a Cortex-A8 flag in the compiler as an easy give away.

My comment was that the object code probably would show A9 specific stuff that isn't in the A8 if you look at it in detail.
post #147 of 166
Quote:
Originally Posted by shubidua View Post

I'm just reporting what "the almighty" said himself

But I agree that it seems weird though that the iPad would last as long when watching video or reading books. On the other hand I think we do forget the WiFi antenna in this power equation, because when I turn it off on my MB for example, I get something like 10 percent more time on the charge (rough estimation), so it looks like it is taken a lot of power.

Ok, but please let's not be silly with the titles, ok?
post #148 of 166
Quote:
Originally Posted by vinea View Post

There's generally good times and bad times to buy certain tech.

Agree with here. I waited 2 years to get an iPhone. I wanted double the CPU, double GPU, double the RAM and double the storage from the original iPhone and even iPhone 3G. It was tough waiting, but it was worth waiting for. The iPhone 3GS is nice sweet spot phone that'll last awhile. It'll be obsolete this July, but it does things smoooth and it has just enough storage (32 GB) for my needs.

It was really really hard to wait. And Apple went away from the aluminum design! Wasn't happy about that.

Quote:
I may get an iPad anyway but $500+ is a lot for a device if I know that the heart of the thing is already obsolecent. I will likely only buy it if I'm actively developing for it. Otherwise I'll just continue to target the iPhone and test against the iPad simulator.

I'm waiting. Not an early adopter. The iPad won't be really mature until version 3. Right now, it's a really nice machine for coach and bedroom surfing, maybe as portable if you have desktop, so it will have its place. But it won't come into its own until version 3.

Quote:
The A9 has a shallower pipeline so it does more per clock than the A8. It has OOE that also allows it to do more per clock. With a shallower pipeline the A9 also is more thrify on power...mispredicted branches waste more power (and performance) on a deep pipeline than a shallower one.

The OOE allows it to do more per clock. The shallower pipeline allows it to use less power, but also decreases its clock potential. The longer pipeline allowed the A8 to clock higher. Putting in beefier branch prediction and OOE hardware made it use more power. Generally, A9 chips will be on <45 nm fabs while A8 fabs are on 65 nm fabs. All the tradeoffs resulted in a 25% improvement in performance (2 DMIPS/MHz to 2.5). Well, if the A9 is on the same fab process, it will be an interesting tradeoff in perf/watt between the two.

The Cortex-A8 in the iPhone 3GS was a bigger jump from ARM11 in the iPhone 3G than this.

Quote:
So say the difference was that the dual core A9 2011 iPads have 3rd party multitasking enabled in iPhone OS 5.1 in late 2011 whereas the single core A8 2010 iPads do not without jailbreaking.

Still don't care what CPU the 2010 iPad runs? More the fool you then. It's not a geek compulsion as much as being a knowledgable consumer.

Well, like me, I was iPhone-less for 2 years while others were using them for 2 years running. I didn't get the pleasure for 2 years and had to stick with my Treo 650 (ugh). It really depends on what type of buyer one is. Certainly for you and me, we should wait, but for others, buying now and buying again in 2 years is certainly an option. It's fairly inexpensive (when compared to an Apple laptop or desktop), so shorter refresh cycles will be more common for folks.
post #149 of 166
Quote:
Originally Posted by backtomac View Post

He just wants something more robust that'll have more 'legs' and be able to keep longer.

But what is there to say that the CPU alone will have any bearing on his ability to "keep longer". The iPad isn't a general purpose computer, it's more of an appliance like a video game console. Look at the longevity of something like a PS3 or XBox360 which have hardware that by todays standards is downright anemic - yet new games come out for them and are somehow comparable to what you get on a PC.

That's the power of the "walled garden" that so many like to deride - well, one of the benefits of the walled garden/limited hardware is you can optimize for it without having to ensure your code will run on every half baked piece of hardware out there.

I guess another way of putting it, people worried about "legs" at this point, for a device that hasn't even shipped, are probably obsessing about the wrong thing.

Quote:
Is that unreasonable?

Another thing I have noticed is what people really mean when they say things like "keep longer" is "never have to buy another one again but run the latest software" which just isn't a reasonable expectation. Again since it hasn't even shipped, nor have we seen any evolution in the platform I think obsessing about the CPU is unreasonable in the context of this device. If it was a Mac or a PC, then no, it wouldn't be unreasonable - but it's neither of those. It's something different. Trying to judge it through the mask of a general purpose computer is just silly - maybe not unreasonable, but certainly missing the point.
post #150 of 166
Quote:
Originally Posted by pmz View Post

My observation is based on what happens to great and popular products that sit on shelves too long while competitors cycle and update their products. Popularity means nothing if the idea is old, with no fresh take. The Wii has done well, will continue to do OK, but within 2 years will be completely dead, in terms of new sales.

Only if MS or Sony update their consoles, which both have said they have no intention of doing.

Just like I feel there will be with the iPad, there are many different "non traditional" groups of users with the Wii. The hardcore gamers or active gamers may fall off, but the Wii is different. It's not uncommon for a Wii to sit unused and then get used for a party, or a weekend or some other event. In the end it doesn't matter- they still sold the Wii, they still sold the game or games - if someone buys a game and plays it 7x24 or buys a game and plays it once every two months the game still sold. They only time inactivity hurts is on incremental in-game downloadable purchases, and I haven't seen Wii games that push that as much as on say the Xbox 360.

Quote:
Get some real opinions from people that own one. Average usage for the Wii is 1 month. Many people use it for 2 weeks, then forget they even have it. People are lazy, and despite having a brief compulsion to play an interactive video game, the thrill is quickly gone, aided by the extreme lack of interesting content on the platform.

Everyone I know that has one that uses it infrequently (including myself) but it does get brought out for events (parties and such). As I said, as long as it is used and games are bought, it will continue to make revenue. I also have a group of friends that use it daily or several times a week for things such as Wii Fit. That group is about the same size of those of us who use it infrequently.

Heck, I need to fire mine up and start using Wii Fit more - it would actually do me some good
post #151 of 166
Quote:
Originally Posted by Shrike View Post

This is Jobsian marketing speak. Everything is true in the way he's saying it.

So are you agreeing with him? Because I think he's nailed it - a full color screen that supports video is far more important than a week long battery life.

Quote:
I was talking in terms of computational performance. It's doubtful they'll create a more powerful chip than competing designs. Jon Stokes did intimate the right thing I think. They are doing in-house designs to gain an advantage on performance/Watt.

And I agree here too. Even the mobile CPU's have gotten to the point where there is more than enough umph in them to do what is needed - the endless chase for performance just for the sake of performance doesn't make sense on an ultra mobile device like the iPad.
post #152 of 166
Quote:
Originally Posted by vinea View Post

Still don't care what CPU the 2010 iPad runs? More the fool you then. It's not a geek compulsion as much as being a knowledgable consumer.

So would you not buy a console right now because they were launched 4 years ago?

The iPad is closer to a console and nothing like a general purpose computer. Obsess about the CPU if you like, I will be enjoying my iPad for what it will do for me today. I sat on the fence for five years waiting for the next big thing when I decided to get a Mac post the '90's fiascos... and for what? I could have had a Mac that much earlier! Technology changes and must be replaced - it's a fact of life. While Apple has stumbled a few times in the speccing of new hardware all in all they are pretty good and the iPad, for me, is more than compelling to get without getting overly wrapped up about what is inside. Watching video of people with it hands on told me all I need to know about the CPU in it - it's fast!
post #153 of 166
One has to wonder why the A4 designation. Sounds sexy like the Audi A4? Were there A1, A2 and A3 processors we never saw? Does the 4 designate 4 cores? Alphabet soup time!

The Cortex-A8 tops out at 1GHz but has better power characteristics than even the single core Cortex-A9, and adding cores just increases consumption. Jobs is on record as saying that the processor uses only a trickle of power, so I'm betting on a soft macro A8 with PA Risc doing intense work on chip layout to get power consumption close to the hard macro implementation values. If they did choose the A9 in dual or quad core configurations, my assumption is they would dynamically manage the clock frequency or even the core usage (if that is possible) to eek out the power consumption. This is the kind of design skill which would differentiate PA Risc from just another SoC job shop bolting together IP.

The iPad is carefully positioned not to cannibalize the MacBook market. It's an ARM processor to give immediate access to 140K applications and jumpstart the market in a way other tablet computers have completely bombed at. Pages, Numbers and Keynote for $10 each - wow! I expect they'll be less functional than the desktop versions, but this is aimed at dominating the tablet market out of the box.

I think Apple's direction is clear - Intel for laptop/desktop, ARM for embedded. Next up, I am sure, will be an A4-based Apple TV and then that market will take off with 140K apps running on Apple TV and no doubt new and exciting content. It's only a hobby because their resources are tied up pushing out the first iPad.
post #154 of 166
Quote:
Originally Posted by DocNo42 View Post

So are you agreeing with him? Because I think he's nailed it - a full color screen that supports video is far more important than a week long battery life.

That the A4 only uses a trickle of power? That hardly anyone will use the iPad for more than 10 hours straight?

The first question is obvious. Basically, every ARM system uses a trickle of power. I'm waiting to see if the iPad uses a smaller trickle of power than contemporary systems. If the A4 uses less power than a Snapdragon or Tegra or an OMAP or whatever Samsung while providing similar performance, then PA Semi purchase, or rather, building the A4 in-house, is a big win.

For the second question, Jobs is making a converged device with book reading as one of its features. Since it isn't its primary feature, the tradeoff is to have less reading time in favor of the other stuff (web browsing, apps, video/music, communications). The other features of the iPad pretty much mean iPad users will recharge daily, every 2 days maybe, 3 or 4. The fact that reading time is less then a Kindle doesn't apply that much when comparing the two. The usage pattern between a Kindle user and an iPad user is going to be different.

Quote:
And I agree here too. Even the mobile CPU's have gotten to the point where there is more than enough umph in them to do what is needed - the endless chase for performance just for the sake of performance doesn't make sense on an ultra mobile device like the iPad.

We're not there yet for mobile. Likely there for office automation and web users on desktops and laptops, but for mobiles, not there yet. I don't think we will be there until mobiles have 2 to 3 GHz dual-cores, and of course the same level for GPU. Memory performance, both main memory and storage, is so abysmal that I don't think they will ever be fast enough.

However, that's really beside the point. There are many good reasons for one to wait. Increases CPU/GPU, memory and storage are good reasons. Bugs (hardware and software). Content. Radio tech (WiMAX, LTE). Camera. Just plain design evolution and maturation.
post #155 of 166
Quote:
Originally Posted by Shrike View Post

They don't need to have a Cortex-A8 flag in the compiler as an easy give away.

Sure, they don't. It's neutral -march armv7.

We mean Apple no harm.

People are lovers, basically. -- Engadget livebloggers at the iPad mini event.

Reply

We mean Apple no harm.

People are lovers, basically. -- Engadget livebloggers at the iPad mini event.

Reply
post #156 of 166
Quote:
Originally Posted by DocNo42 View Post

But what is there to say that the CPU alone will have any bearing on his ability to "keep longer". .

Have you used epocrates on a 3g and 3gs iphone?

When I had a 3g it was almost unusable. On a 3gs it works pretty well but with each update it gets a little more sluggish to use.
post #157 of 166
Quote:
Originally Posted by DocNo42 View Post

Another thing I have noticed is what people really mean when they say things like "keep longer" is "never have to buy another one again but run the latest software" which just isn't a reasonable expectation.

Nice straw man. Try again.
post #158 of 166
Quote:
Originally Posted by backtomac View Post

Have you used epocrates on a 3g and 3gs iphone?

When I had a 3g it was almost unusable. On a 3gs it works pretty well but with each update it gets a little more sluggish to use.

We're back to the "wait forever" argument. When telling someone buying a computer to buy what they will want a year from now rather what they would want now, so that their purchase will obsolete a bit more slowly, I'm considering that they're likely to use programs that are fairly much hogs to begin with.

I think we'll get less of that here. Besides, at what point will purchase of a new computing product make sense? It makes sense when it does what you want. I'll be enjoying my new iPad for couple of years at least. Then, if I really have a program or two that brings it to its knees, I'll get a new one. The old one will still work fine for everything else.

But most people will be able to use this for years without any problem. As long as you aren't really upset about not having a built-in camera, and possibly a compass, which we don't know for certain that it doesn't have, there won't be anything that will kill the device next year. In two years, maybe, for some. In three years, it's more likely. But are we to tell people to wait three years? I hope not!
post #159 of 166
Quote:
Originally Posted by melgross View Post

We're back to the "wait forever" argument. When telling someone buying a computer to buy what they will want a year from now rather what they would want now, so that their purchase will obsolete a bit more slowly, I'm considering that they're likely to use programs that are fairly much hogs to begin with.

This is untrue or folks wouldn't care that the MBP hasn't been updated yet.

The reality is that there are indeed good times and bad times to buy computers.

Now is not a good time to buy a Core2Duo machine.
Now is not a good time to buy a Cortex A8 machine.

Now is a good time to buy a Core i5/Core i7 machines (Soon if you're getting a MBP).
Soon will be a good time to buy a Cortex A9 machine.

So it's never been "wait forever" but wait until you're not at the end of a product cycle.

If you believe that rev A machines are buggier then the optimal time is about 6 months after a new generation for maximizing longevity while minimizing early growing pains and early adopter price penalties.
post #160 of 166
Quote:
Originally Posted by vinea View Post

This is untrue or folks wouldn't care that the MBP hasn't been updated yet.

The reality is that there are indeed good times and bad times to buy computers.

Now is not a good time to buy a Core2Duo machine.
Now is not a good time to buy a Cortex A8 machine.

Now is a good time to buy a Core i5/Core i7 machines (Soon if you're getting a MBP).
Soon will be a good time to buy a Cortex A9 machine.

So it's never been "wait forever" but wait until you're not at the end of a product cycle.

If you believe that rev A machines are buggier then the optimal time is about 6 months after a new generation for maximizing longevity while minimizing early growing pains and early adopter price penalties.

I already made the point about computers, which was that they can be running programs that are taxing them. Then, and only then, does it matter. If, like most people, a buyer will not be doing anything taxing, then they really don't have to wait for the newest model. Of course, if the new model is just a short time away, then it does pay to wait. But not an entire year! That's what you suggested if the cpu here wasn't what you thought it SHOULD be.

I'm saying that for this type of device, if it works well now, then it will work well a year from now, and so buying it now is fine.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › iPad's Apple A4 CPU is feature-stripped ARM Cortex-A8 - report