Techno-geeks focus on the inputs, the rest of the world focuses on the outputs. By making the A4 a mysterious black box, Apple avoids all of the noise from geekdom and focuses people's attention on the overall product.
I think the Wii analogy is perfect. Geeks refuse to buy it because they think the hardware is obsolete. The rest of the world buys it because it's fun.
Wii is also being purchased because it is/was cheaper than others, as well. I also came across statistics that WII owners on average purchase much less games than X360 and PS3 owners, making them more casual gamers, or novelty-purchasers... so WII is good news for hardware manufacturer, but not so good news for content providers, sort of.
Wii is also being purchased because it is/was cheaper than others, as well. I also came across statistics that WII owners on average purchase much less games than X360 and PS3 owners, making them more casual gamers, or novelty-purchasers... so WII is good news for hardware manufacturer, but not so good news for content providers, sort of.
and it seems to me that the Wii is well represented among the top selling games. Of course, many of those top-selling Wii games are from Nintendo, so maybe your point applies to 3rd party games.
Jon Stokes is well respected. What he says makes sense. Sometimes the articles here are a bit overblown, but the technical articles in Ars are also well respected. He's also saying that it looks as though it's true, but he's not saying it's 100% yet. I would tend to go with what he says unless shown differently.
He knows microprocessor design, yes, but I question his "Apple" sources. He's great with x86 designs and has a good history with good contacts there, but all things related Apple, he's just as much in the dark as the next guy. He's only as good as his sources are, and they are not that good in things involving Apple. Nobody has good sources except maybe John Gruber, but his are all in the marketing department.
Whether the Apple A4 is a Cortex A8 or A9 or a custom ARM design, who knows, I'm pretty sure that it and any subsequent Apple ARM SoC won't have higher performance than competing ARM designs. There's no magic circuit design that PA Semi has that others don't.
But Stokes pokes on the right thing. The number one complaint about iPhones is battery life. So, I think Apple is putting a lot of effort into getting the best performance/Watt as possible. If it yields a true 30 day standby time for the iPad and 10 hours of video/WiFi browsing, maybe 12 hours for ebook reading, and doing a custom SoC is the reason, that's a big big win. If that development work means a 10 hour WiFi/talk time and 3 week standby time for the next iPhone, that's a big win.
I do think he is wrong on the things being removed like the camera and the various I/O, but he's just guessing on all that stuff or just using them as an example. What I think is right on is the power/clock gating and voltage slewing. If the A4 can downclock and reduce voltage to the bare minimum for iBook reading, that could mean 14-16 hrs of battery life for reading. That's probably something other SoCs can't do.
He knows microprocessor design, yes, but I question his "Apple" sources. He's great with x86 designs and has a good history with good contacts there, but all things related Apple, he's just as much in the dark as the next guy. He's only as good as his sources are, and they are not that good in things involving Apple. Nobody has good sources except maybe John Gruber, but his are all in the marketing department.
Whether the Apple A4 is a Cortex A8 or A9 or a custom ARM design, who knows, I'm pretty sure that it and any subsequent Apple ARM SoC will have higher performance than computing ARM designs. There's no magic circuit design that PA Semi has that others don't.
But Stokes pokes on the right thing. The number one complaint about iPhones is battery life. So, I think Apple is putting a lot of effort into getting the best performance/Watt as possible. If it yields a true 30 day standby time for the iPad and 10 hours of video/WiFi browsing, maybe 12 hours for ebook reading, and doing a custom SoC is the reason, that's a big big win. If that development work means a 10 hour WiFi/talk time and 3 week standby time for the next iPhone, that's a big win.
I do think he is wrong on the things being removed like the camera and the various I/O, but he's just guessing on all that stuff or just using them as an example. What I think is right on is the power/clock gating and voltage slewing. If the A4 can downclock and reduce voltage to the bare minimum for iBook reading, that could mean 14-16 hrs of battery life for reading. That's probably something other SoCs can't do.
Jon has been using and writing about Apple matters for years. In fact, many of the people working at Ars are Mac users. Ars is more conservative than AI is with their technical articles, and much more technical as well. If he says that he has sources, then I believe him, going from his record. I'd be surprised if he was far off.
He might be wrong about the camera, but don't forget that we really don't know if that frame we saw a picture of was really what it was stated as being. Possibly, the iPad frame has no place for a camera. Remember the frame for the 3GS which turned out to be for a Creative device?
I agree about the battery life, and was thinking about the same numbers myself. I believe that Jobs said 10 hours for movies, if I'm not mistaken. If so, that's about the most intensive usage the hardware will be getting other than some 3D gaming. 140 hours of music without the screen on! I'd like to see any other tablet match that.
I am not convinced. It does not make sense to me that Apple would go to all the effort of producing its own chip for so little gain. In the end, these are still only rumors.
Additionally, we have the few limited reports that system is really fast and responsive. Mossberg, for example, said "wicked fast." This indicates that there is a lot more going on than what the Ars Tech. article proclaims.
Personally, I see no reason to doubt the quote in #2.
You can get "wicked fast" from well executed software even if hardware is not overly exciting.
PC desktop platform is "blessed" with affordable and powerful software and, more often than not, it gives programmers excuse for sloppy, minimally optimised coding.
I cannot skip reflecting on PS2 with it's puny 4MB video RAM and 32MB system RAM, 300MHz CPU speed and 147MHz GPU speed... yet some games are looking good and run remarkably smooth even comparing to desktop standards... on a hardware that would not be considered useful at all on desktops.
Likewise, even PS3 is, hardware wise, way behind desktops with its graphics based on 256MB nVidia 7800. New PC games would really be bag of hurt on that graphics, and 256MB of system memory simply would be useless, even if you have it all available for games/applications... yet new PS3 games look really great, even compared with PC latest and greatest.
If true, I hope the A4 does not make it into the next iPhone. With the iPad potentially running a year old processor, I guess the prospects of multitasking in OS 4.0 is looking pretty dim.
And yet the CoreDuo processors were all based on the Pentium III not the Pentium IV which was the newer one.
You don't have to be multicore to multitask! it just helps. Unix has been multitasking on a single processor for 40 years.
Doesn't even have to be fast. The Amiga screamed along running 16 apps in 1MB RAM on a 7MHz processor. That was graphics apps (Personal Paint and Imagine), games, spreadsheets (Filthy Lucre), internet browser (AWeb), e-mail client, and other apps all at the same time.
and it seems to me that the Wii is well represented among the top selling games. Of course, many of those top-selling Wii games are from Nintendo, so maybe your point applies to 3rd party games.
Yes, but don't forget you have much more Wiis in houses than other consoles. Relative to number of each console in the wild, there are less Wii games per console owner compared to other platforms.
Nonsense. MacBook models are not delineated by age in any way.
Nonsense. Why would they do that? Why would they close off an entire revenue stream?
Nonsense. See first point above. Where do you even get that idea from, apart from pulling it out of your a**?
Nonsense. It's a media delivery device, and that's what it does. Why would anyone outside of the idiot jailbreaking community want to hack it?
Nonsense. How many non-technical users do you think start "messing around" in terminal?
And more nonsense. You're using nonsense to explain nonsense. I would imagine (and my anecdotal theories carry as much weight as yours) iPad users are likely to already HAVE a computer.
By golly but you make a lot of sense, Mr. Knightlie!
Jon has been using and writing about Apple matters for years. In fact, many of the people working at Ars are Mac users. Ars is more conservative than AI is with their technical articles, and much more technical as well. If he says that he has sources, then I believe him, going from his record. I'd be surprised if he was far off.
Yes, there are a lot of Mac users staffed at Ars. It doesn't mean they have any good sources for Apple dealings. Stokes is as just in the dark as anyone else. What he has on his side is good x86 knowledge, industry sources, and a good understanding of microprocessor design. I'm pretty confident he has zero sources into Apple proper. I think the best he's got is a contractor working on compiler and driver design. Maybe it is a developer with iPad access and they have inferred by performance results.
Notice how he is just speculating about what is missing. He's already admitted to being wrong about display output. I think all his sources told him was that the A4 is an Cortex-A8 based SoC. And it's only based on indirect knowledge. That's it. I don't think anyone knows but the upper management, the SoC CPU/SMC subteam, and the compiler/driver subteam.
I think we will never know and all we will have are performance comparisons between iPhone 3GS and iPad apps.
Quote:
I agree about the battery life, and was thinking about the same numbers myself. I believe that Jobs said 10 hours for movies, if I'm not mistaken. If so, that's about the most intensive usage the hardware will be getting other than some 3D gaming. 140 hours of music without the screen on! I'd like to see any other tablet match that.
It's right on the iPad specs page: 10 hrs for video or web browsing at default display settings. The iPad has to really hit that mark.
Jobs is quoted as saying the company has invested > $1 Billion on R&D for the A4.
Whether or not that includes $275 Million means anywhere from $725 Million+ in R&D plus ARM IP licensing on a CPU/GPU IC solution for old IP makes no sense.
I don't believe it. It's just not possible. I'd like to see an explanation for that statement, which I don't remember seeing, though I'm not saying he didn't make it. But he would have to b referring to much more than work on the chip itself.
You're right. It wasn't Jobs and anything attributed to Apple. Nobody knows how much Apple has spent on the A4.
The $1G quote was the NYT quoting an industry analyst on the cost of designing a proprietary, from the ground-up CPU. That's all he was talking about in an article about how Nvidia, Qualcomm and Apple are off building their own custom SoCs. It's quite doubtful it cost any of them that much as they all took various existing designs and did custom integration work.
One can make a guess. It is fairly simple. Take 50 people including managers and engineers, multiply by 200k and by 2 years, you get 20m. Double it for other procurements, testing and pad it a little more for other stuff. It probably cost Apple on the order of 50 million if they devouted 50 engineers and various related testing, prototyping, licensing and "institutional" costs. For $1G, you're talking 500 people over 2 years and some capital expenditures. Apple probably didn't even spend that much on iPad development as whole. They probably didn't even spend half that.
I think its safe to say that the average 2010 uninformed consumer who has an HDTV and picks up a Wii will be somewhat disappointed when they see the analog resolution
The amount of people today who have HDTV's but no HDTV cable or OTA HD antenna blow your little theory right out of the water. They don't care right now for the primary purpose of their "HD" TV, why are they going to suddenly care for a game console that is a casual, secondary use?
To a geek such as yourself, watching SD on an HDTV is a distraction in and of itself and unfathomable. To non-geeks, the Wii is fun and it's approachable in a way that the Xbox and PS3 aren't. Extra resolution isn't going to change that part of the experience. They couldn't care less that it's 480i instead of 1080p - it's fun!. Wanna know why the Wii is still popular? It's about the end user experience, not the specs!
It is notable that Nintendo, like Apple, is doing just fine dissing people such as yourself who look at a few missing checkboxes and proclaims "it sucks". From the beating self-proclaimed Internet experts took with the iPod, personified in the now infamous Slashdot iPod comments, you think this kind of inane commentary would be a little more reserved. But here we are, all over again...
Why is this bad? Because the iPad is a closed device and doesn't encourage as much immediate hacking and interest as a open device does.
On a Mac, anyone who has the interest can fire up Terminal, learn a few Unix commands and be messing around. It encourages that because it's a open device.
People who want to use terminal won't be buying an iPad - they will continue to buy the MacBook or MacBook Pro just as they would today - unless they are stupid, and if you are that stupid you don't have any business in terminal or "hacking" around anyway.
I really don't understand the hysterics that the iPad is going to kill "open" computing. Even if Apple killed Mac OSX outright tomorrow (and there is absolutely no indication whatsoever that they intend to do so either now or at any time in the future) you can still grab and Windows, Linux or whatever OS you want to run on your generic Intel hardware.
So the iPad isn't for you - great. I find it fascinating that people are so threatened by something different that they feel they have to not only not just buy it, but campaign against it Guess what, there are more people who want to do basic tasks but don't want a computer to "hack around" than those who do. That's what Apple is targeting - a market that is not currently being served by "traditional" computing. The majority of people couldn't care less about opening Terminal or learning Unix - and nor should they have to! This isn't "one size fits all world" - you need to get over yourself and realize there is a VERY large group of people where the iPad is ideal.
Heck, as an IT geek who is comfortable hacking around, I still find the concept of an appliance like the iPad that will just work and have dramatically reduced administrative overhead extremely appealing for casual computing. It's not enough to replace my full blown machines, but it can handle the majority of my casual computing - rather than being a hinderance, the "closed" nature of the iPad is a good thing - it makes the entire environment and hence my experience with it far smoother than a Mac, Windows (or god forbid Linux!) machine can ever hope of being. It's not the perfect solution for everything but it doesn't have to be! It wouldn't be nearly as appealing if it was trying to be all things to all people...
From the beating self-proclaimed Internet experts took with the iPod, personified in the now infamous Slashdot iPod comments, you think this kind of inane commentary would be a little more reserved. But here we are, all over again...
Great link. Lot of funny posts which remind me of certain myopic posters here.
This one caught my eye...
Quote:
Quote:
Apple is a normal company. Why does the public constantly expect them to do the impossible?
I think over history, Apple has shown with some regularity that they can pull "the impossible" out of their hat. Now with Jobs and NeXT genes on board, that sense is even more intense.
Whether Apple's products are brilliant successes or bizarrely interesting failures, nobody can deny that what they're doing as a rule seems more interesting that what Dell/Gateway/Microsoft et al are ever doing. And occasionally (Macintosh, NeXT, Newton, iMac) Apple/NeXT have done things that were completely mind-blowing and heretofore impossible.
I'm speaking as a longtime PC owner and Linux, not a Mac owner (though I do love my Newton)-- I have a healthy respect for the real innovation Apple has brought to the industry (compare to Microsoft's "innovation"...) and I have trouble understanding why Slashot users are such haters when it comes to Apple and Steve Jobs.
That was in 2001 and they still had that pedigree. Time change but people don't... it seems.
LOL, so you think there is going to be true multitasking when:
-the iPad will be running the same OS as the iPhone so there will be no significant differences between the two devices
1) The iPhone multitasks now. Not sure what this "true" multitasking is you speak of. Is "true" multitasking like Monster-Cable "pure" HDMI?
2)Selective multitasking for third party Apps will probably come eventually - when Apple figures out how to make it clean, obvious and manageable in a way where a non-geek can understand clearly what is going on they will offer it. Until then, like cut/copy/paste it won't be shipping.
About the only thing I want or need multitasking for is Pandora. Police Scanner can play in the background by launching a stream in Safari in the background - that's on the iPhone today! Perhaps it's just a matter of Pandora thinking different? Anyway, "multitasking" isn't a feature that most people are looking for in and of itself. They want to get stuff done - and the current iPhone OS does that and provides a pretty good facsimile of multitasking - I can switch programs and tasks fairly quickly. Well written apps that leverage notification services and preserve state as you exit/enter them work pretty darn well. Just look at anyone with an Android phone that runs many programs at once - the battery life goes to crap. Well, what about that! That can't be - Apple is just being asshole control freaks and their whole line about battery life is just a ploy to run your life.
"Multitasking" didn't affect the iPhone significantly and I sincerely doubt it will make a meaningful impact with the iPad. Sure, some geeks won't buy it - but if it wasn't for multitasking there would be some other deficiency or list of "deficiencies" - in other words, they were never going to nor do they intend to buy an iPhone or iPad anyway, so their protesting is meaningless \
...Then Apple would just need to tweak it a bit, slap on a logo, and voila! A4! Then they'll stick in in the iPad and let the early adopters troubleshoot and test it just like they did with S***Leopard, Aperture 3, iMac 27, AirPort extreme, Time Capsule, MacBook pro batteries, MacBook hard drives, etc. etc.
Your quite the masochist to stick with a company that causes you so much angst...
Personally, if it contains an reference A8 design I'll likely skip the 1st gen. In a year's time A9 solutions will make pretty much any 1Ghz A8 solution look anemic.
This right here is why Apple isn't saying much beyond "A4" for technical specs for the iPad, and I for one am glad for this device they are not hopping on that tredmill. At the end of the day what the ^@!^# difference does it make what the guts are? Does it run the software quickly (and from every demo I saw recorded at the hands on after the announcement that's a resounding yes!)? Does it get good battery life? Does it do everything I want and need it to do?
If yes, I buy. I'm not going to not buy because they could have put a 900MHz part in and instead they put in an 800MHz part... honestly, other than some geek compulsion to focus on minutia, at the end of the day what does it matter? To be honest, if I was going to obsess about a technical detail on the iPad it would be the amount of system memory, not the CPU. Stuff like this sure as heck won't matter to my father. He's shocked me, he's very interested in the iPad and if he gets one, when the next iPad comes out his won't "poof" into the air or otherwise be diminished. If Apple releases a new OS that kills it by loading it down, that would be a fair concern - my iPhone 3G did slow down with later releases more than I would have liked, but then again I'm pretty impatient People like my father probably won't even notice.
If you wan't something you can pick apart, get gaga over the guts, or get root and terminal access go buy one of the many existing solutions that will do all that right now. This device isn't targeted at you!
Comments
Techno-geeks focus on the inputs, the rest of the world focuses on the outputs. By making the A4 a mysterious black box, Apple avoids all of the noise from geekdom and focuses people's attention on the overall product.
I think the Wii analogy is perfect. Geeks refuse to buy it because they think the hardware is obsolete. The rest of the world buys it because it's fun.
Wii is also being purchased because it is/was cheaper than others, as well. I also came across statistics that WII owners on average purchase much less games than X360 and PS3 owners, making them more casual gamers, or novelty-purchasers... so WII is good news for hardware manufacturer, but not so good news for content providers, sort of.
Wii is also being purchased because it is/was cheaper than others, as well. I also came across statistics that WII owners on average purchase much less games than X360 and PS3 owners, making them more casual gamers, or novelty-purchasers... so WII is good news for hardware manufacturer, but not so good news for content providers, sort of.
I just looked here:
http://www.vgchartz.com/chartsindex.php
and it seems to me that the Wii is well represented among the top selling games. Of course, many of those top-selling Wii games are from Nintendo, so maybe your point applies to 3rd party games.
The iPad has a lower barrier to entry than the $999 MacBook, also the accessories are optional. The iPad is going to replace the MacBook.
The iPad has a lower barrier to entry than the $2499 Mac Pro, also the accessories are optional. The iPad is going to replace the Mac Pro.
The iPad has a lower barrier to entry than the $2999 Xserve, also the accessories are optional. The iPad is going to replace the Xserve.
The iPad has a lower barrier to entry than the $999 Final Cut Studio suite, also the accessories are optional. The iPad is going to replace Final Cut.
Makes about as much sense.
Jon Stokes is well respected. What he says makes sense. Sometimes the articles here are a bit overblown, but the technical articles in Ars are also well respected. He's also saying that it looks as though it's true, but he's not saying it's 100% yet. I would tend to go with what he says unless shown differently.
He knows microprocessor design, yes, but I question his "Apple" sources. He's great with x86 designs and has a good history with good contacts there, but all things related Apple, he's just as much in the dark as the next guy. He's only as good as his sources are, and they are not that good in things involving Apple. Nobody has good sources except maybe John Gruber, but his are all in the marketing department.
Whether the Apple A4 is a Cortex A8 or A9 or a custom ARM design, who knows, I'm pretty sure that it and any subsequent Apple ARM SoC won't have higher performance than competing ARM designs. There's no magic circuit design that PA Semi has that others don't.
But Stokes pokes on the right thing. The number one complaint about iPhones is battery life. So, I think Apple is putting a lot of effort into getting the best performance/Watt as possible. If it yields a true 30 day standby time for the iPad and 10 hours of video/WiFi browsing, maybe 12 hours for ebook reading, and doing a custom SoC is the reason, that's a big big win. If that development work means a 10 hour WiFi/talk time and 3 week standby time for the next iPhone, that's a big win.
I do think he is wrong on the things being removed like the camera and the various I/O, but he's just guessing on all that stuff or just using them as an example. What I think is right on is the power/clock gating and voltage slewing. If the A4 can downclock and reduce voltage to the bare minimum for iBook reading, that could mean 14-16 hrs of battery life for reading. That's probably something other SoCs can't do.
He knows microprocessor design, yes, but I question his "Apple" sources. He's great with x86 designs and has a good history with good contacts there, but all things related Apple, he's just as much in the dark as the next guy. He's only as good as his sources are, and they are not that good in things involving Apple. Nobody has good sources except maybe John Gruber, but his are all in the marketing department.
Whether the Apple A4 is a Cortex A8 or A9 or a custom ARM design, who knows, I'm pretty sure that it and any subsequent Apple ARM SoC will have higher performance than computing ARM designs. There's no magic circuit design that PA Semi has that others don't.
But Stokes pokes on the right thing. The number one complaint about iPhones is battery life. So, I think Apple is putting a lot of effort into getting the best performance/Watt as possible. If it yields a true 30 day standby time for the iPad and 10 hours of video/WiFi browsing, maybe 12 hours for ebook reading, and doing a custom SoC is the reason, that's a big big win. If that development work means a 10 hour WiFi/talk time and 3 week standby time for the next iPhone, that's a big win.
I do think he is wrong on the things being removed like the camera and the various I/O, but he's just guessing on all that stuff or just using them as an example. What I think is right on is the power/clock gating and voltage slewing. If the A4 can downclock and reduce voltage to the bare minimum for iBook reading, that could mean 14-16 hrs of battery life for reading. That's probably something other SoCs can't do.
Jon has been using and writing about Apple matters for years. In fact, many of the people working at Ars are Mac users. Ars is more conservative than AI is with their technical articles, and much more technical as well. If he says that he has sources, then I believe him, going from his record. I'd be surprised if he was far off.
He might be wrong about the camera, but don't forget that we really don't know if that frame we saw a picture of was really what it was stated as being. Possibly, the iPad frame has no place for a camera. Remember the frame for the 3GS which turned out to be for a Creative device?
I agree about the battery life, and was thinking about the same numbers myself. I believe that Jobs said 10 hours for movies, if I'm not mistaken. If so, that's about the most intensive usage the hardware will be getting other than some 3D gaming. 140 hours of music without the screen on! I'd like to see any other tablet match that.
Re #1:
I am not convinced. It does not make sense to me that Apple would go to all the effort of producing its own chip for so little gain. In the end, these are still only rumors.
Additionally, we have the few limited reports that system is really fast and responsive. Mossberg, for example, said "wicked fast." This indicates that there is a lot more going on than what the Ars Tech. article proclaims.
Personally, I see no reason to doubt the quote in #2.
You can get "wicked fast" from well executed software even if hardware is not overly exciting.
PC desktop platform is "blessed" with affordable and powerful software and, more often than not, it gives programmers excuse for sloppy, minimally optimised coding.
I cannot skip reflecting on PS2 with it's puny 4MB video RAM and 32MB system RAM, 300MHz CPU speed and 147MHz GPU speed... yet some games are looking good and run remarkably smooth even comparing to desktop standards... on a hardware that would not be considered useful at all on desktops.
Likewise, even PS3 is, hardware wise, way behind desktops with its graphics based on 256MB nVidia 7800. New PC games would really be bag of hurt on that graphics, and 256MB of system memory simply would be useless, even if you have it all available for games/applications... yet new PS3 games look really great, even compared with PC latest and greatest.
If true, I hope the A4 does not make it into the next iPhone. With the iPad potentially running a year old processor, I guess the prospects of multitasking in OS 4.0 is looking pretty dim.
And yet the CoreDuo processors were all based on the Pentium III not the Pentium IV which was the newer one.
New isn't always better.
You don't have to be multicore to multitask! it just helps. Unix has been multitasking on a single processor for 40 years.
Doesn't even have to be fast. The Amiga screamed along running 16 apps in 1MB RAM on a 7MHz processor. That was graphics apps (Personal Paint and Imagine), games, spreadsheets (Filthy Lucre), internet browser (AWeb), e-mail client, and other apps all at the same time.
I just looked here:
http://www.vgchartz.com/chartsindex.php
and it seems to me that the Wii is well represented among the top selling games. Of course, many of those top-selling Wii games are from Nintendo, so maybe your point applies to 3rd party games.
Yes, but don't forget you have much more Wiis in houses than other consoles. Relative to number of each console in the wild, there are less Wii games per console owner compared to other platforms.
And yet the CoreDuo processors were all based on the Pentium III not the Pentium IV which was the newer one.
New isn't always better.
I believe they were based on the mobile version.
Who knew at the time that it would have such legs?
Nonsense. MacBook models are not delineated by age in any way.
Nonsense. Why would they do that? Why would they close off an entire revenue stream?
Nonsense. See first point above. Where do you even get that idea from, apart from pulling it out of your a**?
Nonsense. It's a media delivery device, and that's what it does. Why would anyone outside of the idiot jailbreaking community want to hack it?
Nonsense. How many non-technical users do you think start "messing around" in terminal?
And more nonsense.
By golly but you make a lot of sense, Mr. Knightlie!
Jon has been using and writing about Apple matters for years. In fact, many of the people working at Ars are Mac users. Ars is more conservative than AI is with their technical articles, and much more technical as well. If he says that he has sources, then I believe him, going from his record. I'd be surprised if he was far off.
Yes, there are a lot of Mac users staffed at Ars. It doesn't mean they have any good sources for Apple dealings. Stokes is as just in the dark as anyone else. What he has on his side is good x86 knowledge, industry sources, and a good understanding of microprocessor design. I'm pretty confident he has zero sources into Apple proper. I think the best he's got is a contractor working on compiler and driver design. Maybe it is a developer with iPad access and they have inferred by performance results.
Notice how he is just speculating about what is missing. He's already admitted to being wrong about display output. I think all his sources told him was that the A4 is an Cortex-A8 based SoC. And it's only based on indirect knowledge. That's it. I don't think anyone knows but the upper management, the SoC CPU/SMC subteam, and the compiler/driver subteam.
I think we will never know and all we will have are performance comparisons between iPhone 3GS and iPad apps.
I agree about the battery life, and was thinking about the same numbers myself. I believe that Jobs said 10 hours for movies, if I'm not mistaken. If so, that's about the most intensive usage the hardware will be getting other than some 3D gaming. 140 hours of music without the screen on! I'd like to see any other tablet match that.
It's right on the iPad specs page: 10 hrs for video or web browsing at default display settings. The iPad has to really hit that mark.
Originally Posted by mdriftmeyer
Jobs is quoted as saying the company has invested > $1 Billion on R&D for the A4.
Whether or not that includes $275 Million means anywhere from $725 Million+ in R&D plus ARM IP licensing on a CPU/GPU IC solution for old IP makes no sense.
I don't believe it. It's just not possible. I'd like to see an explanation for that statement, which I don't remember seeing, though I'm not saying he didn't make it. But he would have to b referring to much more than work on the chip itself.
You're right. It wasn't Jobs and anything attributed to Apple. Nobody knows how much Apple has spent on the A4.
The $1G quote was the NYT quoting an industry analyst on the cost of designing a proprietary, from the ground-up CPU. That's all he was talking about in an article about how Nvidia, Qualcomm and Apple are off building their own custom SoCs. It's quite doubtful it cost any of them that much as they all took various existing designs and did custom integration work.
One can make a guess. It is fairly simple. Take 50 people including managers and engineers, multiply by 200k and by 2 years, you get 20m. Double it for other procurements, testing and pad it a little more for other stuff. It probably cost Apple on the order of 50 million if they devouted 50 engineers and various related testing, prototyping, licensing and "institutional" costs. For $1G, you're talking 500 people over 2 years and some capital expenditures. Apple probably didn't even spend that much on iPad development as whole. They probably didn't even spend half that.
I think its safe to say that the average 2010 uninformed consumer who has an HDTV and picks up a Wii will be somewhat disappointed when they see the analog resolution
The amount of people today who have HDTV's but no HDTV cable or OTA HD antenna blow your little theory right out of the water. They don't care right now for the primary purpose of their "HD" TV, why are they going to suddenly care for a game console that is a casual, secondary use?
To a geek such as yourself, watching SD on an HDTV is a distraction in and of itself and unfathomable. To non-geeks, the Wii is fun and it's approachable in a way that the Xbox and PS3 aren't. Extra resolution isn't going to change that part of the experience. They couldn't care less that it's 480i instead of 1080p - it's fun!. Wanna know why the Wii is still popular? It's about the end user experience, not the specs!
It is notable that Nintendo, like Apple, is doing just fine dissing people such as yourself who look at a few missing checkboxes and proclaims "it sucks". From the beating self-proclaimed Internet experts took with the iPod, personified in the now infamous Slashdot iPod comments, you think this kind of inane commentary would be a little more reserved. But here we are, all over again...
Why is this bad? Because the iPad is a closed device and doesn't encourage as much immediate hacking and interest as a open device does.
On a Mac, anyone who has the interest can fire up Terminal, learn a few Unix commands and be messing around. It encourages that because it's a open device.
People who want to use terminal won't be buying an iPad - they will continue to buy the MacBook or MacBook Pro just as they would today - unless they are stupid, and if you are that stupid you don't have any business in terminal or "hacking" around anyway.
I really don't understand the hysterics that the iPad is going to kill "open" computing. Even if Apple killed Mac OSX outright tomorrow (and there is absolutely no indication whatsoever that they intend to do so either now or at any time in the future) you can still grab and Windows, Linux or whatever OS you want to run on your generic Intel hardware.
So the iPad isn't for you - great. I find it fascinating that people are so threatened by something different that they feel they have to not only not just buy it, but campaign against it
Heck, as an IT geek who is comfortable hacking around, I still find the concept of an appliance like the iPad that will just work and have dramatically reduced administrative overhead extremely appealing for casual computing. It's not enough to replace my full blown machines, but it can handle the majority of my casual computing - rather than being a hinderance, the "closed" nature of the iPad is a good thing - it makes the entire environment and hence my experience with it far smoother than a Mac, Windows (or god forbid Linux!) machine can ever hope of being. It's not the perfect solution for everything but it doesn't have to be! It wouldn't be nearly as appealing if it was trying to be all things to all people...
From the beating self-proclaimed Internet experts took with the iPod, personified in the now infamous Slashdot iPod comments, you think this kind of inane commentary would be a little more reserved. But here we are, all over again...
Great link. Lot of funny posts which remind me of certain myopic posters here.
This one caught my eye...
Apple is a normal company. Why does the public constantly expect them to do the impossible?
I think over history, Apple has shown with some regularity that they can pull "the impossible" out of their hat. Now with Jobs and NeXT genes on board, that sense is even more intense.
Whether Apple's products are brilliant successes or bizarrely interesting failures, nobody can deny that what they're doing as a rule seems more interesting that what Dell/Gateway/Microsoft et al are ever doing. And occasionally (Macintosh, NeXT, Newton, iMac) Apple/NeXT have done things that were completely mind-blowing and heretofore impossible.
I'm speaking as a longtime PC owner and Linux, not a Mac owner (though I do love my Newton)-- I have a healthy respect for the real innovation Apple has brought to the industry (compare to Microsoft's "innovation"...) and I have trouble understanding why Slashot users are such haters when it comes to Apple and Steve Jobs.
That was in 2001 and they still had that pedigree. Time change but people don't... it seems.
LOL, so you think there is going to be true multitasking when:
-the iPad will be running the same OS as the iPhone so there will be no significant differences between the two devices
1) The iPhone multitasks now. Not sure what this "true" multitasking is you speak of. Is "true" multitasking like Monster-Cable "pure" HDMI?
2)Selective multitasking for third party Apps will probably come eventually - when Apple figures out how to make it clean, obvious and manageable in a way where a non-geek can understand clearly what is going on they will offer it. Until then, like cut/copy/paste it won't be shipping.
About the only thing I want or need multitasking for is Pandora. Police Scanner can play in the background by launching a stream in Safari in the background - that's on the iPhone today! Perhaps it's just a matter of Pandora thinking different? Anyway, "multitasking" isn't a feature that most people are looking for in and of itself. They want to get stuff done - and the current iPhone OS does that and provides a pretty good facsimile of multitasking - I can switch programs and tasks fairly quickly. Well written apps that leverage notification services and preserve state as you exit/enter them work pretty darn well. Just look at anyone with an Android phone that runs many programs at once - the battery life goes to crap. Well, what about that! That can't be - Apple is just being asshole control freaks and their whole line about battery life is just a ploy to run your life.
"Multitasking" didn't affect the iPhone significantly and I sincerely doubt it will make a meaningful impact with the iPad. Sure, some geeks won't buy it - but if it wasn't for multitasking there would be some other deficiency or list of "deficiencies" - in other words, they were never going to nor do they intend to buy an iPhone or iPad anyway, so their protesting is meaningless
Great link. Lot of funny posts which remind me of certain myopic posters here.
I'm surprised they left that article active. Then again it's such a part of the Internet it would probably raise more attention if they took it down
That was in 2001 and they still had that pedigree. Time change but people don't... it seems.
"Those who cannot remember the past are condemned to repeat it."
It's not just a pithy catch phrase
...Then Apple would just need to tweak it a bit, slap on a logo, and voila! A4! Then they'll stick in in the iPad and let the early adopters troubleshoot and test it just like they did with S***Leopard, Aperture 3, iMac 27, AirPort extreme, Time Capsule, MacBook pro batteries, MacBook hard drives, etc. etc.
Your quite the masochist to stick with a company that causes you so much angst...
Personally, if it contains an reference A8 design I'll likely skip the 1st gen. In a year's time A9 solutions will make pretty much any 1Ghz A8 solution look anemic.
This right here is why Apple isn't saying much beyond "A4" for technical specs for the iPad, and I for one am glad for this device they are not hopping on that tredmill. At the end of the day what the ^@!^# difference does it make what the guts are? Does it run the software quickly (and from every demo I saw recorded at the hands on after the announcement that's a resounding yes!)? Does it get good battery life? Does it do everything I want and need it to do?
If yes, I buy. I'm not going to not buy because they could have put a 900MHz part in and instead they put in an 800MHz part... honestly, other than some geek compulsion to focus on minutia, at the end of the day what does it matter? To be honest, if I was going to obsess about a technical detail on the iPad it would be the amount of system memory, not the CPU. Stuff like this sure as heck won't matter to my father. He's shocked me, he's very interested in the iPad and if he gets one, when the next iPad comes out his won't "poof" into the air or otherwise be diminished. If Apple releases a new OS that kills it by loading it down, that would be a fair concern - my iPhone 3G did slow down with later releases more than I would have liked, but then again I'm pretty impatient
If you wan't something you can pick apart, get gaga over the guts, or get root and terminal access go buy one of the many existing solutions that will do all that right now. This device isn't targeted at you!