Apple CEO hints at no ARM-based MacBook Air as iPad to "soon satisfy" that niche

124»

Comments

  • Reply 61 of 73
    zunxzunx Posts: 620member
    Quote:
    Originally Posted by SolipsismX View Post


    So is this brilliant, well thought out idea Mac OS X on a pocketable touchscreen or Mac OS X with a pocketable keyboard?



    Whatever. The important thing in this kind of device is not the form factor, but first and foremost the weight and then the size. The lighter and the smaller, the better. The full Mac in your pocket or purse. Always.
  • Reply 62 of 73
    Quote:
    Originally Posted by SSquirrel View Post


    Yes but again, a current gen discrete desktop GPU is blisteringly fast and can pump way more data than a current gen top end ARM GPU. So even if the CPUs stacked up managed to even out, the desktop would still have the advantage with the GPU compute capabilities.



    Agree...



    But within the mobile form factor, specifically tablets and low-end laptops, how does the Atom CPU/GPU stack up against the A5 CPU/GPU?
  • Reply 63 of 73
    Quote:
    Originally Posted by poke View Post


    I think Apple probably has more than one plan in place. One would involve converging iOS and OS X and perhaps moving Macs to ARM. The other would involve the iPad replacing the Mac. It's a question of where the market goes. Either way I think that 5 years from now they'll only have one platform.



    One of the really cool take aways from the Jobs biography was how secrecy allowed Jobs or the current CEO to make the decision for the next platform at the last minute with all the dominoes lined up for both decisions. It is a decision point made based only on what is best for the companies future rather than its present. Both PowerPc and Intel mac books were made by independent development teams. If one failed the other could take over. At the last minute after sleeping on the decision Steve chose the intel chip. Doing both insures against intellectual theft and empire making by the drones in different departments. The fact that it makes for awesome theater and marketing is just icing on the cake. No wonder Steve always seemed so excited by the moment. The decision was just made and the future is right now during the keynote.



    So yea I would say that this is now SOP at Apple. Let the press and internet Guess. Apple doesn't even know yet, and any employee who thinks he can get away with telling reporters about what the next great thing is will be in for a surprise.
  • Reply 64 of 73
    ssquirrelssquirrel Posts: 1,196member
    Quote:
    Originally Posted by Dick Applebaum View Post


    Agree...



    But within the mobile form factor, specifically tablets and low-end laptops, how does the Atom CPU/GPU stack up against the A5 CPU/GPU?



    The Atom certainly draws more power. I haven't kept up w/Atom developments as much, but I know the Atom processors that were out a year ago were still in-order processors. I've seen articles about the potential for stacking huge numbers of Atom processors and using them for supercomputers and such, but it was a power saving measure IIRC as you really did need several to compete against each single top end processor.



    Here's a link w/info about the new gen of Atoms coming out.



    http://newsroom.intel.com/community/...ry-life-on-tap
  • Reply 65 of 73
    solipsismxsolipsismx Posts: 19,566member
    Quote:
    Originally Posted by SSquirrel View Post


    The Atom certainly draws more power. I haven't kept up w/Atom developments as much, but I know the Atom processors that were out a year ago were still in-order processors. I've seen articles about the potential for stacking huge numbers of Atom processors and using them for supercomputers and such, but it was a power saving measure IIRC as you really did need several to compete against each single top end processor.



    Here's a link w/info about the new gen of Atoms coming out.



    http://newsroom.intel.com/community/...ry-life-on-tap



    Atom is still in-order though Intel has come a long way with Atom. There Medfield coming out later this year does look to have several competitive edges over Cortex-A9 but that's the previous gen ARM so it's not the best measure. We'll have to see how it stacks up in real life against Cortex-A9 and the upcoming A15.



    That said, Intel's focus on the mobile front should not be scoffed at. They now realize that ARM will hurt their bottom line if they don't get a viable competitor if they don't get under it quickly and the resources to throw at it.
  • Reply 66 of 73
    afrodriafrodri Posts: 190member
    Quote:
    Originally Posted by SolipsismX View Post


    That said, Intel's focus on the mobile front should not be scoffed at. They now realize that ARM will hurt their bottom line if they don't get a viable competitor if they don't get under it quickly and the resources to throw at it.



    I'd agree, and also say that Intel _has_ realized it for a while now, its just that their attempts to break in to the mobile world have been flops. Still, you are right not to count them out. They are a _huge_ company which can take a brute force approach like none other. I remember the 90s when it was "obvious" that the x86 architecture was doomed and RISC architectures would crush them, and on a purely technical standpoint x86 didn't stand a chance. But, because Intel could shovel cash and engineering and manufacturing knowledge at the problem, x86 survived and even took over much of the high end. They still might be able to pull it off in the mobile world.



    I wouldn't be surprised if Apple has both ARM and x86 versions of MacOS and iOS internally, just like they had x86 versions of MacOS years before the PPC->x86 changeover. If Intel does come up with a decent line of low-power cores, Apple would be able to take advantage of it.
  • Reply 67 of 73
    Quote:
    Originally Posted by shompa View Post


    You know that its impossible with X86 to be power efficient? The architecture makes the CPU to large. Large CPU leads to heat, more cost and more energy.



    X86

    Bookmark this:

    Intel will be a niche processor in 5-8 years.



    LOL. At first glance, I thought I missed the sarcasm tag.



    But no you are serious. There is nothing quite as entertaining as long winded under-informed opinion "educating" the rest of us.



    Impossible for x86 to be power efficient? If you read some actual CPU designer opinion on the the x86 overhead it is only 5-10% overhead versus other ISAs. Intel has simply been slow to wake up and change course.



    There is nothing inherently wrong with x86 for modern smartphones or tablets, except an installed base of ARM code. Similar to the issue ARM has on the desktop.
  • Reply 68 of 73
    tipootipoo Posts: 1,142member
    Quote:
    Originally Posted by Snowdog65 View Post




    Impossible for x86 to be power efficient? If you read some actual CPU designer opinion on the the x86 overhead it is only 5-10% overhead versus other ISAs. Intel has simply been slow to wake up and change course.



    There is nothing inherently wrong with x86 for modern smartphones or tablets, except an installed base of ARM code. Similar to the issue ARM has on the desktop.







    Exactly. He still hasn't responded to the article I posted above, Medfeild looks like it will be both power competitive and performance competitive. And this isn't even using their 22nm fabs yet. When it does, it will have quite a lead. There's Cortex A15 coming out of course, but Intel is one tough gorilla to compete with in the long term.



    More food for thought. I know it doesn't mean success or failure, but Intel sure has a lot more to throw at R&D if things should go sour.



    http://www.wolframalpha.com/input/?i...M+Holdings+AMD
  • Reply 69 of 73
    nvidia2008nvidia2008 Posts: 9,262member
    Yeah, certainly Intel fabbing for Apple seems like a great idea. But Intel has drifted somewhat from Apple since the euphoria of Macs switching to Intel.



    afrodi below makes a good point as well. Intel has resisted being a fab-for-hire, and as such is perhaps the best fab out there. TSMC and Global Foundries have had their share of screw-ups, things like Nvidia's whacked-out post-8600GT designs notwithstanding.



    Intel is at a crossroads. They can stick to x86 and enjoy the juicy margins since their x86 fabbing is unbeatable right now, and for the forseeable future.



    Or, they can channel a huge amount of money spent "fighting" ARM to start fabbing ARM. They can't depend on Apple, Apple for sure has investigated and lined up all the fabbing capacity they would ever need for the next several years.



    I wonder, just for hypotheticals, whether the visits by AMD were actually also a peek into Global Foundries.



    In any case, Apple has scoured the earth, it's up to Intel now to decide where to go.



    Quote:
    Originally Posted by Dick Applebaum View Post


    I often wonder if Apple could contract Intel to fab Apple's ARM chips/packages.



    Seems like a mutually-beneficial arrangement could be worked out -- and reduce dependence on Sammy.



    Quote:
    Originally Posted by afrodri View Post


    Its theoretically possible, but Intel has historically resisted being a fab-for-hire. TSMC, UMC, and Global Foundries (and a little IBM) all work in that space and have the infrastructure for being a foundry.



    Part of the reason Intel has resisted this path is the same reason that Apple killed off the clones and avoided licensing MacOS ?*by having vertical integration (HW & SW in Apple's case, Chip design and Fab in Intels) you capture more of the high margin activities and can differentiate yourself. If Intel became "just another foundry" they might not be able to maintain the profit margins they prefer.



    Intel actually used to make ARM cores (I think there was some IP swap from DEC or something), but got out of that to focus on x86 - i.e. somewhere they could try and differentiate themselves.



  • Reply 70 of 73
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by zunx View Post


    What is needed is a true a full Mac with just 400 to 600 g. The Mac in your pocket. Always. And I mean a true Mac with Intel x86 inside; not an ARM-based iOS device.



    Erm... But that Mac in your pocket won't be a Mac as we know it. It will be iOS. Apple is clear that below the 10" form factor it will never be a regular keyboard-screen OS X.



    Quote:
    Originally Posted by Dick Applebaum View Post


    Maybe I should have phrased it like the OP -- because Apple controls the platform, OS and hardware, they can take advantage of ARM/SIMD in iDevices.



    It is doubtful that MS or Google can control this to the extent that Apple can.



    Precisely. And again the scope of scaling up ARM-SIMD-PowerVR is phenomenally easier than cramming down x86 into tablet/mobile land.



    Quote:
    Originally Posted by SSquirrel View Post


    Yes but again, a current gen discrete desktop GPU is blisteringly fast and can pump way more data than a current gen top end ARM GPU. So even if the CPUs stacked up managed to even out, the desktop would still have the advantage with the GPU compute capabilities.



    Yes and No. Those blisteringly fast GPUs burn at least 50W to in excess of 150W. On Windows and even Mac OS X, GPGPU just hasn't lived up to the promise. On Windows, even gaming with a good GPU is near impossible nowadays with endless patches and graphic driver updates. I feel so burned up sometimes that Nvidia and AMD/ATI threw away such fantastic GPU engineering by just eating up more and more watts, failing to fab at lower nodes in a timely fashion, and just not sorting out the driver issues with Windows and games AND GPGPU stuff in Windows.



    Given Intel in-CPU optimisations like H.264 encoding and so on, a modern Core i7 with the right software trounces GPGPU... in addition to the advantage of *not needing* another GPU besides Intel Integrated.



    Quote:
    Originally Posted by Macnewsjunkie View Post


    One of the really cool take aways from the Jobs biography was how secrecy allowed Jobs or the current CEO to make the decision for the next platform at the last minute with all the dominoes lined up for both decisions. It is a decision point made based only on what is best for the companies future rather than its present. Both PowerPc and Intel mac books were made by independent development teams. If one failed the other could take over. At the last minute after sleeping on the decision Steve chose the intel chip. Doing both insures against intellectual theft and empire making by the drones in different departments. The fact that it makes for awesome theater and marketing is just icing on the cake. No wonder Steve always seemed so excited by the moment. The decision was just made and the future is right now during the keynote.



    So yea I would say that this is now SOP at Apple. Let the press and internet Guess. Apple doesn't even know yet, and any employee who thinks he can get away with telling reporters about what the next great thing is will be in for a surprise.



    Yeah, this was awesome isn't it? In the best traditions of American "skunkworks". Apple sure gives Area51 a run for its money.



    Quote:
    Originally Posted by Snowdog65 View Post


    LOL. At first glance, I thought I missed the sarcasm tag.



    But no you are serious. There is nothing quite as entertaining as long winded under-informed opinion "educating" the rest of us.



    Impossible for x86 to be power efficient? If you read some actual CPU designer opinion on the the x86 overhead it is only 5-10% overhead versus other ISAs. Intel has simply been slow to wake up and change course.



    There is nothing inherently wrong with x86 for modern smartphones or tablets, except an installed base of ARM code. Similar to the issue ARM has on the desktop.



    Perhaps theoretically x86 does not have major disadvantages. But Intel has a tough job ahead of it to come down to ARM-level power draw without the x86 CPU being worthless.



    Quote:
    Originally Posted by tipoo View Post


    Exactly. He still hasn't responded to the article I posted above, Medfeild looks like it will be both power competitive and performance competitive. And this isn't even using their 22nm fabs yet. When it does, it will have quite a lead. There's Cortex A15 coming out of course, but Intel is one tough gorilla to compete with in the long term.



    More food for thought. I know it doesn't mean success or failure, but Intel sure has a lot more to throw at R&D if things should go sour.



    http://www.wolframalpha.com/input/?i...M+Holdings+AMD



    Fair enough, but I'm not feeling Intel on this, the weight behind ARM is signficant with Apple, Google and Microsoft all on the bandwagon, among others.



    This is a battle Intel can "afford" to lose in the next five years, but as we know in tech, sometimes losing the battle means you have nothing much left to fight the war.



    Again, the scenario of an iPad 4 which delivers Core i5 "experience" (due to optimisation etc) and DX10-quality 1680x1050 graphics, all playable on a HDTV, presents a tantalising possibility that you wouldn't want to touch your laptop unless for intense content creation. For business and content consumption, x86 in five years is going to be total overkill. Yeah, Windows 8, Office 2013 or whatever will drive the x86 forward but the road is getting narrower and the cliffs closer.



    Quote:
    Originally Posted by afrodri View Post


    I'd agree, and also say that Intel _has_ realized it for a while now, its just that their attempts to break in to the mobile world have been flops. Still, you are right not to count them out. They are a _huge_ company which can take a brute force approach like none other. I remember the 90s when it was "obvious" that the x86 architecture was doomed and RISC architectures would crush them, and on a purely technical standpoint x86 didn't stand a chance. But, because Intel could shovel cash and engineering and manufacturing knowledge at the problem, x86 survived and even took over much of the high end. They still might be able to pull it off in the mobile world.



    I wouldn't be surprised if Apple has both ARM and x86 versions of MacOS and iOS internally, just like they had x86 versions of MacOS years before the PPC->x86 changeover. If Intel does come up with a decent line of low-power cores, Apple would be able to take advantage of it.



    It's true, PPC RISC was definitely awesome but Intel got its act together early last decade. However they were lucky in some sense that the Pentium3 line proved to be the way to go in amongst the kludge that was Pentium 1, 2 and the ill-fated Pentium 4. Also, Motorola and IBM both dropped the ball, somehow it seems very strange that the best they could do was a RISC processor eating well over 100W to compete with the stuff Intel (and don't forget AMD) was putting out with ease. In fact before the fiasco of the G5 Motorola was already sruggling badly with the G4.



    So in some sense, PPC RISC and x86 were not that far apart, trading blows for a while until Moto and IBM screwed up, and Intel rallied the troops.



    But if you're looking at ARM vs x86, the "gap" in thermals and no doubt performance is much bigger. Again the argument for ARM is that it has immense scope for scaling up performance and cores whereas x86 is facing the reverse challenge.



    Intel x86 might be able to pull it off in the mobile world, or at least tablet world, but they have a bit of an uphill battle. ARM and PowerVR are on the straight and just have to keep on keeping on.
  • Reply 71 of 73
    Quote:
    Originally Posted by nvidia2008 View Post


    Precisely. And again the scope of scaling up ARM-SIMD-PowerVR is phenomenally easier than cramming down x86 into tablet/mobile land.





    Perhaps theoretically x86 does not have major disadvantages. But Intel has a tough job ahead of it to come down to ARM-level power draw without the x86 CPU being worthless.



    I really have to wonder if it is willful blindness that keeps people thinking ARM has some kind of magic fairy dust while ignoring what is going on around them:



    http://www.anandtech.com/show/5365/i...or-smartphones



    There is Medfield, running with similar power to ARM on a smartphone, while doing MUCH better in benchmarks. So exactly what is this tough job AHEAD, and does this look like the CPU is worthless??



    Intel stumbled for a few years in mobile, just as the stumbled with P4-Netburst architecture. It takes time to turn the big ship, but when the do, watch out.



    Intel has a problem in Mobile, but it is NOT the quality of the CPU anymore, it is simply a massive installed code base that is in ARM ISA.
  • Reply 72 of 73
    ssquirrelssquirrel Posts: 1,196member
    Quote:
    Originally Posted by nvidia2008 View Post


    Yes and No. Those blisteringly fast GPUs burn at least 50W to in excess of 150W. On Windows and even Mac OS X, GPGPU just hasn't lived up to the promise. On Windows, even gaming with a good GPU is near impossible nowadays with endless patches and graphic driver updates. I feel so burned up sometimes that Nvidia and AMD/ATI threw away such fantastic GPU engineering by just eating up more and more watts, failing to fab at lower nodes in a timely fashion, and just not sorting out the driver issues with Windows and games AND GPGPU stuff in Windows.



    I was careful not to claim any kind of evenness on the watt usage of GPUs in the desktop world compared w/mobile, but there are certainly pretty big results. Part of the GPGPU problem is that both NVIDIA and AMD have their own version of it and everyone isn't coding for the same thing. In fact, many people never code for it. It isn't like OS X where the OS naturally takes advantage of the GPU.



    Your bit about gaming being near impossible w/patches and the like is extremely hyperbolic. Usually, unless there has been some major problem they are fixing, you don't get stable releases of the graphics card more regularly than every month or 2 and game patches depend entirely on the game. MMOs have regular small updates that are downloaded in a couple of minutes time and are usually client changes only. Big version change patches come with lots of advance notice and, these days, advance download of large chunks of it while you are playing so when the actual patch day comes, a smaller update hits and contains the last few changes.



    That is hardly a huge timekiller for most games. Now if you are buying a game that has been out for awhile and has a multitude of patches, then you have a lot of patching before that first time, but then probably not that many
  • Reply 73 of 73
    tipootipoo Posts: 1,142member
    Quote:
    Originally Posted by nvidia2008 View Post


    Fair enough, but I'm not feeling Intel on this, the weight behind ARM is signficant with Apple, Google and Microsoft all on the bandwagon, among others.



    This is a battle Intel can "afford" to lose in the next five years, but as we know in tech, sometimes losing the battle means you have nothing much left to fight the war.



    Again, the scenario of an iPad 4 which delivers Core i5 "experience" (due to optimisation etc) and DX10-quality 1680x1050 graphics, all playable on a HDTV, presents a tantalising possibility that you wouldn't want to touch your laptop unless for intense content creation. For business and content consumption, x86 in five years is going to be total overkill. Yeah, Windows 8, Office 2013 or whatever will drive the x86 forward but the road is getting narrower and the cliffs closer.




    I'm not sure I'd say they are on the ARM bandwagon. Instead, everyone is moving towards processor architecture agnosticism. Apps coded for Metro will be able to run on ARM and x86 natively for example. And the vast majority of Android apps will run on Medfeild with no changes, the rest will run with ARM binary compatibility which will be done in real time in hardware. So Intel does not face a challenge to its instruction set, its just a matter of power draw, cost and performance. ARM may win the first two for a while, but like I said, Intel tends to slowly crush anyone it faces.
Sign In or Register to comment.