or Connect
AppleInsider › Forums › Investors › AAPL Investors › Apple CEO hints at no ARM-based MacBook Air as iPad to "soon satisfy" that niche
New Posts  All Forums:Forum Nav:

Apple CEO hints at no ARM-based MacBook Air as iPad to "soon satisfy" that niche - Page 2

post #41 of 74
Quote:
Originally Posted by poke View Post

I think Apple probably has more than one plan in place. One would involve converging iOS and OS X and perhaps moving Macs to ARM. The other would involve the iPad replacing the Mac. It's a question of where the market goes. Either way I think that 5 years from now they'll only have one platform.

I seriously doubt that either of those scenarios will pan out. Given the recent sales figures of both Mac and iOS I can't see Apple consolidating those divisions.

People need both mobile devices and full computers unless their life is so simplistic that they can get by with just a touch based mobile device.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #42 of 74
What is needed is a true a full Mac with just 400 to 600 g. The Mac in your pocket. Always. And I mean a true Mac with Intel x86 inside; not an ARM-based iOS device.
post #43 of 74
Quote:
Originally Posted by zunx View Post

What is needed is a true a full Mac with just 400 to 600 g. The Mac in your pocket. Always. And I mean a true Mac with Intel x86 inside; not an ARM-based iOS device.

So is this brilliant, well thought out idea Mac OS X on a pocketable touchscreen or Mac OS X with a pocketable keyboard?

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply
post #44 of 74
Quote:
Originally Posted by shompa View Post

You know that its impossible with X86 to be power efficient? The architecture makes the CPU to large. Large CPU leads to heat, more cost and more energy.

X86 is many things. Not fastest, not most power efficient. X86 have always "won" since
Windows support it and that processor are cheap compared to RISC processors. Remember that RISC went 64bit in 1995 and that first with Win7 64bit is widespread in the X86 world. And that is with 64bit extensions, not true/pure 64bit.

The funny thing with X86 64bit is that programs runs slower with it. Programs that don't use more then 4 gig memory are in average 2% slower on 64bit then 32bit. This is something that never have happened before. From 16-->32bit dramatic performance increase. On real 64bit processors. Dramatic performance increase 32--->64bit.

Intel have the greatest manufacturing process in the world. The competition are 2 years behind. Even with this, Intel cant manufacture a decent low powered SoC that can be used in mobile devices.

The X86 phones that are introduced in 3-6 month have a singe core 1.3ghz that turbos to 1.6. In single threaded programs its faster then Cortex9. But Intel won't compete with Cortex9 in 6 month. They will have to compete with ARM15.

At the same power envelope as the Intel SoC at 1.3ghz you get a quod core 1.8-2ghz ARM15 core. We are talking 4 times the power.

Intel needs to uncripple its SoC and produce it with latest processing node for it to have a chance. But Intel wont do that. They don't want to compete with ARMs 6 cent per core strategy. Intel wants to sell expensive processors. A good, cheap Intel would just hurt Intel's bottom line.

Apple have a unique edge against Intel/X86/Android
Since Apple controls its platform and makes its own SoC they can design the SoC to their specification. This leads to that Apple uses NOVA SIMD acceleration. This is something Intel/Android never can use since they don't control the platform. Every single A4+ Apple product have SIMD. An educated guess is that iOS 6 will be A4+ only meaning that everything can be accelerated by SIMD since every device will support it. Google supriced me with their answer. Since not every Android device have SIMD, they accelerated stuff with GPU in Android 4. Apple can use both SIMD and GPU. Something that Apple have lots of experience with since OSX have been accelerated using Altivec and Quartz extreme since 2002.

Intel have never been good at SIMD extensions. RISC have always lead the way, probably since most RISC vendors control their OS/Hardware and can accelerate stuff.

Bookmark this:
Intel will be a niche processor in 5-8 years. They will have the "performance" market, gaming and "fast enough" servers.
Lowend ARM
Middle end ARM
Highend Intel/SPARC/POWER and all other exotic processors.

Intel will OEM ARM manufacturing with their great fabs.

People seems to believe that X86/Intel always have been/Will be the greatest. 15 years ago:
Lowend companies: Diskless clients to Xservers. Home computers X86
Middle end Intel
Highend SPARC/IBM/Alpha. (Intel had 10% market share servers in 2000)
Stuff changes fast in the IT world.

And last, to put things in perspective.
ARM license about 20 billion cores each year. That is what Intel have done in all its lifetime. ARM license more high end ARM cores then AMD have sold in its lifetime.
The averege Ipad costs more the the averege PC.

Phones and Tablets will replace almost all desktop computers within a couple of years. (many users will have faster Ipads then their PCs starting with A6)

Stunning post.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #45 of 74
I am not an expert on microprocessors, but the discussion reminds me the quote which is attributed to L Ellison, about the IBM PC "IBM has been mad to design something for which one third of the profit goes to Microsoft, and another third to Intel" (something like this). I am pretty sure Apple will also make a move in such a way that it also puts in its pocket the microprocessor profit, as they did for their mobile offering.
post #46 of 74
Quote:
Originally Posted by PBRSTREETG View Post

I think thats good for you but I write a lot of code and I can do it on an iPad but I won't. It's one of those cases of some thing that you can do but really shouldn't because it makes no sense. Writing hundred of line of code and writing research papers is what I do and it makes no sense to do it on an iPad no matter how big it is.

Just buy an Apple wireless keyboard if you do a lot of typing.
post #47 of 74
An iPad 3 with an attachable keyboard like the Transformer Prime would be great. Its nice to know they see it as a scaling up of the iPad and not a dumbing down of Mac's too. ARM chips are getting more and more capable but still are magnitudes behind what the chips in Macbook Pros and other laptops are. So no ARM based Macbook Air, not for a few years at least, but an iPad with a keyboard that can do basic laptop stuff decently.
post #48 of 74
Quote:
Originally Posted by shompa View Post

Bookmark this:
Intel will be a niche processor in 5-8 years. They will have the "performance" market, gaming and "fast enough" servers.
Lowend ARM
Middle end ARM
Highend Intel/SPARC/POWER and all other exotic processors.



You realize Intel is getting onto ARMs target space soon too, right?

http://www.anandtech.com/show/5365/i...or-smartphones

They also have much more R&D capability than any ARM manufacturer, as well as having the most advanced fabs by at least a years margin. I believe they will face some tougher challenges with ARM than against AMD, but they certainly aren't becoming niche in 5-8 years, please bookmark this as well
post #49 of 74
Quote:
Originally Posted by Tallest Skil View Post

Just like they lost the phone market because they kept a 3.5" screen. Or how they lost the PMP market by not supporting FLAC or WMA.

Wait

It is funny. just five years ago Apple did not have even ONE device in the phone market, and now they are kicking butt with only a couple of iPhone models compared with many companies using many other smartphone devices! Actually Apple as a company is just dominating...

Like it or not, apple is an incredible marketing machine with great products..It was not the first one on the MP3 market, and now is killing it, then it was more than 20 years late on the wireless phone market and as individual company is killing it, and then they created a whole new category with the iPad, and it is killing it...

Then, Apple came with the Mac air...

And one more thing._________Just fill in the blanks, I call it innovation, not only technology but also marketing innovation.

As a marketer who has sold close to 200 million dollars in products (I am just a drop of water in the ocean when compared with apple) I say hail to the apple marketing machine...and Steve Jobs legacy, we have a lot to learn from them, and keep learning...

George
post #50 of 74
It doesn't matter, at all, how whatever chipset Apple puts into an iPad compares to whatever is the latest and greatest from Intel. What matters is how well that iPad runs the software that most people want to run.

Will the iPad 3 do 3D modeling or realtime simulation as well as a powerful desktop? No. Will it more than likely do most of what most people want their computer to do quickly and fluidly? Almost certainly, in that the current iPad already satisfies those conditions in most cases.

A lot of people seem to be taking Cook's remarks as indicative of new hardware or form factors, but I'm intrigued as to what it means for the next iteration of iOS.

Apple has addressed most of the nagging issues that seemed to be on most people's lists, now they can start to really flex the architecture's muscles and see what they can do about becoming the next, no-compromise computing platform. That means the OS and applications.

Right now there are a bunch of apps that feel a bit "dumbed down" in order to play nice on an iPad. Limited feature set, constrained file management, etc. Up till now, the fact that I can run some of these apps at all has seemed amazing, but we're quickly moving to the point that people are going to start wanting more out of their tablets.

This is also where Apple can absolutely bury the competition. Android manufactures and Google seem to have settled into a default "media consumption with lots of ties to social media" position for their tablets. Apple clearly has much loftier ambitions for the iPad and possible future iOS devices-- they really want to make it the next Mac. It isn't yet, but there's no reason it can't be. Google, OTOH, doesn't seem to aspire to much beyond an ad impression funnel, so there's no urgency around creating the kind of apps that can sharply demarcate the toys from the "real work" machines (ironic, I know, but it seems to be true). Just look at the Avid video editing app that just got released for iOS. I guess they might release an Android version at some point, but would anyone use it? You don't get the impression anyone is buying an Android tablet to actually do stuff, because they've already decided that "real work" is for laptops.

As for MS, who knows? I don't see the Metro environment as particularly conducive to real productivity apps, and it's looking as if Windows tablets will be more like Android tablets with a flatter UI, mostly intended as a window into your social doings. Will they ever build a Metro Office? Can they? Because you can bet Apple is working on major improvements to iOS iWork
They spoke of the sayings and doings of their commander, the grand duke, and told stories of his kindness and irascibility.
Reply
They spoke of the sayings and doings of their commander, the grand duke, and told stories of his kindness and irascibility.
Reply
post #51 of 74
Quote:
Originally Posted by Shaun, UK View Post

Just buy an Apple wireless keyboard if you do a lot of typing.

A keyboard is hardly the only problem with doing serious code development on an iPad. To made anything other than a toy program you need to be able to open and edit lots of different files, and HAVE A COMPILER. The most you can use an iPad for is as a text editor where you would have to upload the files to another machine to compile and test (or back to the iPad if that is what you are targeting). So, for code development you would be shelling out $500+ for what is essentially a 10-inch dumb terminal. Plus, most coders I know prefer to develop on two or more large (20-inch or more) screens connected to as many processors with as much DRAM as you can afford for parallel compilation. The iPad is not a good platform for development. Even for writing large technical documents, the 10-inch screen is a limitation.

This in no way means the iPad is bad. It is great for a lot of tasks, probably even most day-to-day use, but there is a reason we still have conventional PCs, and for many uses (code development, 3D modeling and rendering, page layout, scientific and technical computing, etc...) tablets are just not a great fit.

I would argue that this is the real strength of Apple's strategy. Other than the very high end (servers and supercomputers) they are the only company which offers a unified set of devices from your pocket to your desktop. This allows someone to develop content on a device well suited for it (i.e. lots of processing power, memory, and screen space) and then present that content on a device that is portable and tailored toward media consumption. If Apple only focused on the mobile, it would be a much less compelling ecosystem.
post #52 of 74
Quote:
Originally Posted by mstone View Post

But you can't type a degree symbol, or any Greek letter like micron symbol or theta which we use all the time in science. You can't type an m-dash or superscript or subscript, plus/minus or approx equal. For science the iPad sucks at typing. But then again all one really needs to do in life is post to Facebook.

You need to type out all those things, so you use an application that supports it. Not Notes. Find a notepad application with all of those things. Buy a stylus, draw them in. There's a couple ways you could do it.

Your complaint is like saying you can't make very good spreadsheets with Wordpad.
post #53 of 74
Quote:
Originally Posted by Shaun, UK View Post

They won't need to transition the MBA to ARM because we'll probably see a 13" iPad next year.

Quote:
Originally Posted by Alonso Perez View Post

I don't know about 13", which seems a little big, but I sure do hope for a slightly larger iPad, about 11.8", 2048 x 1536 (double the current resolution), coupled with a good dock and a keyboard cover like the Zagg (but with full size keys like the 11" Air).

This would make it a more viable notebook replacement for more common use cases, it would make it a much better magazine and illustrated book reader (size does matter), and it would reduce eye fatigue for those older than 40.

I see the format as a complement, not a replacement. Just like today we have two sizes of Airs, there would be two iPad sizes.

By the way, if the above does not happen I would love to see an ARM based Air. The point is not to make it cheap or low end. The point would be the all-day, worry-free battery life. If you spend all day with email and basic office apps, you don't need the high performance of an Intel chip.

Have a look at this:

Jeff Han Demos 82-inch & 27-inch Multi-Touch Display

"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
post #54 of 74
Quote:
Originally Posted by Leery View Post

You need to type out all those things, so you use an application that supports it. Not Notes. Find a notepad application with all of those things. Buy a stylus, draw them in. There's a couple ways you could do it.

Your complaint is like saying you can't make very good spreadsheets with Wordpad.

Yes and instead of taking mass transit to my office each day I could do cartwheels, handsprings and back flips the entire way.

Seriously thanks for your suggestions but I need a much more robust system wide character chooser or keyboard layout to accomplish the needed tasks for all applications from texting to email, posting in web forms, to do lists etc.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #55 of 74
Quote:
Originally Posted by nvidia2008 View Post

Listening now to Deus Ex: HR soundtrack while reading this... I have to say youve got some good points.

The killer is this: User Experience for beginners with ARM devices are light years ahead of x86 devices at the moment... Intermediate users are already using ARM devices for "simpler" tasks. The bottom is about to drop out of the desktop PC market, and in five years I can't see ARM not kicking x86 ass for most beginner-intermediate tasks (whatever that may mean to someone).

I think Intel is too complacent to fab ARM in the next few years though, even though if they started tomorrow you know they will be laughing all the way to the bank throughout this decade. For the next few years Intel will be burning a lot of resources fighting ARM when they could be making lots off it.

I often wonder if Apple could contract Intel to fab Apple's ARM chips/packages.

Seems like a mutually-beneficial arrangement could be worked out -- and reduce dependence on Sammy.
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
post #56 of 74
Quote:
Originally Posted by Dick Applebaum View Post

I often wonder if Apple could contract Intel to fab Apple's ARM chips/packages.

Seems like a mutually-beneficial arrangement could be worked out -- and reduce dependence on Sammy.

Its theoretically possible, but Intel has historically resisted being a fab-for-hire. TSMC, UMC, and Global Foundries (and a little IBM) all work in that space and have the infrastructure for being a foundry.

Part of the reason Intel has resisted this path is the same reason that Apple killed off the clones and avoided licensing MacOS *by having vertical integration (HW & SW in Apple's case, Chip design and Fab in Intels) you capture more of the high margin activities and can differentiate yourself. If Intel became "just another foundry" they might not be able to maintain the profit margins they prefer.

Intel actually used to make ARM cores (I think there was some IP swap from DEC or something), but got out of that to focus on x86 - i.e. somewhere they could try and differentiate themselves.
post #57 of 74
Quote:
Originally Posted by shompa View Post

...

Apple have a unique edge against Intel/X86/Android
Since Apple controls its platform and makes its own SoC they can design the SoC to their specification. This leads to that Apple uses NOVA SIMD acceleration. This is something Intel/Android never can use since they don't control the platform. Every single A4+ Apple product have SIMD. An educated guess is that iOS 6 will be A4+ only meaning that everything can be accelerated by SIMD since every device will support it. Google supriced me with their answer. Since not every Android device have SIMD, they accelerated stuff with GPU in Android 4. Apple can use both SIMD and GPU. Something that Apple have lots of experience with since OSX have been accelerated using Altivec and Quartz extreme since 2002.

Intel have never been good at SIMD extensions. RISC have always lead the way, probably since most RISC vendors control their OS/Hardware and can accelerate stuff.

...

Quote:
Originally Posted by addabox View Post

It doesn't matter, at all, how whatever chipset Apple puts into an iPad compares to whatever is the latest and greatest from Intel. What matters is how well that iPad runs the software that most people want to run.

Will the iPad 3 do 3D modeling or realtime simulation as well as a powerful desktop? No. Will it more than likely do most of what most people want their computer to do quickly and fluidly? Almost certainly, in that the current iPad already satisfies those conditions in most cases.

A lot of people seem to be taking Cook's remarks as indicative of new hardware or form factors, but I'm intrigued as to what it means for the next iteration of iOS.

Apple has addressed most of the nagging issues that seemed to be on most people's lists, now they can start to really flex the architecture's muscles and see what they can do about becoming the next, no-compromise computing platform. That means the OS and applications.

Right now there are a bunch of apps that feel a bit "dumbed down" in order to play nice on an iPad. Limited feature set, constrained file management, etc. Up till now, the fact that I can run some of these apps at all has seemed amazing, but we're quickly moving to the point that people are going to start wanting more out of their tablets.

This is also where Apple can absolutely bury the competition. Android manufactures and Google seem to have settled into a default "media consumption with lots of ties to social media" position for their tablets. Apple clearly has much loftier ambitions for the iPad and possible future iOS devices-- they really want to make it the next Mac. It isn't yet, but there's no reason it can't be. Google, OTOH, doesn't seem to aspire to much beyond an ad impression funnel, so there's no urgency around creating the kind of apps that can sharply demarcate the toys from the "real work" machines (ironic, I know, but it seems to be true). Just look at the Avid video editing app that just got released for iOS. I guess they might release an Android version at some point, but would anyone use it? You don't get the impression anyone is buying an Android tablet to actually do stuff, because they've already decided that "real work" is for laptops.

As for MS, who knows? I don't see the Metro environment as particularly conducive to real productivity apps, and it's looking as if Windows tablets will be more like Android tablets with a flatter UI, mostly intended as a window into your social doings. Will they ever build a Metro Office? Can they? Because you can bet Apple is working on major improvements to iOS iWork


From reading in this and other threads, I am starting to get the impression that we may be looking at this
x86 vs ARM from the wrong perspective...

Many people say we must look at how ARM measures up to x86 -- running x86 apps in particular.

The earlier post referenced SIMD -- so I did a little research:

Wiki: SIMD


From the linked article:
Quote:
Advantages

An application that may take advantage of SIMD is one where the same value is being added (or subtracted) to a large number of data points, a common operation in many multimedia applications. One example would be changing the brightness of an image. Each pixel of an image consists of three values for the brightness of the red (R), green (G) and blue (B) portions of the color. To change the brightness, the R, G and B values are read from memory, a value is added (or subtracted) from them, and the resulting values are written back out to memory.
With a SIMD processor there are two improvements to this process. For one the data is understood to be in blocks, and a number of values can be loaded all at once. Instead of a series of instructions saying "get this pixel, now get the next pixel", a SIMD processor will have a single instruction that effectively says "get lots of pixels" ("lots" is a number that varies from design to design). For a variety of reasons, this can take much less time than "getting" each pixel individually, as with traditional CPU design.
Another advantage is that SIMD systems typically include only those instructions that can be applied to all of the data in one operation. In other words, if the SIMD system works by loading up eight data points at once, the add operation being applied to the data will happen to all eight values at the same time. Although the same is true for any super-scalar processor design, the level of parallelism in a SIMD system is typically much higher.

Now THAT IS interesting!

AV editing involves effects (panning/zooming/distortion, colorization, etc.), retiming, changing octave/key...

It just may be that the ARM/SIMD solution is superior for these types of apps -- and it's the x86 that doesn't measure up.


Now, take that same concept to most display operations -- things like dragging a window across the display.

Also the size/cost/heat/power advantages might allow the ARM/SIMD "package" (including dedicated software) to be included in the display itself...

Wouldn't that be something -- a display (or TV Screen) that could intermitently receive and buffer a full complement of streamed data then perform manipulations on that data -- instead of constantly replacing it with attentional streamed data.

I don't know... maybe receive every 25th frame of a hires video -- then generate the intervening 24 frames... That could deliver a superior video at lower bandwidth.

I have used iMovie on the iPad... it's OK!

I recently bought Avid Studio on the iPad... more capabilities & I think I like it over iMovie.

But, whenever I fire up Final Cut Pro X on my iMac -- I just can't help wondering how well it would run on the iPad.

"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
post #58 of 74
Quote:
Originally Posted by Dick Applebaum View Post

From reading in this and other threads, I am starting to get the impression that we may be looking at this
x86 vs ARM from the wrong perspective...

Many people say we must look at how ARM measures up to x86 -- running x86 apps in particular.

The earlier post referenced SIMD -- so I did a little research:

Wiki: SIMD


From the linked article:


Now THAT IS interesting!

AV editing involves effects (panning/zooming/distortion, colorization, etc.), retiming, changing octave/key...

It just may be that the ARM/SIMD solution is superior for these types of apps -- and it's the x86 that doesn't measure up.

x86 has support for SIMD instructions, just like ARM. And both processors can offload heavy SIMD work to GPUs which are even better at them. You can argue which SIMD instruction set is better, but there is not much difference, and both are evolving rapidly. So we are not comparing x86 vs. ARM or x86 vs. ARM+SIMD, but x86+SIMD+GPUs vs. ARM+SIMD+GPUs.
post #59 of 74
Quote:
Originally Posted by Shaun, UK View Post

Just buy an Apple wireless keyboard if you do a lot of typing.

When I first got the iPad I made an attempt to use it to replace my MacBook Pro. The same stuff, coding work and writing papers and email. It didn't work for me. I tried it again with the iPad 2. It still didn't work out. If you have evert used a wireless keyboard with the iPad the display keyboard is needed sometimes and you end up touching the screen a lot anyway. I ended up replacing my MBP with an MBA.

MSTONE had commented in a comical way regarding the use of an iPad but there merit to his comment. It really depends on the kind of work you need to do with an iPad but its utility is still limited if you do complex content creation, programming or require performance. I use a multicore PC to run several VMs 24/7 for testing. I would love to do it on a tablet but this of course doesn't work.

The majority of of people would be just fine with an iPad, but they are also just fine with a single core or dual core computer at the most. The activity on their computer is no more complex than FaceBook, web, email, messaging and light gaming. That's 90% of the people out there.
post #60 of 74
Quote:
Originally Posted by afrodri View Post

x86 has support for SIMD instructions, just like ARM. And both processors can offload heavy SIMD work to GPUs which are even better at them. You can argue which SIMD instruction set is better, but there is not much difference, and both are evolving rapidly. So we are not comparing x86 vs. ARM or x86 vs. ARM+SIMD, but x86+SIMD+GPUs vs. ARM+SIMD+GPUs.

Maybe I should have phrased it like the OP -- because Apple controls the platform, OS and hardware, they can take advantage of ARM/SIMD in iDevices.

It is doubtful that MS or Google can control this to the extent that Apple can.
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
post #61 of 74
Quote:
Originally Posted by Dick Applebaum View Post

Maybe I should have phrased it like the OP -- because Apple controls the platform, OS and hardware, they can take advantage of ARM/SIMD in iDevices.

It is doubtful that MS or Google can control this to the extent that Apple can.

Yes but again, a current gen discrete desktop GPU is blisteringly fast and can pump way more data than a current gen top end ARM GPU. So even if the CPUs stacked up managed to even out, the desktop would still have the advantage with the GPU compute capabilities.
post #62 of 74
Quote:
Originally Posted by SolipsismX View Post

So is this brilliant, well thought out idea Mac OS X on a pocketable touchscreen or Mac OS X with a pocketable keyboard?

Whatever. The important thing in this kind of device is not the form factor, but first and foremost the weight and then the size. The lighter and the smaller, the better. The full Mac in your pocket or purse. Always.
post #63 of 74
Quote:
Originally Posted by SSquirrel View Post

Yes but again, a current gen discrete desktop GPU is blisteringly fast and can pump way more data than a current gen top end ARM GPU. So even if the CPUs stacked up managed to even out, the desktop would still have the advantage with the GPU compute capabilities.

Agree...

But within the mobile form factor, specifically tablets and low-end laptops, how does the Atom CPU/GPU stack up against the A5 CPU/GPU?
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
"...The calm is on the water and part of us would linger by the shore, For ships are safe in harbor, but that's not what ships are for."
- Michael Lille -
Reply
post #64 of 74
Quote:
Originally Posted by poke View Post

I think Apple probably has more than one plan in place. One would involve converging iOS and OS X and perhaps moving Macs to ARM. The other would involve the iPad replacing the Mac. It's a question of where the market goes. Either way I think that 5 years from now they'll only have one platform.

One of the really cool take aways from the Jobs biography was how secrecy allowed Jobs or the current CEO to make the decision for the next platform at the last minute with all the dominoes lined up for both decisions. It is a decision point made based only on what is best for the companies future rather than its present. Both PowerPc and Intel mac books were made by independent development teams. If one failed the other could take over. At the last minute after sleeping on the decision Steve chose the intel chip. Doing both insures against intellectual theft and empire making by the drones in different departments. The fact that it makes for awesome theater and marketing is just icing on the cake. No wonder Steve always seemed so excited by the moment. The decision was just made and the future is right now during the keynote.

So yea I would say that this is now SOP at Apple. Let the press and internet Guess. Apple doesn't even know yet, and any employee who thinks he can get away with telling reporters about what the next great thing is will be in for a surprise.
post #65 of 74
Quote:
Originally Posted by Dick Applebaum View Post

Agree...

But within the mobile form factor, specifically tablets and low-end laptops, how does the Atom CPU/GPU stack up against the A5 CPU/GPU?

The Atom certainly draws more power. I haven't kept up w/Atom developments as much, but I know the Atom processors that were out a year ago were still in-order processors. I've seen articles about the potential for stacking huge numbers of Atom processors and using them for supercomputers and such, but it was a power saving measure IIRC as you really did need several to compete against each single top end processor.

Here's a link w/info about the new gen of Atoms coming out.

http://newsroom.intel.com/community/...ry-life-on-tap
post #66 of 74
Quote:
Originally Posted by SSquirrel View Post

The Atom certainly draws more power. I haven't kept up w/Atom developments as much, but I know the Atom processors that were out a year ago were still in-order processors. I've seen articles about the potential for stacking huge numbers of Atom processors and using them for supercomputers and such, but it was a power saving measure IIRC as you really did need several to compete against each single top end processor.

Here's a link w/info about the new gen of Atoms coming out.

http://newsroom.intel.com/community/...ry-life-on-tap

Atom is still in-order though Intel has come a long way with Atom. There Medfield coming out later this year does look to have several competitive edges over Cortex-A9 but that's the previous gen ARM so it's not the best measure. We'll have to see how it stacks up in real life against Cortex-A9 and the upcoming A15.

That said, Intel's focus on the mobile front should not be scoffed at. They now realize that ARM will hurt their bottom line if they don't get a viable competitor if they don't get under it quickly and the resources to throw at it.

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply

"There is no rule that says the best phones must have the largest screen." ~RoundaboutNow

 

Goodbyeee jragosta :: http://forums.appleinsider.com/t/160864/jragosta-joseph-michael-ragosta

Reply
post #67 of 74
Quote:
Originally Posted by SolipsismX View Post

That said, Intel's focus on the mobile front should not be scoffed at. They now realize that ARM will hurt their bottom line if they don't get a viable competitor if they don't get under it quickly and the resources to throw at it.

I'd agree, and also say that Intel _has_ realized it for a while now, its just that their attempts to break in to the mobile world have been flops. Still, you are right not to count them out. They are a _huge_ company which can take a brute force approach like none other. I remember the 90s when it was "obvious" that the x86 architecture was doomed and RISC architectures would crush them, and on a purely technical standpoint x86 didn't stand a chance. But, because Intel could shovel cash and engineering and manufacturing knowledge at the problem, x86 survived and even took over much of the high end. They still might be able to pull it off in the mobile world.

I wouldn't be surprised if Apple has both ARM and x86 versions of MacOS and iOS internally, just like they had x86 versions of MacOS years before the PPC->x86 changeover. If Intel does come up with a decent line of low-power cores, Apple would be able to take advantage of it.
post #68 of 74
Quote:
Originally Posted by shompa View Post

You know that its impossible with X86 to be power efficient? The architecture makes the CPU to large. Large CPU leads to heat, more cost and more energy.

X86
Bookmark this:
Intel will be a niche processor in 5-8 years.

LOL. At first glance, I thought I missed the sarcasm tag.

But no you are serious. There is nothing quite as entertaining as long winded under-informed opinion "educating" the rest of us.

Impossible for x86 to be power efficient? If you read some actual CPU designer opinion on the the x86 overhead it is only 5-10% overhead versus other ISAs. Intel has simply been slow to wake up and change course.

There is nothing inherently wrong with x86 for modern smartphones or tablets, except an installed base of ARM code. Similar to the issue ARM has on the desktop.
post #69 of 74
Quote:
Originally Posted by Snowdog65 View Post


Impossible for x86 to be power efficient? If you read some actual CPU designer opinion on the the x86 overhead it is only 5-10% overhead versus other ISAs. Intel has simply been slow to wake up and change course.

There is nothing inherently wrong with x86 for modern smartphones or tablets, except an installed base of ARM code. Similar to the issue ARM has on the desktop.



Exactly. He still hasn't responded to the article I posted above, Medfeild looks like it will be both power competitive and performance competitive. And this isn't even using their 22nm fabs yet. When it does, it will have quite a lead. There's Cortex A15 coming out of course, but Intel is one tough gorilla to compete with in the long term.

More food for thought. I know it doesn't mean success or failure, but Intel sure has a lot more to throw at R&D if things should go sour.

http://www.wolframalpha.com/input/?i...M+Holdings+AMD
post #70 of 74
Yeah, certainly Intel fabbing for Apple seems like a great idea. But Intel has drifted somewhat from Apple since the euphoria of Macs switching to Intel.

afrodi below makes a good point as well. Intel has resisted being a fab-for-hire, and as such is perhaps the best fab out there. TSMC and Global Foundries have had their share of screw-ups, things like Nvidia's whacked-out post-8600GT designs notwithstanding.

Intel is at a crossroads. They can stick to x86 and enjoy the juicy margins since their x86 fabbing is unbeatable right now, and for the forseeable future.

Or, they can channel a huge amount of money spent "fighting" ARM to start fabbing ARM. They can't depend on Apple, Apple for sure has investigated and lined up all the fabbing capacity they would ever need for the next several years.

I wonder, just for hypotheticals, whether the visits by AMD were actually also a peek into Global Foundries.

In any case, Apple has scoured the earth, it's up to Intel now to decide where to go.

Quote:
Originally Posted by Dick Applebaum View Post

I often wonder if Apple could contract Intel to fab Apple's ARM chips/packages.

Seems like a mutually-beneficial arrangement could be worked out -- and reduce dependence on Sammy.

Quote:
Originally Posted by afrodri View Post

Its theoretically possible, but Intel has historically resisted being a fab-for-hire. TSMC, UMC, and Global Foundries (and a little IBM) all work in that space and have the infrastructure for being a foundry.

Part of the reason Intel has resisted this path is the same reason that Apple killed off the clones and avoided licensing MacOS *by having vertical integration (HW & SW in Apple's case, Chip design and Fab in Intels) you capture more of the high margin activities and can differentiate yourself. If Intel became "just another foundry" they might not be able to maintain the profit margins they prefer.

Intel actually used to make ARM cores (I think there was some IP swap from DEC or something), but got out of that to focus on x86 - i.e. somewhere they could try and differentiate themselves.
post #71 of 74
Quote:
Originally Posted by zunx View Post

What is needed is a true a full Mac with just 400 to 600 g. The Mac in your pocket. Always. And I mean a true Mac with Intel x86 inside; not an ARM-based iOS device.

Erm... But that Mac in your pocket won't be a Mac as we know it. It will be iOS. Apple is clear that below the 10" form factor it will never be a regular keyboard-screen OS X.

Quote:
Originally Posted by Dick Applebaum View Post

Maybe I should have phrased it like the OP -- because Apple controls the platform, OS and hardware, they can take advantage of ARM/SIMD in iDevices.

It is doubtful that MS or Google can control this to the extent that Apple can.

Precisely. And again the scope of scaling up ARM-SIMD-PowerVR is phenomenally easier than cramming down x86 into tablet/mobile land.

Quote:
Originally Posted by SSquirrel View Post

Yes but again, a current gen discrete desktop GPU is blisteringly fast and can pump way more data than a current gen top end ARM GPU. So even if the CPUs stacked up managed to even out, the desktop would still have the advantage with the GPU compute capabilities.

Yes and No. Those blisteringly fast GPUs burn at least 50W to in excess of 150W. On Windows and even Mac OS X, GPGPU just hasn't lived up to the promise. On Windows, even gaming with a good GPU is near impossible nowadays with endless patches and graphic driver updates. I feel so burned up sometimes that Nvidia and AMD/ATI threw away such fantastic GPU engineering by just eating up more and more watts, failing to fab at lower nodes in a timely fashion, and just not sorting out the driver issues with Windows and games AND GPGPU stuff in Windows.

Given Intel in-CPU optimisations like H.264 encoding and so on, a modern Core i7 with the right software trounces GPGPU... in addition to the advantage of *not needing* another GPU besides Intel Integrated.

Quote:
Originally Posted by Macnewsjunkie View Post

One of the really cool take aways from the Jobs biography was how secrecy allowed Jobs or the current CEO to make the decision for the next platform at the last minute with all the dominoes lined up for both decisions. It is a decision point made based only on what is best for the companies future rather than its present. Both PowerPc and Intel mac books were made by independent development teams. If one failed the other could take over. At the last minute after sleeping on the decision Steve chose the intel chip. Doing both insures against intellectual theft and empire making by the drones in different departments. The fact that it makes for awesome theater and marketing is just icing on the cake. No wonder Steve always seemed so excited by the moment. The decision was just made and the future is right now during the keynote.

So yea I would say that this is now SOP at Apple. Let the press and internet Guess. Apple doesn't even know yet, and any employee who thinks he can get away with telling reporters about what the next great thing is will be in for a surprise.

Yeah, this was awesome isn't it? In the best traditions of American "skunkworks". Apple sure gives Area51 a run for its money.

Quote:
Originally Posted by Snowdog65 View Post

LOL. At first glance, I thought I missed the sarcasm tag.

But no you are serious. There is nothing quite as entertaining as long winded under-informed opinion "educating" the rest of us.

Impossible for x86 to be power efficient? If you read some actual CPU designer opinion on the the x86 overhead it is only 5-10% overhead versus other ISAs. Intel has simply been slow to wake up and change course.

There is nothing inherently wrong with x86 for modern smartphones or tablets, except an installed base of ARM code. Similar to the issue ARM has on the desktop.

Perhaps theoretically x86 does not have major disadvantages. But Intel has a tough job ahead of it to come down to ARM-level power draw without the x86 CPU being worthless.

Quote:
Originally Posted by tipoo View Post

Exactly. He still hasn't responded to the article I posted above, Medfeild looks like it will be both power competitive and performance competitive. And this isn't even using their 22nm fabs yet. When it does, it will have quite a lead. There's Cortex A15 coming out of course, but Intel is one tough gorilla to compete with in the long term.

More food for thought. I know it doesn't mean success or failure, but Intel sure has a lot more to throw at R&D if things should go sour.

http://www.wolframalpha.com/input/?i...M+Holdings+AMD

Fair enough, but I'm not feeling Intel on this, the weight behind ARM is signficant with Apple, Google and Microsoft all on the bandwagon, among others.

This is a battle Intel can "afford" to lose in the next five years, but as we know in tech, sometimes losing the battle means you have nothing much left to fight the war.

Again, the scenario of an iPad 4 which delivers Core i5 "experience" (due to optimisation etc) and DX10-quality 1680x1050 graphics, all playable on a HDTV, presents a tantalising possibility that you wouldn't want to touch your laptop unless for intense content creation. For business and content consumption, x86 in five years is going to be total overkill. Yeah, Windows 8, Office 2013 or whatever will drive the x86 forward but the road is getting narrower and the cliffs closer.

Quote:
Originally Posted by afrodri View Post

I'd agree, and also say that Intel _has_ realized it for a while now, its just that their attempts to break in to the mobile world have been flops. Still, you are right not to count them out. They are a _huge_ company which can take a brute force approach like none other. I remember the 90s when it was "obvious" that the x86 architecture was doomed and RISC architectures would crush them, and on a purely technical standpoint x86 didn't stand a chance. But, because Intel could shovel cash and engineering and manufacturing knowledge at the problem, x86 survived and even took over much of the high end. They still might be able to pull it off in the mobile world.

I wouldn't be surprised if Apple has both ARM and x86 versions of MacOS and iOS internally, just like they had x86 versions of MacOS years before the PPC->x86 changeover. If Intel does come up with a decent line of low-power cores, Apple would be able to take advantage of it.

It's true, PPC RISC was definitely awesome but Intel got its act together early last decade. However they were lucky in some sense that the Pentium3 line proved to be the way to go in amongst the kludge that was Pentium 1, 2 and the ill-fated Pentium 4. Also, Motorola and IBM both dropped the ball, somehow it seems very strange that the best they could do was a RISC processor eating well over 100W to compete with the stuff Intel (and don't forget AMD) was putting out with ease. In fact before the fiasco of the G5 Motorola was already sruggling badly with the G4.

So in some sense, PPC RISC and x86 were not that far apart, trading blows for a while until Moto and IBM screwed up, and Intel rallied the troops.

But if you're looking at ARM vs x86, the "gap" in thermals and no doubt performance is much bigger. Again the argument for ARM is that it has immense scope for scaling up performance and cores whereas x86 is facing the reverse challenge.

Intel x86 might be able to pull it off in the mobile world, or at least tablet world, but they have a bit of an uphill battle. ARM and PowerVR are on the straight and just have to keep on keeping on.
post #72 of 74
Quote:
Originally Posted by nvidia2008 View Post

Precisely. And again the scope of scaling up ARM-SIMD-PowerVR is phenomenally easier than cramming down x86 into tablet/mobile land.


Perhaps theoretically x86 does not have major disadvantages. But Intel has a tough job ahead of it to come down to ARM-level power draw without the x86 CPU being worthless.

I really have to wonder if it is willful blindness that keeps people thinking ARM has some kind of magic fairy dust while ignoring what is going on around them:

http://www.anandtech.com/show/5365/i...or-smartphones

There is Medfield, running with similar power to ARM on a smartphone, while doing MUCH better in benchmarks. So exactly what is this tough job AHEAD, and does this look like the CPU is worthless??

Intel stumbled for a few years in mobile, just as the stumbled with P4-Netburst architecture. It takes time to turn the big ship, but when the do, watch out.

Intel has a problem in Mobile, but it is NOT the quality of the CPU anymore, it is simply a massive installed code base that is in ARM ISA.
post #73 of 74
Quote:
Originally Posted by nvidia2008 View Post

Yes and No. Those blisteringly fast GPUs burn at least 50W to in excess of 150W. On Windows and even Mac OS X, GPGPU just hasn't lived up to the promise. On Windows, even gaming with a good GPU is near impossible nowadays with endless patches and graphic driver updates. I feel so burned up sometimes that Nvidia and AMD/ATI threw away such fantastic GPU engineering by just eating up more and more watts, failing to fab at lower nodes in a timely fashion, and just not sorting out the driver issues with Windows and games AND GPGPU stuff in Windows.

I was careful not to claim any kind of evenness on the watt usage of GPUs in the desktop world compared w/mobile, but there are certainly pretty big results. Part of the GPGPU problem is that both NVIDIA and AMD have their own version of it and everyone isn't coding for the same thing. In fact, many people never code for it. It isn't like OS X where the OS naturally takes advantage of the GPU.

Your bit about gaming being near impossible w/patches and the like is extremely hyperbolic. Usually, unless there has been some major problem they are fixing, you don't get stable releases of the graphics card more regularly than every month or 2 and game patches depend entirely on the game. MMOs have regular small updates that are downloaded in a couple of minutes time and are usually client changes only. Big version change patches come with lots of advance notice and, these days, advance download of large chunks of it while you are playing so when the actual patch day comes, a smaller update hits and contains the last few changes.

That is hardly a huge timekiller for most games. Now if you are buying a game that has been out for awhile and has a multitude of patches, then you have a lot of patching before that first time, but then probably not that many
post #74 of 74
Quote:
Originally Posted by nvidia2008 View Post

Fair enough, but I'm not feeling Intel on this, the weight behind ARM is signficant with Apple, Google and Microsoft all on the bandwagon, among others.

This is a battle Intel can "afford" to lose in the next five years, but as we know in tech, sometimes losing the battle means you have nothing much left to fight the war.

Again, the scenario of an iPad 4 which delivers Core i5 "experience" (due to optimisation etc) and DX10-quality 1680x1050 graphics, all playable on a HDTV, presents a tantalising possibility that you wouldn't want to touch your laptop unless for intense content creation. For business and content consumption, x86 in five years is going to be total overkill. Yeah, Windows 8, Office 2013 or whatever will drive the x86 forward but the road is getting narrower and the cliffs closer.

I'm not sure I'd say they are on the ARM bandwagon. Instead, everyone is moving towards processor architecture agnosticism. Apps coded for Metro will be able to run on ARM and x86 natively for example. And the vast majority of Android apps will run on Medfeild with no changes, the rest will run with ARM binary compatibility which will be done in real time in hardware. So Intel does not face a challenge to its instruction set, its just a matter of power draw, cost and performance. ARM may win the first two for a while, but like I said, Intel tends to slowly crush anyone it faces.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AAPL Investors
AppleInsider › Forums › Investors › AAPL Investors › Apple CEO hints at no ARM-based MacBook Air as iPad to "soon satisfy" that niche