One thing I've noticed Google does and continues to do is this. - If they come into a market, like Smartphones, they do NOT dominate, they goto the Open Source route initially. Once they own the market, they close it hard.. Android, TRUE Android, is no longer open.. has not been for a few years..
They have done this repeatedly now. It's like a rope-a-dope .. They get the open source community to do the dirty work, get them to support it and excited, then pull back and own it themselves an reap the benefits leaving the community with an 'open' version thats heavily crippled, at best.
Of all the things Apple could diversify into, silicon makes the most sense. They have a track record of excellence, the existing players are few and tend to be complacent. Plus, every device going forward from watches to thermostats need them. They are a commodity product that cannot be done on the cheap by knock-off artists. I can see Apple becoming the new Intel, but reserving the latest and greatest for themselves while selling-off the last gen to others. "Apple Inside."!
I had forgotten about that intriguing little chip that manages the pixels on the new iMac. Very interesting. Apple is making their stuff harder to copy.
Maybe it's all this innovation that makes me wonder how long it's going to be before Apple breaks from Intel. I can't help but think that some accidental discovery is going to leapfrog Apple's ARM chips into the stratosphere. Same with batteries.
Whoa... While that sounds like a minor bump, that would literally be the game changer. Also not very likely. Although it could store 3,588,860,714,729,490 Yottabytes (that is the next power up from Exabytes). That is a lot of RAM that a phone, tablet, or even modern computer cannot even come close to having. (Review this post in about 20 years, and then mock me in the same way we mock Bill Gates for thinking people will never need more than 64 Kb of RAM).
Other than for encryption, can anyone here suggest what a 128 bit CPU would be used for?
Wow, DED waited almost a full week before retyping his wet dream BS for the 35th time this month. I'm assuming our AI editor has an obsessive/compulsive disorder. Sheesh!
Wow, DED waited almost a full week before retyping his wet dream BS for the 35th time this month. I'm assuming our AI editor has an obsessive/compulsive disorder. Sheesh!
I can see, if I try hard, to some folks that all the articles seem the same. The differences between them all does require an understanding of the history of Apple absent in many casual readers or Android users.
Wow, DED waited almost a full week before retyping his wet dream BS for the 35th time this month. I'm assuming our AI editor has an obsessive/compulsive disorder. Sheesh!
I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.
I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.
You could say Apple itself was founded in 1976 when they made the leap from "hobbyist" (homebrew) to finished product. The Apple I was a crude "product," but look at the refinement of the original Apple II.
That's been the story of tech: from hobbyist to refined products. Internet TV, home automation, and wearables are next to be productized.
Markets like IoT and self-driving cars are still in the hobbyist/research phases, but will become products.
Intel have no foothold on these markets, and their continued focus on skating to where the puck was rather than where it's going will seal their fates in the long run. No, I don't think the PC is going away tomorrow and the sky is NOT falling, but Microsoft has Windows for IoT on ARM. And that's a sign of things to come.
You forget that even something as simple as taking multi-megapixel selfies and texting requires significant computing power and network bandwidth (especially if you expect to do it on the spur of the moment without waiting 20 seconds for you camera app to launch). Just go back a decade when 2G EDGE networks were the norm, cell phones had crude VGA cameras, and slow Java based UIs. You don't need a 64-processor to do that, but then again, that's not what we expect from our devices in 2015, is it?
<div class="quote-container" data-huddler-embed="/t/184651/after-whipping-intel-nvidia-amd-in-mobile-chips-skys-the-limit-for-apple-incs-silicon-design-team#post_2671728" data-huddler-embed-placeholder="false">Quote:<div class="quote-block">Originally Posted by <strong>mnbob1</strong> <a href="/t/184651/after-whipping-intel-nvidia-amd-in-mobile-chips-skys-the-limit-for-apple-incs-silicon-design-team#post_2671728"><img alt="View Post" src="/img/forum/go_quote.gif" /></a><br /> <p><br />I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.</p></div></div><p><br />Because democracy is dead?</p>
Yes. Unless you agree your view isn't welcome by some.
Deja vu. I seem to remember similar stories in the early 1990s about Microsoft taking over everything. That story fell apart, as did the one about IBM before it.
Apple needs to be careful the same hubris doesn't do it in.
IBM divested itself of it's core competencies. Which was making hardware for business. First they got rid of the laptops, desktops and then servers. So all the hardware they sell now is what? Mainframes? Who is buying those except those who bought them more than 30 years ago? That's a rapidly decaying market. IBM is all services now, and people are rushing out the door to try "Cloud services" instead. This is precisely the area IBM would have been good at if it didn't dispose of it's commodity hardware competencies.
Microsoft's problem is that it's only core competency is the Windows OS, and to some extent Microsoft Office. Microsoft unfortunately has always been trying to keep a walled-garden, subscribing to "not-invented-here", rolling out it's own alternatives (eg C#, .NET, SilverLight, DirectX/Direct3D) instead of working to establish standards to keep developers and customers on the expensive Windows platform. So when they started rolling out WindowsCE/Mobile/Phone for PDA's and then Phones, they just up and threw the entire thing away when the iPhone obliterated all sales of those devices.
That's the problem that Microsoft and IBM have, is that they choose to cater to their existing customers, and then force them to be unable to leave their platforms. Requiring subscriptions and/or updates to prevent their software from completely locking them out.
Apple on the other hand does this as well, but not to the extent that Microsoft ever has. Given all the publicly available source code for the tools and Darwin platform, you could conceivably build a Mac OS X clone without ever having the source code that makes up the OS X GUI. It would be impractical, but the point is there is very little on OS X that you can't port to another platform without having to write from scratch.
Where Apple is can perilously forget it's past, is by attempting to switch all it's Intel-based devices to ARM, and trying to repeat the PPC to x86 switch. People weren't happy about switching, but it also gave plenty of PC people an excuse to switch because Windows could still be run on the same machine when needed. Switching the desktops and laptops to ARM abandons that.
It took me a moment to "get" the humour. You'd think the sarcasm tag would have done it for me, but you'd be wrong (I need more coffee). In any case.....good post.
Yup and I was excited to do so. I like to read well laid out information on Apple, it's why I'm here in the first place.
While not one of his worst this is still an article full of wishful thinking and bending the truth. Like everything DED I'd take it with a grain of sand.
On the flip side I have to wonder why People are surprised by this. Apple has always had a very strong Electircal engineering team and has designed chips for its machines for years.
Comments
Not sure what the title has to do with the article
It's pretty simple actually...
- If they come into a market, like Smartphones, they do NOT dominate, they goto the Open Source route initially. Once they own the market, they close it hard.. Android, TRUE Android, is no longer open.. has not been for a few years..
They have done this repeatedly now. It's like a rope-a-dope .. They get the open source community to do the dirty work, get them to support it and excited, then pull back and own it themselves an reap the benefits leaving the community with an 'open' version thats heavily crippled, at best.
To me, thats dirty.
I had forgotten about that intriguing little chip that manages the pixels on the new iMac. Very interesting. Apple is making their stuff harder to copy.
Maybe it's all this innovation that makes me wonder how long it's going to be before Apple breaks from Intel. I can't help but think that some accidental discovery is going to leapfrog Apple's ARM chips into the stratosphere. Same with batteries.
128 bit CPUs and peripherals
Whoa... While that sounds like a minor bump, that would literally be the game changer. Also not very likely. Although it could store 3,588,860,714,729,490 Yottabytes (that is the next power up from Exabytes). That is a lot of RAM that a phone, tablet, or even modern computer cannot even come close to having. (Review this post in about 20 years, and then mock me in the same way we mock Bill Gates for thinking people will never need more than 64 Kb of RAM).
Other than for encryption, can anyone here suggest what a 128 bit CPU would be used for?
I can see, if I try hard, to some folks that all the articles seem the same. The differences between them all does require an understanding of the history of Apple absent in many casual readers or Android users.
It's pretty simple actually...
click bait?
Wow, DED waited almost a full week before retyping his wet dream BS for the 35th time this month. I'm assuming our AI editor has an obsessive/compulsive disorder. Sheesh!
I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.
I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.
Because democracy is dead?
Posting selflies and texting.
You could say Apple itself was founded in 1976 when they made the leap from "hobbyist" (homebrew) to finished product. The Apple I was a crude "product," but look at the refinement of the original Apple II.
That's been the story of tech: from hobbyist to refined products. Internet TV, home automation, and wearables are next to be productized.
Markets like IoT and self-driving cars are still in the hobbyist/research phases, but will become products.
Intel have no foothold on these markets, and their continued focus on skating to where the puck was rather than where it's going will seal their fates in the long run. No, I don't think the PC is going away tomorrow and the sky is NOT falling, but Microsoft has Windows for IoT on ARM. And that's a sign of things to come.
You forget that even something as simple as taking multi-megapixel selfies and texting requires significant computing power and network bandwidth (especially if you expect to do it on the spur of the moment without waiting 20 seconds for you camera app to launch). Just go back a decade when 2G EDGE networks were the norm, cell phones had crude VGA cameras, and slow Java based UIs. You don't need a 64-processor to do that, but then again, that's not what we expect from our devices in 2015, is it?
It's not dead. Democracy is what's giving trolls and whiners like you to post your nonsense drivel without fear of reprisal.
That being said, you don't like DED's articles, don't read them. Now buzz off little gnat.
IBM divested itself of it's core competencies. Which was making hardware for business. First they got rid of the laptops, desktops and then servers. So all the hardware they sell now is what? Mainframes? Who is buying those except those who bought them more than 30 years ago? That's a rapidly decaying market. IBM is all services now, and people are rushing out the door to try "Cloud services" instead. This is precisely the area IBM would have been good at if it didn't dispose of it's commodity hardware competencies.
Microsoft's problem is that it's only core competency is the Windows OS, and to some extent Microsoft Office. Microsoft unfortunately has always been trying to keep a walled-garden, subscribing to "not-invented-here", rolling out it's own alternatives (eg C#, .NET, SilverLight, DirectX/Direct3D) instead of working to establish standards to keep developers and customers on the expensive Windows platform. So when they started rolling out WindowsCE/Mobile/Phone for PDA's and then Phones, they just up and threw the entire thing away when the iPhone obliterated all sales of those devices.
That's the problem that Microsoft and IBM have, is that they choose to cater to their existing customers, and then force them to be unable to leave their platforms. Requiring subscriptions and/or updates to prevent their software from completely locking them out.
Apple on the other hand does this as well, but not to the extent that Microsoft ever has. Given all the publicly available source code for the tools and Darwin platform, you could conceivably build a Mac OS X clone without ever having the source code that makes up the OS X GUI. It would be impractical, but the point is there is very little on OS X that you can't port to another platform without having to write from scratch.
Where Apple is can perilously forget it's past, is by attempting to switch all it's Intel-based devices to ARM, and trying to repeat the PPC to x86 switch. People weren't happy about switching, but it also gave plenty of PC people an excuse to switch because Windows could still be run on the same machine when needed. Switching the desktops and laptops to ARM abandons that.
I knew it was DED the moment I read the headline.
It took me a moment to "get" the humour. You'd think the sarcasm tag would have done it for me, but you'd be wrong (I need more coffee). In any case.....good post.
While not one of his worst this is still an article full of wishful thinking and bending the truth. Like everything DED I'd take it with a grain of sand.
On the flip side I have to wonder why People are surprised by this. Apple has always had a very strong Electircal engineering team and has designed chips for its machines for years.