After whipping Intel, Nvidia & AMD in mobile chips, sky's the limit for Apple Inc's silicon design t

2

Comments

  • Reply 21 of 53
    mazda 3smazda 3s Posts: 1,613member
    Quote:
    Originally Posted by staticx57 View Post



    Not sure what the title has to do with the article

    It's pretty simple actually...

     0Likes 0Dislikes 0Informatives
  • Reply 22 of 53
    One thing I've noticed Google does and continues to do is this.
    - If they come into a market, like Smartphones, they do NOT dominate, they goto the Open Source route initially. Once they own the market, they close it hard.. Android, TRUE Android, is no longer open.. has not been for a few years..

    They have done this repeatedly now. It's like a rope-a-dope .. They get the open source community to do the dirty work, get them to support it and excited, then pull back and own it themselves an reap the benefits leaving the community with an 'open' version thats heavily crippled, at best.

    To me, thats dirty.
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 53
    Of all the things Apple could diversify into, silicon makes the most sense. They have a track record of excellence, the existing players are few and tend to be complacent. Plus, every device going forward from watches to thermostats need them. They are a commodity product that cannot be done on the cheap by knock-off artists. I can see Apple becoming the new Intel, but reserving the latest and greatest for themselves while selling-off the last gen to others. "Apple Inside."!
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 53
    bugsnwbugsnw Posts: 717member

    I had forgotten about that intriguing little chip that manages the pixels on the new iMac. Very interesting. Apple is making their stuff harder to copy.

     

    Maybe it's all this innovation that makes me wonder how long it's going to be before Apple breaks from Intel. I can't help but think that some accidental discovery is going to leapfrog Apple's ARM chips into the stratosphere. Same with batteries.

     0Likes 0Dislikes 0Informatives
  • Reply 25 of 53
    128 bit CPUs and peripherals
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 53
    Another nice, balanced, unbiased article about Apple. How can anybody take these guys seriously?
     0Likes 0Dislikes 0Informatives
  • Reply 27 of 53
    Quote:

    Originally Posted by Ian Huitson View Post



    128 bit CPUs and peripherals



    Whoa... While that sounds like a minor bump, that would literally be the game changer. Also not very likely. Although it could store 3,588,860,714,729,490 Yottabytes (that is the next power up from Exabytes). That is a lot of RAM that a phone, tablet, or even modern computer cannot even come close to having. (Review this post in about 20 years, and then mock me in the same way we mock Bill Gates for thinking people will never need more than 64 Kb of RAM).

     

    Other than for encryption, can anyone here suggest what a 128 bit CPU would be used for?

     0Likes 0Dislikes 0Informatives
  • Reply 28 of 53
    mj webmj web Posts: 918member
    Wow, DED waited almost a full week before retyping his wet dream BS for the 35th time this month. I'm assuming our AI editor has an obsessive/compulsive disorder. Sheesh!
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 53
    MacPromacpro Posts: 19,873member
    mj web wrote: »
    Wow, DED waited almost a full week before retyping his wet dream BS for the 35th time this month. I'm assuming our AI editor has an obsessive/compulsive disorder. Sheesh!

    I can see, if I try hard, to some folks that all the articles seem the same. The differences between them all does require an understanding of the history of Apple absent in many casual readers or Android users.
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 53
    Quote:

    Originally Posted by Mazda 3s View Post

     

    It's pretty simple actually...


     

    click bait?

     0Likes 0Dislikes 0Informatives
  • Reply 31 of 53
    mnbob1mnbob1 Posts: 269member
    Quote:

    Originally Posted by MJ Web View Post



    Wow, DED waited almost a full week before retyping his wet dream BS for the 35th time this month. I'm assuming our AI editor has an obsessive/compulsive disorder. Sheesh!



    I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.

     0Likes 0Dislikes 0Informatives
  • Reply 32 of 53
    mj webmj web Posts: 918member
    Quote:

    Originally Posted by mnbob1 View Post

     



    I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.




    Because democracy is dead?

     0Likes 0Dislikes 0Informatives
  • Reply 33 of 53
    mstonemstone Posts: 11,510member
    Quote:
    Originally Posted by Mike Eggleston View Post

     
    Other than for encryption, can anyone here suggest what a 128 bit CPU would be used for?


    Posting selflies and texting.

     0Likes 0Dislikes 0Informatives
  • Reply 34 of 53
    Good read.

    You could say Apple itself was founded in 1976 when they made the leap from "hobbyist" (homebrew) to finished product. The Apple I was a crude "product," but look at the refinement of the original Apple II.

    That's been the story of tech: from hobbyist to refined products. Internet TV, home automation, and wearables are next to be productized.

    Markets like IoT and self-driving cars are still in the hobbyist/research phases, but will become products.

    Intel have no foothold on these markets, and their continued focus on skating to where the puck was rather than where it's going will seal their fates in the long run. No, I don't think the PC is going away tomorrow and the sky is NOT falling, but Microsoft has Windows for IoT on ARM. And that's a sign of things to come.
     0Likes 0Dislikes 0Informatives
  • Reply 35 of 53
    mstone wrote: »
    Posting selflies and texting.

    You forget that even something as simple as taking multi-megapixel selfies and texting requires significant computing power and network bandwidth (especially if you expect to do it on the spur of the moment without waiting 20 seconds for you camera app to launch). Just go back a decade when 2G EDGE networks were the norm, cell phones had crude VGA cameras, and slow Java based UIs. You don't need a 64-processor to do that, but then again, that's not what we expect from our devices in 2015, is it?
     0Likes 0Dislikes 0Informatives
  • Reply 36 of 53
    mj web wrote: »
    <div class="quote-container" data-huddler-embed="/t/184651/after-whipping-intel-nvidia-amd-in-mobile-chips-skys-the-limit-for-apple-incs-silicon-design-team#post_2671728" data-huddler-embed-placeholder="false">Quote:<div class="quote-block">Originally Posted by <strong>mnbob1</strong> <a href="/t/184651/after-whipping-intel-nvidia-amd-in-mobile-chips-skys-the-limit-for-apple-incs-silicon-design-team#post_2671728"><img alt="View Post" src="/img/forum/go_quote.gif" /></a><br /> <p><br />I'm wondering why you take the time to read an article when you're going also take time to bitch about it in the forum. Go away! You're annoying and have nothing to add here.</p></div></div><p><br />Because democracy is dead?</p>
    Yes. Unless you agree your view isn't welcome by some.
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 53
    sflocalsflocal Posts: 6,178member
    mj web wrote: »

    Because democracy is dead?

    It's not dead. Democracy is what's giving trolls and whiners like you to post your nonsense drivel without fear of reprisal.

    That being said, you don't like DED's articles, don't read them. Now buzz off little gnat.
     0Likes 0Dislikes 0Informatives
  • Reply 38 of 53
    misamisa Posts: 827member
    inkling wrote: »
    Deja vu. I seem to remember similar stories in the early 1990s about Microsoft taking over everything. That story fell apart, as did the one about IBM before it.

    Apple needs to be careful the same hubris doesn't do it in.

    IBM divested itself of it's core competencies. Which was making hardware for business. First they got rid of the laptops, desktops and then servers. So all the hardware they sell now is what? Mainframes? Who is buying those except those who bought them more than 30 years ago? That's a rapidly decaying market. IBM is all services now, and people are rushing out the door to try "Cloud services" instead. This is precisely the area IBM would have been good at if it didn't dispose of it's commodity hardware competencies.

    Microsoft's problem is that it's only core competency is the Windows OS, and to some extent Microsoft Office. Microsoft unfortunately has always been trying to keep a walled-garden, subscribing to "not-invented-here", rolling out it's own alternatives (eg C#, .NET, SilverLight, DirectX/Direct3D) instead of working to establish standards to keep developers and customers on the expensive Windows platform. So when they started rolling out WindowsCE/Mobile/Phone for PDA's and then Phones, they just up and threw the entire thing away when the iPhone obliterated all sales of those devices.

    That's the problem that Microsoft and IBM have, is that they choose to cater to their existing customers, and then force them to be unable to leave their platforms. Requiring subscriptions and/or updates to prevent their software from completely locking them out.

    Apple on the other hand does this as well, but not to the extent that Microsoft ever has. Given all the publicly available source code for the tools and Darwin platform, you could conceivably build a Mac OS X clone without ever having the source code that makes up the OS X GUI. It would be impractical, but the point is there is very little on OS X that you can't port to another platform without having to write from scratch.

    Where Apple is can perilously forget it's past, is by attempting to switch all it's Intel-based devices to ARM, and trying to repeat the PPC to x86 switch. People weren't happy about switching, but it also gave plenty of PC people an excuse to switch because Windows could still be run on the same machine when needed. Switching the desktops and laptops to ARM abandons that.
     0Likes 0Dislikes 0Informatives
  • Reply 39 of 53
    newbeenewbee Posts: 2,055member
    Quote:
    Originally Posted by CogitoDexter View Post

     

    I knew it was DED the moment I read the headline. ;)




    It took me a moment to "get" the humour. You'd think the sarcasm tag would have done it for me, but you'd be wrong  (I need more coffee). In any case.....good post.  :) 

     0Likes 0Dislikes 0Informatives
  • Reply 40 of 53
    wizard69wizard69 Posts: 13,377member
    pennywse wrote: »
    Yup and I was excited to do so. I like to read well laid out information on Apple, it's why I'm here in the first place.

    While not one of his worst this is still an article full of wishful thinking and bending the truth. Like everything DED I'd take it with a grain of sand.

    On the flip side I have to wonder why People are surprised by this. Apple has always had a very strong Electircal engineering team and has designed chips for its machines for years.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.