Apple developing dedicated AI chip called Apple Neural Engine

24

Comments

  • Reply 21 of 78
    radarthekatradarthekat Posts: 3,842moderator
    My bet is that Apple doesn't use the term Facial recognition, but instead uses a term like gesture recognition, to differentiate that capability from face recognition.  Recently someone at Apple said in an interview that many there read AppleInsider.  Surely they recognize the confusion here between the two terms.  I'm confident Apple, on stage in some near future presentation, will avoid such confusion.  

    As to AI, I prefer the term Machine Intelligence.  Something either meets the bar for being intelligent (by whatever definition is applied) or it doesn't.  There's no value in signifying that one intelligence is somehow artificial versus another being somehow genuine.  But there is value in delineating between naturally evolved intelligence and engineered [machine] intelligence, as the latter embodies some attributes, like being able to be easily replicated, that the former does not.  

    As to Siri, I look forward to the day when it's able to hold a deeply contextual and ongoing conversation.  The Siri chatbot rumored to be added to iMessage will be a good first foray for Apple in this direction.
    edited May 2017 watto_cobradoozydozenoneof52pscooter63
  • Reply 22 of 78
    Unfortunately, Apple's AI processor won't even come close to NVidia's Tesla P100 GPU with it's 150 billion transistors on a single chip. Another one of the major reasons why NVidia with its P/E of 148 is valued far higher on Wall Street than Apple will ever be. Supposedly NVidia sunk $2 billion of R&D developing the T P100 GPU and they're certainly committed to this GPU with eight of them packed into a single computer and ready to roll with software included. NVidia isn't wasting their time on "hobbies" like Apple is. NVidia will likely replace Netflix as the new FANG as NVidia stock climbs to the stratosphere. Sometimes a focused smaller company blows larger companies by the wayside.
    doozydozen
  • Reply 23 of 78
    Unfortunately, Apple's AI processor won't even come close to NVidia's Tesla P100 GPU with it's 150 billion transistors on a single chip. Another one of the major reasons why NVidia with its P/E of 148 is valued far higher on Wall Street than Apple will ever be. Supposedly NVidia sunk $2 billion of R&D developing the T P100 GPU and they're certainly committed to this GPU with eight of them packed into a single computer and ready to roll with software included. NVidia isn't wasting their time on "hobbies" like Apple is. NVidia will likely replace Netflix as the new FANG as NVidia stock climbs to the stratosphere. Sometimes a focused smaller company blows larger companies by the wayside.

    Stupid comment.

    Apples processor doesn't have to come close to the P100 since that's destined for an entirely different class of machine. Can you even name me an Nvidia product that does machine learning in a mobile device? Because the P100, at 250W, isn't even close. And don't think Nvidia is so great at everything. Not one of their custom ARM processor cores ever came close to matching Apples custom processor cores.
    watto_cobradoozydozenfastasleepcalimizhoumonstrosityRayz2016mejsriconeof52tmay
  • Reply 24 of 78
    boboliciousbobolicious Posts: 1,146member
    True or not, we have a schedule to keep!

    https://futurism.com/images/the-dawn-of-the-singularity/
    Interesting, but is it choice i.e. actually for the better...?
    Will all those freely giving away their IP to gmail, hotmail, icloud and other web 2.0 inducements have any excuses when they are replaced...
    Has MacOS become one of the most creeping, progressive & brilliant spyware yet conceived ?
    I am interested to see the final cut of 2049 bladerunnermovie.com 'every civilization is built off the back of a disposable workforce'
    edited May 2017
  • Reply 25 of 78
    karmadavekarmadave Posts: 369member
    It wouldn't surprise me in the slightest if Apple has been working on such a chip for years. One of the advantages of doing some of your own chip design in-house is that you do these kinds of projects. Whether or not it ever sees the light of day is an entirely other matter...
  • Reply 26 of 78
    mattinozmattinoz Posts: 2,316member
    wizard69 said:

    Nice hear about AI chip!  Maybe Siri will begin to be able to understand and carry out searches with the new IA chip!  It is so stupid otherwise.  Alexa and Google Assistant is so much better at everything.
    The frustrating this with Siri is that they have the voice to text down pat.   I seldom have troubles with Siri translating what I've said to text, it is the inability to process that text that is a huge problem.  Unfortunately this happens on Apple servers so an AI chip on a cell phone won't help much unless Apple restructures their Siri services significantly.  

    I've been an advocate of Apple doing just that though.   That is have the cell phone itself construct the queries to the various databases Siri uses.   AI really needs to reside local to the device be it a phone, laptop or whatever.   In the end you would want your local AI to be able to help with the management of your device, searching it and optimizing it.

    As for what Apple is up to with Siri, I've pretty much given up on it.   Siri actually got worse for me for a long time.   In the end I'm not a big fan of these AI assistance, I don't think the use case has really been thought about buy the builders of the various systems.   
    Wouldn't that be a good reason for Apple to make the sort of moves thet are making with the new File System/service. I mean if they can have a secure personal set of data for each user shared between the users devices then they could have a Siri Instance per user. Start learning each user quriks of language to construt better search results and actions. 
    radarthekatwatto_cobra
  • Reply 27 of 78
    baederboybaederboy Posts: 38member
    Nice hear about AI chip!  Maybe Siri will begin to be able to understand and carry out searches with the new IA chip!  It is so stupid otherwise.  Alexa and Google Assistant is so much better at everything.
    How well does Alexa and Google Assistant understand your request when you ask in Spanish? French? Japanese? Mandarin?  Cantonese? Russian? Hebrew? Arabic? ...

    Are Alexa and Google Assistant really so much better at everything???
    calimizhoumacpluspluswatto_cobrabrucemcmattinoz
  • Reply 28 of 78
    slurpyslurpy Posts: 5,384member
    Awesome. I much prefer that AI processing is done locally than in the cloud. Apple has the resources, expertise, motivation, and ecosystem to leapfrog everyone in this area. 
    watto_cobra
  • Reply 29 of 78
    calicali Posts: 3,494member
    If they combine this with VocalIQ tech then this will blow everything out of the water. I wish I could explain how great this would be.

    Siri will be able to understand commands such as "open the free Wifi when I get to the restaurant" or questions such as "show me a romantic place I should take my wife to that's quiet and open late". 
    watto_cobrapscooter63
  • Reply 30 of 78
    mizhoumizhou Posts: 16member
    jd_in_sb said:
    I was under the impression that the barrier to good AI was software not lack of CPU speed. If we had blazing fast chips today would Siri suddenly be amazing? I think not.  
    Of course AI needs good software, but that software has to do billions of complicated computations, and therefore raw processing power is needed too. Last years WWDC, in a talk about the Photos app it was mentioned that when you take a picture with the iPhone camera, around 11 billion calculations are made, to tag the picture with metadata about what's in the picture (face recognition and objects like cars, beaches, cats, dogs etc).

    watto_cobra
  • Reply 31 of 78
    macplusplusmacplusplus Posts: 2,112member
    wizard69 said:

    I bet they already have it. It'll be part of their new GPU coming out in the next iPhone this fall.
    Exactly!   Now maybe not in the first iteration but I'm absolutely certain that Apples move to an in house GPU has a lot to do with extensions to the GPU to facilitate AI type calculations.    In a cell phone sized device you simply can't afford to have a discrete chip for AI even with the next process shrink and board space doesn't permit discreet chips.

    This has nothing to do with a GPU, this is neural network chip. Totally different beast.
  • Reply 32 of 78
    wonkothesanewonkothesane Posts: 1,724member
    What would be nice is to have Siri less reliant on an active internet Connection. 
    caliandrewj5790SpamSandwichtallest skilbrucemcdysamoria
  • Reply 33 of 78
    Rayz2016Rayz2016 Posts: 6,957member
    Sometimes it's just good to read folks' comments and learn stuff. 
    pscooter63
  • Reply 34 of 78
    dysamoriadysamoria Posts: 3,430member
    My prediction of computers going back to more dedicated/custom chips seems to be holding true...

    however: how does adding a chip reduce battery drain?
  • Reply 35 of 78
    radarthekatradarthekat Posts: 3,842moderator
    dysamoria said:
    My prediction of computers going back to more dedicated/custom chips seems to be holding true...

    however: how does adding a chip reduce battery drain?
    Simple.  Instead of taxing the main CPU to perform, for example, graphics intensive tasks that require a lot of clock cycles, graphics tasks are offloaded to a GPU that is designed to be optimized for image and video manipulation, which can be done in a significantly different, and much more parallel manner due to an image file being a discrete and known entity; processing can be done on multiple portions of an image at the same time, without effecting work being done on other portions.  In this manner a GPU can be more energy efficient in doing the same task that a CPU would otherwise need to crunch through sequentially.  Same likely holds true for an AI chip, which would likely take the form of a neural network, optimized for quickly crunching a certain type of task, and therefore much more energy efficient at performing that task. 
    edited May 2017 pscooter63
  • Reply 36 of 78
    tallest skiltallest skil Posts: 43,388member
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
  • Reply 37 of 78
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
  • Reply 38 of 78
    macplusplusmacplusplus Posts: 2,112member
    dysamoria said:
    My prediction of computers going back to more dedicated/custom chips seems to be holding true...

    however: how does adding a chip reduce battery drain?
    This is the radio that consumes the most battery. The energy cost of communicating back and forth with the server would justify the inclusion of a custom chip working stand-alone.
    pscooter63dysamoria
  • Reply 39 of 78
    macplusplusmacplusplus Posts: 2,112member
    https://backchannel.com/an-exclusive-look-at-how-ai-and-machine-learning-work-at-apple

    That article will help to better understand what Apple targets with that chip.
  • Reply 40 of 78
    tmaytmay Posts: 6,329member
    dysamoria said:
    My prediction of computers going back to more dedicated/custom chips seems to be holding true...

    however: how does adding a chip reduce battery drain?
    Simple.  Instead of taxing the main CPU to perform, for example, graphics intensive tasks that require a lot of clock cycles, graphics tasks are offloaded to a GPU that is designed to be optimized for image and video manipulation, which can be done in a significantly different, and much more parallel manner due to an image file being a discrete and known entity; processing can be done on multiple portions of an image at the same time, without effecting work being done on other portions.  In this manner a GPU can be more energy efficient in doing the same task that a CPU would otherwise need to crunch through sequentially.  Same likely holds true for an AI chip, which would likely take the form of a neural network, optimized for quickly crunching a certain type of task, and therefore much more energy efficient at performing that task. 
    It isn't difficult to imagine that Apple is best positioned to lead the industry in granulating existing SOC tasks to dedicated satellite processors, and I would have an expectation that there will be at least a few non silicon implementations that may be very disruptive to existing SOC design on the roadmap. In essence, these satellite processors would extend what the few that already exist do; interface to the user and accessory devices in the real world with low latency. There isn't any barrier to this happening at Qualcomm, or Samsung, though the loose connection with Google Android OS development and roadmap would be a notable disadvantage.

    In the not to distant future, mobile devices will have much more than the visible and audible spectrum to sense our world.
    radarthekat
Sign In or Register to comment.