My bet is that Apple doesn't use the term Facial recognition, but instead uses a term like gesture recognition, to differentiate that capability from face recognition. Recently someone at Apple said in an interview that many there read AppleInsider. Surely they recognize the confusion here between the two terms. I'm confident Apple, on stage in some near future presentation, will avoid such confusion.
As to AI, I prefer the term Machine Intelligence. Something either meets the bar for being intelligent (by whatever definition is applied) or it doesn't. There's no value in signifying that one intelligence is somehow artificial versus another being somehow genuine. But there is value in delineating between naturally evolved intelligence and engineered [machine] intelligence, as the latter embodies some attributes, like being able to be easily replicated, that the former does not.
As to Siri, I look forward to the day when it's able to hold a deeply contextual and ongoing conversation. The Siri chatbot rumored to be added to iMessage will be a good first foray for Apple in this direction.
Unfortunately, Apple's AI processor won't even come close to NVidia's Tesla P100 GPU with it's 150 billion transistors on a single chip. Another one of the major reasons why NVidia with its P/E of 148 is valued far higher on Wall Street than Apple will ever be. Supposedly NVidia sunk $2 billion of R&D developing the T P100 GPU and they're certainly committed to this GPU with eight of them packed into a single computer and ready to roll with software included. NVidia isn't wasting their time on "hobbies" like Apple is. NVidia will likely replace Netflix as the new FANG as NVidia stock climbs to the stratosphere. Sometimes a focused smaller company blows larger companies by the wayside.
Unfortunately, Apple's AI processor won't even come close to NVidia's Tesla P100 GPU with it's 150 billion transistors on a single chip. Another one of the major reasons why NVidia with its P/E of 148 is valued far higher on Wall Street than Apple will ever be. Supposedly NVidia sunk $2 billion of R&D developing the T P100 GPU and they're certainly committed to this GPU with eight of them packed into a single computer and ready to roll with software included. NVidia isn't wasting their time on "hobbies" like Apple is. NVidia will likely replace Netflix as the new FANG as NVidia stock climbs to the stratosphere. Sometimes a focused smaller company blows larger companies by the wayside.
Stupid comment.
Apples processor doesn't have to come close to the P100 since that's destined for an entirely different class of machine. Can you even name me an Nvidia product that does machine learning in a mobile device? Because the P100, at 250W, isn't even close. And don't think Nvidia is so great at everything. Not one of their custom ARM processor cores ever came close to matching Apples custom processor cores.
Interesting, but is it choice i.e. actually for the better...? Will all those freely giving away their IP to gmail, hotmail, icloud and other web 2.0 inducements have any excuses when they are replaced... Has MacOS become one of the most creeping, progressive & brilliant spyware yet conceived ? I am interested to see the final cut of 2049 bladerunnermovie.com 'every civilization is built off the back of a disposable workforce'
It wouldn't surprise me in the slightest if Apple has been working on such a chip for years. One of the advantages of doing some of your own chip design in-house is that you do these kinds of projects. Whether or not it ever sees the light of day is an entirely other matter...
Nice hear about AI chip! Maybe Siri will begin to be able to understand and carry out searches with the new IA chip! It is so stupid otherwise. Alexa and Google Assistant is so much better at everything.
The frustrating this with Siri is that they have the voice to text down pat. I seldom have troubles with Siri translating what I've said to text, it is the inability to process that text that is a huge problem. Unfortunately this happens on Apple servers so an AI chip on a cell phone won't help much unless Apple restructures their Siri services significantly.
I've been an advocate of Apple doing just that though. That is have the cell phone itself construct the queries to the various databases Siri uses. AI really needs to reside local to the device be it a phone, laptop or whatever. In the end you would want your local AI to be able to help with the management of your device, searching it and optimizing it.
As for what Apple is up to with Siri, I've pretty much given up on it. Siri actually got worse for me for a long time. In the end I'm not a big fan of these AI assistance, I don't think the use case has really been thought about buy the builders of the various systems.
Wouldn't that be a good reason for Apple to make the sort of moves thet are making with the new File System/service. I mean if they can have a secure personal set of data for each user shared between the users devices then they could have a Siri Instance per user. Start learning each user quriks of language to construt better search results and actions.
Nice hear about AI chip! Maybe Siri will begin to be able to understand and carry out searches with the new IA chip! It is so stupid otherwise. Alexa and Google Assistant is so much better at everything.
How well does Alexa and Google Assistant understand your request when you ask in Spanish? French? Japanese? Mandarin? Cantonese? Russian? Hebrew? Arabic? ...
Are Alexa and Google Assistant really so much better at everything???
Awesome. I much prefer that AI processing is done locally than in the cloud. Apple has the resources, expertise, motivation, and ecosystem to leapfrog everyone in this area.
If they combine this with VocalIQ tech then this will blow everything out of the water. I wish I could explain how great this would be.
Siri will be able to understand commands such as "open the free Wifi when I get to the restaurant" or questions such as "show me a romantic place I should take my wife to that's quiet and open late".
I was under the impression that the barrier to good AI was software not lack of CPU speed. If we had blazing fast chips today would Siri suddenly be amazing? I think not.
Of course AI needs good software, but that software has to do billions of complicated computations, and therefore raw processing power is needed too. Last years WWDC, in a talk about the Photos app it was mentioned that when you take a picture with the iPhone camera, around 11 billion calculations are made, to tag the picture with metadata about what's in the picture (face recognition and objects like cars, beaches, cats, dogs etc).
I bet they already have it. It'll be part of their new GPU coming out in the next iPhone this fall.
Exactly! Now maybe not in the first iteration but I'm absolutely certain that Apples move to an in house GPU has a lot to do with extensions to the GPU to facilitate AI type calculations. In a cell phone sized device you simply can't afford to have a discrete chip for AI even with the next process shrink and board space doesn't permit discreet chips.
This has nothing to do with a GPU, this is neural network chip. Totally different beast.
My prediction of computers going back to more dedicated/custom chips seems to be holding true...
however: how does adding a chip reduce battery drain?
Simple. Instead of taxing the main CPU to perform, for example, graphics intensive tasks that require a lot of clock cycles, graphics tasks are offloaded to a GPU that is designed to be optimized for image and video manipulation, which can be done in a significantly different, and much more parallel manner due to an image file being a discrete and known entity; processing can be done on multiple portions of an image at the same time, without effecting work being done on other portions. In this manner a GPU can be more energy efficient in doing the same task that a CPU would otherwise need to crunch through sequentially. Same likely holds true for an AI chip, which would likely take the form of a neural network, optimized for quickly crunching a certain type of task, and therefore much more energy efficient at performing that task.
I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
My prediction of computers going back to more dedicated/custom chips seems to be holding true...
however: how does adding a chip reduce battery drain?
This is the radio that consumes the most battery. The energy cost of communicating back and forth with the server would justify the inclusion of a custom chip working stand-alone.
My prediction of computers going back to more dedicated/custom chips seems to be holding true...
however: how does adding a chip reduce battery drain?
Simple. Instead of taxing the main CPU to perform, for example, graphics intensive tasks that require a lot of clock cycles, graphics tasks are offloaded to a GPU that is designed to be optimized for image and video manipulation, which can be done in a significantly different, and much more parallel manner due to an image file being a discrete and known entity; processing can be done on multiple portions of an image at the same time, without effecting work being done on other portions. In this manner a GPU can be more energy efficient in doing the same task that a CPU would otherwise need to crunch through sequentially. Same likely holds true for an AI chip, which would likely take the form of a neural network, optimized for quickly crunching a certain type of task, and therefore much more energy efficient at performing that task.
It isn't difficult to imagine that Apple is best positioned to lead the industry in granulating existing SOC tasks to dedicated satellite processors, and I would have an expectation that there will be at least a few non silicon implementations that may be very disruptive to existing SOC design on the roadmap. In essence, these satellite processors would extend what the few that already exist do; interface to the user and accessory devices in the real world with low latency. There isn't any barrier to this happening at Qualcomm, or Samsung, though the loose connection with Google Android OS development and roadmap would be a notable disadvantage.
In the not to distant future, mobile devices will have much more than the visible and audible spectrum to sense our world.
Comments
As to AI, I prefer the term Machine Intelligence. Something either meets the bar for being intelligent (by whatever definition is applied) or it doesn't. There's no value in signifying that one intelligence is somehow artificial versus another being somehow genuine. But there is value in delineating between naturally evolved intelligence and engineered [machine] intelligence, as the latter embodies some attributes, like being able to be easily replicated, that the former does not.
As to Siri, I look forward to the day when it's able to hold a deeply contextual and ongoing conversation. The Siri chatbot rumored to be added to iMessage will be a good first foray for Apple in this direction.
Stupid comment.
Apples processor doesn't have to come close to the P100 since that's destined for an entirely different class of machine. Can you even name me an Nvidia product that does machine learning in a mobile device? Because the P100, at 250W, isn't even close. And don't think Nvidia is so great at everything. Not one of their custom ARM processor cores ever came close to matching Apples custom processor cores.
Will all those freely giving away their IP to gmail, hotmail, icloud and other web 2.0 inducements have any excuses when they are replaced...
Has MacOS become one of the most creeping, progressive & brilliant spyware yet conceived ?
I am interested to see the final cut of 2049 bladerunnermovie.com 'every civilization is built off the back of a disposable workforce'
Are Alexa and Google Assistant really so much better at everything???
Siri will be able to understand commands such as "open the free Wifi when I get to the restaurant" or questions such as "show me a romantic place I should take my wife to that's quiet and open late".
however: how does adding a chip reduce battery drain?
That article will help to better understand what Apple targets with that chip.
In the not to distant future, mobile devices will have much more than the visible and audible spectrum to sense our world.