Even though Siri apparently uses Nuance only for voice recognition and not for AI, it seems to me that this would have been a more important strategic and defensive acquisition than Beats was.
Why the past tense? If Apple wants Nuance, they'll buy Nuance.
It's not like the Beats acquisition out a dent in Apple's cash pile or anything.
i used to think that companies would be racing to acquire Nuance. But the whole voice recognition as a service capability is quickly becoming a commodity. If they don't use Nuance they'll use some other service. The real power is with the training data and user feedback that Apple is collecting and with the AI layer that acts on the words.
Sometimes there are unavoidable evils we have to bear. Just like automobiles, no matter how much one strives to buy American, foreign parts (especially electronics) are impossibly unavoidable.
I hope you posted this to prove a pro-Apple point and not just to troll...
I am a long time user of Nuance. More than 12 year. I still use it everyday and am even using it to dictate this. Even with a pretty good microphone (Airport 77), a Retina MBP and a quiet environment, it SUCKS!
Its lack of contextual awareness, for a company that wants to be worth 7 billion , is inexcusable and downright lazy. Statistics-based frequency models for determining which words should appear together are clunky and not fit for purpose in modern computing. Even a little bit of artificial intelligence and better use of grammar would go a long way in making the product better.
The stupid word combinations it sometimes comes up with boggle the mind even with someone that speaks clearly and knows how to use the product.
Apple's dictation in Mavericks cannot be used all day, every day for serious work. But DragonDictate earns enough money that some serious investment would make a product that..... .....makes sense out of the nonsense traces it throws in front of you.
I really wish someone else could emerge to provide enough competition to make them progress in a meaningful way. Apple Dictation just knocked the bottom out of the market and I worry that they will take that as an incentive to invest even less, especially on the Mac platform.
At the end of the day, the product is just too dumb for what we [reasonably] expect of it.
Thanks for your experience, Shaw.
Article: [Nuance's technology does not handle Siri's artificial intelligence layers. Instead, the company's products simply provides the capability of Siri, or other voice-driven services, to interpret a user's voice.]
Surely there is other technology Apple could purchase that would cost far less than this. However, to seem interested to drive the price up to waste Samsung’s time and money would be a ploy most corporations might entertain. Creepy, but Samsung thrives on creepy.
addendum
As Apple seems not to have broached the company prior to this, possibly it is working on its own version of this technology. Just supposing . .
Apple was working on speech recognition many years ago... I think even before Steve returned. Maybe Steve killed it to focus on keeping the company alive. I used to have a CD where it was discussed and the presenter was saying why it was so important to add content to speech recognition because just by listening a computer can't tell the difference between saying "I see," and saying "icy."
On buying Nuance, Apple may have all the licenses it need in place to use what they need from Nuance. Who are we to know?
Well, that's it. Apple is doomed. DOOMED, I tell you! Time to sell* my unbelievably considerable position in Apple.
*Actually, I usually figure when someone states that they own Apple stock and are selling it because of an AppleInsider post that they don't own any and are just being dramatic and trolly - don't you?
Even though Siri apparently uses Nuance only for voice recognition and not for AI, it seems to me that this would have been a more important strategic and defensive acquisition than Beats was.
If it was strategically important and more valuable to own than license, Apple would have owned them already. Apple has probably looked at it and evaluated the technology and their license terms and figure for less than $7B they can roll their own. Until then, they will just keep using it as is.
Didn't you know questioning the Beats acquisition is heresy around here?
People can question the Beats acquisition as long as they make coherent thoughts and good arguments. Stating Beats has been detrimental to Apple because of "hip hop" or "thug" culture is not one of them.
I hope you posted this to prove a pro-Apple point and not just to troll...
You missed a third option: to illustrate that the vitriol towards Samsung so liberally sprayed on this forum exaggerates its role as Arch-Nemesis and ignores the fact that Samsung provides some products and services that Apple cannot readily source elsewhere.
If not that, I still gotta figure there are possible reasons beyond just "pro-Apple" and "troll."
You missed a third option: to illustrate that the vitriol towards Samsung so liberally sprayed on this forum exaggerates its role as Arch-Nemesis and ignores the fact that Samsung provides some products and services that Apple cannot readily source elsewhere.
There's not really anyone who disparages Samsung while ignoring their present manufacturing necessity. Do you have any examples otherwise?
"Nuance's technology does not handle Siri's artificial intelligence layers. Instead, the company's products simply provides the capability of Siri, or other voice-driven services, to interpret a user's voice. That data must then be contextually deciphered to provide the kind of humanized response a user expects."
Whoever wrote this does not understand voice-recongition technologies. Yes, Nuance does not handle the integration to backend services, however, voice grammars provide all the contextual information necessary to make the queries. In essence, 80% of the work is done after the recognition... the rest is merely web services and database work.
Google's voice recognition technology and contextual natural language comprehension is so far ahead of everyone else it is ridiculous. I have no idea how they did it without infringing on Nuance patents. That is the main challenge with Apple rolling their own voice recognition, the patents.
Google's is good but, IMO, is no where near nuance yet. But as for how they did it so fast...data.
These days, algorithms are well advanced and data is king. Google can examine every voicemail that goes through the GV system and store the data for further analysis. They can also peruse each and every email that goes through gmail to improve NLU (not the speech recognition part, but the language parsing and understanding part). Nuance doesn't have either of these massive sources of pure data. No one does really.
Even though Siri apparently uses Nuance only for voice recognition and not for AI, it seems to me that this would have been a more important strategic and defensive acquisition than Beats was.
I hear this often, that nuance only provides the bare bones speech recognition and apple handles the AI. I don't think it's that cut and dry. Nuance has gigantic data centres for online voice recognition and have been almost exclusively focused on NLU.
There would really be three stages here.
-determining the words
- determining meaning of words and phrases
- determining intent as it relates to Siri for command and control, search, etc
The AI is going to be in the last two. I expect nuance is involved up to and including stage two.
Apple apparently has a research lab in Boston made up of former VoiceSignal staff that have been working for a few years on a secret project. Maybe Nuance knows that Apple is about to cut its cord with them and are looking for some company to buy them out before that happens?
Apple represents a very small fraction of Nuance's revenues ... The bread and butter for Nuance is Health Care and Enterprise, not Apple.
I hear this often, that nuance only provides the bare bones speech recognition and apple handles the AI. I don't think it's that cut and dry. Nuance has gigantic data centres for online voice recognition and have been almost exclusively focused on NLU.
There would really be three stages here.
-determining the words
- determining meaning of words and phrases
- determining intent as it relates to Siri for command and control, search, etc
The AI is going to be in the last two. I expect nuance is involved up to and including stage two.
Nuance Recognizer will provide everything unto the intent ... so the only thing left to do is submit the intent and associated parameters into a web service call to retrieve the data requested by the user.
Basically the heavy lifting in terms of understanding the user is done by the time the intent is captured (by Nuance technologies).
No people complain Siri can be "dumb" at times ... that's probably due to a lack of investment on Apple's part on expanding the grammars to cover each and every possible utterance. Nevertheless the current state of Siri is still pretty impressive for a NLU application.
Comments
Good suggestion; but even better would be Steven.
Why the past tense? If Apple wants Nuance, they'll buy Nuance.
It's not like the Beats acquisition out a dent in Apple's cash pile or anything.
+1
...and it SUCKS even more owning anything with Samsung name on it.
Are… you acquainted with MacRumors?
You mean the Samsung Shill/Fanboy Forum?
Sometimes there are unavoidable evils we have to bear. Just like automobiles, no matter how much one strives to buy American, foreign parts (especially electronics) are impossibly unavoidable.
I hope you posted this to prove a pro-Apple point and not just to troll...
Apple was working on speech recognition many years ago... I think even before Steve returned. Maybe Steve killed it to focus on keeping the company alive. I used to have a CD where it was discussed and the presenter was saying why it was so important to add content to speech recognition because just by listening a computer can't tell the difference between saying "I see," and saying "icy."
On buying Nuance, Apple may have all the licenses it need in place to use what they need from Nuance. Who are we to know?
That's the one.
I hope you posted this to prove a pro-Apple point and not just to troll...
It's waterrockets. He's trolling.
Well, that's it. Apple is doomed. DOOMED, I tell you! Time to sell* my unbelievably considerable position in Apple.
*Actually, I usually figure when someone states that they own Apple stock and are selling it because of an AppleInsider post that they don't own any and are just being dramatic and trolly - don't you?
Even though Siri apparently uses Nuance only for voice recognition and not for AI, it seems to me that this would have been a more important strategic and defensive acquisition than Beats was.
If it was strategically important and more valuable to own than license, Apple would have owned them already. Apple has probably looked at it and evaluated the technology and their license terms and figure for less than $7B they can roll their own. Until then, they will just keep using it as is.
People can question the Beats acquisition as long as they make coherent thoughts and good arguments. Stating Beats has been detrimental to Apple because of "hip hop" or "thug" culture is not one of them.
You missed a third option: to illustrate that the vitriol towards Samsung so liberally sprayed on this forum exaggerates its role as Arch-Nemesis and ignores the fact that Samsung provides some products and services that Apple cannot readily source elsewhere.
If not that, I still gotta figure there are possible reasons beyond just "pro-Apple" and "troll."
You missed a third option: to illustrate that the vitriol towards Samsung so liberally sprayed on this forum exaggerates its role as Arch-Nemesis and ignores the fact that Samsung provides some products and services that Apple cannot readily source elsewhere.
There's not really anyone who disparages Samsung while ignoring their present manufacturing necessity. Do you have any examples otherwise?
There's not really anyone who disparages Samsung while ignoring their present manufacturing necessity. Do you have any examples otherwise?
You're ruining his straw man...
Whoever wrote this does not understand voice-recongition technologies. Yes, Nuance does not handle the integration to backend services, however, voice grammars provide all the contextual information necessary to make the queries. In essence, 80% of the work is done after the recognition... the rest is merely web services and database work.
These days, algorithms are well advanced and data is king. Google can examine every voicemail that goes through the GV system and store the data for further analysis. They can also peruse each and every email that goes through gmail to improve NLU (not the speech recognition part, but the language parsing and understanding part). Nuance doesn't have either of these massive sources of pure data. No one does really.
There would really be three stages here.
-determining the words
- determining meaning of words and phrases
- determining intent as it relates to Siri for command and control, search, etc
The AI is going to be in the last two. I expect nuance is involved up to and including stage two.
Apple apparently has a research lab in Boston made up of former VoiceSignal staff that have been working for a few years on a secret project. Maybe Nuance knows that Apple is about to cut its cord with them and are looking for some company to buy them out before that happens?
Apple represents a very small fraction of Nuance's revenues ... The bread and butter for Nuance is Health Care and Enterprise, not Apple.
I hear this often, that nuance only provides the bare bones speech recognition and apple handles the AI. I don't think it's that cut and dry. Nuance has gigantic data centres for online voice recognition and have been almost exclusively focused on NLU.
There would really be three stages here.
-determining the words
- determining meaning of words and phrases
- determining intent as it relates to Siri for command and control, search, etc
The AI is going to be in the last two. I expect nuance is involved up to and including stage two.
Nuance Recognizer will provide everything unto the intent ... so the only thing left to do is submit the intent and associated parameters into a web service call to retrieve the data requested by the user.
Basically the heavy lifting in terms of understanding the user is done by the time the intent is captured (by Nuance technologies).
No people complain Siri can be "dumb" at times ... that's probably due to a lack of investment on Apple's part on expanding the grammars to cover each and every possible utterance. Nevertheless the current state of Siri is still pretty impressive for a NLU application.