Apple acquires AI startup Voysis in apparent push to bolster Siri's natural language skill...
Apple recently purchased Dublin, Ireland-based company Voysis, a startup that conducted work on improving natural language processing in virtual assistants.
Voysis developed a natural language processing platform tailored to voice-based assistants deployed in online shopping apps, reports Bloomberg. By better understanding a customer's natural speech patterns, the embeddable technology was able to deliver more accurate search results.
Apple confirmed the purchase in a typical boilerplate statement, saying it "buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans."
According to the report, a now-removed webpage claimed Voysis tech was able to narrow product search results by processing retail-related phrases like "I need a new LED TV" and "My budget is $1,000." Effective language processing allows users to more naturally interact with AI voice assistants by removing hurdles like memorizing key command phrases.
The firm's solution is based on WaveNet technology, which was introduced by Google's DeepMind program in 2016. Described as a "deep generative model of raw audio waveforms," WaveNets can be used to generate speech that mimics any human voice, making for a more natural virtual assistant experience. It appears that Voysis applied the method to more accurately sample and translate human voice commands for AI systems.
Prior to its acquisition, Voysis marketed its natural language solution to companies looking to improve voice assistants integrated in apps or online, with an emphasis on the retail sector. The company reportedly reduced the footprint of its solution down to 25MB of memory, small enough for easy integration with existing AI systems.
In 2018, Voysis raised $8 million in venture capital funding to further development and roll out a refined product, TechCrunch reported at the time.
"Voysis is a complete voice AI platform," Voysis founder and CEO Peter Cahill said in a 2018 interview. "What I mean by that is that the platform enables companies and businesses to rapidly stand up their own artificial intelligences that can be queried by voice or text."
Apple's plans for Voysis remain unknown, though Siri is a natural fit. The tech giant has incrementally improved Siri's speech output since the assistant's debut in 2011, but the program took a major step forward with iOS 13. Not only does the virtual assistant sound more "human" in its latest iteration, but its ability to recognize natural language commands has greatly improved thanks to backend processing improvements.
That said, Siri is far from perfect. Voysis' technology could go a long way toward developing a more natural human-machine interface.
Voysis developed a natural language processing platform tailored to voice-based assistants deployed in online shopping apps, reports Bloomberg. By better understanding a customer's natural speech patterns, the embeddable technology was able to deliver more accurate search results.
Apple confirmed the purchase in a typical boilerplate statement, saying it "buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans."
According to the report, a now-removed webpage claimed Voysis tech was able to narrow product search results by processing retail-related phrases like "I need a new LED TV" and "My budget is $1,000." Effective language processing allows users to more naturally interact with AI voice assistants by removing hurdles like memorizing key command phrases.
The firm's solution is based on WaveNet technology, which was introduced by Google's DeepMind program in 2016. Described as a "deep generative model of raw audio waveforms," WaveNets can be used to generate speech that mimics any human voice, making for a more natural virtual assistant experience. It appears that Voysis applied the method to more accurately sample and translate human voice commands for AI systems.
Prior to its acquisition, Voysis marketed its natural language solution to companies looking to improve voice assistants integrated in apps or online, with an emphasis on the retail sector. The company reportedly reduced the footprint of its solution down to 25MB of memory, small enough for easy integration with existing AI systems.
In 2018, Voysis raised $8 million in venture capital funding to further development and roll out a refined product, TechCrunch reported at the time.
"Voysis is a complete voice AI platform," Voysis founder and CEO Peter Cahill said in a 2018 interview. "What I mean by that is that the platform enables companies and businesses to rapidly stand up their own artificial intelligences that can be queried by voice or text."
Apple's plans for Voysis remain unknown, though Siri is a natural fit. The tech giant has incrementally improved Siri's speech output since the assistant's debut in 2011, but the program took a major step forward with iOS 13. Not only does the virtual assistant sound more "human" in its latest iteration, but its ability to recognize natural language commands has greatly improved thanks to backend processing improvements.
That said, Siri is far from perfect. Voysis' technology could go a long way toward developing a more natural human-machine interface.
Comments
Lots of applications for Apple but autonomous cars come to mind in addition to shopping, medicine, finance, HomeKit etc...
That said, it is surprising off often Siri gets triggered accidentally. I was listening to a podcast the other day where the host kept saying "a history ..." and Siri kept triggering. I had to press the home button so the podcast would continue, but Overcast skips back a second or so, and triggered again... loop. So, maybe they need some work there, too. But, when I dictate a phrase, it generally does fairly well.
The other place they need work is on the predictive text and context. It seems really bad at any kind of analysis of the words around it to help determine what word to suggest. I'm often almost finished with words before it finally figures out what to suggest, which actually slows me down because I keep trying to watch what it is predicting, hoping to save some time.
- “it is surprising off often Siri gets triggered accidentally“ cgWerks.
- “The autocorrect also seems worse” RetroGusto.
- “the problem is the “I give up” attitude Siri presents. Why not offer to send the results of a search to a screen?” Rayz2016”
- ...
It seems that there are 2 domains being discussed, and even if Apple calls both “Siri”, I’m not sure that they really share the same technology. The 1st is the speech inquiry agent triggered with “Hey Siri”, the 2nd is the predictive and autocorrect text agent on keyboards.In the 1st case, Siri appears limited in “knowledge”, limited in scope of “action” and limited in “context” understanding.
”hey Siri how far have I walked?” (Fail - “I can’t do that, please open the app...”).
”hey Siri what song was no.1 in the U.K. charts in January” (Fail - “I cant find the answer to that on HomePod”).
... follow up questions, for example “hey Siri, what album was it released on?” (generally fails regardless of the 1st question; Siri seems very limited when “remembering” conversation context).
etc.
In the 2nd case, I have noticed a significant decrease in accuracy in autocorrect since iOS 13. Plus, when adding words into an existing sentence, Siri (if this really is Siri) often thinks the new word should start with a capital letter... even in the middle of sentences. That’s just poor.
Apple purchasing technology companies for their code/tech. or their developer skills can only help (fingers crossed).
I've taken two bits out of context to make an oxymoron. Siri needs to be much better. I don't care that Siri is no JARVIS. But it can be improved a hundred fold without ever coming close.
Apple has two Siris, at least. One for iDevices and one for HomePod. iDevices Siri is very limited in functionality, while HomePod Siri is both stupid and inept, and annoying.
As there are no visual cued in HomePod to provide an informative response, it talks at you. Apple has tried to make it 'conversational', and it's just annoying.
I've had fewer errors with Siri understand a question or command, compared to Alexa, but Siri can't do 20% of what Alexa can.
I'm wary that Apple bought this company not to improve Siri as we know it, but for some different purpose that will yield little of any improvement for day to day, mundane Siri functionality. And that HomePod Siri will remain stupid and inept.
This is accessing iTunes Match music from paired Home Pod.
'"Hey Siri play The Eagles Shuffled."
"I'm on it"
pause....
"Sorry you don't have any music by The Eagles in your Library'
"Hey Siri, play Hotel California"
"Coming right up, here is Hotel California by The Eagles'
Song plays.
I suppose if one does a particular thing often, and can memorize the syntax, Siri saves some time. But, often, if I'm not quite sure how to try and ask/command something, by the time it messes up several times and *maybe* I get it right, I could have just done it manually.
Yeah, I just run with unlocked only, so it can't trigger most of the time, as my phone is locked most of the time. However, when driving, I have it in a dash holder and powered up/unlocked. It triggers WAY too often considering how little I use it like that.