or Connect
AppleInsider › Forums › General › General Discussion › Apple acquired speech recognition firm Novauris last year to improve Siri - report
New Posts  All Forums:Forum Nav:

Apple acquired speech recognition firm Novauris last year to improve Siri - report

post #1 of 31
Thread Starter 
An Apple acquisition from 2013 may have quietly flown under the radar, as it was revealed this week that the Siri development team has added personnel from Novauris Technologies.




Apple's alleged purchase of Novauris was revealed on Thursday by TechCrunch, which reported that the acquisition took place last year but was not publicly announced. Personnel from Novauris are said to already be a part of Apple's Siri team.

While Apple has not confirmed the purchase, reporter Sarah Perez placed a call placed to the Novauris offices in the U.K., and an employee there confirmed that the operation is now a part of Apple.

Novauris was founded by the same people who founded Dragon Systems, creators of Dragon NaturallySpeaking, which is now owned by Nuance Communications. Nuance also specializes in speech recognition, and its technology helps to power Apple's Siri, the virtual personal assistant found on its iOS devices.

Nouvaris specializes in automatic speech recognition technology utilized to access information stored either locally on a device, or on remote servers. The company's website remains operational, and makes no mention of an acquisition by Apple.

Siri


Its featured product is NovaSystem, a scalable, server-based speech recognition system capable of handling multiple simultaneous voice access requests. The system can also spread the computational load over a multi-computer installation if necessary.

"Novauris believes that voice will become the interface of choice for consumer applications of all types," the website reads. "Users should be able to ask for information quickly, easily and directly without needing an enrollment process or complicated and time-consuming dialogues."

Apple spent $525 million on acquisitions in the last quarter of 2013 alone. Those purchases are believed to include personal assistant app Cue, 3D motion sensor firm PrimeSense, and social media analytics company Topsy.

Other 2013 purchases by Apple include digital mapping company BroadMap, public transit tracker HopStop, mapping data startup Locationary, and indoor GPS company WifiSLAM.
post #2 of 31
If true, then bravo to Apple for homing in on the right people. It's about time that James Baker saw someone other than a banker profit from his inventions.

This is exactly what I hoped Apple would do, when I suggested it buy the Baker's team in my comments to an Appleinsider story last July:
http://forums.appleinsider.com/t/158729/apple-assembling-speech-recognition-tech-team-in-boston-to-improve-siri#post_2368625

...and in another of my comments last September:
http://forums.appleinsider.com/t/159303/investors-speculate-carl-icahn-could-push-apple-to-buy-nuance-for-speech-tech#post_2388324
Edited by TeaEarleGreyHot - 4/3/14 at 11:19am
post #3 of 31
And they didn't overspend by billions of dollars. Kudos.
post #4 of 31
Quote:
Originally Posted by TeaEarleGreyHot View Post

If true, then bravo to Apple for homing in on the right people. It's about time that James Baker saw someone other than a banker profit from his inventions.

This is exactly what I hoped Apple would do, when I suggested it buy the Baker's team in my comments to an Appleinsider story last July:
http://forums.appleinsider.com/t/158729/apple-assembling-speech-recognition-tech-team-in-boston-to-improve-siri#post_2368625

...and in another of my comments last September:
http://forums.appleinsider.com/t/159303/investors-speculate-carl-icahn-could-push-apple-to-buy-nuance-for-speech-tech#post_2388324

The article triggered a memory of someone's purchase suggestion. I was going to look it up, but thankfully you made yourself known.

Thanks for the suggestion!
post #5 of 31
I hope they continue to improve the AI that occurs after the voice is understood. The other night I asked Siri which theater was showing "The Grand Budapest Hotel" and she replied "I can't look up things in Hungary".
post #6 of 31
Speech recognition should be done on the device and not on a server, that's also the best load balancing imaginable. A device with 115 GFLOPS should be more than capable.
It's a bad thing that Siri doesn't work without a network.
post #7 of 31
Quote:
Originally Posted by knowitall View Post

Speech recognition should be done on the device and not on a server, that's also the best load balancing imaginable. A device with 115 GFLOPS should be more than capable.
It's a bad thing that Siri doesn't work without a network.


There isn't just a single way.

I mean, by this time, the majority of Apple's devices that are sold are connected to the Internet one way or another. I get it, networks suck sometimes, other times there's slow or no WiFi, but more often than not, if you're using one of these devices you're online. 

 

You're also forgetting all of the benefits that come from the server communication. Apple can make updates to the Siri engine without ever needing to update the device, you can have machines dedicated and optimized for the processing of Siri requests (much more efficiently than the phones hardware would without added complexity, expense, etc), you can utilize the incoming data to get analytics, the phone doesn't need to store the massive amount of natural language-decoding software components... the list goes on and on.

 

There are obviously many other benefits and it's ridiculous to think Apple did not weigh them in their development decisions with Siri.

post #8 of 31
Quote:
Originally Posted by Nobodyy View Post


There isn't just a single way.
I mean, by this time, the majority of Apple's devices that are sold are connected to the Internet one way or another. I get it, networks suck sometimes, other times there's slow or no WiFi, but more often than not, if you're using one of these devices you're online. 

You're also forgetting all of the benefits that come from the server communication. Apple can make updates to the Siri engine without ever needing to update the device, you can have machines dedicated and optimized for the processing of Siri requests (much more efficiently than the phones hardware would without added complexity, expense, etc), you can utilize the incoming data to get analytics, the phone doesn't need to store the massive amount of natural language-decoding software components... the list goes on and on.

There are obviously many other benefits and it's ridiculous to think Apple did not weigh them in their development decisions with Siri.
Privacy is always a concern if you are using servers. iOS can be updated easily, Apple can be wrong and is in this case.
Speech recognition can be solved without a server, and current hardware can do it as good as any server.
Think of the network traffic and server capacity needed, it's all a waste because the computing power of all devices combined exceeds that of any server by far.
Bitcoin mining is now done on mobile phones, why is that you think?
post #9 of 31
Well form the looks it's done well. New Siri voice compared to the old is amazing. For some reason 7.1 for me has Siri being WAAAAY more faster and accurate to complicated words.
post #10 of 31
Quote:
Originally Posted by TeaEarleGreyHot View Post

If true, then bravo to Apple for homing in on the right people. It's about time that James Baker saw someone other than a banker profit from his inventions.

This is exactly what I hoped Apple would do, when I suggested it buy the Baker's team in my comments to an Appleinsider story last July:
http://forums.appleinsider.com/t/158729/apple-assembling-speech-recognition-tech-team-in-boston-to-improve-siri#post_2368625

...and in another of my comments last September:
http://forums.appleinsider.com/t/159303/investors-speculate-carl-icahn-could-push-apple-to-buy-nuance-for-speech-tech#post_2388324


Hard Proof that Apple executives read this blog to know what to do next!!! /s
"That (the) world is moving so quickly that iOS is already amongst the older mobile operating systems in active development today." — The Verge
Reply
"That (the) world is moving so quickly that iOS is already amongst the older mobile operating systems in active development today." — The Verge
Reply
post #11 of 31
Definitely getting more accurate and seems faster.
post #12 of 31
Quote:
Originally Posted by tookieman2013 View Post

Definitely getting more accurate and seems faster.

 

Anyone who thinks Siri is improved please try this test.

 

Me: "Siri, who is the president of the the United States?"

 

Siri: "Let me think about that...The answer is Barack Hussain Obama II" (she said Obama 'two')

 

Me: "What is the name of his dog?"

 

Siri: "Checking my sources... My web search turned this up."

 

Displays a link to Dog - Wikipedia.

 

But if you ask her "What is the name of President Obama's dog?" She gets it right, sort of right, "Bo" but she doesn't speak it just displays on screen.  Actually "Bo" is only one of the Presidents dogs. "Sunny" is the other.

 

My conclusion: Siri still cannot carry on a conversation worth a damn when it involves retrieving information from the Internet..

 

Your results may vary, but I try this type of contextual query with Siri all the time and she almost never has a clue what we are discussing after the first question.

 

When you try the same test with Google Search the results are completely different and the there is no "Hmm, let ne think about that" either. The speed difference is like a hundred times faster too. Give it a try and then tell me which works better....I mean, which one works.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #13 of 31

What Novauris does that Nuance does not...

post #14 of 31

Has anyone tried the 'Tell me a story' thing on Siri? It is pretty funny. You need to ask her thrice...

post #15 of 31
Quote:
Originally Posted by knowitall View Post

Speech recognition should be done on the device and not on a server, that's also the best load balancing imaginable. A device with 115 GFLOPS should be more than capable.
It's a bad thing that Siri doesn't work without a network.

I agree. The main reason I don't use Siri very often is I don't want Apple having a library of MP3s of my voice.

post #16 of 31
Quote:
Originally Posted by knowitall View Post

Privacy is always a concern if you are using servers. iOS can be updated easily, Apple can be wrong and is in this case.
Speech recognition can be solved without a server, and current hardware can do it as good as any server.
Think of the network traffic and server capacity needed, it's all a waste because the computing power of all devices combined exceeds that of any server by far.
Bitcoin mining is now done on mobile phones, why is that you think?

Has privacy been a concern? No. Maybe for the select few, but not the millions of others.

Can iOS be updated easily? It depends on what your definition of easy is. You don't even know how often they do make changes to the Siri engine - it could be everyday, it could be once as month, but I guarantee you it is much easier to update a server farm than change iOS and then push that update out to millions of iDevices frequently.

And no, iPhone hardware in it's current form cannot do Siri as well as a server optimized specifically for that task. By contracting out, your iPhone gets better battery life, isn't being stuffed with potentially unused software components, not using hardware to handle requests making the phone cheaper, etc etc. You also state that all counting power of these devices COMBINED is better than a server? That's cool, but you're also implying these devices to be independent processing requests. When multiple servers are pumping away at your single request, you can be damn sure they'll push out that request faster than locally on your phone, or at least fast enough that you'd often not notice network fallacies, which is the biggest drawback in this method.

Bitmining on phones is NOT cost effective or efficient at all. It's hardly efficient on most computers and requires specialized hardware to make it that way, which costs extra money, resources, time, etc. The same thing can be said for Siri on iDevice hardware.
Edited by Nobodyy - 4/4/14 at 7:55am
post #17 of 31
Quote:
Originally Posted by Macky the Macky View Post

Hard Proof that Apple executives read this blog to know what to do next!!! /s

 

Heh. I'm sure Apple execs have better things to do than read our oft-rambling comments. But doesn't it always feel good to see something happen which you felt ought to happen?  That's all I meant to express.

post #18 of 31

I wonder if, at least part of, this acquisition isn't a subtle FU to Carl Ichan, Apple gadfly and large Nuance shareholder. I'm sure Tim Cook said "why buy the cow, when we can get Novauris for [comparatively] free?" The fact that it kinda screws a major pain in his ass may have just been gravy.

post #19 of 31
Quote:
Originally Posted by ascii View Post

I agree. The main reason I don't use Siri very often is I don't want Apple having a library of MP3s of my voice.

You are both clueless. Siri is not just speech recognition for one thing. It's an AI app that requires the server back end in order to aggregate and respond with the information requested with some level of intelligence based on previous requests, location, time, and other factors. The more it's used the more likely the response will be what your looking for.

It's unlikely that Apple has any interest in storing your or anyone else's voices in MP3 format from your Siri queries. I'm sure you used MP3 to make a point but it wouldn't be the chosen codec for storage due to space. And if they did save those hundreds of millions of voice queries, what purpose would it be for? A massive library of all of us saying "Siri, tell me a joke." when we first got our iPhone 4s and above. If you're afraid to talk to your phone to ask for information or directions then what do you think your ISP or cell carrier is collecting about you when you access the internet? Any app that uses location services knows more about you and where you are than your mom did the entire time you were in high school. Those ads you see aren't there by accident either.
post #20 of 31
Quote:
Originally Posted by mnbob1 View Post


You are both clueless. Siri is not just speech recognition for one thing. It's an AI app that requires the server back end in order to aggregate and respond with the information requested with some level of intelligence based on previous requests, location, time, and other factors. The more it's used the more likely the response will be what your looking for.

It's unlikely that Apple has any interest in storing your or anyone else's voices in MP3 format from your Siri queries. I'm sure you used MP3 to make a point but it wouldn't be the chosen codec for storage due to space. And if they did save those hundreds of millions of voice queries, what purpose would it be for? A massive library of all of us saying "Siri, tell me a joke." when we first got our iPhone 4s and above. If you're afraid to talk to your phone to ask for information or directions then what do you think your ISP or cell carrier is collecting about you when you access the internet? Any app that uses location services knows more about you and where you are than your mom did the entire time you were in high school. Those ads you see aren't there by accident either.

Yes, I just used the term "MP3" to make a point, what the actual codec is is irrelevant. What you perhaps don't know that Apple do in fact keep recordings of all queries sent to Siri for 2 years. This is from their own spokesperson, not a rumour. 

http://www.wired.com/2013/04/siri-two-years/

post #21 of 31
The data and voice is collected and disassociated with your identity in order to help them improve the product according to the Siri policy:

You may choose to turn off Siri at any time. To do so, open Settings, tap General, tap Siri, and slide the Siri switch to “off”. If you turn off Siri, Apple will delete your User Data, as well as your recent voice input data. Older voice input data that has been disassociated from you may be retained for a period of time to generally improve Siri, Dictation and dictation functionality in other Apple products and services. This voice input data may include audio files and transcripts of what you said and related diagnostic data, such as hardware and operating system specifications and performance statistics.

The article reiterates that with a clarification of the time (2 years) and throws a bit of paranoia in as well.

If you are so concerned about Apple have clips of your voice (they have more and Google knows you better than you do) then just turn it off. Doing voice recognition is possible standalone but doing an application like Siri requires a host server right now and for the near future. If you're concerned about your private information being collected on a server somewhere turm off location services, don't buy any apps music or movies, don't use maps or Google maps, stay away from unknown hotspots that require a login or confirmation to connect and don't use any social media. Also stay away from any cloud services. Of course your cell carrier is going to collect information about you no matter what you do and chances are that your carrier, Google, Yahoo, Microsoft, Apple or Facebook has already handed your info over to the NSA. Personally I'm getting too old to worry about all this. I use Siri all the time. She gets more accurate as we get to know each other. I tell her how to be smarter by teaching her things like how to pronounce names and telling her relationships. It's much more fun when you relax and get to know her. When I got my iphone4s I would say "call Jill Johnson Mobile" to call my wife. Now I say "Get ahold of my Sweet Pea in the car please". I like to be pleasant with Siri. Or to text my daughter "Tell the kiddo I'm on my way and be there in 5". Sometimes I just ask for useless info. I learned this one the other day "what planes are flying over me right now". Just for fun.
post #22 of 31
Siri is a complete disaster. Today I wanted to go to Goodwill to make a donation as we are straightening up the house and had a lot of stuff to donate. I asked Siri, directions to Goodwill on Bristol street in Santa Ana. Honestly, you cannot be any more specific than that, but she comes back with...I found 15 matches for Goodwill, some are pretty far from you. My God, there are not even fifteen Goodwill locations in all of Orange County. Then she says tap on the one that you want.  Idiotic. I'm driving Siri, you should know that because you can access my GPS and see I am moving and also very close to Goodwlll maybe a couple blocks away. I don't want to look at the screen while driving so I pull over and none the fifteen locations she found were on Bristol. Of course there is only one Goodwill on Bristol in Santa Ana so how she came up with fifteen is beyond me. So I remembered it was next to Ana's Linens so I asked the same sort of question. Siri give me directions to Ana's Linens on Bristol Street in Santa Ana.
 
Same bull shit. I found 9 locations matching Ana's Linens, tap the screen to choose which one you want directions for. For the record there is only one Ana's Linens anywhere in Santa Ana. Just stupid, and the way she pronounces the Spanish derived names common in southern California is just ridiculous. Total waste of time. No need to mention it but I will.  I ultimately pulled over again and used Google to get directions.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #23 of 31
That's interesting. I just tested Siri on my phone and asked "directions to the nearest goodwill" and she found the closest one and started the directions immediately. I then went and did a search. There are 5 goodwill stores within a 10 mile radius of my current location which is home. I tried it for several other retailers and restaurants with the same correct results. Then I realized I was connected to my wifi. So I turned off wifi and was connected to LTE (I have an iPhone 5s). I got the same results as before. I use Siri multiple times per day and it's mostly for directions, making calls in the car and some for simple texting in the car (much better now that I can control how long Siri listens with ios 7.1). It seems strange but having wifi enabled even in the car improves accuracy. I'm not sure how that is possible but I've been told that multiple times since back to my iPhone 4s. I have followed that advice and have had consistent good results. That info came from multiple Apple Store people and several Apple techs. It's my understanding that the more you use Siri the more accurate the results will be. She will anticipate patterns in your requests. The voice recognition will become more accurate.

If she mispronounces names and places consistently and it annoys you the teach her the correct way to pronounce it. Just say "name is pronounced xxxxx" then she'll say back several options and you select the correct one or you tell her to keep trying.

The Siri experience is only as good as you make it. It's not perfect. There are rumors of huge improvements in ios 8 and opening the API to developers. I can't wait to be able to use a Siri enabled app. Some of my favorite apps are perfect for it. Relax. Talk calmly to her. Sometimes too much info is not as good as you think. As you said Siri already has a lot of GPS info. Notice I just asked for the nearest goodwill? When you ask complicated questions Siri takes it all and many times will not give you what you were looking for. In your example Siri heard Bristol street Goodwill and Santa Ana. She did what you asked, she found the Goodwill's associated with Bristol and Santa Ana of which there were many. The only option was to offer you a choice. Yes you were driving and the API knows that. Unfortunately some responses aren't spoken when so many are presented. In some cases using headphones or Bluetooth car connections give audio responses. Just remember. Keep it simple.
post #24 of 31
Quote:
Originally Posted by mnbob1 View Post

Notice I just asked for the nearest goodwill? When you ask complicated questions Siri takes it all and many times will not give you what you were looking for. In your example Siri heard Bristol street Goodwill and Santa Ana. She did what you asked, she found the Goodwill's associated with Bristol and Santa Ana of which there were many. 

Thanks for your tips but I consistently have issues with Siri. Funny how Google knows exactly how to respond with the same exact input. I didn't want any Goodwill or the closet one, I wanted the one I asked for because that one is the regional donation center not just an outlet. If I knew the exact address and the zip code would that have made the results even worse because I offered even more information?

 

When querying Google on the iPhone with the exact same three data points it returned the full address, the phone number, hours of operation for each day of the week and bolded the current day hours, directions, a picture, and the distance form my location along with a map.

 

If you want Siri to return those types of detailed results you need to say "Siri, search Google for Goodwill on Bristol street in Santa Ana"


Edited by mstone - 4/6/14 at 9:44pm

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #25 of 31
Quote:
Originally Posted by mnbob1 View Post


You are both clueless. Siri is not just speech recognition for one thing. It's an AI app that requires the server back end in order to aggregate and respond with the information requested with some level of intelligence based on previous requests, location, time, and other factors. The more it's used the more likely the response will be what your looking for.
 

 

Not at all. The kind of 'AI' happening can easily be done on an iPhone.

Also, Siri needs local data (or data that should be local) from the device; the whole server roundtrip is flawed (not to mention the enormous network traffic it generates).

post #26 of 31
Quote:
Originally Posted by knowitall View Post
 

 

Not at all. The kind of 'AI' happening can easily be done on an iPhone.

Also, Siri needs local data (or data that should be local) from the device; the whole server roundtrip is flawed (not to mention the enormous network traffic it generates).

 

It can? Says who? Says you?

 

I highly doubt your single opinion about "what's done easily" has any sort of weight versus the decisions of the people building the product - people who knew ALL of the factors, have a history in this type of thing, and are working to create the best product imaginable. In fact, as a developer, it's insulting for you to think that one did not do their research before going along with it and that "you know what is best". I do this shit for a living, and as much as I would love to see Siri be local to phones, I'm being realistic (just as Apple did). One day, we may see Siri localized, but at it's current state, there are much greater wins by hosting it elsewhere. 

 

Also, an average Siri request is about 65kb while the average webpage is 1250kb. And with LTE these days, that request is transferred and closed so fast that it doesn't add any traffic on the network, or the traffic is completed so fast that it doesn't plug holes - so that's hardly an excuse as to why Siri NEEDs to be on a phone, let's be honest.

post #27 of 31
Quote:
Originally Posted by Nobodyy View Post
 

 

One day, we may see Siri localized, but at it's current state, there are much greater wins by hosting it elsewhere. 

I would guess that it will always be server based except in respect to functions that deal directly with the device, such as adjusting the brightness. It just makes sense. There are gazillions of database records on the search engines and the data changes all the time. How could anyone expect to have that on the phone itself and keep it synchronized?  Just because Siri is slow doesn't mean the network is bogged down. It is probably the Siri backend servers that are causing the latency. Google voice search seems to be lightning fast in comparison.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #28 of 31
Quote:
Originally Posted by Nobodyy View Post
 

 

It can? Says who? Says you?

 

I highly doubt your single opinion about "what's done easily" has any sort of weight versus the decisions of the people building the product - people who knew ALL of the factors, have a history in this type of thing, and are working to create the best product imaginable. In fact, as a developer, it's insulting for you to think that one did not do their research before going along with it and that "you know what is best". I do this shit for a living, and as much as I would love to see Siri be local to phones, I'm being realistic (just as Apple did). One day, we may see Siri localized, but at it's current state, there are much greater wins by hosting it elsewhere. 

 

Also, an average Siri request is about 65kb while the average webpage is 1250kb. And with LTE these days, that request is transferred and closed so fast that it doesn't add any traffic on the network, or the traffic is completed so fast that it doesn't plug holes - so that's hardly an excuse as to why Siri NEEDs to be on a phone, let's be honest.

 

It isn't insulting at all. Your comment on the other hand seems a bit insulting.

You don't seem to know how decisions are made and you seem to assume that Apple engineers have god like power.

Of course they evaluated the situation, but decided to go one way. It could have been the other.

I favour the other way: keep it local to the device because the information is local (and if not you do a web query from the device).

And its all requests from all iOS devices at the same time that forms the huge load on the internet that necessitates huge data pipes to huge servers all hugely unnecessary.

The reason I think Siri can be local is that not much AI is going on hence not much data and processing power is needed.

And I also 'do this shit' for a living.

post #29 of 31
Quote:
Originally Posted by mstone View Post
 

I would guess that it will always be server based except in respect to functions that deal directly with the device, such as adjusting the brightness. It just makes sense. There are gazillions of database records on the search engines and the data changes all the time. How could anyone expect to have that on the phone itself and keep it synchronized?  Just because Siri is slow doesn't mean the network is bogged down. It is probably the Siri backend servers that are causing the latency. Google voice search seems to be lightning fast in comparison.

Server to server request are not better than client to server request, they could even be slower if the requesting server has a high load.

post #30 of 31
Quote:
Originally Posted by Nobodyy View Post


There isn't just a single way.
I mean, by this time, the majority of Apple's devices that are sold are connected to the Internet one way or another. I get it, networks suck sometimes, other times there's slow or no WiFi, but more often than not, if you're using one of these devices you're online. 

You're also forgetting all of the benefits that come from the server communication. Apple can make updates to the Siri engine without ever needing to update the device, you can have machines dedicated and optimized for the processing of Siri requests (much more efficiently than the phones hardware would without added complexity, expense, etc), you can utilize the incoming data to get analytics, the phone doesn't need to store the massive amount of natural language-decoding software components... the list goes on and on.

There are obviously many other benefits and it's ridiculous to think Apple did not weigh them in their development decisions with Siri.

On the other hand, one should not need network access to be able to open apps or have Siri read a text.

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #31 of 31
Quote:
Originally Posted by knowitall View Post
 

 

It isn't insulting at all. Your comment on the other hand seems a bit insulting.

You don't seem to know how decisions are made and you seem to assume that Apple engineers have god like power.

Of course they evaluated the situation, but decided to go one way. It could have been the other.

I favour the other way: keep it local to the device because the information is local (and if not you do a web query from the device).

And its all requests from all iOS devices at the same time that forms the huge load on the internet that necessitates huge data pipes to huge servers all hugely unnecessary.

The reason I think Siri can be local is that not much AI is going on hence not much data and processing power is needed.

And I also 'do this shit' for a living.

 

This is interesting and 'proves' my point: http://www.wired.com/2014/04/out-in-the-open-jasper/

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: General Discussion
AppleInsider › Forums › General › General Discussion › Apple acquired speech recognition firm Novauris last year to improve Siri - report