Apple exposes how it teaches new languages to Siri, staying ahead of competition
A report on Thursday offered a behind-the-scenes look at how Apple introduces new languages for Siri, suggesting that broader language support is one of the few things keeping the voice assistant competitive in the face of rivals from Amazon and Google.

To bring in a new language, Apple first starts by having people read passages in a variety of accents and dialects, Apple speech team head Alex Acero explained to Reuters. These passages are manually transcribed so computers know exactly what they're supposed to be learning. After being supplemented by sounds captured in a range of voices, Apple builds a language model that tries to predict word sequences.
Before actually upgrading Siri, Apple rolls out dictation. The company is said to collect and anonymize a small percentage of recordings from customers, and then have humans transcribe these, something Acero said can cut recognition errors in half.
After gathering enough data, the company finally prepares to launch Siri in the new language, including hiring one or more actors. The first release may still only be able to answer the most frequent questions -- Apple, though, will update its codebase every two weeks with adjustments.
The result is that Siri is now in 21 languages and 36 countries. Microsoft's Cortana supports 8 languages in 13 countries, while Google Assistant -- a relatively new AI, replacing Google Now -- speaks 4 languages. Amazon's Alexa works only in English and German.
One of Apple's next languages will be Shanghainese, a Chinese dialect spoken only around that city. Dictation support is coming in impending software updates such as macOS 10.12.4.
Apple's approach could be problematic, Reuters hinted, because of how difficult it is to scale, demanding more and more writers. Other assistants, like the Samsung-owned Viv, may be better designed for this task.
Siri has increasingly been accused of lagging behind its rivals in terms of capabilities. Amazon's Alexa regularly gains new skills, for instance, while Google Assistant is able to understand context, enabling simpler, more conversational responses.
Apple is rumored to be planning Siri enhancements which could debut on new iPhones shipping this fall. What those might be aren't clear, though the company has been investing large sums into AI and machine learning.

To bring in a new language, Apple first starts by having people read passages in a variety of accents and dialects, Apple speech team head Alex Acero explained to Reuters. These passages are manually transcribed so computers know exactly what they're supposed to be learning. After being supplemented by sounds captured in a range of voices, Apple builds a language model that tries to predict word sequences.
Before actually upgrading Siri, Apple rolls out dictation. The company is said to collect and anonymize a small percentage of recordings from customers, and then have humans transcribe these, something Acero said can cut recognition errors in half.
After gathering enough data, the company finally prepares to launch Siri in the new language, including hiring one or more actors. The first release may still only be able to answer the most frequent questions -- Apple, though, will update its codebase every two weeks with adjustments.
The result is that Siri is now in 21 languages and 36 countries. Microsoft's Cortana supports 8 languages in 13 countries, while Google Assistant -- a relatively new AI, replacing Google Now -- speaks 4 languages. Amazon's Alexa works only in English and German.
One of Apple's next languages will be Shanghainese, a Chinese dialect spoken only around that city. Dictation support is coming in impending software updates such as macOS 10.12.4.
Apple's approach could be problematic, Reuters hinted, because of how difficult it is to scale, demanding more and more writers. Other assistants, like the Samsung-owned Viv, may be better designed for this task.
Siri has increasingly been accused of lagging behind its rivals in terms of capabilities. Amazon's Alexa regularly gains new skills, for instance, while Google Assistant is able to understand context, enabling simpler, more conversational responses.
Apple is rumored to be planning Siri enhancements which could debut on new iPhones shipping this fall. What those might be aren't clear, though the company has been investing large sums into AI and machine learning.
Comments
Alexa, on the other hand, is very useful. I talk to her several times each day on my three Echo devices to control lighting, read news and weather, set timers, make conversions and calculations, add items to my shopping list, read audio books, and play music.
I'm a huge Apple "fanboy" and I'd much prefer to be tied to their ecosystem instead of Amazon's, but Alexa has Siri beat in terms of ease of use and the ability to actually accomplish things - at least for me. I've not used Google Home and have no desire to at this time.
I would prefer my language included even with limited feature set to be able to call/message people by their names.
Next level for Siri is at least bilingual operation. But seeing that we got just bilingual spellcheck for few languages it will take long time. AI asistants have long way to go.
For $300 you get 3 "Siri hubs"
Replaces your current WiFi routers with a mesh network that works well.
Typical placement: 1) Kitchen 2) Livingroom 3) Master bedroom
Each Siri hub has a microphone array that is always listening for questions or commands.
Eero already works with Amazon Alexa and can do the following...
This is the kind of skills Siri/HomeKit need!
While the Eero solution is nice you would need to purchase 3 Eeros($375) and 3 Echo Dots($150)...$525!
Apple is playing the long game with regard to HomeKit as well and I think that is why we haven't seen a competitor to the Amazon dot and echo.
Apple doesn't want to spook their HomeKit partners by competing with them in the HomeKit hardware space.
However, I am still continually frustrated with just the basic functionality, and I know I'm not alone. The most recent frustration: since siri, somehow, cant understand "call ____ on ____ street," I entered my local pharmacy in my contacts as "Midtown Walgreens." Then, inexplicably, when I say "call midtown walgreens" siri's reply is "I couldn't find any wallgreens locations in midtown, here are some walgreens locations that are near you" (this is nearly verbatim IIRC). And of course, multiple, logical work-arounds didn't work; they never do with siri. For example, "call midtown walgreens from my contacts," siri's response was, and I'm not joking, "I couldn't find the contact 'contact midtown walgreens' in your contacts."
Siri is endlessly frustrating.
I have a strong feeling that Apple's issue with Siri is something along the lines of 'the perfect is the enemy of the good,' in that Apple is focusing on advanced functionality while neglecting simple, basic functionality.
http://www.siriuserguide.com/siri-dictation-guide/
For example the person who had problems "And of course, multiple, logical work-arounds didn't work; they never do with siri. For example, "call midtown walgreens from my contacts," siri's response was, and I'm not joking, "I couldn't find the contact 'contact midtown walgreens' in your contacts." Works much better when one says Call "XXX" as it adds context for Siri.
Spending some time with Siri and playing with the various syntax options, and using AirPods has made me a regular Siri user now even in the car, which previously was unusable.