Originally Posted by snova
sorry, but I disagree. This is the key point of Siri.. not translation. If you think Siri is about TTS and STT you missed the whole reason for Siri.
Siri figures out your true intention by random text that gets passed to it. Its not about matching against pre-determined phrases. The TTS and STT part may even be licensed from Nuance. That is not the IP of interest here.
It is translation. It has to know the sentence structure of French and German to decipher what you've told it. This is pretty similar to the IBM thing they had on Jeopardy. It hears the words and their places in the sentence and then decides what you've said. It compares words to the ones before and after to it confirm what it heard and what you've meant. I'm not saying it's sole function is rigid command response but there is some of that in there. "Weather", "today", "forecast", "rain", "umbrella", "NASDAQ", "meeting", "tomorrow" "(contact name)", "(time)" are all forms of commands that it can separate, or combine, and decide context. If you can figure out what context they are in then that is a form of STT, but much more seamless and intuitive. You and I do this every day. We separate words and phrases into chunks and decide context. If I tell you I'm meeting Sally on Saturday for coffee at 9 AM, you pick out the meat of that sentence. Sally is the person, Saturday is the day, coffee is the event and 9 AM is the time I'm meeting her. This works too with words spelled incorrectly. You can decipher the word needing only the letters and they don't have to be spelled correctly. Translating this brain function into code has to be a bitch I'm sure.
From the article you posted:
Originally Posted by http://www.xconomy.com/
Siri is far better at understanding. It’s a gifted natural language AI technology that knows the myriad of ways people express intent. Siri knows that by “book” you don’t mean a paperback novel but the action to “reserve” a table.
With clever programming it'll look at "book", "table", "Il Fornaio" find out what they are and see that is a restaurant. It'll understand that's a restaurant and you want reservations through OpenTable. Which seems to be an extremely tied in service for it to work as well as it does. This is table lookup but in an extremely fast and clever way. Another reason why I think that Siri is supported by off site servers and not the A5.
It's alot more complicated than that from what I've seen of it, and this presentation and function is amazing, but it is based in command words that can be easily recognized. Why I believe that this is restricted to the 4S is to drive sales. It was working on lesser hardware than the 4S in the form of an app and it works beautifully in the tech demo. Phil Schiller saying that it's supported by off site servers tells me that it is process intensive and the A5 isn't the only thing used to make it work. I'm not trying to down play the awesomenes that it is, but words in there can be figured out by their order and a clear instruction made from them. Wolfram Alpha does this too, and it's not in a phone. I think Siri is amazing, but there are basic fundamentals that make it work. Making it work system wide is where the A5 comes into play is what I guess. One core can chew on what was heard and the other works on getting to the information and presenting it to you.
I'm probably completely wrong though. Just my opinion of course. I sincerely hope it'll be included in iOS5 for the iPhone 4 because it looks really awesome.