This is funny in a way. I can just imagine my technology-phobic aunt trying to call me up and Siri answering her instead. It would probably give her quite a fright!
The part I want most is the ability to have my voicemail available on all devices. It's the one thing that's still stuck on just my phone. If we can move away from carrier voicemail completely, that would solve a ton of problems for me. Can't wait to see it!
You can do that now, just not (yet) solely within the Apple ecosystem.
I would not want to use any 3rd party App for my calls/voicemails.
There are so many problems using a 3rd party app. It needs to be built into the phone from the start.
If it was adobe or any other developer I still would not use it.
That is what I meant when I said no phone does this out of the box. With GoogleVoice you need to go through a whole setup process.
You're on the right track. Tiny "fix" above and I'm all over your comments here 100%
It doesn't matter to me if it's google, adobe or anyone, they have no business recording, transcribing, or otherwise analyzing and "owning" my voice messages. Ever.
The difference is, I don't believe that's any of Apple's business to own that data either. Sure, they're better than companies whose entire business models revolve around gathering personal information of their users, but NO company should have all that info.
Remember when answering machines sat next to your home phones, and you were the only one that got to listen to those messages? When the entire process can happen on the device itself or via a server that I own, then I'll jump for joy. Yes, this will be challenging, but I absolutely believe it will be the norm again someday, when people stop handing over every single thing they say and do and want and think to large corporations. Some day.
Most Google Android smartphones? Figured you were aware of that.
BTW, neural network processing is now the way to go. Apple will use network processing, just as Amazon and Google do. Google Voice only recently started using their neural processing to do so according to their blog post. Doing so greatly increases the accuracy. It used to be humorously inaccurate at times.
Interesting. I'll have to see if the transcription for any messages I receive in the future are improved because, as you said, they were pretty damn funny before (and I may miss that, LOL).
Why can't it do this now? Pretty much every VOIP operator has a voice-to-text conversion for voicemail. There is nothing special about it. But it works. How is it that Apple can make this wait until 2016, and probably bill it as a major feature?
Like everything else...its probably not "doing it" thats the problem....its "doing it for hundreds of millions of users all day everyday".
EDIT: I'm only talking about the voice-to-text aspect. Not the rest of the far more advanced feature.
Using my Google Voice number and the transcripts it sends me in an email I'd say it's because, 1) the feature is primitive in the sense that it doesn't really connect well with the rest of the phoning system, 2) since the iPhone connects via carriers they would have to get them to change their backend. like they did previously with Visual Voicemail, but perhaps with even more complexity since at some point the audio will need to be routed to Apple's Siri servers from their voicemail servers or your device in order to complete the transcription (whereas with Google it's all already on their servers), and 3) the transcription has been comically bad to the point of not being helpful (although I recently read that they've improved it dramatically).
You can get much of this now with French mobile provider "Orange", you can get even get it if you are not an Orange customer. Their app is called "Libon" and is available in the App Store (shortcut link: 4xq.ca/libon ). It supports many languages, and can also read customised greetings to individual callers or groups of callers. It's also free, unless you pay for extra voices. I'v been using it since it first became available. I'm a user, nothing more, love the service, transcription is 95%+ great, every now and then it's off.
This implementation is more like Siri is acting like a secretary or personal assistant. We're getting closer to real artificial intelligence at work. In fact, Google and Facebook are pouring a lot of resources into this development right now and I it'll be Apple, Google, Facebook or Amazon that will make the big breakthrough first.
Using my Google Voice number and the transcripts it sends me in an email I'd say it's because, 1) the feature is primitive in the sense that it doesn't really connect well with the rest of the phoning system, 2) since the iPhone connects via carriers they would have to get them to change their backend. like they did previously with Visual Voicemail, but perhaps with even more complexity since at some point the audio will need to be routed to Apple's Siri servers from their voicemail servers or your device in order to complete the transcription (whereas with Google it's all already on their servers), and 3) the transcription has been comically bad to the point of not being helpful (although I recently read that they've improved it dramatically).
Google does a decent transcription, but there are still a lot of things it gets wrong. It would be nice if Siri would transcribe what is spoken, then confirm with a text back to the caller to check accuracy if they agree. It shouldn't be but a matter of a few more years before we start getting systems that ask questions to clarify things, or start to put everything spoken or recorded into a context that makes sense for each individual. This context awareness is already starting to be seen with Google Now, but one must allow their system to scan all of your emails and communications to get the best results.
Google does a decent transcription, but there are still a lot of things it gets wrong. It would be nice if Siri would transcribe what is spoken, then confirm with a text back to the caller to check accuracy if they agree. It shouldn't be but a matter of a few more years before we start getting systems that ask questions to clarify things, or start to put everything spoken or recorded into a context that makes sense for each individual. This context awareness is already starting to be seen with Google Now, but one must allow their system to scan all of your emails and communications to get the best results.
Google Voice transcriptions recently got much better.
Comments
This is funny in a way. I can just imagine my technology-phobic aunt trying to call me up and Siri answering her instead. It would probably give her quite a fright!
And making even more mistakes than a minimum wage worker.
Siri's got a long way to go. I would have thought there would have been vast and substantive improvements by now.
Me too, Google Now is still much-much better!
You can do that now, just not (yet) solely within the Apple ecosystem.
Can do it, and having it reliably built into the system are two completely different things.
You're on the right track. Tiny "fix" above and I'm all over your comments here 100%
It doesn't matter to me if it's google, adobe or anyone, they have no business recording, transcribing, or otherwise analyzing and "owning" my voice messages. Ever.
The difference is, I don't believe that's any of Apple's business to own that data either. Sure, they're better than companies whose entire business models revolve around gathering personal information of their users, but NO company should have all that info.
Remember when answering machines sat next to your home phones, and you were the only one that got to listen to those messages? When the entire process can happen on the device itself or via a server that I own, then I'll jump for joy. Yes, this will be challenging, but I absolutely believe it will be the norm again someday, when people stop handing over every single thing they say and do and want and think to large corporations. Some day.
Most Google Android smartphones? Figured you were aware of that.
BTW, neural network processing is now the way to go. Apple will use network processing, just as Amazon and Google do. Google Voice only recently started using their neural processing to do so according to their blog post. Doing so greatly increases the accuracy. It used to be humorously inaccurate at times.
Interesting. I'll have to see if the transcription for any messages I receive in the future are improved because, as you said, they were pretty damn funny before (and I may miss that, LOL).
Using my Google Voice number and the transcripts it sends me in an email I'd say it's because, 1) the feature is primitive in the sense that it doesn't really connect well with the rest of the phoning system, 2) since the iPhone connects via carriers they would have to get them to change their backend. like they did previously with Visual Voicemail, but perhaps with even more complexity since at some point the audio will need to be routed to Apple's Siri servers from their voicemail servers or your device in order to complete the transcription (whereas with Google it's all already on their servers), and 3) the transcription has been comically bad to the point of not being helpful (although I recently read that they've improved it dramatically).
Siri message to guy's wife:
"Having a wonderful time on vacation dear. Wish you were her."
You can get much of this now with French mobile provider "Orange", you can get even get it if you are not an Orange customer. Their app is called "Libon" and is available in the App Store (shortcut link: 4xq.ca/libon ). It supports many languages, and can also read customised greetings to individual callers or groups of callers. It's also free, unless you pay for extra voices. I'v been using it since it first became available. I'm a user, nothing more, love the service, transcription is 95%+ great, every now and then it's off.
What's next? Soon I'll have my Siri talk to your Siri and we can leave the people out altogether.
There's a YouTube video somewhere of two phones doing just that.
Doesn't Google Voice already do this?
This implementation is more like Siri is acting like a secretary or personal assistant. We're getting closer to real artificial intelligence at work. In fact, Google and Facebook are pouring a lot of resources into this development right now and I it'll be Apple, Google, Facebook or Amazon that will make the big breakthrough first.
Using my Google Voice number and the transcripts it sends me in an email I'd say it's because, 1) the feature is primitive in the sense that it doesn't really connect well with the rest of the phoning system, 2) since the iPhone connects via carriers they would have to get them to change their backend. like they did previously with Visual Voicemail, but perhaps with even more complexity since at some point the audio will need to be routed to Apple's Siri servers from their voicemail servers or your device in order to complete the transcription (whereas with Google it's all already on their servers), and 3) the transcription has been comically bad to the point of not being helpful (although I recently read that they've improved it dramatically).
Google does a decent transcription, but there are still a lot of things it gets wrong. It would be nice if Siri would transcribe what is spoken, then confirm with a text back to the caller to check accuracy if they agree. It shouldn't be but a matter of a few more years before we start getting systems that ask questions to clarify things, or start to put everything spoken or recorded into a context that makes sense for each individual. This context awareness is already starting to be seen with Google Now, but one must allow their system to scan all of your emails and communications to get the best results.
Google Voice transcriptions recently got much better.