Originally Posted by Gatorguy
What specific UI features are copied then if nearly everything is a "rip from Siri"?
The entire focus of the layout and fundamental way you access the service didn't exist before Siri. I don't that is something Apple can protect or ave rights to but that doesn't it wasn't copied.
Google Now goes much further than Siri, anticipating search requests and displaying them without user input.
Google has some additional features but the one you mention is the one I have no seen in action. Just a propaganda video from Google, much like their Google Glasses. I have seen a demo from some users that are impressive but we also need to consider the number of devices that can access the service.
How does it anticipate requests? Just like Apple's Passport that uses location services to know where you are? I know it's suppose to learn patterns in your behavior, like Nest does, but you they are trying to achieve does not seem like something that can easily work. Passport is very focused to certain kind of task and Nest is very focused to a specific location and particular chore, and yet they aren't perfect.
So how does Google have this all seeing all knowing service that works with absolutely no user sourced desirability for a given location and/or direction? I guess they can use your location when you made a Google search for something; butt do they really record the exact location of Android devices every time you use Google? I guess they can use their Maps service to get an idea of things around you or in the direction you're going, but unless you live on a one road town how does it know where I'm going if there are a dozen places I frequent in the next block? Is it going to serve up a message every time it thinks it knows what I want? Is it going to constantly send my location and time/date stamp back to Google so they can build a reasonable database? If they already have on, who in their right mind would authorize that?
According to what I see stated there is nothing it uses that would have that level of "second sight."
It didn't take a year to do so either with at last one beta-tester commenting on the anticipated search and cards display on his tablet back in December of last year as well as it's inclusion in the next Android OS update, well before public knowledge of the feature. That doesn''t sound like Google seeing Siri two months prior and then wondering how to answer it. Even Google's engineers aren't that good.
I don't know what you're trying to say there.
I still think it's just as possible that Apple's Siri was the one pushed out sooner than they may have preferred, knowing that Google's much improved voice search and Google Now would be introduced before Apple got another OS version out.
Siri was well polished. They bought the original Siri from the devs well before that then integrated it into their OS with plenty of time to spare. They also worked to get the data center going a long time before that. In fact, we kept waiting and waiting for that data center to go live long after we got work that it was completed.
Again, the data you think is so easily determined by a developers sitting in a room typing in responses to questions they conceive isn't realistic. There is no service that can understand the context of human speech without an incredible amount of data sourcing. Google did this years ago and then shelved it. Good on them, but Siri is phenomenal for coming out at complete as is.
As an aside, to show you I'm not being subjective I do think Apple's Maps for iOS 6is rushed. It's not nearly as complete as it should be to compete with what Google has spent many years perfecting. If it doesn't get better by many factors in many areas by the time iOS 6 goes live then I say they should have just let it sit until iOS 7 or until it's actually a good replacement.