I think, Siri on HomePod should at least be smart enough to understand the command ‘Play XXX next’. But something like ‘play next’ or ‘add to queue’ seems not to be in Siri’s HomePod vocabulary. Or has anyone been successful with this?
Hey @DanielEran if Apple makes no money from Siri how is she "failing all the way to the bank?" If siri was 100x better tomorrow and sounded just like Scarlet Johanson in "Her" how would that get people who already have iPhones to buy new ones? You're wrightup doesn't seem to follow its own logic.
Apple views siri as complementary to the iPhone like like the watch, they have very little to gaine by dedicating the gargantuan amounts of resources it would take to make sirri get to that Samantha level, I think I read somewhere that Bezos' got over 100K people working on Alexa don't know how true that is but unlike Apple Alexa could realistically makes AMZ serious $$$ because Alexa is complementary to AMZ's business model not the products they sell, so maybe you could work on something to show us how Apple getting "Sirious" about Siri will meaning fully contribute to the bottom line since you love to bring up in almost every artticle, & why do you do that by the way, how has talking about how much money Apple's making become the thing Apple fans care about now?
Only AAPL fans care about that and those guys car about Apple only as far as the next financial statement's good for them no?
Apple also doesn't make direct revenue from Pages, Keynote, etc. Siri sold iPhone 4s and continues to sell products, despite the competitor-led idea that it is so terrible nobody can use it.
Amazon supposedly has 5,000 employees working on Alexa (not 100k). It doesn't directly contribute any revenue either, and the hardware it sells is not profitable. Amazon sees it as a way to induce sales and move all retail to its tax-free mailorder business.
The contraction you make in saying Siri but not Alexa, must make money is puzzling. But when you say "talking about how much money Apple's making become the thing Apple fans care about now" after I lay out that Apple is clearly not commercially failing because it is capable of selling things (the way Surface, Galaxy Gear, Pixel are not) it's also a head-scratcher.
I feel like I'm playing Fruit Ninja with your word salad and it's not fun even if I win.
Not all of the Alexa skills are frivolous. I use the Sonos skill, as well as the Logitech harmony Skill. When I get home, I say "Alexa turn on Sonos", she complies, then I say "Alexa play Jethro Tull in the Living Room" and she dutifully complies by streaming Tull via Spotify. (The biggest shortcoming here BTW is Spotify - I'm relatively new to it, but compared to Apple Music, Spotify is a real dud. It doesn't understand King Crimson, as well as many other "older" bands. It also has a tendency to play the same handful of songs over and again if you just specify to listen to a particular artist. But I digress). Another cool Alexa skill is Anylist. This is how we make our shopping list, and its remarkably good at understanding ingredients you wouldn't expect it to understand. And with the new Alexa feature where you don't need to say the wake-word if you are already conversing with Alexa, then its even more natural to make a shopping list just by talking.
Playing music is certainly a "skill" built into Siri.
There are lots of other useful features that Alexa has pioneered, and many of those are linked to the fact that it's tied almost exclusively to an always-on appliance that is designed to take a series of requests, rather than Siri being historically available only from devices that you interact with for a single question, which often follows up with UI, not a voice conversation. Siri is invading the Echo world, while Alexa has made zero progress in establishing itself on mobile devices at all, despite that being the more valuable path (and the initial goal back in 2014).
If Apple actually sees HomePod as a strategic direction (and it promoted it as such, not just as a thing it casually sells via Beats) then it will be surprising if Siri features on HomePod don't quickly catch up, and exceed in areas like Continuity, where Apple has capabilities for tight mobile integration that aren't even available to Alexa.
Commentators are often clear on the idea that everything Apple can innovate will be copied (Touch ID, TrueDepth, etc) but seem to lack the vision for how a better-financed company that's better at hardware, better at design, better at platform management, etc, could possibly catch up with features that are already released by Amazon. Siri learned Hey Siri from Google and HomePod has a mic array like Echo. Apple now has a growing installed base of HomePods, so its a matter of adding backend enhancements.
It needs some full-function API's. Restricting the released API's to calls (VoIP) and maps makes their job easier - but it doesn't let app developers truly expand the scope of what Siri can do.
I'm sure what I'm asking for is difficult - but that's the Apple Engineer's job.
Expanding Siri to do more useful things should be the job of app developers - they just need a flexible API to do it with.
The fact I can't add something to my shopping list whilst handsfree/driving is very annoying.
It's a trivial example - but there are 100 minor things I do with my phone every day, and 98 of them still require me to touch it to do them: - shopping list - check bank balance - transfer funds - find where my bus is (using the TripView app) - find the next connecting bus (TripView) - is there an accident ahead (LiveTraffic app) - read the latest Facebook posts from close friends - read a book (Kindle) or newspaper article (Kindle) - read an InMail (linked in) - open a camera (Nest) - make a doctors appointment (HotDoc) - note the time I put the baby to bed (Baby Tracker) - note the time the baby woke up (Baby Tracker) - note the quantity and time of medication (Baby Tracker) etc. etc.
What can I do? - set a timer - text/SMS my wife
No way can Apple add all those features - but they can work on better API's so that developers can create the apps with those features.
I think the best example on how inflexible the Siri API is - is the HotDoc app and doctors appointments. Siri can make a calendar appointment, but it's not flexible enough to handle a doctors appointment, because it needs some ability to select the doctor and restrict the appointments available based on available appointment slots and appointment durations (long or short).
It was a good 'show us you are at least thinking about it' API - but it needs a follow up to 'show us you are serious about it' API.
Siri's Intents allow it to pass recognized requests to an app. It's the job of that app to handle the information side and pass back parameters Siri can communicate to you.
It would appear, for example, that any transit app could use ridesharing Intent to do what you want, there just aren't any that do that yet. Perhaps additional support needs to be added to the ridesharing Intel to expand its range of transit modes to include things like fixed location bike shares, trains, etc. In a world where third-party apps solve these kinds of features, there needs to be enough critical mass to support the development of such an app, and the ongoing maintenance of it. If built-in, basic Apple Maps Transit or a transit authority's own app are good enough, they might never work to add support for a Siri INtent. So it's not all on Apple and what it does with Siri.
Many of the "Skills" for Alexa are like the titles in any app store: a mess of frivolous stuff that claims to do something but isn't worth using. One of the points of the article is that this is never brought up when Siri critics simply point to a number of Skills, without actually identifying ones that are useful and missing on Siri. I tried to find some and they just didn't deliver anything reportable.
Siri support for banking apps is supported by Payments Intents, and Siri's Note Taking Intent appears capable of doing what you request for the shopping list and Baby app, but the developer has to decide if its worth it. Apple is providing a massive addressable platform for Siri above 1 Billion devices. All the excitement around Alexa involves around 30-40 million devices, many sets of which are installed in different rooms of a single family.
Your requests for Kindle, Nest etc are things you expect from third parties who do not support Apple's other platforms, including HomeKit. So it's not Siri's fault that those third parties aren't supporting Siri. Siri already provides Intents for messaging, HomeKit cameras, etc.
Developers who see more impetus to develop for Alexa than Siri are either interested in mining your voice data or just don't realize the difference in value between 1,000M and 40M. You should request Siri support from the app developers, rather than asking Apple to generically support "third parties" that it already built support APIs for.
Siri sold iPhone 4s and continues to sell products.
Yeah, despite having a slow/inefficient/overheating Soc, bad display, terrible audio quality, inferior camera, toy like design, poorly built body and frequently hanging iOS 5 - Apple was able to sell millions of iPhone 4s ONLY because of Siri (which proved to be the decisive differentiator with competing products).
I think, Siri on HomePod should at least be smart enough to understand the command ‘Play XXX next’. But something like ‘play next’ or ‘add to queue’ seems not to be in Siri’s HomePod vocabulary. Or has anyone been successful with this?
Yeah it should work like "Play Later" in the UI, but it appears it doesn't yet work from Siri yet.
I haven’t tried this before now but it totally worked for me. We were listening to “Don’t feel like dancing” by Scissor Sisters, I said, “Hey Siri, play “take your mama” by Scissor Sisters next”, Siri replied in the affirmative and played “Take your mama” next. During that song I said, “Hey Siri, play “My Eyes” by Nero next” and it worked again. To be clear, this was on HomePod.
Comments
it’s in my list but I haven’t tried it yet.
I feel like I'm playing Fruit Ninja with your word salad and it's not fun even if I win.
There are lots of other useful features that Alexa has pioneered, and many of those are linked to the fact that it's tied almost exclusively to an always-on appliance that is designed to take a series of requests, rather than Siri being historically available only from devices that you interact with for a single question, which often follows up with UI, not a voice conversation. Siri is invading the Echo world, while Alexa has made zero progress in establishing itself on mobile devices at all, despite that being the more valuable path (and the initial goal back in 2014).
If Apple actually sees HomePod as a strategic direction (and it promoted it as such, not just as a thing it casually sells via Beats) then it will be surprising if Siri features on HomePod don't quickly catch up, and exceed in areas like Continuity, where Apple has capabilities for tight mobile integration that aren't even available to Alexa.
Commentators are often clear on the idea that everything Apple can innovate will be copied (Touch ID, TrueDepth, etc) but seem to lack the vision for how a better-financed company that's better at hardware, better at design, better at platform management, etc, could possibly catch up with features that are already released by Amazon. Siri learned Hey Siri from Google and HomePod has a mic array like Echo. Apple now has a growing installed base of HomePods, so its a matter of adding backend enhancements.
It would appear, for example, that any transit app could use ridesharing Intent to do what you want, there just aren't any that do that yet. Perhaps additional support needs to be added to the ridesharing Intel to expand its range of transit modes to include things like fixed location bike shares, trains, etc. In a world where third-party apps solve these kinds of features, there needs to be enough critical mass to support the development of such an app, and the ongoing maintenance of it. If built-in, basic Apple Maps Transit or a transit authority's own app are good enough, they might never work to add support for a Siri INtent. So it's not all on Apple and what it does with Siri.
Many of the "Skills" for Alexa are like the titles in any app store: a mess of frivolous stuff that claims to do something but isn't worth using. One of the points of the article is that this is never brought up when Siri critics simply point to a number of Skills, without actually identifying ones that are useful and missing on Siri. I tried to find some and they just didn't deliver anything reportable.
Siri support for banking apps is supported by Payments Intents, and Siri's Note Taking Intent appears capable of doing what you request for the shopping list and Baby app, but the developer has to decide if its worth it. Apple is providing a massive addressable platform for Siri above 1 Billion devices. All the excitement around Alexa involves around 30-40 million devices, many sets of which are installed in different rooms of a single family.
Your requests for Kindle, Nest etc are things you expect from third parties who do not support Apple's other platforms, including HomeKit. So it's not Siri's fault that those third parties aren't supporting Siri. Siri already provides Intents for messaging, HomeKit cameras, etc.
Developers who see more impetus to develop for Alexa than Siri are either interested in mining your voice data or just don't realize the difference in value between 1,000M and 40M. You should request Siri support from the app developers, rather than asking Apple to generically support "third parties" that it already built support APIs for.
Yeah, despite having a slow/inefficient/overheating Soc, bad display, terrible audio quality, inferior camera, toy like design, poorly built body and frequently hanging iOS 5 - Apple was able to sell millions of iPhone 4s ONLY because of Siri (which proved to be the decisive differentiator with competing products).
/s