Future versions of Apple's Siri may interpret your emotions

Posted:
in General Discussion edited November 2020
Future versions of Apple's Siri may go beyond voice recognition to enhance accuracy, taping into FaceTime cameras in the company's devices to simultaneously analyze facial reactions and emotions as users engages in dialog with the voice assistant.




Apple is developing a way to help interpret a user's requests by adding facial analysis to a future version of Siri or other system. The aim is to cut down the number of times a spoken request is misinterpreted, and to do so by attempting to analyse emotions.

"Intelligent software agents can perform actions on behalf of a user," says Apple in US Patent Number 20190348037. "Actions can be performed in response to a natural-language user input, such as a sentence spoken by the user. In some circumstances, an action taken by an intelligent software agent may not match the action that the user intended."

"As an example," it continues, "the face image in the video input... may be analysed to determine whether particular muscles or muscle groups are activated by identifying shapes or motions."

Part of the system entails using facial recognition to identify the user and so provide customized actions such as retrieving that person's email or playing their personal music playlists.

It is also intended, however, to read the emotional state of a user.

"[The] user reaction information is expressed as one or more metrics such as a probability that the user reaction corresponds to a certain state such as positive or negative," continues the patent, "or a degree to which the user is expressing the reaction."

This could help in a situation where the spoken command can be interpreted in different ways. In that case, Siri might calculate the most likely meaning and act on it, then use facial recognition to see whether the user is pleased or annoyed.

Detail from the patent regarding one process through which facial expressions can be acted upon
Detail from the patent regarding one process through which facial expressions can be acted upon


The system works by "obtaining, by a microphone, an audio input, and obtaining, by a camera, one or more images." Apple notes that expressions can have different meanings, but its method classifies the range of possible meanings according to the Facial Action Coding System (FACS).

This is a standard for facial taxonomy, first created in the 1970s, which categorizes every possible facial expression into an extensive reference catalog.

Using FACS, Apple's system assigns scores to determine the likelihood of which is the correct interpretation and then can have Siri react or respond accordingly.

Of the seven inventors credited on Apple's filing, only one has had a previous patent. Jerremy Holland is also listed as sole inventor on a 2014 Apple patent to do with syncing video playback on a media device. However, inventor Nicholas E. Apostoloff is cited in numerous other patents for his work on analysing and manipulating video using machine learning techniques.

Keep up with all the Apple news with your iPhone, iPad, or Mac. Say, "Hey, Siri, play AppleInsider Daily," -- or bookmark this link -- and you'll get a fast update direct from the AppleInsider team.
«1

Comments

  • Reply 1 of 24
    Good luck, Siri. You wolud need to see my face expression from that shelf across the room, with the camera facing up.
  • Reply 2 of 24
    Will Siri now react to the incredulity on my face when she responds to a simple question with information totally unrelated to what I asked? :smiley: 
    SnickersMagooOnPartyBusinessMacProMplsPelijahgCloudTalkin
  • Reply 3 of 24
    flaneurflaneur Posts: 4,526member
    First sentence: you mean “tapping” not “taping.” 
  • Reply 4 of 24
    yuck9yuck9 Posts: 112member
    This will work great I'm sure. Siri can't even get basic things right after 8 yrs.  
    bigtdselijahg
  • Reply 5 of 24
    gatorguygatorguy Posts: 24,211member
    The sensing and follow-up action based on perceived emotional states is currently a bit creepy IMO. This will take a few years to gain acceptance, much like having cameras constantly in use. Go back a few years and it was a non-starter for a lot of folks. Baby steps. 
  • Reply 6 of 24
    ↑  Currently and forever creepy.
  • Reply 7 of 24
    I thought at first that Siri was getting better but now it seems to be going backwards. Functionality that I had a few months ago is gone. I used to be able to ask Siri the percentage of battery charge on my iPad which was helpful if I didn’t have my glasses on and couldn’t see the little battery amount. Now it says I’m sorry I don’t know the battery status of iPad. Also, I  used to be able to ask Siri to read my email, now it just opens the email app. And I used to be able, when dictating a text message and realized I had forgotten something, to just say “add” and she would say go ahead and add that to my text, now the added portion is all that gets sent. I just would be content if she would resume her old functionality.
    cornchip
  • Reply 8 of 24
    cornchipcornchip Posts: 1,948member
    All I really want is for Siri to add question marks and exclamation marks with some kind of inflection detection.
    StrangeDays
  • Reply 9 of 24
    MacProMacPro Posts: 19,727member
    gatorguy said:
    The sensing and follow-up action based on perceived emotional states is currently a bit creepy IMO. This will take a few years to gain acceptance, much like having cameras constantly in use. Go back a few years and it was a non-starter for a lot of folks. Baby steps. 
    I'd not find it creepy if an AI understood me better, frankly I can hardly wait.  To begin with along those baby steps, which I agree with, intonation could be used to at least react more appropriately rather than as some are suggesting facial expressions.  That said that would have to be localized pretty dramatically and not even by country, by region I'd think.  However, the next step IMHO is for an AI to be able to have a conversation somehow without every response requiring a permission trigger.  Without that there is no way for a contextual response to a response ... if you see what I mean.
    cornchip
  • Reply 10 of 24
    MacProMacPro Posts: 19,727member

    I thought at first that Siri was getting better but now it seems to be going backwards. Functionality that I had a few months ago is gone. I used to be able to ask Siri the percentage of battery charge on my iPad which was helpful if I didn’t have my glasses on and couldn’t see the little battery amount. Now it says I’m sorry I don’t know the battery status of iPad. Also, I  used to be able to ask Siri to read my email, now it just opens the email app. And I used to be able, when dictating a text message and realized I had forgotten something, to just say “add” and she would say go ahead and add that to my text, now the added portion is all that gets sent. I just would be content if she would resume her old functionality.
    Yes I've seen some pretty weird goings on like that.  Not related as such but all of a sudden certain artists can't be found in my iTunes Match by HomePods even though they are there on all other devices.  I have to assume Apple is hard at work tweaking Siri's back end (pardon the expression lol)  and these are anomalies caused by this and hopefully temporary.
    cornchipOnPartyBusiness
  • Reply 11 of 24
    I only find it creepy when an advertising company that has a culture of ignoring and actively circumventing user privacy wishes delves into this stuff. Not worried about Apple. 
    edited November 2019
  • Reply 12 of 24
     Bring it on, I all for it. As long as my facial recognition data will stay private and will never leave my phone! 
  • Reply 13 of 24
    avon b7avon b7 Posts: 7,664member
    Siri once spurted out to my wife:

    "Touchy today, aren't we!"

    Unfortunately it was a case of Siri responding to something that wasn't aimed at it in the first place.

    Made me laugh out loud, though.
  • Reply 14 of 24
    jungmarkjungmark Posts: 6,926member
    “I’m sorry, Dave, I’m afraid I can’t do that.”

    this will end badly. 
  • Reply 15 of 24
    Hey Siri .. maybe just focus on getting the basics working properly before launching into the twilight zone. 
    OnPartyBusiness
  • Reply 16 of 24
    MplsPMplsP Posts: 3,921member
    Will Siri now react to the incredulity on my face when she responds to a simple question with information totally unrelated to what I asked? :smiley: 
    kimberly said:
    Hey Siri .. maybe just focus on getting the basics working properly before launching into the twilight zone. 
    Exactly. My thought while reading this was “how will Siri interpreting emotions change her default action of performing a web search? I guess I’ll now get replies like ‘I found this on the web for “when does daylight savings time frustrated.” ‘ “
  • Reply 17 of 24
    jungmark said:
    “I’m sorry, Dave, I’m afraid I can’t do that.”

    this will end badly. 
    /disables lip-reading feature
  • Reply 18 of 24
    kevin keekevin kee Posts: 1,289member
    I thought at first that Siri was getting better but now it seems to be going backwards. Functionality that I had a few months ago is gone. I used to be able to ask Siri the percentage of battery charge on my iPad which was helpful if I didn’t have my glasses on and couldn’t see the little battery amount. Now it says I’m sorry I don’t know the battery status of iPad. Also, I  used to be able to ask Siri to read my email, now it just opens the email app. And I used to be able, when dictating a text message and realized I had forgotten something, to just say “add” and she would say go ahead and add that to my text, now the added portion is all that gets sent. I just would be content if she would resume her old functionality.
     I just did and it says "Your iPad is at 70%". Also it is still able to read email (by time or by name). Haven't tried the last one yet.
    edited November 2019 OnPartyBusiness
  • Reply 19 of 24
    Wow Kevin, I’d like to get to the bottom of this.  Do you mind telling me which iPad and iPadOS version you are on?  These functions worked fine for me one day and we’d simply gone the next morning when I woke up, having done nothing to change them and using the same words in my commands. Do you have any thoughts? I’d love to get this functionality back.
  • Reply 20 of 24
    After 8 years, Siri cannot get even the simple things right 60% of the time. Both Alexa and Google Assistant are correct virtually 100% of the time. Why is Apple so far behind the other's in accuracy?

    Maybe Apple needs to scrap Siri completely and start again. It can't even get the basic things right virtually all the time.
    edited November 2019
Sign In or Register to comment.