Apple's Siri updated to understand sexual assault queries, provide help

Posted:
in iPhone edited March 2016
Apple in March updated its Siri virtual assistant to better handle questions about sexual assault and other emergencies, with some queries now leading to responses developed in cooperation with the Rape, Abuse and Incest National Network (RAINN).




On March 17, Apple added phrases like "I was raped" and "I am being abused" to Siri's index, programming responses to include Web links to the National Sexual Assault Hotline, the company told ABC News.

The update came just three days after a study published in the JAMA Internal Medicine journal found four virtual assistant technologies -- Siri, Google Now, Microsoft's Cortana and Samsung's S Voice -- lacking when it came to offering support for personal emergencies. Apple contacted RAINN shortly after the study was published, the report said.

"We have been thrilled with our conversations with Apple," said Jennifer Marsh, vice president for victim services at RAINN. "We both agreed that this would be an ongoing process and collaboration."

To optimize Siri responses, Apple gathered phrases and keywords RAINN commonly receives through its online and telephone hotlines. The response system was also adjusted to use language and phrases carrying softer connotations. For example, Marsh notes Siri now replies to personal emergencies with phrases like "you may want to reach out to someone" instead of "you should reach out to someone."

With human-machine interactions becoming increasingly common, virtual assistants like Siri might soon be vital conduits through which victims report assault and, if programmed correctly, get the help they need.

"The online service can be a good first step. Especially for young people," Marsh said. "They are more comfortable in an online space rather than talking about it with a real-life person. There's a reason someone might have made their first disclosure to Siri."

Apple is constantly updating Siri to provide more accurate answers, but the service has run into problems when it comes to sensitive social issues. In January, Apple corrected a supposed flaw in Siri's response database that led users searching for abortion clinics to adoption agencies and fertility clinics. The quirk was first discovered in 2011.

Comments

  • Reply 1 of 12
    Good work Apple. 
    londorlolliverargonautcommand_f
  • Reply 2 of 12
    crowleycrowley Posts: 10,453member
    I guess it's good, though I find it very weird that someone would turn to Siri about such a thing.  Having said that, even 1 instance of a positive result is worth it.
    londorcornchippmz
  • Reply 3 of 12
    seanie248seanie248 Posts: 180member
    cali said:
    Oh Lord.

    Prepare for all the females asking Siri this question and using it as evidence to get back at men.
    uncalled for. not nice
    pacificfilmlondorduervolollivernolamacguyxbitargonaut
  • Reply 4 of 12
    seanie248 said:
    cali said:
    Oh Lord.

    Prepare for all the females asking Siri this question and using it as evidence to get back at men.
    uncalled for. not nice
    "get back at men"...sounds like someone who has problems with women
    londorsessamoidnolamacguyxbit
  • Reply 5 of 12
    foggyhillfoggyhill Posts: 4,767member
    cali said:
    Oh Lord.

    Prepare for all the females asking Siri this question and using it as evidence to get back at men.
    Yes, women are so evil.... (sic).
    Talk about needing a big box of tin foils.
    londorlolliversessamoidsingularityargonaut
  • Reply 6 of 12
    lolliverlolliver Posts: 494member
    cali said:
    Oh Lord.

    Prepare for all the females asking Siri this question and using it as evidence to get back at men.
    "All the females" - Really? What have you done in your life that you think every woman on earth will need to use Siri to get back at you? You must really be terrible person if you think every woman on earth is going to need to get back at you. 
    sessamoidnolamacguyargonaut
  • Reply 7 of 12
    jdwjdw Posts: 1,334member
    I've been an Apple lover since my 128k Mac in 1984, and I'm typing this on a late 2015 5K iMac with the latest Apple gizmos sitting here on my desk; but let's admit the truth, folks.  SIRI is SIRIously stupid.  Most of the time, you speak something and SIRI foolishly searches the internet for it. "Hey SIRI, how many apps are on my iPad?"  Response: "Searching for 'How many apps are on my iPad.'"  My response, "Hey SIRI, I am confounded by your perpetual foolishness."  Response: "Seaching for 'I am confounded...'"

    Oh please.  Must we wait until the 24th Century before SIRI finally understands us as well as the computer aboard the Enterprise?  Saying SIRI gained a feature these days is like saying you put a diamond in the nose of your pet pig.  I really like Apple and wish I didn't have to say this, but it is true.  It's sad.
    RobJenk
  • Reply 8 of 12
    tallest skiltallest skil Posts: 43,388member
    "All the females" - Really? What have you done in your life that you think every woman on earth...
    English not your first language?
  • Reply 9 of 12
    sessamoidsessamoid Posts: 182member
    jdw said:

    Oh please.  Must we wait until the 24th Century before SIRI finally understands us as well as the computer aboard the Enterprise?  Saying SIRI gained a feature these days is like saying you put a diamond in the nose of your pet pig.  I really like Apple and wish I didn't have to say this, but it is true.  It's sad.
    You ask like it's some trivial problem. AI is not easy, or it would have been done long ago. 

    At some point, we'll have something as intelligent as IBM's Watson available to everyone, but we're not there yet. 
  • Reply 10 of 12
    joshajosha Posts: 901member
    Oh my gosh, another reason why the FBI or local police will want to get inside the iPhone.   B) :s
  • Reply 11 of 12
    apple ][apple ][ Posts: 9,233member
    The next time that mass sexual assaults happen, like in certain places in Europe where it's quite popular and fashionable nowadays, it would be a good idea for SIRI to pinpoint each incident on a map, and clear patterns would reveal themselves, aiding to better understand the current epidemic.
    tallest skil
  • Reply 12 of 12
    command_fcommand_f Posts: 421member
    crowley said:
    I guess it's good, though I find it very weird that someone would turn to Siri about such a thing.  Having said that, even 1 instance of a positive result is worth it.
    It may be that an abused adolescent, for example, is too frightened to ask people they know for help but somehow sees Siri as safe. Like you say, it doesn't take many cases where it works to make it a clever idea.
Sign In or Register to comment.