Apple suspends Siri quality control program, will let users opt out in update

Posted:
in General Discussion edited December 2019
Apple has temporarily suspended its Siri quality control program after a Guardian expose last week claimed private contractors are privy to "highly sensitive recordings," revelations that immediately raised the brows of privacy advocates.




In a statement to TechCrunch, Apple said it has suspended the Siri response grading program as it reviews the initiative designed to determine whether the virtual assistant is being inadvertently triggered. The company will also allow users to opt in or out of Siri grading as part of a forthcoming software update.

"We are committed to delivering a great Siri experience while protecting user privacy," an Apple spokesperson said. "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."

Last week, Apple came under fire when a report from The Guardian cited a Siri grading program employee as saying the process can inadvertently reveal a user's identity, personal information and other private material.

In efforts to make Siri more accurate, Apple employs contractors who listen to snippets of Siri queries uploaded from devices like iPhone and HomePod. The goal, according to the company, is to resolve whether the assistant was invoked purposely or by mistake, a determination that can only be made by a human operator.

While Apple takes steps to anonymize digested data and disassociate evaluated recordings from device owners, the identities and private information of users can sometimes be gleaned from overheard audio, the contractor said. Further, the contractor claims some audio clips feature "private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."

"You can definitely hear a doctor and patient, talking about the medical history of the patient," the person said. "Or you'd hear someone, maybe with car engine background noise -- you can't say definitely, but it's a drug deal. You can definitely hearing it happening."

The contractor also questioned Apple's transparency on the subject, positing that the company does not go far enough to disclose to consumers how the grading process is conducted and what it entails.

Apple responded to the claims, saying "a small portion" of Siri requests -- less than 1% of daily activations -- are evaluated by personnel for quality control purposes. Reviewers must adhere to "strict confidentiality requirements" and conduct their work in "secure facilities," Apple said, adding that typical recordings subject to grading are a few seconds long.

While Apple does inform users of ongoing Siri quality control initiatives in its terms and conditions, the language used is vague and does not specifically state that audio clips will be recorded and reviewed by other people.

Apple's move to temporarily halt Siri grading is in line with the company's well-cultivated public image as a bastion of consumer privacy. With critics lambasting Google, Facebook and others for harvesting user information, Apple wields privacy as a potent marketing tool, promising customers that its products and services provide world-leading data security.
«1

Comments

  • Reply 1 of 35
    revenantrevenant Posts: 621member
    I honestly see the dilemma, but how do you test real world conversations, voicing, regional dialects? 

    I will happily let apple do what they do, I believe they are far and away better at privacy then the other jokers.
    fotoformatBart YmwhiteStrangeDaysJWSClostkiwipscooter63
  • Reply 2 of 35
    mobirdmobird Posts: 752member
    What does it take to invoke Siri to listen in on a complete conversation as it is suggested? Most people know that it is a stretch to get Siri to do the most basic task asked of it.
    caladanianbeowulfschmidtdedgeckoJWSClostkiwi
  • Reply 3 of 35
    SoliSoli Posts: 10,035member
    This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.
    Bart Yn2itivguymwhiteStrangeDayspscooter63
  • Reply 4 of 35
    racerhomie3racerhomie3 Posts: 1,264member
    Good job Apple. Keep it up. 
    Bart Ymwhite
  • Reply 5 of 35
    kimberlykimberly Posts: 427member
    mobird said:
    What does it take to invoke Siri to listen in on a complete conversation as it is suggested? Most people know that it is a stretch to get Siri to do the most basic task asked of it.
     :D :D :D
  • Reply 6 of 35
    seanismorrisseanismorris Posts: 1,624member
    Good job Apple. Keep it up. 
    This did this basically because of a whistleblower...

    I agree Siri needs to get better, and the only way to do that involves a human.  Nothing revealed is really a surprise, but Apple needs to do a better job keeping their users/customers informed.

    The problem:
    Apple has a tendency to do nothing (or conceal) until it becomes a PR problem.

    Last years scandal:
    Apple confirmed what many customers have long suspected: The company has been slowing the performance of older iPhones. Apple says it started the practice a year ago, to compensate for battery degradation, rather than push people to upgrade their smartphones faster.“

    If Apple gave the users the option to slow performance to extend battery life.  Or, inform users a battery replacement was recommended because Apple detected degradation... it wouldn’t have been a big deal.

    Instead customers were pissed because they replaced their phone unnecessary, experience degraded performance, etc.  Getting the battery replaced is easy enough, but we needed the information they concealed. 

    I’d give Apple more credit if they acted on the Siri issue before we heard about it.  Or, at least informed us without a leak first...

    holyonemazda 3smuthuk_vanalingam1STnTENDERBITSbigtdsCarnagechemengin1
  • Reply 7 of 35
    holyoneholyone Posts: 398member
    The Apple legion is a hilarious bunch, when Google was caught doing the very same thing for the very same reasons they were lambasted, but Apple gets caught and subsequently stops not because they 'Apple' found some horrendous oversight in their supposed customer first process but because like Google they did the most sensible/deceptive thing in persuit of a better AI but were caught and exposed, LOL . Its going to be so funny when Apple finally realizes howmuch data it takes to make competent AI and has to do a u-turn on this privacy nonsense.  
    muthuk_vanalingam1STnTENDERBITS
  • Reply 8 of 35
    croprcropr Posts: 1,120member
    Soli said:
    This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.
    Anonymizing is great for structured data, where you can delete the sensitive data fields.  But for listening to a human voice, anonymizing does not make sense.  People don't talk in "structured fields".

    Apple has violated the GDPR rules in Europe because Apple did not ask an explicit permission to the user.  Maybe this is the reason Apple temporarily stopped the quality assurance and will ask an explicit permission in future, but anyhow Apple  (like Google) can expect to get a fine for this violation. 

    muthuk_vanalingam
  • Reply 9 of 35
    Ooh, highly sensitive recordings. How exciting! Anybody is more than welcome to listen into the rubbish my Home Pod would record: how was your day? Okay, thanks. My hayfever’s bad today.  I had some weird dreams last night. Heh, Siri, what’s the weather like in London today? Heh, Siri, play some smooth jazz, etc.
    Bart Yzroger73
  • Reply 10 of 35
    genovellegenovelle Posts: 1,480member
    cropr said:
    Soli said:
    This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.
    Anonymizing is great for structured data, where you can delete the sensitive data fields.  But for listening to a human voice, anonymizing does not make sense.  People don't talk in "structured fields".

    Apple has violated the GDPR rules in Europe because Apple did not ask an explicit permission to the user.  Maybe this is the reason Apple temporarily stopped the quality assurance and will ask an explicit permission in future, but anyhow Apple  (like Google) can expect to get a fine for this violation. 

    According to to the article it is in the terms but not explicit that a human would listen. It is also different from listening to always recording audio in stead of in this case recordings of the actual invoking of Hey Siri to determine if it was on purpose, to prevent it from involving accidentally 

  • Reply 11 of 35
    holyoneholyone Posts: 398member
    mobird said:
    What does it take to invoke Siri to listen in on a complete conversation as it is suggested? Most people know that it is a stretch to get Siri to do the most basic task asked of it.
    I think it speakers to the different way Apple And Google see software and AI in particular, take their respective approach to AR. This is what Apple thinks matters (hardware and UI power)  





    And this is what Google thinks matters ( AI and information power)





    Who is making the right bet ? 
    Carnage
  • Reply 12 of 35
    croprcropr Posts: 1,120member
    genovelle said:
    cropr said:
    Soli said:
    This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.
    Anonymizing is great for structured data, where you can delete the sensitive data fields.  But for listening to a human voice, anonymizing does not make sense.  People don't talk in "structured fields".

    Apple has violated the GDPR rules in Europe because Apple did not ask an explicit permission to the user.  Maybe this is the reason Apple temporarily stopped the quality assurance and will ask an explicit permission in future, but anyhow Apple  (like Google) can expect to get a fine for this violation. 

    According to to the article it is in the terms but not explicit that a human would listen. It is also different from listening to always recording audio in stead of in this case recordings of the actual invoking of Hey Siri to determine if it was on purpose, to prevent it from involving accidentally 

    For the GDPR, there is absolutely no difference if a human actually listens or not or if the there is always a recording.  The fact that in some cases a human might listen for quality purposes, is sufficient to require an explicit permission.
    muthuk_vanalingam
  • Reply 13 of 35
    Bart YBart Y Posts: 64unconfirmed, member
    Contractor sounds more like a disgruntled ex-employee who has now violated there NDA and confidentiality agreement.
    uraharamwhiteDAalsethJWSClostkiwiSpamSandwich
  • Reply 14 of 35
    Rayz2016Rayz2016 Posts: 6,957member
    Soli said:
    This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.
    The real problem is that Apple didn’t give folk an opt-out from day one. 

    The irony is that they ran into a privacy problem while trying to ensure Siri didn’t run into a privacy problem. 

    It’ll be interesting to see how they get around this one. The conversational changes coming to Siri are game-changing; shame if they get derailed. 


    caladanianpscooter63
  • Reply 15 of 35
    gatorguygatorguy Posts: 24,153member
    cropr said:
    Soli said:
    This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.
    Anonymizing is great for structured data, where you can delete the sensitive data fields.  But for listening to a human voice, anonymizing does not make sense.  People don't talk in "structured fields".

    Apple has violated the GDPR rules in Europe because Apple did not ask an explicit permission to the user.  Maybe this is the reason Apple temporarily stopped the quality assurance and will ask an explicit permission in future, but anyhow Apple  (like Google) can expect to get a fine for this violation. 

    GDPR is almost certainly the reason. Apple was put on notice this week when the German privacy czar initiated an investigation of Google for doing exactly the same thing. The Irish had already started looking at Apple for not properly disclosing their Siri voice recordings and permissions and Apple's response was there was no reason to, their recordings are anonymised. This expose' gives the Irish data commissioner ammunition.

    GDPR is opening a big ol' can of worms in unexpected places. 

    By the way the Apple Watch is being portrayed as the biggest offender for errant and unintended Siri activations. 
    Quote;
    "If your phone or HomePod are accidentally triggered, they will start recording whatever it hears, which could be a sensitive conversation or romantic encounter that you probably don’t want on record. According to the whistleblower, Siri regularly activates without the explicit “Hey Siri” command, including something as innocuous as “the sound of a zip” on a jacket...
    If that snippet is then selected as one of the ones used for grading, a contractor could hear it. The possibility of an accidental trigger rises significantly with the Apple Watch, which only needs to be raised to activate Siri. And since it’s always on your person, the probability of Siri inadventently recording a sensitive conversation is higher than with a phone or a HomePod."

    "Recordings are accompanied by user data showing location, contact details, and app data.” is still completely unclear and neither denied nor confirmed by Apple. It might be the EU that eventually clarifies it. 

    edited August 2019 muthuk_vanalingam1STnTENDERBITSbigtds
  • Reply 16 of 35
    The problem:
    Apple has a tendency to do nothing (or conceal) until it becomes a PR problem.
    How do you know?

    That is something you cannot know 'per definition'..  You only see the things that get reported.. You have no way of knowing what they did to solve issues that were not reported..
    edited August 2019 StrangeDayspscooter63
  • Reply 17 of 35
    It’s so simple, Apple take your customers serious! Stop acting reactively and inform your customers before something like this comes out! 
    Alle these slogans about ‘it’s YOUR data’ and ‘on device computing’ seems hollow now...
    It’s time Apple returns to the reality of life and stops acting so devious with such things, communicate to your user base about privacy while using their devices!

  • Reply 18 of 35
    s07s07 Posts: 1member
    Has the whistleblower  thought of all the people who were sent home today and are now out of work!
    edited August 2019 JWSClostkiwi
  • Reply 19 of 35
    gatorguygatorguy Posts: 24,153member
    Bart Y said:
    Contractor sounds more like a disgruntled ex-employee who has now violated there NDA and confidentiality agreement.

    s07 said:
    Has the person who blabber about this thought of all the people who were sent home today and are now out of work!
    Wait, wasn't that one of the complaints about Google's response, going after the contractor and ending the business relationship for breaking their legal obligations? I think it was referred to here as Google being "despicable".
    edited August 2019 muthuk_vanalingamCarnagerevenant
  • Reply 20 of 35
    rogifan_newrogifan_new Posts: 4,297member
    s07 said:
    Has the whistleblower  thought of all the people who were sent home today and are now out of work!
    Apple said it was suspending not ending. I’m sure it will find other things for these people to do in the meantime.
Sign In or Register to comment.