Apple announces plans to improve Siri's privacy protections for users

Posted:
in iPhone edited December 2019
Apple's review of Siri privacy guarantees has completed, and the company is making a few changes going forward to further ensure users' privacy and data safety.




Beyond just its privacy page, Apple on Wednesday reiterated that Siri isn't used to build a marketing profile and any collected data is never sold. Apple says that is uses Siri data only to improve Siri, and is "constantly developing technologies to make Siri even more private."

The company shared more details about the grading process as well.

"Before we suspended grading, our process involved reviewing a small sample of audio from Siri requests -- less than 0.2 percent -- and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability," wrote Apple. "For example, did the user intend to wake Siri? Did Siri hear the request accurately? And did Siri respond appropriately to the request?"

Apple says that its internal review has resulted in a few changes.
  • Users will be able to opt in to help Siri improve by learning from the audio samples of their requests. Those who choose to participate will be able to opt out at any time.
  • Apple will no longer retain audio recordings of Siri interactions, and will continue to use computer-generated transcripts to help Siri improve.
  • When customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions.
  • Apple will also "work to delete" any recording which is determined to be an inadvertent trigger of Siri.
On July 28, a "whistleblower" who was allegedly a contractor for Apple, detailed a program that used voice recordings to improve the responses from Apple's voice assistant. The report said that a small team of contractors working for Apple, were tasked with determining if the Siri activation was accidental or on purpose, if it was a query within the range of Siri's capabilities, and whether Siri acted properly.

The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors -- but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

The nature of the information, sometimes unintentional and not part of the query, is wide-ranging, the whistleblower said.

"You can definitely hear a doctor and patient, talking about the medical history of the patient," said the source. "Or you'd hear someone, maybe with car engine background noise - you can't say definitely, but it's a drug deal. You can definitely hearing it happening," they advised.

The whistleblower went on to state there are many recordings "featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."

Allegedly, there wasn't a procedure in place to deal with sensitive recordings, with the whistleblower stepping forward over the suggestion the data could be easily misused.

Wednesday's discussion by Apple also reminds users that as much as possible is done on the device, Siri uses as little data as possible to deliver an accurate result, and contents of Siri queries are not returned to Apple.

Furthermore, even before the announced changes, Apple's Siri used a random identifier -- a long string of letters and numbers associated with a single device -- to keep track of data while it's being processed, rather than tying it to your identity through your Apple ID or phone number -- a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device's data is disassociated from the random identifier.

Apple has also published a support page with frequently asked questions about Siri privacy and grading.
«1

Comments

  • Reply 1 of 24
    Disappointing.

    "will continue to use computer-generated transcripts"
    "
    only Apple employees will be allowed to listen to audio samples"
    "work to delete"

    Jargon I would have expected from other companies.  Not Apple.
    muthuk_vanalingamcgWerks
  • Reply 2 of 24
    gatorguygatorguy Posts: 23,510member
    The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors -- but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

    Where does Apple specifically mention it? The Privacy Page clipping AI has been using was from iOS6 AFAICT, but I've not been able to find any disclosure of queries being reviewed since then, and never by outside contractors anyway. Is there a more recent link to it, or is "six iterations of iOS" meant as a backhanded way of saying that Apple removed it from mention in the privacy policy it a few years ago?

    FWIW Google too says recordings may be reviewed, using so many words to say so in a very legal and broad manner, but never says by humans employed by a 3rd party contractor. They should be far more transparent and no doubt will be going forward. They also intimated a few weeks ago that they will be moving the review program in-house just as Apple plans to. TBH I would have assumed that was always the case and more than surprised Amazon, Apple and Google were all sending recordings to outside companies and facilities. Was it just cheaper to do so? 
    edited August 2019 muthuk_vanalingam
  • Reply 3 of 24
    Mike WuertheleMike Wuerthele Posts: 6,567administrator
    gatorguy said:
    The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors -- but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

    Where does Apple specifically mention it? The Privacy Page clipping AI has been using was from iOS6 but I've not been able to find any mention of queries being reviewed since then, and never by outside contractors anyway. Is there a more recent link to it, or is "six iterations of iOS" meant as a backhanded way of saying that Apple stopped disclosing it a few years ago? FWIW Google too says recordings may be reviewed, using so many words to say so, but never says by humans employed by a 3rd party contractor.
    The clipping is substantively the same in iOS 6 through 11. Discussion of review has been in the privacy page since inception.

    Nobody explicitly says "by humans," relying on terms like "agents" and "affiliates" but what else would review them?
    edited August 2019 StrangeDaysRayz2016lolliver
  • Reply 4 of 24
    rogifan_newrogifan_new Posts: 4,297member
    These are good changes. It’s too bad it took Apple being embarrassed by a salacious news story for it to happen.
    muthuk_vanalingamAtomikTacochemengin1
  • Reply 5 of 24
    rogifan_newrogifan_new Posts: 4,297member
    Disappointing.

    "will continue to use computer-generated transcripts"
    "only Apple employees will be allowed to listen to audio samples"
    "work to delete"

    Jargon I would have expected from other companies.  Not Apple.
    How do you expect Apple to improve Siri then? 
  • Reply 6 of 24
    mike1mike1 Posts: 3,133member
    Disappointing.

    "will continue to use computer-generated transcripts"
    "
    only Apple employees will be allowed to listen to audio samples"
    "work to delete"

    Jargon I would have expected from other companies.  Not Apple.
    That definitely is not jargon. Pretty plain English, actually.
    CloudTalkinleftoverbaconStrangeDaysstompydedgeckololliver
  • Reply 7 of 24
    Good.  They did almost exactly what I suggested they should do (no, I am not taking credit, just noting similarities).  Should not have been necessary.  Said it before, and unfortunately, will probably have to say it again.  Self inflicted wound was caused by ambiguous language.  Plain English lessens the chances of having to apologize and explain later.  
    muthuk_vanalingam
  • Reply 8 of 24
    Just wake me when SIRI actually works properly.
    dedgeckomuthuk_vanalingamcgWerks
  • Reply 9 of 24
    gatorguygatorguy Posts: 23,510member
    gatorguy said:
    The main thrust of the report claims that Apple does not explicitly disclose to consumers that recordings are passed along to contractors -- but Apple does tell users that some queries are manually reviewed, and has since the release of the service. Despite the information having been public-facing for at least six iterations of iOS, the "whistleblower" advised that they were concerned over the lack of disclosure, especially considering the contents of some recordings containing "extremely sensitive personal information."

    Where does Apple specifically mention it? The Privacy Page clipping AI has been using was from iOS6 but I've not been able to find any mention of queries being reviewed since then, and never by outside contractors anyway. Is there a more recent link to it, or is "six iterations of iOS" meant as a backhanded way of saying that Apple stopped disclosing it a few years ago? FWIW Google too says recordings may be reviewed, using so many words to say so, but never says by humans employed by a 3rd party contractor.
    The clipping is substantively the same in iOS 6 through 11. Discussion of review has been in the privacy page since inception.

    Nobody explicitly says "by humans," relying on terms like "agents" and "affiliates" but what else would review them?
    Mike, do you have a link to it in iOS 12 or even 11 'cause I just can't find where it is. There's actually a reason I ask that beyond "Apple shoulda said".  On another Apple-focused site there's a claim of proof Apple stopped the program a few years ago because they stopped disclosure of it a couple iOS gens and TOS's back. 
    edited August 2019
  • Reply 10 of 24
    22july201322july2013 Posts: 3,219member
    Apple could pay people to be willing voice control subjects. If they had the money. Oh, wait.
    AtomikTaco
  • Reply 11 of 24
    gatorguygatorguy Posts: 23,510member
    Apple could pay people to be willing voice control subjects. If they had the money. Oh, wait.
    That wouldn't fill the need. Voice services need a lot of correction still, and a few thousand voice snippets or even a million won't get them there. 
  • Reply 12 of 24
    technotechno Posts: 735member
    Maybe they could improve Siri on the HomePod to stop saying "uh..mmm" out of the blue or just negatively answering unasked questions.
  • Reply 13 of 24
    LOL “Siri, the pioneering intelligent assistant”, cannot even handle 2 levels deep of interaction, Siri is by FAR the most “stupid” “intelligent” assistant, that’s why Apple “invented” Shortcuts to make the impression of “AI” by making scripts for Siri because it cannot handle multi level request! What is Apple doing with their 231 billion$ on their bank account? Ref Apple Statement / https://www.apple.com/newsroom/2019/08/improving-siris-privacy-protections/
    edited August 2019
  • Reply 14 of 24
    The key words are “Users will be able to opt in”.  

    I’m assuming drug dealers won’t opt in.  Other users are now made aware of the Siri recording process and should realize if they opt in they might want to turn on “airplane mode” when entering confidential meetings (like doctor, lawyer, or business related).

    Intimate encounters will probably still get recorded, because the participants are usually a bit distracted...

     ¯\_(ツ)_/¯ 

    ......after further review not many people worship Siri so my concerns are overblown.

    On a side note, my new cult of Osiris (that I started 10 minutes ago) isn’t finding much interest... Thanks Apple!



    edited August 2019
  • Reply 15 of 24
    elfig2012 said:
    LOL “Siri, the pioneering intelligent assistant”, cannot even handle 2 levels deep of interaction, Siri is by FAR the most “stupid” “intelligent” assistant, that’s why Apple “invented” Shortcuts to make the impression of “AI” by making scripts for Siri because it cannot handle multi level request! What is Apple doing with their 231 billion$ on their bank account? Ref Apple Statement / https://www.apple.com/newsroom/2019/08/improving-siris-privacy-protections/
    Furiously buying back AAPL stock.  It's why they only have a little over a $100 billion in the bank instead of $230 billion.  ;)  
    All of the assistants are left wanting in one way or another.  Siri, according to the most recent Loup Ventures survey test, is the 2nd smartest assistant... based on the criteria used in the test.  Make of that what you will.  
    elfig2012muthuk_vanalingamequality72521
  • Reply 16 of 24
    StrangeDaysStrangeDays Posts: 12,325member
    Disappointing.

    "will continue to use computer-generated transcripts"
    "only Apple employees will be allowed to listen to audio samples"
    "work to delete"

    Jargon I would have expected from other companies.  Not Apple.
    It's not at all. Computer-generated transcripts are a thing. Apple employees-only is a thing, and a change.  Gruber has a much more sensible summary of the changes here than yours:

    https://daringfireball.net/2019/08/apple_siri_privacy

    Of note, the simple-langue FAQ:

    https://support.apple.com/en-us/HT210558

    fastasleeplolliver
  • Reply 17 of 24
    StrangeDaysStrangeDays Posts: 12,325member

    These are good changes. It’s too bad it took Apple being embarrassed by a salacious news story for it to happen.
    Mistakes happen, now the policies are refactored and made even more clear to users.

    More curious to me is why some people work so hard to find negative angles to every story. I would so love to know what all these wonderful critics do that their endeavors are eternally successful and free of mistakes or lessons learned.
    lolliver
  • Reply 18 of 24
    lkrupplkrupp Posts: 10,326member
    Disappointing.

    "will continue to use computer-generated transcripts"
    "only Apple employees will be allowed to listen to audio samples"
    "work to delete"

    Jargon I would have expected from other companies.  Not Apple.
    Remember, you will have to opt-in to have your recordings reviewed at all. By default they will not be. But then you will be back here complaining about how Siri isn’t improving its understanding of your voice. This privacy crap is a two edged sword you know.
    stompylolliver
  • Reply 19 of 24
    lkrupplkrupp Posts: 10,326member


    These are good changes. It’s too bad it took Apple being embarrassed by a salacious news story for it to happen.
    Mistakes happen, now the policies are refactored and made even more clear to users.

    More curious to me is why some people work so hard to find negative angles to every story. I would so love to know what all these wonderful critics do that their endeavors are eternally successful and free of mistakes or lessons learned.
    This is Apple we’re talking about. You know, the one with the giant target on its back. And half of the comments here are from people who neither own nor use Apple products but make it their life’s work to spin anything negative about the company. With the vitriol and venom they spew you have to question whether they are really users of the gear, and if they are why do they continue to use the products.
    StrangeDayslolliver
  • Reply 20 of 24
    lkrupp said:
    Remember, you will have to opt-in to have your recordings reviewed at all. By default they will not be. 
    This is incorrect.  If you opt-in, your Siri recordings can be listened to by an Apple employee.  If you opt-out Apple will still keep a computer-generated transcription of your Siri recording for review.  From the Apple FAQ: https://support.apple.com/en-us/HT210558
    Why do you keep transcripts for customers who do not opt in?
    Computer generated transcripts are used to improve Siri and its reliability. These transcripts are used in machine learning training to improve Siri, determine common usage patterns, and update language and understanding models. The transcripts may also be used to resolve critical problems that affect Siri reliability.

    The only way to ensure your Siri interactions are not recorded or transcribed is to disable Siri and Dictation in Settings.


    gatorguymuthuk_vanalingamcgWerkschemengin1
Sign In or Register to comment.