Amazon, Google follow Apple's lead on voice assistant review policies

Posted:
in General Discussion edited August 2019
Following Apple's decision to temporarily halt Siri grading as it evaluates the program's privacy safeguards, Amazon and Google this week followed suit and updated their respective policies on human reviews of recorded voice assistant audio.

Amazon Echo


Apple on Thursday suspended its Siri grading program, which seeks to make the virtual assistant more accurate by having workers review snippets of recorded audio, after a contractor raised privacy concerns about the quality control process.

Now, Apple's competitors in the space, namely Google and Amazon, are making similar moves to address criticism about their own audio review policies.

Shortly after Apple's announcement, Google in a statement to Ars Technica on Friday said it, too, halted a global initiative to review Google Assistant audio. Like Siri grading, Google's process runs audio clips by human operators to enhance system accuracy.

Unlike Apple's Siri situation, however, a contractor at one of Google's international review centers leaked 1,000 recordings to VRT NWS, a news organization in Belgium. In a subsequent report in July, the publication claimed it was able to identify people from the audio clips, adding that a number of snippets were of "conversations that should never have been recorded and during which the command 'OK Google' was clearly not given."

The VRT leak prompted German authorities to investigate Google's review program and level a three-month ban on voice recording transcripts.

"Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.

Google did not divulge the halt to global reviews until Friday.

Amazon is also taking steps to temper negative press about its privacy practices and on Friday rolled out a new Alexa option that allows users to opt out of human reviews of audio recordings, Bloomberg reports. Enabling the feature in the Alexa app excludes recorded audio snippets from analysis.

"We take customer privacy seriously and continuously review our practices and procedures," an Amazon spokeswoman said. "We'll also be updating information we provide to customers to make our practices more clear."

Amazon came under fire in April after a report revealed the company records, transcribes and annotates audio clips recorded by Echo devices in an effort to train its Alexa assistant.

While it may come as a surprise to some, human analysis of voice assistant accuracy is common practice in the industry; it is up to tech companies to anonymize and protect that data to preserve customer privacy.

Apple's method is outlined in a security white paper (PDF link) that notes the company ingests voice recordings, strips them of identifiable information, assigns a random device identifier and saves the data for six months, over which time the system can tap into the information for learning purposes. Following the six-month period, the identifier is erased and the clip is saved "for use by Apple in improving and developing Siri for up to two years."

Apple does not explicitly mention the possibility of manual review by human contractors or employees, nor does it currently offer an option for Siri users to opt out of the program. The company will address the latter issue in a future software update.
«1

Comments

  • Reply 1 of 24
    gatorguygatorguy Posts: 24,153member
    Odd wording and you have the timeline wrong anyway.
    "Google did not divulge the halt to global reviews until Friday" is incorrect.

    The Germans on Thursday  announced Google had previously paused their program and additionally would now be restricted from using it for the next three months by the German Data Protection Commissioner. In the same statement the German agency recommended that Apple and Amazon follow Google's lead.  

    "In 
    a statement released Thursday, (and due to time zone differences I believe Wednesday here in the US) Germany’s data protection commissioner said the country was investigating that contractors listen to audio captured by Google’s AI-powered Assistant to improve speech recognition. In the process, according to the reports, contractors found themselves listening to conversations accidentally recorded by products like the Google HomeIn the statement, the German regulator writes that other speech assistant providers, including Apple and Amazon, are “invited” to “swiftly review” their policies. 

    Nothing may come of it but common business practices may not be acceptable going forward, and certainly more transparency required. 

    Of some note Ireland had already been looking at Apple and Siri due to a citizen complaint but Apple's response to them at the time was no disclosure was necessary since the recordings were anonymised. That defense tactic has probably now changed. 
    edited August 2019 muthuk_vanalingam1STnTENDERBITSctt_zhelijahgCarnage
  • Reply 2 of 24
    chasmchasm Posts: 3,267member
    Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

    Interesting that you're so quick to defend Google that you'd make a careless error like that.

    Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
    racerhomie3P-DogNCStrangeDayslolliverwatto_cobra
  • Reply 3 of 24
    Apple had no choice but to stop doing this. This was obvious.

    Good to see the others doing the same. But also a bit unnerving to see that what Apple was doing in this regard was qualitatively no different from what Google and Facebook were. 
    muthuk_vanalingam
  • Reply 4 of 24
    knowitallknowitall Posts: 1,648member
    Such a big FUCK UP.
    'grading' sounds like 'culling' unbelievable that this term is used, wow.

  • Reply 5 of 24
    gatorguygatorguy Posts: 24,153member
    chasm said:
    Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

    Interesting that you're so quick to defend Google that you'd make a careless error like that.

    Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
    Wrong Chasm. Beginning with the headline the article is not entirely accurate.

    For example The Verge reported Thursday that Google had voluntarily stopped the reviews at some earlier date which could have been weeks or days earlier, and not specifying whether in just the EU (it was never just Germany) or worldwide. Common sense would say worldwide since it was a contractor problem that needed to be addressed not a regional one.

    Ars didn't report the story until Friday and added the word "globally", which still doesn't show "Google and Amazon following Apple's lead". Factually it was the German authorities telling Apple and Amazon they should seriously consider following Google's lead. I assume you read what was announced. Saying Google didn't divulge the program pause until Friday is a half-truth at best IMO.

    No that is not defending Google, it's pointing out that it had nada to do with anything Apple chose to do with their program. Rather than Apple being proactive and everyone else following their lead Apple for their part was forced into it to avoid a formal inquiry by yet another country's data privacy commissioner IMO, tho it's reasonable to assume Apple might have done so of their own volition at some point. 

    Also Google was not supplying voice snippets for review that were not anonymized. They did not "leave identifying information intact" for contractors. What we don't know is whether Apple was more similar to Amazon and including things like location, gender and device ID for instance, not that it would make those recordings as presented personally identifiable anyway. The leak implies some form of identifying data was attached but Apple has volunteered very little information, nor has Google for that matter. They are both pretty vague and not exactly forthcoming, avoiding comment unless pushed into it. 
    edited August 2019 ctt_zhelijahgmuthuk_vanalingamCarnage
  • Reply 6 of 24
    Rayz2016Rayz2016 Posts: 6,957member
    But why do they need folk to listen to the recordings? I’m a bit unclear on that. 
  • Reply 7 of 24
    chasm said:
    Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

    Interesting that you're so quick to defend Google that you'd make a careless error like that.

    Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
    I think you're wrong on both accounts.  Read the headline.  It says "Google follows Apple's lead... "  That's not true.  They didn't.  There's no way to misinterpret that. 
    Even from the article: "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.  AI changed the context of that quote when they changed the attribution. 
    The quote from the Ars article actually reads:
    "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars today.  They had already paused the reviews when they spoke to Ars on Friday.

    You're wrong about the claim that Apple was the only one anonymizing the voice clips.  Google was anonymizing clips that were listened to as well.  Google also made the storing of audio clips opt-in and even if you do you can opt-out at any time.  If you've opted-in, you can set your account to auto-delete every 3 or 18 months.  You can also manually delete them at any time.   From the earlier Ars article on 11 Jul 
    https://arstechnica.com/information-technology/2019/07/google-defends-listening-to-ok-google-queries-after-voice-recordings-leak/

    In case you misunderstand my intent, I'm not defending Google.  I am countering your misinformation.
    ctt_zhgatorguyelijahgmuthuk_vanalingamCarnage
  • Reply 8 of 24
    gatorguygatorguy Posts: 24,153member
    chasm said:
    Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

    Interesting that you're so quick to defend Google that you'd make a careless error like that.

    Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
    I think you're wrong on both accounts.  Read the headline.  It says "Google follows Apple's lead... "  That's not true.  They didn't.  There's no way to misinterpret that. 
    Even from the article: "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.  AI changed the context of that quote when they changed the attribution. 
    The quote from the Ars article actually reads:
    "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars today.  They had already paused the reviews when they spoke to Ars on Friday.

    You're wrong about the claim that Apple was the only one anonymizing the voice clips.  Google was anonymizing clips that were listened to as well.  Google also made the storing of audio clips opt-in and even if you do you can opt-out at any time.  If you've opted-in, you can set your account to auto-delete every 3 or 18 months.  You can also manually delete them at any time.   From the earlier Ars article on 11 Jul https://arstechnica.com/information-technology/2019/07/google-defends-listening-to-ok-google-queries-after-voice-recordings-leak/

    In case you misunderstand my intent, I'm not defending Google.  I am countering your misinformation.
    Actually better explained than I did @1STnTENDERBITS ;

    There almost appears to be a concerted spin effort since in general each of the articles these past days here at AI concerning Apple's review program have contained bits of misinformation, inaccuracies or misleading statements, too many to be coincidental IMO but perhaps they are just simple mistakes. I know it's not typical of AI writers to make such obvious ones though.

    edited August 2019 elijahgmuthuk_vanalingam
  • Reply 9 of 24
    avon b7avon b7 Posts: 7,593member
    Rayz2016 said:
    But why do they need folk to listen to the recordings? I’m a bit unclear on that. 
    At least one of the reasons is to diagnose accidental activations. In those cases a human is probably better suited to discover what the 'machine' mistakenly thought was its service activator prompt. 
    edited August 2019 1STnTENDERBITSmuthuk_vanalingam
  • Reply 10 of 24
    Rayz2016 said:
    But why do they need folk to listen to the recordings? I’m a bit unclear on that. 
    Piggybacking on Avon B7's comment about accidental activations, humans listening to the recording are helpful when the Assistants misunderstand what's being said.   With all the accents and dialects that are spoken, there are going to be times the assistants don't understand the command or query.  Humans can better decipher the meaning and that info can be fed back into the knowledge base to increase accuracy.

    As many have said, there are very legitimate reasons for human listening.  There just has to be improvement.  First thing, they need to get rid of the 3rd party outsourcing and use in-house employees only.  Yeah, it's going to cost more for them but none of these companies are hurting for a dollar.  Second, implement controls in the work environment that make stealing data a difficult process.  Third, in plain language inform customers there's a chance their interactions can be recorded and used for blah, blah, blah.  Third B - a pop up that makes participation OPT-IN not opt out.

    Not high priority, but I think they should all use raw number when reporting to the public.  Vague "less than 1%" or "approximately 0.2%" doesn't really paint an illuminating picture for customers using these services.  What raw number represents less than 1% of interactions per day x 365 days per year?  1000 per day? 100,000? A million?
    edited August 2019 gatorguydewmeseanismorris
  • Reply 11 of 24
    Rayz2016 said:
    But why do they need folk to listen to the recordings? I’m a bit unclear on that. 
    Another reason for people to listen is to understand what was really being asked if the language was not interpreted correctly, or the misheard/low volume/mumbled words that are missed define what the command is and those are not necessarily/accurately reflected in a transcript. I often tell Siri things with prefixes such as "Hey Siri, here is some FEEDBACK for anyone who might hear this: I asked Siri to play a specific artist from my iTunes library (That is not available in AppleMusic/iTunes Store) and the only action she ever takes is to play some unrelated song and artist, however she can recognize it if I ask by a song title, but then she only plays the one song..." All in the hope that someone is searching for transcripts with "feedback". If they don't, they should, as it would be much simpler to speak feedback and bug reports at the point of contention, rather than filing a bug/feedback report online or elsewhere.
    mobirdkudu
  • Reply 12 of 24
    rob53rob53 Posts: 3,239member
    maxperts said:
    Rayz2016 said:
    But why do they need folk to listen to the recordings? I’m a bit unclear on that. 
    Another reason for people to listen is to understand what was really being asked if the language was not interpreted correctly, or the misheard/low volume/mumbled words that are missed define what the command is and those are not necessarily/accurately reflected in a transcript. I often tell Siri things with prefixes such as "Hey Siri, here is some FEEDBACK for anyone who might hear this: I asked Siri to play a specific artist from my iTunes library (That is not available in AppleMusic/iTunes Store) and the only action she ever takes is to play some unrelated song and artist, however she can recognize it if I ask by a song title, but then she only plays the one song..." All in the hope that someone is searching for transcripts with "feedback". If they don't, they should, as it would be much simpler to speak feedback and bug reports at the point of contention, rather than filing a bug/feedback report online or elsewhere.
    I didn't realize we were already in the age of computers knowing how to do everything. Computers need to be programmed and the only way to know if that programming works is to, as you've stated, check what Sir and the others are actually doing. I have to talk slow and enunciate my words to make sure Siri gets my messages correct. When it speaks it back on my Alpine head unit, maybe 50% of the time it gets it right. I have to record the entire message again (no editing on the Alpine I can see) and it can get that wrong as well. I've been using it a lot so I would think it should have started to understand my voice. I am a native english speaker on the west coast so no real accent. If Apple never checks to see if Siri is understanding what's being asked, it will get to the point where nobody will use it.
    watto_cobra
  • Reply 13 of 24
    GabyGaby Posts: 190member
    A rather curious turn of events considering both google and amazon had been reported on many months ago and yet at the time they felt no compulsion to halt or review their respective systems of ai/voice review. All I have to say is
    Baaaaa!!!! 🐑 
    lolliverwatto_cobra
  • Reply 14 of 24
    gatorguygatorguy Posts: 24,153member
    Gaby said:
    A rather curious turn of events considering both google and amazon had been reported on many months ago and yet at the time they felt no compulsion to halt or review their respective systems of ai/voice review. All I have to say is
    Baaaaa!!!! ߐᦡmp;nbsp;
    @Gaby ;
     You didn't actually read the article did you? 

    Google stopped human reviews sometime back, shortly after the initial stories that a contractor had stolen several hundred recordings and given them to a news organization. If anything be surprised that Apple wasn't more forthcoming that they were doing the same thing, but would stop doing so and review the program. That would have been proactive.

    It's not as tho Amazon and Google had already been "discovered" and received negative press, while Apple remained silent and carried on. 
    edited August 2019
  • Reply 15 of 24
    gatorguy said:
    chasm said:
    Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

    Interesting that you're so quick to defend Google that you'd make a careless error like that.

    Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
    I think you're wrong on both accounts.  Read the headline.  It says "Google follows Apple's lead... "  That's not true.  They didn't.  There's no way to misinterpret that. 
    Even from the article: "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.  AI changed the context of that quote when they changed the attribution. 
    The quote from the Ars article actually reads:
    "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars today.  They had already paused the reviews when they spoke to Ars on Friday.

    You're wrong about the claim that Apple was the only one anonymizing the voice clips.  Google was anonymizing clips that were listened to as well.  Google also made the storing of audio clips opt-in and even if you do you can opt-out at any time.  If you've opted-in, you can set your account to auto-delete every 3 or 18 months.  You can also manually delete them at any time.   From the earlier Ars article on 11 Jul https://arstechnica.com/information-technology/2019/07/google-defends-listening-to-ok-google-queries-after-voice-recordings-leak/

    In case you misunderstand my intent, I'm not defending Google.  I am countering your misinformation.
    Actually better explained than I did @1STnTENDERBITS 

    There almost appears to be a concerted spin effort since in general each of the articles these past days here at AI concerning Apple's review program have contained bits of misinformation, inaccuracies or misleading statements, too many to be coincidental IMO but perhaps they are just simple mistakes. I know it's not typical of AI writers to make such obvious ones though.
    Wow! You guys are TOO MUCH! "Separately, Google today confirmed that it recently "paused" human reviews of Google Assistant queries worldwide." What part of TODAY do you guys not get? Google (they say) started the pause earlier, but didn't announce it until TODAY, after Apple announced their pause. I suppose if you were purist about it, you'd rewrite the headline "Amazon, Google follow Apple's lead and CONFIRM voice assistant review policies".

    There almost appears to be a concerted spin effort on your parts. You certainly never asked why they waited so long to "confirm"....
    lolliverwatto_cobra
  • Reply 16 of 24
    gatorguygatorguy Posts: 24,153member
    sacto joe said:
    gatorguy said:
    chasm said:
    Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

    Interesting that you're so quick to defend Google that you'd make a careless error like that.

    Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
    I think you're wrong on both accounts.  Read the headline.  It says "Google follows Apple's lead... "  That's not true.  They didn't.  There's no way to misinterpret that. 
    Even from the article: "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.  AI changed the context of that quote when they changed the attribution. 
    The quote from the Ars article actually reads:
    "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars today.  They had already paused the reviews when they spoke to Ars on Friday.

    You're wrong about the claim that Apple was the only one anonymizing the voice clips.  Google was anonymizing clips that were listened to as well.  Google also made the storing of audio clips opt-in and even if you do you can opt-out at any time.  If you've opted-in, you can set your account to auto-delete every 3 or 18 months.  You can also manually delete them at any time.   From the earlier Ars article on 11 Jul https://arstechnica.com/information-technology/2019/07/google-defends-listening-to-ok-google-queries-after-voice-recordings-leak/

    In case you misunderstand my intent, I'm not defending Google.  I am countering your misinformation.
    Actually better explained than I did @1STnTENDERBITS 

    There almost appears to be a concerted spin effort since in general each of the articles these past days here at AI concerning Apple's review program have contained bits of misinformation, inaccuracies or misleading statements, too many to be coincidental IMO but perhaps they are just simple mistakes. I know it's not typical of AI writers to make such obvious ones though.
    Wow! You guys are TOO MUCH! "Separately, Google today confirmed that it recently "paused" human reviews of Google Assistant queries worldwide." What part of TODAY do you guys not get? Google (they say) started the pause earlier, but didn't announce it until TODAY, after Apple announced their pause. I suppose if you were purist about it, you'd rewrite the headline "Amazon, Google follow Apple's lead and CONFIRM voice assistant review policies".

    There almost appears to be a concerted spin effort on your parts. You certainly never asked why they waited so long to "confirm"....
    Google and the German authority both discussed it .... THURSDAY. That was before any announcement from Apple. You're not reading. Ars just didn't get around to the story until Friday. Others had already written articles the day before and FWR Apple's writer choose Ars as the source rather than one of the earlier blogs.

    Perhaps more accurately the AI author didn't know until Friday, that's possible. Many others already knew. The article as written is misleading IMHO. ;)
    In truth it was Apple and Amazon following Google's lead, with a little prodding from the Germans. 

    As for not wondering why Google took so long to discuss the review pause it might be the same reason why Apple chose not to say a word about doing the same kind of human reviews and continue doing so after the negative press surrounding Google and Amazon being discovered. The less said the better. Say nothing and admit to nothing unless pressured to do so. 
    edited August 2019 muthuk_vanalingam
  • Reply 17 of 24
    AppleExposedAppleExposed Posts: 1,805unconfirmed, member
    *wink* *wink*
  • Reply 18 of 24
    Rayz2016Rayz2016 Posts: 6,957member
    But why do they need folk to listen to the recordings? I’m a bit unclear on that. 
    Okay, got that. Thanks, folks.

    I have to agree, I think the headline is Gaitorbait, mainly because it says something different to what is written in the article.

    Google did not divulge the halt to global reviews until Friday.

    But

    Google has declared to the HmbBfDI in the course of these administrative proceedings that transcriptions of voice recordings will no longer be carried out at present and for at least three months from 1 August 2019. This covers all of the EU. In this respect, the competent authorities for other providers of speech assistance systems, such as Apple or Amazon, are invited to also swiftly review the implementation of appropriate measures.

    This appeared on the 1st August, which I think was the Thursday.

    Still, there seems to be a few details that have been lost in the media rush to attract clicks.

    Google suffered a data breach.
    Apple did not.

    In Google's case, "thousands of recordings" were leaked to the press.
    https://arstechnica.com/information-technology/2019/07/google-defends-listening-to-ok-google-queries-after-voice-recordings-leak/

    In Apple's case, a contractor claimed to have heard people having sex and people making drug deals overheard over the sound of a car engine. But as far as I've read, he did not provide any recordings to back up his claims. 
    https://arstechnica.com/gadgets/2019/07/siri-records-fights-doctors-appointments-and-sex-and-contractors-hear-it/

    Loath as I am to accept any claim without evidence, I'm going to assume he's an upstanding citizen and that he's talking to the press out of innate sense of altruism, though that still doesn't really answer the question as to why he didn't provide any evidence. 
    He also claims that accidental activations on the Apple watch are alarmingly regular. This doesn't surprise me since Siri on the Apple watch can be activated by raising your wrist. Not sure that counts as an accidental activation if you've deliberately set the watch to behave that way. 

    Others seem to think that the problem is hiring outside contractors. Again, I'm not sure that going completely in-house would fix the problem. I have seen little evidence that Apple employees are less likely to leak information than contractors.

    https://www.businessinsider.com/apple-fires-iphone-x-engineer-viral-video-2017-10?r=US&IR=T

    Past behaviour is no guarantee of future conduct.

    So no recordings was leaked. Is Apple hiring a better class of contractor, or does Apple have a system in place that ties the contractor to a recording, making capturing and releasing them to the press a much more risky proposition?

    I'm also guessing that they will continue to store these snippets (with the user's explicit consent this time) in the future, but right now they're looking at ways to make it less likely these recordings are leaked. 

    In any case, those of us who use Apple gear should be glad that he has spoken out because he's highlighted something that could possibly turn into a serious data breach in the future. Apple has staked its reputation on respecting its customers' privacy, Google has not.

    Not allowing customers to opt out? C'mon Apple! Even Google allowed this! (though Google has been known to ignore what their users have requested, and run roughshod over their privacy regardless).

    If Apple holds itself to a higher privacy standard than Google, then I think Apple's more knowledgeable customers should do the same.
    edited August 2019 gatorguywatto_cobra
  • Reply 19 of 24
    Rayz2016Rayz2016 Posts: 6,957member
    Heh.

    “There have been countless instances of recordings featuring […] sexual encounters and so on,” 





    The wake word is the phrase “hey Siri,” but the anonymous source said that it could be activated by similar-sounding words or with the noise of a zipper.

    Yeah, okay, I think I see what happened there …



  • Reply 20 of 24
    StrangeDaysStrangeDays Posts: 12,821member
    gatorguy said:
    chasm said:
    Gatorguy: I believe you misread the article and jumped to conclusions. Mikey's article clearly states that Google did not reveal that they had paused the reviews globally until Friday. Your statement doesn't contradict that at all -- it refers exclusively to the pausing of audio review in Germany.

    Interesting that you're so quick to defend Google that you'd make a careless error like that.

    Also unchanged: Apple among the three companies was the only one that was always anonymizing its voice clips before all this controversy even started, as per their white paper. Anonymizing is not 100 percent foolproof against identifying someone (for example, they identify themselves in the recording, or its obviously a famous person with a distinctive voice etc), but it was and is better than what was previously the policy at Google and Amazon, which left identifying information intact.
    I think you're wrong on both accounts.  Read the headline.  It says "Google follows Apple's lead... "  That's not true.  They didn't.  There's no way to misinterpret that. 
    Even from the article: "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars Technica.  AI changed the context of that quote when they changed the attribution. 
    The quote from the Ars article actually reads:
    "Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate. This paused reviews globally," Google told Ars today.  They had already paused the reviews when they spoke to Ars on Friday.

    You're wrong about the claim that Apple was the only one anonymizing the voice clips.  Google was anonymizing clips that were listened to as well.  Google also made the storing of audio clips opt-in and even if you do you can opt-out at any time.  If you've opted-in, you can set your account to auto-delete every 3 or 18 months.  You can also manually delete them at any time.   From the earlier Ars article on 11 Jul https://arstechnica.com/information-technology/2019/07/google-defends-listening-to-ok-google-queries-after-voice-recordings-leak/

    In case you misunderstand my intent, I'm not defending Google.  I am countering your misinformation.
    There almost appears to be a concerted spin effort since in general each of the articles these past days here at AI concerning Apple's review program have contained bits of misinformation, inaccuracies or misleading statements, too many to be coincidental IMO but perhaps they are just simple mistakes. I know it's not typical of AI writers to make such obvious ones though.
    Ahh yes, the google guy is suggesting AI staff are peddling an anti-google narrative. Keep dropping those FUD pellets, fella! 
    lolliverAppleExposedRayz2016watto_cobra
Sign In or Register to comment.