First class action suit alleging damages from Siri recordings filed
A class action lawsuit has been filed against Apple in regards to Siri quality control program recordings, alleging that the filer who may or may not have been recorded has been damaged by Apple's doing so, and that Apple never disclosed that recordings may be retained.

Filed on the August 7, the suit arises from what it calls Apple's "unlawful and intentional recording of individuals' confidential communications without their consent" from late 2011 to present day. The suit alleges that this is in violation of several California laws, including the California Invasion of Privacy act.
California law prohibits the recording of verbal communications without the consent of all parties involved. The suit claims that users give consent only when "Hey Siri" has been explicitly said or a gesture has been performed, and claims that any other recording would be non-consensual. The lawsuit also alleges that there is reasonable concern that government agencies are looking to voiceprint the populace by recording and storing voice samples, even though Apple anonymizes recordings and keeps them discrete from identifying iCloud accounts.
Puzzlingly, the suit specifically spells out the fact that Apple has in fact informed users that voice samples may be passed to Apple or its associates. And, the suit also notes that the possibility that Siri recordings might be examined for quality control purposes have been stated in the terms of use for iOS, and is presently in the privacy pages that Apple maintains.
The class action suit is currently seeking a court ruling that Apple has violated California's privacy laws and civil code. Additionally, it demands that Apple must delete all recordings of class members and take measures to prevent further recording of the class members without their consent.
Apple suspended the Siri quality control program five days before the suit was filed. Apple has also said that once it is reinstated in a future update, Apple will allow users to opt out.
The filers are also seeking nominal, statutory, and punitive damages where applicable, with interest. Each individual person in the class action suit would be required to prove they were recorded without their consent and suffered financial damage as a result for a positive ruling awarding damages.
Anyone who wishes to remove any voice recordings deleted are able to do so, as we have detailed here. The numbers of retained recordings for quality assurance purposes are very low, so the odds are likely for any given user that there is no retained data in the first place.

Filed on the August 7, the suit arises from what it calls Apple's "unlawful and intentional recording of individuals' confidential communications without their consent" from late 2011 to present day. The suit alleges that this is in violation of several California laws, including the California Invasion of Privacy act.
California law prohibits the recording of verbal communications without the consent of all parties involved. The suit claims that users give consent only when "Hey Siri" has been explicitly said or a gesture has been performed, and claims that any other recording would be non-consensual. The lawsuit also alleges that there is reasonable concern that government agencies are looking to voiceprint the populace by recording and storing voice samples, even though Apple anonymizes recordings and keeps them discrete from identifying iCloud accounts.
Puzzlingly, the suit specifically spells out the fact that Apple has in fact informed users that voice samples may be passed to Apple or its associates. And, the suit also notes that the possibility that Siri recordings might be examined for quality control purposes have been stated in the terms of use for iOS, and is presently in the privacy pages that Apple maintains.
The class action suit is currently seeking a court ruling that Apple has violated California's privacy laws and civil code. Additionally, it demands that Apple must delete all recordings of class members and take measures to prevent further recording of the class members without their consent.
Apple suspended the Siri quality control program five days before the suit was filed. Apple has also said that once it is reinstated in a future update, Apple will allow users to opt out.
The filers are also seeking nominal, statutory, and punitive damages where applicable, with interest. Each individual person in the class action suit would be required to prove they were recorded without their consent and suffered financial damage as a result for a positive ruling awarding damages.
Anyone who wishes to remove any voice recordings deleted are able to do so, as we have detailed here. The numbers of retained recordings for quality assurance purposes are very low, so the odds are likely for any given user that there is no retained data in the first place.
Comments
I think we would all agree this case would be better leveled against Google and Amazon since it has been show they were just out right record conversation.
Best to find a slower ambulance to chase.
Apple said it might review the data for quality control purposes.
This is going nowhere.
i swear, Apple could give money to the poor, or find a cure for cancer, and some AH would sue them over it.
"When you use Siri or Dictation, the things you say will be recorded and sent to Apple in order to convert what you say into text and to process your requests. Your device will also send Apple other information, such as your first name and nickname; the names, nicknames, and relationship with you (e.g., “my dad”) of your address book contacts; and song names in your collection (collectively, your “User Data”). All of this data is used to help Siri and Dictation understand you better and recognize what you say. It is not linked to other data that Apple may have from your use of other Apple services. By using Siri or Dictation, you agree and consent to Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing, and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation, and dictation functionality in other Apple products and services."
Collection and use are in the last sentence.
Sometimes that "key phrase" that's supposed to trigger the voice request was never actually uttered. It was a service failure which the companies are doing what they can to minimize if not for all intents eliminate. Other times their services just misunderstand a legitimate request, again something all three want to eliminate as much as they can. FWIW Microsoft is another tech dragged into this today with the revelation that humans are listening to select Skype and Cortana voice recordings in order to improve them.
AI (!) may be a thing now but it's going to be a long time if ever before humans aren't better at deciphering what humans meant or said.
Then I can hear how many times I ended my commands with bitch.
Maybe the plaintiffs should have tried Judge Judy first before trying to sue in San Jose.
It’s going to be super hard to prove they were harmed because they can’t prove their voices were recorded or not without their permission.
"Apple will be relinquishing some control of these recordings to companies outside of Apple so that their 3rd party employees can listen and transcribe what was said to your Apple device"
All the companies, Amazon, Apple, Google, Microsoft, and more should be far more transparent about how user data is handled and shared. I suspect there's a whole lot of stuff not actually disclosed but technically permitted by wording in ToS. Hands may not be nearly as clean as we like to think they are.
I suspect the lawyers are hoping that failure to explicitly invoke the "Hey Siri" phrase means that they were not intentionally using Siri.
In my case my intent was record meant the audio was being stored to media such as HDD and stored for later recovery. You can also record (digitize) lots of stuff but it can not always be replayed at a later date it is in temporary storage like DRAM. As Apple pointed out they record (digitize) to process the request but it does not specifically say they will always maintain an original copy of the information (replay at a later date). You may want to infer this, but its not clear Apple is retaining the information i.e. there is large server farm of voice recording of everyone whomever invoked the "Hey Siri" prompt. Apple said they only do it to improve Siri so I do not believe Apple is storing every word spoke to Siri just when they detect an error. Apple also pointed out, telling Siri to play a song on your device the content of that request never leaves your device.
Keep in mind the lawsuit is claiming Apple is recording (digitizing and storing) conversation outside invoking "Hey Siri". I point this out before in other posts, I wish Apple would figure out why my homepod will all of sudden begin trying to answer questions to a conversation going on around it even though no one ever said "Hey Siri" or anything close to this. Something is trigging the system to digitize the audio and sending it to Apple. This is a good example of it recording (digitizing) conversations and most likely storing it to later retrieval and analysis. In this case I am not using Siri dictation so I did not give Apple permission, also Apple language does not saying the mere fact of having a system capable of "Hey SirI" in your presents automatically grants them permission to record.
The California the law address this specific case, you are not allow to record (digitize and/or store) people conversation without all parties specific consent. I did not read the law, I would image this only applies in cases where someone believe they have reasonable expectation of privacy, i.e. not in public spaces.
I believe this is drive by lawsuit, they people behind this have no evidence of any wrong doing, they suite on the mere fact they believe violated the consent requirement of the law. They are fishing for any information to back their case and if the Court allow this their discovery requirements will request all maintained records and they will look for the one example where something was recorded and did not invoke "Hey Siri".
Meanwhile, Equifax gets a slap on the wrist.
What I don't understand is why anyone would think they weren't doing this as a QA check to improve the product.