Apple suspends Siri quality control program, will let users opt out in update

2»

Comments

  • Reply 21 of 35
    boboliciousbobolicious Posts: 1,201member
    ...I've long asked if a Siri less HomePod with manual EQ and external input at a lower cost would appeal, as well as offering a satellite stereo speaker...  Software opt ins are only as good as the code, and can also obsolesce quickly...
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 35
    gatorguygatorguy Posts: 24,769member
    s07 said:
    Has the whistleblower  thought of all the people who were sent home today and are now out of work!
    Apple said it was suspending not ending. I’m sure it will find other things for these people to do in the meantime.
    They aren't Apple employees. This would affect outside 3rd party contractors who were being used for listening and transcription of Siri recordings. IMO doubtful that Apple would "find something else for them to do".
    edited August 2019
    muthuk_vanalingamrevenant
     2Likes 0Dislikes 0Informatives
  • Reply 23 of 35
    Solisoli Posts: 10,038member
    Rayz2016 said:
    Soli said:
    This wasn't a big issue since the data was already anonymized and yet they're taking even more steps to help ensure user security. I wish more companies acted this way.
    The real problem is that Apple didn’t give folk an opt-out from day one. 

    The irony is that they ran into a privacy problem while trying to ensure Siri didn’t run into a privacy problem. 

    It’ll be interesting to see how they get around this one. The conversational changes coming to Siri are game-changing; shame if they get derailed. 
    I agree about the lack of an opt-out (or a preferred opt-in).

    I've mentioned countless times on this forum that I really like how Amazon keeps the user involved with Alexa requests. Now, I don't mean how they handle security, but how the user can access their recordings and what Alexa thought they said after the wake word was spoken, at which point they can rate the result. I've only ever let Amazon know when it's been particularly bad or wrong (which usually means getting the context wrong, not the actual words spoken). I'd still love for Apple to incorporate this option so you can choose to send a bad results—whatever they may be—to a Siri team so they can see if they can tweak the system down the road with better efficiency, not unlike how you can submit incorrect Map items to their mapping team.
    gatorguymuthuk_vanalingam
     1Like 0Dislikes 1Informative
  • Reply 24 of 35
    dewmedewme Posts: 6,098member
    gatorguy said:
    By the way the Apple Watch is being portrayed as the biggest offender for errant and unintended Siri activations. 
    Quote;
    "If your phone or HomePod are accidentally triggered, they will start recording whatever it hears, which could be a sensitive conversation or romantic encounter that you probably don’t want on record. According to the whistleblower, Siri regularly activates without the explicit “Hey Siri” command, including something as innocuous as “the sound of a zip” on a jacket...
    If that snippet is then selected as one of the ones used for grading, a contractor could hear it. The possibility of an accidental trigger rises significantly with the Apple Watch, which only needs to be raised to activate Siri. And since it’s always on your person, the probability of Siri inadventently recording a sensitive conversation is higher than with a phone or a HomePod."

    "Recordings are accompanied by user data showing location, contact details, and app data.” is still completely unclear and neither denied nor confirmed by Apple. It might be the EU that eventually clarifies it. 

    Not to nitpick words in the quote, but how does one raise the possibility of an accidental trigger? It's either possible or not. The general lack of precision, clarity, and lack of quantification of the probability of an individual's personal information or activities being leaked after the anonymization process makes me think the whistle blower is striving to achieve maximum damage based on hyperbole rather than a scientific assessment of the actual threat. In reality the probability of an individual Siri user being impacted by the grading process is likely extremely small. You would need to know the false trigger rate for Siri, the percentage of requests that are selected for grading, the probability that the user is engaged in a compromising activity that is conveyed via the captured audio, the probability that other correlated data (like location) is significant to the leak, and the probability that the captured audio contains personally identifiable information that is traceable to a single user.

    I suppose if you are continuously engaged in compromising activities, perhaps with a constant stream of zipper noises, all while flailing your Apple Watch around wildly, and doing a play-by-play announcement of the intimate details of your illicit or private activities, while also verbally uttering your full name, SSN, and the names and IDs of those around you, with local weather thrown in for good measure, I suppose there would be an infinitesimally minute probability that one of your sessions would be selected for review. But you'd better also hope that your special moment is reviewed by a rogue contractor who is intent on using the captured data and information against you. Hey, it's possible. But likely improbable, not likely unless you're really digging your chances with lottery odds.

    The general public and media outlets all too often overreact to published stories like this one because they don't understand the difference between possibility and probability. Unfortunately, nearly all attempts to explain the differences fall on deaf ears and companies like Apple are forced to take heavy handed measures simply to dampen the PR hysteria, even when they know it's pure silliness.
    edited August 2019
    mobird
     1Like 0Dislikes 0Informatives
  • Reply 25 of 35
    gatorguygatorguy Posts: 24,769member
    dewme said:
    gatorguy said:
    By the way the Apple Watch is being portrayed as the biggest offender for errant and unintended Siri activations. 
    Quote;
    "If your phone or HomePod are accidentally triggered, they will start recording whatever it hears, which could be a sensitive conversation or romantic encounter that you probably don’t want on record. According to the whistleblower, Siri regularly activates without the explicit “Hey Siri” command, including something as innocuous as “the sound of a zip” on a jacket...
    If that snippet is then selected as one of the ones used for grading, a contractor could hear it. The possibility of an accidental trigger rises significantly with the Apple Watch, which only needs to be raised to activate Siri. And since it’s always on your person, the probability of Siri inadventently recording a sensitive conversation is higher than with a phone or a HomePod."

    "Recordings are accompanied by user data showing location, contact details, and app data.” is still completely unclear and neither denied nor confirmed by Apple. It might be the EU that eventually clarifies it. 
    Not to nitpick words in the quote, but how does one raise the possibility of an accidental trigger? It's either possible or not. The general lack of precision, clarity, and lack of quantification of the probability of an individual's personal information or activities being leaked after the anonymization process makes me think the whistle blower is striving to achieve maximum damage based on hyperbole rather than a scientific assessment of the actual threat. In reality the probability of an individual Siri user being impacted by the grading process is likely extremely small. You would need to know the false trigger rate for Siri, the percentage of requests that are selected for grading, the probability that the user is engaged in a compromising activity that is conveyed via the captured audio, the probability that other correlated data (like location) is significant to the leak, and the probability that the captured audio contains personally identifiable information that is traceable to a single user.

    I suppose if you are continuously engaged in compromising activities, perhaps with a constant stream of zipper noises, all while flailing your Apple Watch around wildly, and doing a play-by-play announcement of the intimate details of your illicit or private activities, while also verbally uttering your full name, SSN, and the names and IDs of those around you, with local weather thrown in for good measure, I suppose there would be an infinitesimally minute probability that one of your sessions would be selected for review. But you'd better also hope that your special moment is reviewed by a rogue contractor who is intent on using the captured data and information against you. Hey, it's possible. But likely improbable, not likely unless you're really digging your chances with lottery odds.

    The general public and media outlets all too often overreact to published stories like this one because they don't understand the difference between possibility and probability. Unfortunately, nearly all attempts to explain the differences fall on deaf ears and companies like Apple are forced to take heavy handed measures simply to dampen the PR hysteria, even when they know it's pure silliness.
    I agree with you. It was a non-story when a contractor stole anonymized recordings from Google, and only a slightly more interesting story when it was discovered Apple does the same thing and perhaps connecting some data points ala Amazon. That does come as a surprise to nearly all of us.  

    The "violation of privacy" angle is far overblown IMHO, and EU regulators are going to extremes to try and find some way of penalizing companies for otherwise doing the right thing. No one's personal privacy is realistically being violated tho there might be an argument for poor transparency. Yes a contractor might be able to identify a person by putting in some additional investigational skills but the data itself is not directly connected to a user account when the contractor receives it, at least as far as we know. 
    muthuk_vanalingamrevenant
     1Like 0Dislikes 1Informative
  • Reply 26 of 35
    holyone said:
    The Apple legion is a hilarious bunch, when Google was caught doing the very same thing for the very same reasons they were lambasted, but Apple gets caught and subsequently stops not because they 'Apple' found some horrendous oversight in their supposed customer first process but because like Google they did the most sensible/deceptive thing in persuit of a better AI but were caught and exposed, LOL . Its going to be so funny when Apple finally realizes howmuch data it takes to make competent AI and has to do a u-turn on this privacy nonsense.  
    Was about to write the same - blatant dual standard.
    revenant
     1Like 0Dislikes 0Informatives
  • Reply 27 of 35
    If an Apple employee hears evidence of a crime while reviewing Siri recordings, are they required to report it to the police?
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 35
    gatorguygatorguy Posts: 24,769member
    If an Apple employee hears evidence of a crime while reviewing Siri recordings, are they required to report it to the police?
    Not likely. IF the recordings are anonymized they would not be of use anyway.
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 35
    DAalsethdaalseth Posts: 3,297member
    So don’t anyone whine about Siri or Google not getting better. This is how the algorithms are tweaked. You don’t want someone listening in to a few seconds of conversation to figure out why the query went wrong or why the AI was triggered in error? Fine, but now you’re stuck with them as they are.
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 35
    zroger73zroger73 Posts: 787member
    Considering how often Siri misunderstands me compared to Alexa, I'd gladly allow third parties to listen to me moan during climax, toot, and sing in the shower if it would improve Siri's hearing and comprehension.
     0Likes 0Dislikes 0Informatives
  • Reply 31 of 35
    JWSCjwsc Posts: 1,203member
    s07 said:
    Has the whistleblower  thought of all the people who were sent home today and are now out of work!
    Apple said it was suspending not ending. I’m sure it will find other things for these people to do in the meantime.
    I wouldn’t.  New approach all around it needed.  And contractors of this sort need not apply.
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 35
    philboogiephilboogie Posts: 7,675member
    Hey Siri, play the latest episode of Game of Thrones on the living room TV.

    ”Ok, timer set for 9 minutes”

    So, yes, I think Apple should work on Siri. Anyway they see fit, abiding the law.
    Carnage
     1Like 0Dislikes 0Informatives
  • Reply 33 of 35
    SpamSandwichspamsandwich Posts: 33,407member
    Bart Y said:
    Contractor sounds more like a disgruntled ex-employee who has now violated there NDA and confidentiality agreement.
    Agreed. Apple might be able to narrow down who might leak this information and have them sued or arrested, but it would probably be better to outsource to another company and make them 100% liable for leaks, NDA violations, etc.
     0Likes 0Dislikes 0Informatives
  • Reply 34 of 35
    kimberly said:
    mobird said:
    What does it take to invoke Siri to listen in on a complete conversation as it is suggested? Most people know that it is a stretch to get Siri to do the most basic task asked of it.
     :D :D :D
    I returned my first HomePod back to Apple for crummy sound (audiophile, my ass) and for the sheer idiocy of Siri on the device. "Hey, Siri, play 'Jump' by Van Halen." Siri: "OK. Here's "---------" by an artist I've never heard of. Couldn't get it to pause/stop if music was turned up. Turned itself on in the middle of the night several times and began playing music. I live alone, the HomePod was far away in a living room corner, my cat doesn't know how to say "Hey, Siri," and there was no intruder. (I was ready; I had my pistol out). Siri in general needs more than a little work. On the HomePod, Siri was the kid who couldn't get past coloring in the lines in kindergarten. So I'm not surprised at what it "accidentally" slurps up; it needs all the help it can get. Stop acting SO SHOCKED, everyone.

    No more HomePod for me, thanks just the same. I wasn't impressed with its sound or alleged capabilities. I'm going back to plain ole stereo/mini-stereo for apartment sound. And I've got other ways to get the weather.
    muthuk_vanalingam
     0Likes 0Dislikes 1Informative
Sign In or Register to comment.