Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'

1356

Comments

  • Reply 41 of 106
    jdw said:
    crowley said:
    Apple's software, Apple's services, no breakage.
    LOL.  I can't wait to hear Tim Cook quote your words on his next TV interview about how Apple is defending privacy as a right...

    Interviewer: "Mr. Cook, you have long defended Privacy as a right and outlined in detail the steps Apple has taken to protect that right.  How do you harmonize your current plan to hash-scan for CSAM images on-device prior to their being uploaded to iCloud?"

    Tim Cook: "Apple's software. Apple's services. No privacy broken!"

    LOL.
    Now your phrase
    “hash-scan for CSAM images on-device prior to their being uploaded to iCloud”
    makes me think of how your luggage is x-rayed before being put on a plane.
    They don't scan your luggage in your house before being put on a plane.
    anantksundaramemig647baconstangRoderikuscat52bluefire1muthuk_vanalingamchemengin1
  • Reply 42 of 106
    mcdave said:

    Apple’s story about calculating hashes of pictures doesn’t add up. If they, as they claim, only wants to target the photos that hit iCloud they could just as well have done that the cloud and nobody would have been the wiser. Yet they choose to do it upstreams on the users’ phones. -That is odd! I can see two possible reasons. One being that they want to save some CPU cycles from their servers. Hashing isn’t a big deal and I see little support for that option. The other is that, as everyone keeps pointing out, Apple is establishing a snooping platform that they willy-nilly can updated with a new iOS patch to do whatever kind of shady surveillance tasks they, or somebody else, wants them to do. Effectively turning an iPhone into an iSnoop. I find that to be the most likely reason. 

    I don’t like private companies acting like law enforcement agencies because they are not subject to transparency, the bill of right, court orders etc. A framework that has taken us hundreds of years to get properly tuned. That is all being pushed aside with moves like this one where Apple just turns on the mikes. That irks me. I find it especially despicable when it comes from a company that continuously brag about how they value user privacy and I company I have admired for close to forty years. 

    Your argument makes no sense as it would be easier to update server-side scanning than on-device scanning without the user knowing.
    The privacy concern isn’t with scanning (as all devices scan libraries) it’s with reporting the results. The only thing Apple’s system will report is a large collection of verified CSAM images. Can the same be guaranteed for the other services?
    That is exactly the point I was making. Apple has all the information required to carry out CSAM hashing in the cloud. They don't need to have it done on the device. Why on earth do they have to install these components on our phones? Unless of course there is more at play that the 'it's for the children' argument they keep pushing. My view is that they have received a combined governmental order and a gag order to install an on-device spying framework. What do you think? 
    anantksundaramaderutterxyzzy-xxxbaconstangRoderikuscat52muthuk_vanalingam
  • Reply 43 of 106
    I rarely comment but I wanted to share thoughts  I didn’t see anywhere

    1- if you are pedo by now you know that you can store 29 photos and you are good
    as a result Apple you achieved nothing, but for the honest people you implemented a massive surveillance tool BRAVO
    2- when tech engineers are saying WE don’t get it, yep we do!!!!!!, you replace CSAM or inject into it with any other database and you ll get any desired content flagged political, and so on. 

    3-over 30 bad pictures ah! , Apple will check themselves the photos when a threshold is reached ???? Whattttt????
    Let me give you an example with slightly different angle
    so Walmart will install cameras in your house or Tesla will use car video to check if you don’t exceed speed limit or smoke joint in your house, all because these things are illegal. And by the way they would report you 
    Hash or no hash this is EXACTLY what Nest camera, RING could do and there are many other examples 
    Since when a private company can replace the government and authorities?
    cops cannot search your trunk without probable cause or your approval, but Apple can? 
     This is plainly unacceptable 

    4- Well done Apple , the king of privacy and good principles caught in this mess, you could not have done better. 
    Ultimately you are no better than the others , this is disappointing , I expected more from You
    macplusplusErikSomethingaderutterbaconstangRoderikuscat52muthuk_vanalingam
  • Reply 44 of 106

    Apple’s story about calculating hashes of pictures doesn’t add up. If they, as they claim, only wants to target the photos that hit iCloud they could just as well have done that the cloud and nobody would have been the wiser. Yet they choose to do it upstreams on the users’ phones. -That is odd! I can see two possible reasons. One being that they want to save some CPU cycles from their servers. Hashing isn’t a big deal and I see little support for that option. The other is that, as everyone keeps pointing out, Apple is establishing a snooping platform that they willy-nilly can updated with a new iOS patch to do whatever kind of shady surveillance tasks they, or somebody else, wants them to do. Effectively turning an iPhone into an iSnoop. I find that to be the most likely reason. 

    I don’t like private companies acting like law enforcement agencies because they are not subject to transparency, the bill of right, court orders etc. A framework that has taken us hundreds of years to get properly tuned. That is all being pushed aside with moves like this one where Apple just turns on the mikes. That irks me. I find it especially despicable when it comes from a company that continuously brag about how they value user privacy and I company I have admired for close to forty years. 


    Idiotic take and also a logical fallacy.

    A much more likely reason is Apple is going to add end-to-end encryption for Photos, which will make it impossible for Apple to scan anything server-side.
    StrangeDaysrandominternetpersonDoctorQjdb8167roundaboutnowronnfastasleep
  • Reply 45 of 106
    Then we could argue Why looking for end 2 end encryption

  • Reply 46 of 106
    yensid98yensid98 Posts: 311member
    Let's say I buy a brand new house shortly after it's been built. Unbeknownst to me, a person from the company that built the house is hiding inside. Their sole job function is to make sure I'm not doing anything illegal. They can't see me, but if they hear certain sounds they can report me to the authorities. I live in the house for years without ever discovering this hidden person.

    I doubt most anyone would be ok with this situation once they knew about the person living in their house. This person didn't break in. They couldn't see me. But it's still a very obvious that a breach of privacy has occurred. I'd imagine most people would feel violated in this situation. 

    Apple has a lot of damage control to do if they are adamant about moving forward with this technology. 
    exceptionhandlerbaconstangRoderikuscat52muthuk_vanalingam
  • Reply 47 of 106
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    People wouldn't get so bent out of shape if Apple didn't wear the "privacy! privacy! privacy!" mantra on its sleeve. Cook kept shouting about it from every damn rooftop he could find. 

    It's the sheer, seemingly blatant hypocrisy of it that rankles, rings hollow for a lot of people. Including me. Why? It's from a company that refused to break into a terrorist's phone despite all manner of strong-arming from the governments of the world (a stance that many of us, on principle, agreed with, even though that was not easy). 
    citizendeltqaderutteremig647baconstangRoderikuscat52muthuk_vanalingamtokyojimuchemengin1
  • Reply 48 of 106
    Lot of good comments
    Apple cannot have it both ways

    what s next IMessage because there is also a lot of photos shared?
    or and may be FaceTime because who knows what can be transmitted.

    i can foresee Apple doing on device scan in fact for all the apps, and given how they forced Facebook to disclose the data collected  this is only a bridge to cross

    Government has to step in
    anantksundaram
  • Reply 49 of 106
    I think we can all agree the only thing in contention is the on-device aspect which many believe is a privacy violation and clearly will not fly as an option globally.

    This should all be done in the cloud - I mistakenly thought they already were... 
    baconstangcat52muthuk_vanalingam
  • Reply 50 of 106
    GeorgeBMacGeorgeBMac Posts: 11,421member
    For Apple to be this bad at damage control is very off-brand.

    ....
    "Bad at damage control"?
    No, it's pretty normal...

    But if you don't hold it wrong, nobody will know...   Especially as Apple slows your phone because the battery is about to die.

    But, the hair guy straightened it all out when he told us:   "We just don't understand".

    No, this is pretty much normal damage control for Apple!   (Fortunately, they don't have to do it very often)

    baconstang
  • Reply 51 of 106
    GeorgeBMacGeorgeBMac Posts: 11,421member
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    I rarely ever post but as a dad of 3 I’ve got to voice up. I 100% agree this is a slippery slope and hate any twitch that could lead to infringing on rights but I think this discussion could involve more watchdogging since we’ve got the tech to minimize pedos. I don’t trust humans as far as I can throw them but there should be a transparent solution to allow CSAM but with a lot of scrutiny to ensure it’s limited to just that. I see the argument from both sides but at the same time I think we can all agree efforts should be made to protect our kids somehow and I believe most if not all beliefs can unite to defend both sides of using the tech and making sure engineers don’t abuse it into anything. I don’t have a solution for this but there’s a lot of intelligence on these boards which is why I’m bringing it up. 
    You just nailed the single biggest benefit to democracy and a free press:  Decisions (whether by government, private industry or even private individuals) that impact society are discussed and debated.

    Is this a great idea to protect kids?  Or is it slippery slope that will have unintended consequences and collateral damage?
    And, who decides?

    Maher's position and the responses to it show that few have the answer -- or is it everybody has AN answer?

    emig647sagan_studentbaconstangDetnator
  • Reply 52 of 106
    GeorgeBMacGeorgeBMac Posts: 11,421member
    yensid98 said:
    Let's say I buy a brand new house shortly after it's been built. Unbeknownst to me, a person from the company that built the house is hiding inside. Their sole job function is to make sure I'm not doing anything illegal. They can't see me, but if they hear certain sounds they can report me to the authorities. I live in the house for years without ever discovering this hidden person.

    I doubt most anyone would be ok with this situation once they knew about the person living in their house. This person didn't break in. They couldn't see me. But it's still a very obvious that a breach of privacy has occurred. I'd imagine most people would feel violated in this situation. 

    Apple has a lot of damage control to do if they are adamant about moving forward with this technology. 
    That's a very good, valid argument.
    The counter argument is:  "If "it' is done to protect an innocent child (or whatever your hot button issue is) then it's OK".

    Maybe this is a bit like war -- where everybody loses no matter who wins -- where once it starts there is no right answer.
    baconstangJMStearnsX2
  • Reply 53 of 106
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Lot of good comments
    Apple cannot have it both ways

    what s next IMessage because there is also a lot of photos shared?
    or and may be FaceTime because who knows what can be transmitted.

    i can foresee Apple doing on device scan in fact for all the apps, and given how they forced Facebook to disclose the data collected  this is only a bridge to cross

    Government has to step in
    Ultimately, this is a societal issue -- and government is the arbiter or societal issues.

    By the way, whatever you do, don't look at the iPhone of a adolescent boy (or girl these days)!   It's likely inundated with "child pornography"!
    edited August 2021 baconstang
  • Reply 54 of 106
    StrangeDaysStrangeDays Posts: 12,879member
    Heh Appleinsider writers. You are repeatedly characterizing this technology as just fine because of hashes. If you think it’s great, I guess that’s your opinion. However, other people have very different opinions. Such as—clever technology does not negate the fact that this is an invasion of privacy—like the comedian says, without probable cause. Believing the method makes that ok is ridiculous. You can be very polite while you break into someone’s home. You could even mostly close your eyes. But ,you are still breaking in. 
    Except they aren’t. You’re paying to use iCloud Photos, a commercial cloud service. They won’t let you store child porn on it. They scan the hash of the images prior to uploading to their commercial server. I see no problem with this. Don’t like it, don’t use iCloud Photos. 

    Google, Microsoft, Dropbox have done the same for years. Nobody allows images to be stored on their servers w/o scanning to ensure it isn't child porn. Microsoft has PhotoDNA, and Google has its own tools:

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    Where was your outrage?
    edited August 2021 jdb8167ronn
  • Reply 55 of 106
    StrangeDaysStrangeDays Posts: 12,879member
    mcdave said:

    Apple’s story about calculating hashes of pictures doesn’t add up. If they, as they claim, only wants to target the photos that hit iCloud they could just as well have done that the cloud and nobody would have been the wiser. Yet they choose to do it upstreams on the users’ phones. -That is odd! I can see two possible reasons. One being that they want to save some CPU cycles from their servers. Hashing isn’t a big deal and I see little support for that option. The other is that, as everyone keeps pointing out, Apple is establishing a snooping platform that they willy-nilly can updated with a new iOS patch to do whatever kind of shady surveillance tasks they, or somebody else, wants them to do. Effectively turning an iPhone into an iSnoop. I find that to be the most likely reason. 

    I don’t like private companies acting like law enforcement agencies because they are not subject to transparency, the bill of right, court orders etc. A framework that has taken us hundreds of years to get properly tuned. That is all being pushed aside with moves like this one where Apple just turns on the mikes. That irks me. I find it especially despicable when it comes from a company that continuously brag about how they value user privacy and I company I have admired for close to forty years. 

    Your argument makes no sense as it would be easier to update server-side scanning than on-device scanning without the user knowing.
    The privacy concern isn’t with scanning (as all devices scan libraries) it’s with reporting the results. The only thing Apple’s system will report is a large collection of verified CSAM images. Can the same be guaranteed for the other services?
    That is exactly the point I was making. Apple has all the information required to carry out CSAM hashing in the cloud. They don't need to have it done on the device. Why on earth do they have to install these components on our phones? Unless of course there is more at play that the 'it's for the children' argument they keep pushing. My view is that they have received a combined governmental order and a gag order to install an on-device spying framework. What do you think? 
    It’s speculated they are doing it on-device because it’s more private — Apple never knows how many hits you have prior to the 30ish threshold. If it were done on server they would. And this could also enable them to implement E2E encryption on iCloud, which it doesn’t do currently. 
    sagan_studentjdb8167
  • Reply 56 of 106
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna


    Just because something is done for YEARS, doesn’t necessarily make it right or wrong; that is fallacious thinking (Appealing to Tradition).  I for one don’t agree with the other companies doing this, which is why I had moved away from using Microsoft and google over a decade ago now because of privacy issues.  I’ve also never signed up with accounts for any social media services for this reason as well.  I had found a better (not perfect) alternative in apple, which is why I was not very concerned about CSAM, but now that I’ve come to rely on apples products this feels like they’re doing a “bait and switch”.  I’d switch phones, but all the others are far worse in terms of privacy, with the exception of a burner phone.

    So yes, I can see how many Apple users are up in arms about this.
    baconstangRoderikuscat52JMStearnsX2
  • Reply 57 of 106
    Do people really believe that a company that has full vertical integration over their hardware, operating system, applications and services doesn’t already track their data for the company’s own benefit?
  • Reply 58 of 106
    StrangeDaysStrangeDays Posts: 12,879member
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    People wouldn't get so bent out of shape if Apple didn't wear the "privacy! privacy! privacy!" mantra on its sleeve. Cook kept shouting about it from every damn rooftop he could find. 

    It's the sheer, seemingly blatant hypocrisy of it that rankles, rings hollow for a lot of people. Including me. Why? It's from a company that refused to break into a terrorist's phone despite all manner of strong-arming from the governments of the world (a stance that many of us, on principle, agreed with, even though that was not easy). 
    Dumb take. You’re referring to the San Bernardino workplace shooting, to which Apple did offer and help law enforcement, handing over all server-side data. What they didn’t and can’t do is decrypt someone’s device, they don’t have the tools for this. That’s the point of encryption. 
    jdb8167roundaboutnowronnfastasleep
  • Reply 59 of 106
    yensid98 said:
    Let's say I buy a brand new house shortly after it's been built. Unbeknownst to me, a person from the company that built the house is hiding inside. Their sole job function is to make sure I'm not doing anything illegal. They can't see me, but if they hear certain sounds they can report me to the authorities. I live in the house for years without ever discovering this hidden person.

    I doubt most anyone would be ok with this situation once they knew about the person living in their house. This person didn't break in. They couldn't see me. But it's still a very obvious that a breach of privacy has occurred. I'd imagine most people would feel violated in this situation. 

    Apple has a lot of damage control to do if they are adamant about moving forward with this technology. 
    This isn't a parallel to what Apple is doing since Apple publicly announced everything that they would be implementing. Apple has also previously specified in the User Agreement for iCloud that they reserved the right to scan files to ensure users were abiding by the terms of the agreement. 

    IMO, it's more like buying a house that is subject to an HOA and not bothering to read the terms of the HOA. You might be unhappy to learn certain things are restricted after the fact, but it's not like the information wasn't available to you from the start. 
    edited August 2021 randominternetpersonronnzimmiefastasleepDetnator
  • Reply 60 of 106
    StrangeDaysStrangeDays Posts: 12,879member
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna


    Just because something is done for YEARS, doesn’t necessarily make it right or wrong; that is fallacious thinking (Appealing to Tradition).  I for one don’t agree with the other companies doing this, which is why I had moved away from using Microsoft and google over a decade ago now because of privacy issues.  I’ve also never signed up with accounts for any social media services for this reason as well.  I had found a better (not perfect) alternative in apple, which is why I was not very concerned about CSAM, but now that I’ve come to rely on apples products this feels like they’re doing a “bait and switch”.  I’d switch phones, but all the others are far worse in terms of privacy, with the exception of a burner phone.

    So yes, I can see how many Apple users are up in arms about this.
    So tell me, which commercial cloud data provider did you switch to that doesn’t use hash checking for known child porn? Not Microsoft, Google, Dropbox, as they do. Who then?
    edited August 2021 ronn
Sign In or Register to comment.