Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'

1246

Comments

  • Reply 61 of 106
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    People wouldn't get so bent out of shape if Apple didn't wear the "privacy! privacy! privacy!" mantra on its sleeve. Cook kept shouting about it from every damn rooftop he could find. 

    It's the sheer, seemingly blatant hypocrisy of it that rankles, rings hollow for a lot of people. Including me. Why? It's from a company that refused to break into a terrorist's phone despite all manner of strong-arming from the governments of the world (a stance that many of us, on principle, agreed with, even though that was not easy). 
    Dumb take. You’re referring to the San Bernardino workplace shooting, to which Apple did offer and help law enforcement, handing over all server-side data. What they didn’t and can’t do is decrypt someone’s device, they don’t have the tools for this. That’s the point of encryption. 
    Um... no. You win the dumbness contest today. Apple said that it might be able to, but that it would take many many man-hours to do. All of Apple's public pronouncements at that time were about how it would not violate a user's privacy on the device. The company pushed back strongly, and despite all manner of political pressure, stayed firm. 

    It became moot, in any event, because an Israeli company was able to help the FBI do it. 
    killroybaconstangcat52chemengin1
  • Reply 62 of 106
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna


    Just because something is done for YEARS, doesn’t necessarily make it right or wrong; that is fallacious thinking (Appealing to Tradition).  I for one don’t agree with the other companies doing this, which is why I had moved away from using Microsoft and google over a decade ago now because of privacy issues.  I’ve also never signed up with accounts for any social media services for this reason as well.  I had found a better (not perfect) alternative in apple, which is why I was not very concerned about CSAM, but now that I’ve come to rely on apples products this feels like they’re doing a “bait and switch”.  I’d switch phones, but all the others are far worse in terms of privacy, with the exception of a burner phone.

    So yes, I can see how many Apple users are up in arms about this.
    So tell me, which commercial cloud data provider did you switch to that doesn’t use hash checking for known child porn? Not Microsoft, Google, Dropbox, as they do. Who then?
    I store all my images on a local NAS; if I need remote access to them, I either use my VPN or I wait till I get home.  I’ve only used iCloud photos for sharing to family and friends. I’m more or less referencing their services/software as a whole (windows, office, search, gmail, etc).
    killroy
  • Reply 63 of 106
    wood1208wood1208 Posts: 2,913member
    May be he loves child pornography and who knows not yet caught pedophile.
    killroy
  • Reply 64 of 106
    I rarely comment but I wanted to share thoughts  I didn’t see anywhere

    1- if you are pedo by now you know that you can store 29 photos and you are good
    as a result Apple you achieved nothing, but for the honest people you implemented a massive surveillance tool BRAVO
    2- when tech engineers are saying WE don’t get it, yep we do!!!!!!, you replace CSAM or inject into it with any other database and you ll get any desired content flagged political, and so on. 

    3-over 30 bad pictures ah! , Apple will check themselves the photos when a threshold is reached ???? Whattttt????
    Let me give you an example with slightly different angle
    so Walmart will install cameras in your house or Tesla will use car video to check if you don’t exceed speed limit or smoke joint in your house, all because these things are illegal. And by the way they would report you 
    Hash or no hash this is EXACTLY what Nest camera, RING could do and there are many other examples 
    Since when a private company can replace the government and authorities?
    cops cannot search your trunk without probable cause or your approval, but Apple can? 
     This is plainly unacceptable 

    4- Well done Apple , the king of privacy and good principles caught in this mess, you could not have done better. 
    Ultimately you are no better than the others , this is disappointing , I expected more from You
    This is not about "your house."  It's about using Apple's server infrastructure to store your stuff.  And I'm pretty sure that the child porn industry doesn't have a not of members who would limit themselves to 29 images or fewer. Obviously that limit is to (try to) avoid a situation where the police are called on me because you sent me one picture from the database.
    killroy
  • Reply 65 of 106
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna


    Just because something is done for YEARS, doesn’t necessarily make it right or wrong; that is fallacious thinking (Appealing to Tradition).  I for one don’t agree with the other companies doing this, which is why I had moved away from using Microsoft and google over a decade ago now because of privacy issues.  I’ve also never signed up with accounts for any social media services for this reason as well.  I had found a better (not perfect) alternative in apple, which is why I was not very concerned about CSAM, but now that I’ve come to rely on apples products this feels like they’re doing a “bait and switch”.  I’d switch phones, but all the others are far worse in terms of privacy, with the exception of a burner phone.

    So yes, I can see how many Apple users are up in arms about this.
    There is nothing faulty about the logic at all.  Apple--and only Apple--is getting skewered in the press. I read a piece in Apple News last night about how "millions" of users are going to leave the Apple ecosystem when they learn about this--and presumably go to Android. That makes no sense if it's clear that everyone else has been doing this server-side for years and, in a sense, Apple is late to the party.

    The uproar/expose should be about how cloud provides screen your stuff, and how Apple is part of that club.
    killroyArchStantonfastasleep
  • Reply 66 of 106
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna


    Just because something is done for YEARS, doesn’t necessarily make it right or wrong; that is fallacious thinking (Appealing to Tradition).  I for one don’t agree with the other companies doing this, which is why I had moved away from using Microsoft and google over a decade ago now because of privacy issues.  I’ve also never signed up with accounts for any social media services for this reason as well.  I had found a better (not perfect) alternative in apple, which is why I was not very concerned about CSAM, but now that I’ve come to rely on apples products this feels like they’re doing a “bait and switch”.  I’d switch phones, but all the others are far worse in terms of privacy, with the exception of a burner phone.

    So yes, I can see how many Apple users are up in arms about this.
    So tell me, which commercial cloud data provider did you switch to that doesn’t use hash checking for known child porn? Not Microsoft, Google, Dropbox, as they do. Who then?
    And much to the annoyance of many of my friends/family, if I MUST use one of these other cloud services for personal use because it’s what they use, I always encrypt the file for general file storage services.  If I can’t encrypt it, (photo or video specific service for instance) I’ll just share it directly with them via Messages instead since it’s e2e encrypted (for now).
    baconstang
  • Reply 67 of 106
    mcdave said:
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    How do you feel about Google & Facebook’s server-side CSAM scanning? And all the other scanning they do? Apple’s is the lightest of all.
    Totally wrong, server side scanning is much more acceptable because 
    1. you decide what information you put on the server (on my iPhone is currently a lot of information I never would put in the cloud)
    2. the device scan could be altered or hacked
    3. you could get send images to the device that are automatically synced to iCloud
    Doing this on the device is just bad and probably only done because it would require to much processing time to calculate hashes for all the uploaded images on the servers.
    Apple needs to delay this "feature" and put it inti iCloud.
    edited August 2021 macplusplusbaconstangmuthuk_vanalingam
  • Reply 68 of 106
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna


    Just because something is done for YEARS, doesn’t necessarily make it right or wrong; that is fallacious thinking (Appealing to Tradition).  I for one don’t agree with the other companies doing this, which is why I had moved away from using Microsoft and google over a decade ago now because of privacy issues.  I’ve also never signed up with accounts for any social media services for this reason as well.  I had found a better (not perfect) alternative in apple, which is why I was not very concerned about CSAM, but now that I’ve come to rely on apples products this feels like they’re doing a “bait and switch”.  I’d switch phones, but all the others are far worse in terms of privacy, with the exception of a burner phone.

    So yes, I can see how many Apple users are up in arms about this.
    There is nothing faulty about the logic at all.  Apple--and only Apple--is getting skewered in the press. I read a piece in Apple News last night about how "millions" of users are going to leave the Apple ecosystem when they learn about this--and presumably go to Android. That makes no sense if it's clear that everyone else has been doing this server-side for years and, in a sense, Apple is late to the party.

    The uproar/expose should be about how cloud provides screen your stuff, and how Apple is part of that club.
    Apple is also the only one that has made strong claims about privacy being important and making good (mostly) on it.  Which is why this is a “bait and switch” as I said earlier.  Everybody knows the other cloud services are snooping through your stuff, so CSAM is actually the least of the worries there.   That is not a claim whether what they are doing is right or wrong, but rather has not been a concern for me as I’ve stopped using any of their services, so why do I care what they do?  The privacy claims is a reason people have chosen Apple over others, and then they do this.

    This is the reason Apple is being targeted specifically, it’s about the claims they made about privacy, and then back stepping a bit.  And rightfully so.

    I don’t condone what the other services do either, but I’ve also voted with my wallet.
    baconstangcat52muthuk_vanalingam
  • Reply 69 of 106
    danoxdanox Posts: 2,858member
    Law enforcement has been hammering Apple about their strong security potentially hampering capture of child pornographers; well Apple is doing something about it. I don’t care about conceptual discussions about it being constitutionally sound - if it helps prevent this kind of crime I’m all for it.
    A massive fishing trip for the government without a warrant.
    baconstangcat52muthuk_vanalingam
  • Reply 70 of 106
    killroykillroy Posts: 276member
    danox said:
    Law enforcement has been hammering Apple about their strong security potentially hampering capture of child pornographers; well Apple is doing something about it. I don’t care about conceptual discussions about it being constitutionally sound - if it helps prevent this kind of crime I’m all for it.
    A massive fishing trip for the government without a warrant.
    They can do it now. Did you think the NSO app needs one.
    GeorgeBMacronn
  • Reply 71 of 106
    mcdave said:
    Your argument makes no sense as it would be easier to update server-side scanning than on-device scanning without the user knowing.
    The privacy concern isn’t with scanning (as all devices scan libraries) it’s with reporting the results. The only thing Apple’s system will report is a large collection of verified CSAM images. Can the same be guaranteed for the other services?
    Neither does yours, because nobody can transparently check what kind of pattern recognition these hashes are looking for.
    there is no sufficient legal structure that will offer enough guarantee to protect citizens around the world. Neither in the US.
  • Reply 72 of 106
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Roderikus said:
    mcdave said:
    Your argument makes no sense as it would be easier to update server-side scanning than on-device scanning without the user knowing.
    The privacy concern isn’t with scanning (as all devices scan libraries) it’s with reporting the results. The only thing Apple’s system will report is a large collection of verified CSAM images. Can the same be guaranteed for the other services?
    Neither does yours, because nobody can transparently check what kind of pattern recognition these hashes are looking for.
    there is no sufficient legal structure that will offer enough guarantee to protect citizens around the world. Neither in the US.

    True...  But then if there is nothing to hide....
  • Reply 73 of 106
    gatorguygatorguy Posts: 24,213member

    Apple’s story about calculating hashes of pictures doesn’t add up. If they, as they claim, only wants to target the photos that hit iCloud they could just as well have done that the cloud and nobody would have been the wiser. Yet they choose to do it upstreams on the users’ phones. -That is odd! I can see two possible reasons. One being that they want to save some CPU cycles from their servers. Hashing isn’t a big deal and I see little support for that option. The other is that, as everyone keeps pointing out, Apple is establishing a snooping platform that they willy-nilly can updated with a new iOS patch to do whatever kind of shady surveillance tasks they, or somebody else, wants them to do. Effectively turning an iPhone into an iSnoop. I find that to be the most likely reason. 

    I don’t like private companies acting like law enforcement agencies because they are not subject to transparency, the bill of right, court orders etc. A framework that has taken us hundreds of years to get properly tuned. That is all being pushed aside with moves like this one where Apple just turns on the mikes. That irks me. I find it especially despicable when it comes from a company that continuously brag about how they value user privacy and I company I have admired for close to forty years. 


    Idiotic take and also a logical fallacy.

    A much more likely reason is Apple is going to add end-to-end encryption for Photos, which will make it impossible for Apple to scan anything server-side.
    As good a guess as any. That's one of a few different reasons they might have done this. 
    edited August 2021 dewme
  • Reply 74 of 106
    fahlmanfahlman Posts: 740member
    The Constitution guarantees the government has no right to search you or your belongings without a warrant issued by a court. It does not guarantee the same limitations from  a corporation. Those limitations are at your choosing when you decide to use a company's product, or not.
    edited August 2021 ronnforegoneconclusiondewme
  • Reply 75 of 106
    bluefire1bluefire1 Posts: 1,302member
    I agree with Bill Maher 100%. No matter how laudable Apple’s intentions, there shouldn’t be any back door
    to user’s iPhones or iCloud accounts. Otherwise Apple’s constant reminders of how much they value privacy to the utmost rings hollow. 
    edited August 2021
  • Reply 76 of 106
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna

    You're comparing apples (pun is intended) to oranges.  The other companies you mention are platforms that are either designed specifically to share content, or are cloud providers.  The difference between a cloud provider scanning for CSAM and Apples strategy is that when you use the "cloud" you're really just using someone else's computer, and they have the right to scan their computers for content they do not like.  I have the right to inspect anything in my house any time I like right?  Apple is looking on my phone using a cpu and storage that I paid for and own.  That's the difference, and it's not a matter of semantics.  iCloud is ON by default and Apple is proactively making the decision to A upload my photo, and B scan it.  The "spin" that they're putting on it that they do not have to look at the picture to understand its content is poppy cock; many others, that are better writers than I, have addressed that.  You are not correct.
    danoxjdwJMStearnsX2
  • Reply 77 of 106
    danoxdanox Posts: 2,858member
    killroy said:
    danox said:
    Law enforcement has been hammering Apple about their strong security potentially hampering capture of child pornographers; well Apple is doing something about it. I don’t care about conceptual discussions about it being constitutionally sound - if it helps prevent this kind of crime I’m all for it.
    A massive fishing trip for the government without a warrant.
    They can do it now. Did you think the NSO app needs one.

    Then Apple doesn’t to help them?
    edited August 2021
  • Reply 78 of 106
    aguyinatx said: You're comparing apples (pun is intended) to oranges.  The other companies you mention are platforms that are either designed specifically to share content, or are cloud providers.  The difference between a cloud provider scanning for CSAM and Apples strategy is that when you use the "cloud" you're really just using someone else's computer, and they have the right to scan their computers for content they do not like.  I have the right to inspect anything in my house any time I like right?  Apple is looking on my phone using a cpu and storage that I paid for and own.  That's the difference, and it's not a matter of semantics.  iCloud is ON by default and Apple is proactively making the decision to A upload my photo, and B scan it.  The "spin" that they're putting on it that they do not have to look at the picture to understand its content is poppy cock; many others, that are better writers than I, have addressed that.  You are not correct.
    iCloud has to be set up by the user. It's not on by default. Part of the setup process is choosing which apps actually back up files to the cloud. Apple's CSAM hash scanning only involves files going to the cloud, so you're probably the one using semantics. The user has already chosen files from a particular app to go to the cloud, so it doesn't actually matter if the hash scan happens on the device or in the cloud. Same difference. The files that would be scanned in either scenario wouldn't change. 
    edited August 2021 dewmefastasleep
  • Reply 79 of 106
    crowleycrowley Posts: 10,453member
    bluefire1 said:
    I agree with Bill Maher 100%. No matter how laudable Apple’s intentions, there shouldn’t be any back door
    to user’s iPhones or iCloud accounts. Otherwise Apple’s constant reminders of how much they value privacy to the utmost rings hollow. 
    Not a back door.  Apple does not get any more access than they already had.
    fastasleepronn
  • Reply 80 of 106
    dewmedewme Posts: 5,368member
    crowley said:
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    They should shelve it not because of the merits or demerits of the system, but because the public don't understand it?  Spare me the pandering to the ignorant.  Proper arguments only.
    Apple should probably wait until Martha Stewart weighs in on the efficacy of the hashing algorithm that Apple is using before deciding whether to delay the rollout of this function to the Apple customer base. Perhaps her recommendation would be that we should follow the honor system, you know, only Apple customers who are actually dealing in child porn should turn this feature on in their iCloud account settings.

    All kidding aside, we shouldn’t keep referring to this as being within the scope of “public opinion.” This is only a concern for “Apple customers.” There is clearly a separation between these two groups, with membership in one group being completely voluntary. If Samsung was rolling out an on-device scanner in their phones, we wouldn’t care.
Sign In or Register to comment.