Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'

2456

Comments

  • Reply 21 of 106
    Bill Maher is an idiot. 
    And, apparently, a self-appointed constitutional expert.

    Self-appointed experts are always the best. ;-)
    killroyronnsconosciutojony0
  • Reply 22 of 106
    larryjwlarryjw Posts: 930member
    Oh, if Bill Maher says so, it must be true!

    I know. My law degree isn't from The Bill Maher School of Law, or Facebook School of Law, so I must be missing something in my legal education 
    killroyronnsconosciuto
  • Reply 23 of 106
    crowley said:
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    They should shelve it not because of the merits or demerits of the system, but because the public don't understand it?  Spare me the pandering to the ignorant.  Proper arguments only.
    I will be shocked if Apple doesn't delay the implementation of this until they can in front of the story.  No company can afford to ignore a potential firestorm of bad PR even if the main problem is public ignorance.  What's the rush?

    Apple is full of smart people. They will explain it and do the right things to get the doubters and haters to change their stands. Or, if they can't do that, then shelve the damn feature altogether.  They tried to do something to interfere with the disgusting child porn industry, but if "society" doesn't want Apple to interfere, then so be it. It's not like this "feature" was going to sell one additional iPhone.
    baconstangmacplusplus
  • Reply 24 of 106
    mobird said:
    Wonder if a injunction by who knows who might put a kabash on the release date of iOS 15?

     

    Not a chance to succeed because you do not have to use iCloud. But if you use iCloud, within iCloud's ToS will be the verbiage that covers what Apple will do as part of the agreement to use the service.  Courts (and the populace in general) allowed this cat out of the bag many years ago. 

    Consider: when Govs and Orgs ultimately just can't get the company to bend to their will,  they then get at one of the links in the chain of that business. Banks, credit processing, shipping are some of the places they can alternately get to upstream or downstream -- this is getting at a company by other means when that company refuses to submit.  Porn sites have been feeling this regulation/lawfare squeeze for some time now -- regulation/law fare getting at the money system these sites depend on -- forcing them to obey or drive them out of business. It isn't just porn they're doing this to.
    Make no mistake, Apple didn't just put this new policy in place because it was simply part of a business plan. This is got the long arm of Govs/Orgs written all over it. 

    killroybaconstangcat52
  • Reply 25 of 106
    mcdavemcdave Posts: 1,917member
    Any talk of privacy not directly also including an outright attack on Android/Google and Facebook is no talk of privacy at all. 
    I'm glad to have the talk about privacy and Apple's CSAM. It's not a great place to plant the flag for privacy but that's the hand that's been dealt. But now include the majority of smartphone users in the world having their privacy tracked to a degree that if they printed to paper what data has been collected, it would easily surpass half a million pages. 
    Unfortunately most of the media people railing against CSAM are probably iPhone users. So privacy issue knocked on their door, now they care. How wonderful of them...
    That’s the thing about Apple’s system. The only data to escape the device/be reported is when a significant collection has been matched to validated CSAM. Until then, nothing gets reported & no privacy is breached.
    This obfuscated FUD is up there with Windows & Android users claiming Apple has the same malware issue they do.
    killroyfastasleepronnmwhiterandominternetpersonArchStantonjony0
  • Reply 26 of 106
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    I rarely ever post but as a dad of 3 I’ve got to voice up. I 100% agree this is a slippery slope and hate any twitch that could lead to infringing on rights but I think this discussion could involve more watchdogging since we’ve got the tech to minimize pedos. I don’t trust humans as far as I can throw them but there should be a transparent solution to allow CSAM but with a lot of scrutiny to ensure it’s limited to just that. I see the argument from both sides but at the same time I think we can all agree efforts should be made to protect our kids somehow and I believe most if not all beliefs can unite to defend both sides of using the tech and making sure engineers don’t abuse it into anything. I don’t have a solution for this but there’s a lot of intelligence on these boards which is why I’m bringing it up. 
  • Reply 27 of 106
    fastasleepfastasleep Posts: 6,119member
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    I rarely ever post but as a dad of 3 I’ve got to voice up. I 100% agree this is a slippery slope and hate any twitch that could lead to infringing on rights but I think this discussion could involve more watchdogging since we’ve got the tech to minimize pedos. I don’t trust humans as far as I can throw them but there should be a transparent solution to allow CSAM but with a lot of scrutiny to ensure it’s limited to just that. I see the argument from both sides but at the same time I think we can all agree efforts should be made to protect our kids somehow and I believe most if not all beliefs can unite to defend both sides of using the tech and making sure engineers don’t abuse it into anything. I don’t have a solution for this but there’s a lot of intelligence on these boards which is why I’m bringing it up. 
    … what? 
    sconosciutojony0
  • Reply 28 of 106
    fastasleepfastasleep Posts: 6,119member
    roake said:
    genovelle said:
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    The general public has no idea except what the media highlights here. He is a part of the problem, likely because he has or has had such materials on his phone. Only a hit dog hollas that loud. 
    Or because he doesn’t agree with spyware being factory-installed on billions of devices.  
    You mean what he erroneously believes to be spyware. 
    ronnpscooter63sconosciutoroundaboutnowjony0
  • Reply 29 of 106
    fastasleepfastasleep Posts: 6,119member
    Heh Appleinsider writers. You are repeatedly characterizing this technology as just fine because of hashes. If you think it’s great, I guess that’s your opinion. However, other people have very different opinions. Such as—clever technology does not negate the fact that this is an invasion of privacy—like the comedian says, without probable cause. Believing the method makes that ok is ridiculous. You can be very polite while you break into someone’s home. You could even mostly close your eyes. But ,you are still breaking in. 
    Not really. Nobody is breaking in as it’s all local, and not a thing happens to you until you meet a detected threshold of known CSAM images matched by the file hashes, at which point you have bigger concerns on your hands than your perceived privacy violations when a real person verifies you’re in possession of illegal child abuse material. At no point before that is anyone “breaking in” or invading your privacy. 
    ronnroundaboutnow
  • Reply 30 of 106
    Breaking: A polemicist expressed an opinion about Apple. AppleInsider will stay on this story as it develops.
  • Reply 31 of 106
    jdwjdw Posts: 1,067member
    mcdave said:
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    How do you feel about Google & Facebook’s server-side CSAM scanning? And all the other scanning they do? Apple’s is the lightest of all.
    Why do people keep asking that ridiculous question to Apple users like myself who come to AppleInsider, not GoogleInsider or FaceBookInsider to read stories EXCLUSIVELY about Apple?  I care not about what FB, Google or anyone else does.  I care only about what Apple does or does not do.  Any other platform has zero meaning in my mind to discuss.
    gatorguymacplusplusmobirdemig647baconstangbluefire1muthuk_vanalingam
  • Reply 32 of 106
    jdwjdw Posts: 1,067member
    crowley said:
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    They should shelve it not because of the merits or demerits of the system, but because the public don't understand it?  Spare me the pandering to the ignorant.  Proper arguments only.
    I will be shocked if Apple doesn't delay the implementation of this until they can in front of the story.  No company can afford to ignore a potential firestorm of bad PR even if the main problem is public ignorance.  What's the rush?

    If Crowley was in charge at Apple, not only would it not be delayed, he'd accelerate the release date ONLY to get back at the general public and government officials in spite!
    macplusplusemig647GeorgeBMacrandominternetpersonbaconstangcat52chemengin1
  • Reply 33 of 106
    jdwjdw Posts: 1,067member
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    You keep arguing they need to shelve this because you don’t understand it. The people that do understand it are not ignoring how file hashing works, and it’s intrinsically related to why people defend the feature. 
    Fast asleep at the wheel again, I see.  I never said that "I" do not understand it.  To suggest what I have written is based on my own individual misunderstanding of the technical side of Apple's plan is ridiculous.  Believe me, AppleInsider has repeatedly told us what that technical side is, so much so that few of us can say we don't really understand it.  

    What matters is what the general iPhone using public thinks, seeing their governmental representatives will ultimately get involved at some point.  We have free-market killers in government already talking Anti Trust action against all our American business success stories.  The last thing we need is for this to become more fodder for those "I'm from the government, and I'm here to help" elected officials to justify going after Apple.
    edited August 2021 randominternetpersoncat52
  • Reply 34 of 106
    jdwjdw Posts: 1,067member
    crowley said:
    Apple's software, Apple's services, no breakage.
    LOL.  I can't wait to hear Tim Cook quote your words on his next TV interview about how Apple is defending privacy as a right...

    Interviewer: "Mr. Cook, you have long defended Privacy as a right and outlined in detail the steps Apple has taken to protect that right.  How do you harmonize your current plan to hash-scan for CSAM images on-device prior to their being uploaded to iCloud?"

    Tim Cook: "Apple's software. Apple's services. No privacy broken!"

    LOL.
    macplusplusmobirdemig647baconstangcat52muthuk_vanalingamchemengin1
  • Reply 35 of 106
    jdw said:
    crowley said:
    Apple's software, Apple's services, no breakage.
    LOL.  I can't wait to hear Tim Cook quote your words on his next TV interview about how Apple is defending privacy as a right...

    Interviewer: "Mr. Cook, you have long defended Privacy as a right and outlined in detail the steps Apple has taken to protect that right.  How do you harmonize your current plan to hash-scan for CSAM images on-device prior to their being uploaded to iCloud?"

    Tim Cook: "Apple's software. Apple's services. No privacy broken!"

    LOL.
    Now your phrase
    hash-scan for CSAM images on-device prior to their being uploaded to iCloud”
    makes me think of how your luggage is x-rayed before being put on a plane.
    GeorgeBMacroundaboutnowronn
  • Reply 36 of 106

    Apple’s story about calculating hashes of pictures doesn’t add up. If they, as they claim, only wants to target the photos that hit iCloud they could just as well have done that the cloud and nobody would have been the wiser. Yet they choose to do it upstreams on the users’ phones. -That is odd! I can see two possible reasons. One being that they want to save some CPU cycles from their servers. Hashing isn’t a big deal and I see little support for that option. The other is that, as everyone keeps pointing out, Apple is establishing a snooping platform that they willy-nilly can updated with a new iOS patch to do whatever kind of shady surveillance tasks they, or somebody else, wants them to do. Effectively turning an iPhone into an iSnoop. I find that to be the most likely reason. 

    I don’t like private companies acting like law enforcement agencies because they are not subject to transparency, the bill of right, court orders etc. A framework that has taken us hundreds of years to get properly tuned. That is all being pushed aside with moves like this one where Apple just turns on the mikes. That irks me. I find it especially despicable when it comes from a company that continuously brag about how they value user privacy and I company I have admired for close to forty years. 

    mobirdaderutterbaconstangcat52muthuk_vanalingamJMStearnsX2
  • Reply 37 of 106
    crowleycrowley Posts: 10,234member
    jdw said:
    crowley said:
    Apple's software, Apple's services, no breakage.
    LOL.  I can't wait to hear Tim Cook quote your words on his next TV interview about how Apple is defending privacy as a right...

    Interviewer: "Mr. Cook, you have long defended Privacy as a right and outlined in detail the steps Apple has taken to protect that right.  How do you harmonize your current plan to hash-scan for CSAM images on-device prior to their being uploaded to iCloud?"

    Tim Cook: "Apple's software. Apple's services. No privacy broken!"

    LOL.
    Tim Cook would never be so undiplomatic. Doesn’t make it wrong though.
    ronnfastasleep
  • Reply 38 of 106
    Bill Maher: yet another person that never bothered to read the User Agreement for iCloud. 
    mwhite
  • Reply 39 of 106
    mcdavemcdave Posts: 1,917member
    jdw said:
    mcdave said:
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    How do you feel about Google & Facebook’s server-side CSAM scanning? And all the other scanning they do? Apple’s is the lightest of all.
    Why do people keep asking that ridiculous question to Apple users like myself who come to AppleInsider, not GoogleInsider or FaceBookInsider to read stories EXCLUSIVELY about Apple?  I care not about what FB, Google or anyone else does.  I care only about what Apple does or does not do.  Any other platform has zero meaning in my mind to discuss.
    Isolating Apple’s intended privacy practice from competitors’ current privacy practices would be an effective way of holding them to a higher degree of accountability with no acknowledgment of them achieving it. Nice try, not fooled.
    roundaboutnowronnfastasleep
  • Reply 40 of 106
    mcdavemcdave Posts: 1,917member

    Apple’s story about calculating hashes of pictures doesn’t add up. If they, as they claim, only wants to target the photos that hit iCloud they could just as well have done that the cloud and nobody would have been the wiser. Yet they choose to do it upstreams on the users’ phones. -That is odd! I can see two possible reasons. One being that they want to save some CPU cycles from their servers. Hashing isn’t a big deal and I see little support for that option. The other is that, as everyone keeps pointing out, Apple is establishing a snooping platform that they willy-nilly can updated with a new iOS patch to do whatever kind of shady surveillance tasks they, or somebody else, wants them to do. Effectively turning an iPhone into an iSnoop. I find that to be the most likely reason. 

    I don’t like private companies acting like law enforcement agencies because they are not subject to transparency, the bill of right, court orders etc. A framework that has taken us hundreds of years to get properly tuned. That is all being pushed aside with moves like this one where Apple just turns on the mikes. That irks me. I find it especially despicable when it comes from a company that continuously brag about how they value user privacy and I company I have admired for close to forty years. 

    Your argument makes no sense as it would be easier to update server-side scanning than on-device scanning without the user knowing.
    The privacy concern isn’t with scanning (as all devices scan libraries) it’s with reporting the results. The only thing Apple’s system will report is a large collection of verified CSAM images. Can the same be guaranteed for the other services?
    mwhiteronnfastasleep
Sign In or Register to comment.