Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'

1235

Comments

  • Reply 81 of 106
    aguyinatx said: You're comparing apples (pun is intended) to oranges.  The other companies you mention are platforms that are either designed specifically to share content, or are cloud providers.  The difference between a cloud provider scanning for CSAM and Apples strategy is that when you use the "cloud" you're really just using someone else's computer, and they have the right to scan their computers for content they do not like.  I have the right to inspect anything in my house any time I like right?  Apple is looking on my phone using a cpu and storage that I paid for and own.  That's the difference, and it's not a matter of semantics.  iCloud is ON by default and Apple is proactively making the decision to A upload my photo, and B scan it.  The "spin" that they're putting on it that they do not have to look at the picture to understand its content is poppy cock; many others, that are better writers than I, have addressed that.  You are not correct.
    iCloud has to be set up by the user. It's not on by default. Part of the setup process is choosing which apps actually back up files to the cloud. Apple's CSAM hash scanning only involves files going to the cloud, so you're probably the one using semantics. The user has already chosen files from a particular app to go to the cloud, so it doesn't actually matter if the hash scan happens on the device or in the cloud. Same difference. The files that would be scanned in either scenario wouldn't change. 
    aguyinatx said: You're comparing apples (pun is intended) to oranges.  The other companies you mention are platforms that are either designed specifically to share content, or are cloud providers.  The difference between a cloud provider scanning for CSAM and Apples strategy is that when you use the "cloud" you're really just using someone else's computer, and they have the right to scan their computers for content they do not like.  I have the right to inspect anything in my house any time I like right?  Apple is looking on my phone using a cpu and storage that I paid for and own.  That's the difference, and it's not a matter of semantics.  iCloud is ON by default and Apple is proactively making the decision to A upload my photo, and B scan it.  The "spin" that they're putting on it that they do not have to look at the picture to understand its content is poppy cock; many others, that are better writers than I, have addressed that.  You are not correct.
    iCloud has to be set up by the user. It's not on by default. Part of the setup process is choosing which apps actually back up files to the cloud. Apple's CSAM hash scanning only involves files going to the cloud, so you're probably the one using semantics. The user has already chosen files from a particular app to go to the cloud, so it doesn't actually matter if the hash scan happens on the device or in the cloud. Same difference. The files that would be scanned in either scenario wouldn't change. 
    I'm sorry, I can see that I wasn't clear.  iCloud photos is on by default when an iCloud account is created. Thanks for the catch.  It isn't semantics, as I explained the difference.  The blog I'm linking to makes the technical case better than I can.  https://www.hackerfactor.com/blog/
  • Reply 82 of 106
    jdw said:

    That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.
    What's funny is that everybody seems to think this is coming in iOS 15.0, when clearly Apple can't even get SharePlay ready in time for next month.

    In reality, Apple already said in the original announcement that it wasn't coming until "later this year," which likely means iOS 15.1, 15.2, or 15.3, depending on how many point releases it gets out this year...
    These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
    So Apple has a bit more time to make up its mind, and I can see them at least pushing it into early next year, much like they did with ATT last year. I doubt that they'll hold off until iOS 16 on this one, though... 
    killroy
  • Reply 83 of 106
    Heh Appleinsider writers. You are repeatedly characterizing this technology as just fine because of hashes. If you think it’s great, I guess that’s your opinion. However, other people have very different opinions. Such as—clever technology does not negate the fact that this is an invasion of privacy—like the comedian says, without probable cause. Believing the method makes that ok is ridiculous. You can be very polite while you break into someone’s home. You could even mostly close your eyes. But ,you are still breaking in. 
    Except they aren’t. You’re paying to use iCloud Photos, a commercial cloud service. They won’t let you store child porn on it. They scan the hash of the images prior to uploading to their commercial server. I see no problem with this. Don’t like it, don’t use iCloud Photos. 

    Google, Microsoft, Dropbox have done the same for years. Nobody allows images to be stored on their servers w/o scanning to ensure it isn't child porn. Microsoft has PhotoDNA, and Google has its own tools:

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    Where was your outrage?

    What makes you think this person wasn't outraged?  They very well could have found out and quit using the aforementioned services in disgust.

    The point isn't that cloud services scan for CSAM, the issue people are having is that it will now be done on the device.  The nuance of where it gets done is a pretty strong privacy issue for people.
    muthuk_vanalingam
  • Reply 84 of 106
    roake said:
    genovelle said:
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    The general public has no idea except what the media highlights here. He is a part of the problem, likely because he has or has had such materials on his phone. Only a hit dog hollas that loud. 
    Or because he doesn’t agree with spyware being factory-installed on billions of devices.  
    You mean what he erroneously believes to be spyware. 

    What would you call software that watches for child pornography and lets Apple know if you are in possession of that material?

    I worked with a guy who was pinched for CP., every computer he used and his phone had this exact kind of monitoring software on it.  I'm not really interested in being treated like a perpetual suspect.
  • Reply 84 of 106
    Roderikus said:
    mcdave said:
    Your argument makes no sense as it would be easier to update server-side scanning than on-device scanning without the user knowing.
    The privacy concern isn’t with scanning (as all devices scan libraries) it’s with reporting the results. The only thing Apple’s system will report is a large collection of verified CSAM images. Can the same be guaranteed for the other services?
    Neither does yours, because nobody can transparently check what kind of pattern recognition these hashes are looking for.
    there is no sufficient legal structure that will offer enough guarantee to protect citizens around the world. Neither in the US.

    True...  But then if there is nothing to hide....
    I'm always suspicious of the "something to hide" argument.  There is a difference between private and secret.  Your parents did something in private that resulted in you, but others weren't allowed to watch.  That's privacy; even though a crime might have been committed in the act.  Secrecy is what criminals and governments want.  It's never that I have something to hide, but I do not have anything that I want to show, and it should be my right to decide what I show to a company and when.
  • Reply 86 of 106
    BeatsBeats Posts: 3,073member
    Law enforcement has been hammering Apple about their strong security potentially hampering capture of child pornographers; well Apple is doing something about it. I don’t care about conceptual discussions about it being constitutionally sound - if it helps prevent this kind of crime I’m all for it.

    No it doesn’t.

    This crap DOES NOT protect children in any shape or form and never did. It’s just marketing speak to allow back doors.
  • Reply 87 of 106
    bbhbbh Posts: 130member
    The problem, to me, is that even after opting out by declining to use iCloud upload for photo storage, the scanning software remains on your iPhone. I think that and the overall privacy issue could be addressed by making the "opt in" download of the scanning software separately, a requirement for activating iCloud Photo upload. 

    A separate download of the scanning software for those who wish to continue with iCloud Photo storage. No scanning software on your iPhone, no ability to use iCloud Photo upload. This does not seem technologically difficult to me. 
    edited August 2021 sdbryanDetnator
  • Reply 88 of 106
    bbhbbh Posts: 130member
    Heh Appleinsider writers. You are repeatedly characterizing this technology as just fine because of hashes. If you think it’s great, I guess that’s your opinion. However, other people have very different opinions. Such as—clever technology does not negate the fact that this is an invasion of privacy—like the comedian says, without probable cause. Believing the method makes that ok is ridiculous. You can be very polite while you break into someone’s home. You could even mostly close your eyes. But ,you are still breaking in. 
    Except they aren’t. You’re paying to use iCloud Photos, a commercial cloud service. They won’t let you store child porn on it. They scan the hash of the images prior to uploading to their commercial server. I see no problem with this. Don’t like it, don’t use iCloud Photos. 

    Google, Microsoft, Dropbox have done the same for years. Nobody allows images to be stored on their servers w/o scanning to ensure it isn't child porn. Microsoft has PhotoDNA, and Google has its own tools:

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    Where was your outrage?
    The problem is that even after you opt out of using iCloud Photo, the scanning software is still sitting there on your iPhone "awaiting orders". If one does not use iCloud Photo,WHY SHOULD HIS PHONE REMAIN BUGGED ?
    muthuk_vanalingam
  • Reply 89 of 106
    bbhbbh Posts: 130member
    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna


    Just because something is done for YEARS, doesn’t necessarily make it right or wrong; that is fallacious thinking (Appealing to Tradition).  I for one don’t agree with the other companies doing this, which is why I had moved away from using Microsoft and google over a decade ago now because of privacy issues.  I’ve also never signed up with accounts for any social media services for this reason as well.  I had found a better (not perfect) alternative in apple, which is why I was not very concerned about CSAM, but now that I’ve come to rely on apples products this feels like they’re doing a “bait and switch”.  I’d switch phones, but all the others are far worse in terms of privacy, with the exception of a burner phone.

    So yes, I can see how many Apple users are up in arms about this.
    So tell me, which commercial cloud data provider did you switch to that doesn’t use hash checking for known child porn? Not Microsoft, Google, Dropbox, as they do. Who then?
    I store all my images on a local NAS; if I need remote access to them, I either use my VPN or I wait till I get home.  I’ve only used iCloud photos for sharing to family and friends. I’m more or less referencing their services/software as a whole (windows, office, search, gmail, etc).
    I do the same thing with my Synology NAS. I can upload the photos from all my devices to the NAS from anywhere I have internet access, and I can access those same photos from anywhere with internet access. Synology's latest software, DSM 7, coupled with their Photos App, completely decouples you from any cloud storage. I have the DS 218+. The latest 2 drive model, the DS220+, is available for around $275, plus the cost of the hard drives you choose. 
  • Reply 90 of 106
    bbhbbh Posts: 130member
    tedz98 said:
    The general public has no understanding of what a file hash is. So the techies at Apple have no understanding of how the general public perceives what they are doing. They just think Apple is scanning their phone. I’m not a huge Bill Maher fan, but I agree with him here. It’s a slippery slope that Apple is embarking on.
    I rarely ever post but as a dad of 3 I’ve got to voice up. I 100% agree this is a slippery slope and hate any twitch that could lead to infringing on rights but I think this discussion could involve more watchdogging since we’ve got the tech to minimize pedos. I don’t trust humans as far as I can throw them but there should be a transparent solution to allow CSAM but with a lot of scrutiny to ensure it’s limited to just that. I see the argument from both sides but at the same time I think we can all agree efforts should be made to protect our kids somehow and I believe most if not all beliefs can unite to defend both sides of using the tech and making sure engineers don’t abuse it into anything. I don’t have a solution for this but there’s a lot of intelligence on these boards which is why I’m bringing it up. 
    I have a solution !! Make an independent download of the scanning software a requirement to activate iCloud Photo. Don't care about iCloud Photo ? Great. Don't download the scanning software. 
  • Reply 91 of 106
    sflocalsflocal Posts: 5,977member
    Am I missing something here?

    All the rhetoric about Apple installing spyware on iPhones is confusing me.  From everything I have read, Apple is only scanning cloud data.  If you store child-porn on your iPhone, nothing happens.  If you store it in iCloud, then that's where all this CSAM happens.  That is nothing different than what other cloud providers are doing.  Don't want your data scanned?  Don't put it on the servers of other companies.

    Have you ever placed your personal belongings in those storage garages?  I have.  Part of the rental agreement is that they can enter your storage locker/garage if they suspect something is going on in there.  Why hasn't anyone raised a stink about that?  The point is, if you don't want anyone sniffing around your stuff then keep it with you or in your house.  Don't store stuff elsewhere that others have access to.  Duh.
    ronnDetnator
  • Reply 92 of 106
    Beats said:
    Law enforcement has been hammering Apple about their strong security potentially hampering capture of child pornographers; well Apple is doing something about it. I don’t care about conceptual discussions about it being constitutionally sound - if it helps prevent this kind of crime I’m all for it.

    No it doesn’t.

    This crap DOES NOT protect children in any shape or form and never did. It’s just marketing speak to allow back doors.
    How do you know that, Beats? Do you work for Apple?
    killroy
  • Reply 93 of 106
    Bill Maher. Newsman? Errrr, not exactly. Technical expert? Errrr, not exactly. Credible? Errrr…

    Washed up hack? Yeah, that's the ticket.
    ronnkillroy
  • Reply 94 of 106
    sdbryansdbryan Posts: 349member
    This may be the wrong place to pose this question but maybe someone can explain. Is CSAM designed only for detecting lazy/dumb perps? Hash functions are designed to give a completely different result if you change even one byte in the source file. It shouldn’t be hard to design a program that would make a visually undetectable change to a file of a photograph but the hash would be nowhere near the hash of the original “offending” file. It could even be done “randomly” so the range of ‘valid’ identifying hash value could be enormous. I know I should read the technical notes that Apple is making availablle but maybe there is a glaring error in my assumptions.
  • Reply 95 of 106
    sdbryan said:
    This may be the wrong place to pose this question but maybe someone can explain. Is CSAM designed only for detecting lazy/dumb perps? Hash functions are designed to give a completely different result if you change even one byte in the source file. It shouldn’t be hard to design a program that would make a visually undetectable change to a file of a photograph but the hash would be nowhere near the hash of the original “offending” file. It could even be done “randomly” so the range of ‘valid’ identifying hash value could be enormous. I know I should read the technical notes that Apple is making available but maybe there is a glaring error in my assumptions.
    Apple makes a claim that Hash value returned would be same for even modified images (cropped and other simple edits done to the pictures), but without providing details on how it is done (or how it is even possible). Would they provide more details on how this is achieved? I don't have much hope on that. That grey area will be left open to the imagination of Apple users. Those who "trust" Apple blindly will support them, while others would keep asking questions without ever getting an answer for this key aspect.
  • Reply 96 of 106
    MarvinMarvin Posts: 14,786moderator
    sdbryan said:
    This may be the wrong place to pose this question but maybe someone can explain. Is CSAM designed only for detecting lazy/dumb perps? Hash functions are designed to give a completely different result if you change even one byte in the source file. It shouldn’t be hard to design a program that would make a visually undetectable change to a file of a photograph but the hash would be nowhere near the hash of the original “offending” file. It could even be done “randomly” so the range of ‘valid’ identifying hash value could be enormous. I know I should read the technical notes that Apple is making available but maybe there is a glaring error in my assumptions.
    Apple makes a claim that Hash value returned would be same for even modified images (cropped and other simple edits done to the pictures), but without providing details on how it is done (or how it is even possible). Would they provide more details on how this is achieved? I don't have much hope on that. That grey area will be left open to the imagination of Apple users. Those who "trust" Apple blindly will support them, while others would keep asking questions without ever getting an answer for this key aspect.
    Apple described the algorithm:

    "Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image. The system computes these hashes by using an embedding network to produce image descriptors and then converting those descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process ensures that different images produce different hashes.
    The embedding network represents images as real-valued vectors and ensures that perceptually and semantically similar images have close descriptors in the sense of angular distance or cosine similarity. Perceptually and semantically different images have descriptors farther apart, which results in larger angular distances. The Hyperplane LSH process then converts descriptors to unique hash values as integers."

    There are examples online showing how this works:

    https://pippy360.github.io/transformationInvariantImageSearch/

    1. The algorithm finds keypoints in the input using edge detection.
    2. Each set of three keypoints is converted into a triangle.
    3. These triangles are transformed into equilateral triangles.
    4. Each equilateral triangle is rotated to each of it's 3 edges and a perceptual hashing algorithm (in this case PHash) is used to produce a hash for each side.
    5. The algorithm compares the hash to those stored in the database and returns all matching images.

    The ability to work with multiple image changes is also why it has a higher collision rate than data hash algorithms and why they use a threshold with multiple matches.
    ronnmuthuk_vanalingamDetnator
  • Reply 97 of 106
    crowleycrowley Posts: 10,238member
    sdbryan said:
    This may be the wrong place to pose this question but maybe someone can explain. Is CSAM designed only for detecting lazy/dumb perps? Hash functions are designed to give a completely different result if you change even one byte in the source file. It shouldn’t be hard to design a program that would make a visually undetectable change to a file of a photograph but the hash would be nowhere near the hash of the original “offending” file. It could even be done “randomly” so the range of ‘valid’ identifying hash value could be enormous. I know I should read the technical notes that Apple is making available but maybe there is a glaring error in my assumptions.
    Apple makes a claim that Hash value returned would be same for even modified images (cropped and other simple edits done to the pictures), but without providing details on how it is done (or how it is even possible). Would they provide more details on how this is achieved? I don't have much hope on that. That grey area will be left open to the imagination of Apple users. Those who "trust" Apple blindly will support them, while others would keep asking questions without ever getting an answer for this key aspect.
    Unfortunately the more information they reveal about the hash, the more they advertise how to get around it.  I don't think the low level technical details of how the hash is performed is of all that much consequence to the privacy debate that is going on.
    edited August 2021 ronn
  • Reply 98 of 106
    bbhbbh Posts: 130member
    sflocal said:
    Am I missing something here?

    All the rhetoric about Apple installing spyware on iPhones is confusing me.  From everything I have read, Apple is only scanning cloud data.  If you store child-porn on your iPhone, nothing happens.  If you store it in iCloud, then that's where all this CSAM happens.  That is nothing different than what other cloud providers are doing.  Don't want your data scanned?  Don't put it on the servers of other companies.

    Have you ever placed your personal belongings in those storage garages?  I have.  Part of the rental agreement is that they can enter your storage locker/garage if they suspect something is going on in there.  Why hasn't anyone raised a stink about that?  The point is, if you don't want anyone sniffing around your stuff then keep it with you or in your house.  Don't store stuff elsewhere that others have access to.  Duh.

    Caramba....did you not read the previous 3 posts by me? Posts nbr 87,88,89. The scanning DOES NOT happen in the iCloud. It happens on your iPhone. That is different from "what other cloud providers are doing".  That's what all the complaining is about. 

    How can you be so confused about something so fundamental? 
    muthuk_vanalingam
  • Reply 99 of 106
    sdbryansdbryan Posts: 349member
    Thank you Crowley, Marvin, and Muthuk_vanalingam for clarifying the technical aspect of this issue. I carelessly assumed Apple was using the narrow meaning of “hash” with which I am familiar. But Apple is bringing a huge amount of processing to every picture stored on every iPhone (if the picture is uploaded to iCloud) in order to produce this more durable hash value. I am beginning to see why so many find this offensive.

    Previously my response was that if a person was opposed to this sort of mass surveillance just turn off iCloud Photos. I suppose it is illogical but before I found the idea of performing a simple hash on every photo seemed innocuous. But the level of scrutiny and implicit assumption that every photo on every iPhone is guilty unless proven ‘innocent’ just seems obscene. I don’t want a piece of code like that in the OS of a device I chose to buy and use.
    muthuk_vanalingam
  • Reply 100 of 106
    I don't follow the guy but has he condemned Twitter and other social media sites banning/shadow banning Trump and other conservative voices?

    If not then he doesn't have any basis for his complaint. If it's ok for private companies to censor on their own platforms, and still somehow be content neutral and not be publishers, then clearly it's ok for Apple to look for criminal content on their servers. As I understand it Apple is only looking at content on iCloud servers.

    I do find it interesting all the left wing types who are chill with the social media companies censoring anyone who doesn't buy into "right think" are suddenly getting all 1st Amendmentish over protecting pedophiles.

    After all to avoid this "problem" pedophiles only have to not put their photos on iCloud.
Sign In or Register to comment.