Open letter asks Apple not to implement Child Safety measures

245

Comments

  • Reply 21 of 91
    crowleycrowley Posts: 10,453member
    darkvader said:
    crowley said:
    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 
    FUD.

    It's a private method for comparing an image hash against a specific database of image hashes.  

    No indication that it'll become a generic API, or a feature for anyone else to use.
    No indication that there will be any way for it to be hijacked, short of rooting the entire phone, in which case all bets are off anyway.
    No indication that it could be used to search for anything other than images, or more specifically the hashes of those images.
    No indication that it couldn't be shut down any time Apple wanted to shut it down.

    It's not FUD.  There is no indication that it can't be used for scanning any type of data, and in fact it's obvious that it absolutely can be.
    What?  There's no indication that code written to turn photos into a hash and compare it against a stored value can't be used for something completely different?  There's no indication that Calculator.app can't be used for launching nuclear missiles either, except for the obvious reason that it absolutely can't be.

    Code does what it's designed to do.
  • Reply 22 of 91
    GeorgeBMac said: What about laws passed by democratic governments?   Are they absolved of responsibility?  Are they pure and holy and without  fault?

    Have you heard of the Patriot Act?
    Or the data Apple supplied to the government on Democratic lawmakers?
    Of course I've heard of the Patriot Act. It was a public document available to anyone to read when it was passed/signed in 2001, and plenty of the people that read it were vocally opposed to it right from the start. That's what eventually led to the warrantless wiretapping provision being eliminated by Congress: public pressure. However, there were plenty of people that were ignorant of the Patriot Act and what it contained, thus Edward Snowden's success in repackaging old news from the Patriot Act a decade later as if it were something new to be worried about.
    edited August 2021 killroy
  • Reply 23 of 91
    Rayz2016Rayz2016 Posts: 6,957member
    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 

    Apple takes hash values of child abuse images from the NMSEC database and loads them onto the phone (which is not something I want to think about being loaded onto my phone, even if they are just hashed values). The phone then runs the comparison.

    So the hijacking is basically just loading the hash to whichever database Apple is told to track. 
    macplusplusargonaut
  • Reply 24 of 91
    robin huberrobin huber Posts: 3,960member
    Attaching Snowden’s name to this motivates me to ignore it. Old line liberal here, but selling out your country is a bridge too far. 
    foregoneconclusionwilliamlondonfirelockkillroyDnykjpRfC6fnBsDBSync
  • Reply 25 of 91
    hucom2000hucom2000 Posts: 149member
    I was just wondering how different (or not) this idea is to Safari now warning you about compromised passwords that have appeared in data leaks?

    It seems like the same principle: Apple is taking something private (passcodes in my keychain) from me and comparing it to something public (lists of known leaked passwords).  If there’s a match an action is taken. Sure, the action looks different. But the basic idea seems the same.

    Why didn’t anyone get upset about that feature?
    Because you can turn it off?
    edited August 2021 DBSyncdewmekillroyDnykjpRfC6fnBsn2itivguyFileMakerFeller
  • Reply 26 of 91
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.

    The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.

    Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.

    This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.


    The only ones who fear storing stuff on iCloud -- or fear having the government see their "private" stuff -- are, as you call them, the "Bad guys",

    They can see all of mine that they want to.  They'll be bored to death.
    DBSyncdewmekillroyn2itivguy
  • Reply 27 of 91
    cochococho Posts: 8member
    What will they search for next?
    baconstang
  • Reply 28 of 91
    WgkruegerWgkrueger Posts: 352member
    With respect to the image scanning I think Apple can make a good case that they want illegal and horrific images off of their servers. I applaud the use technology to eradicate these images from the face of the earth and more specifically to stop child abuse in general. 

    My personal perspective is related to Apples methodology. 

    The method Apple chose involves uploading data to customer owned devices without their permission for the needs of a third party. Further they are using the processing power of their customers equipment without their permission to access customers data without their permission with the sole purpose of finding out if that customer is breaking the law. It seems like this is violating the customers trust in Apple. it also seems like this makes Apple an agent for the government. 

    The reason for what they’re scanning for and the methods they’re using is because it’s justifiably horrific, I.e. child abuse, and the images are widespread, so there is a limit to Apples privacy stance imo. 

    baconstangargonautkillroy
  • Reply 29 of 91
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.

    The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.

    Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.

    This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.


    The only ones who fear storing stuff on iCloud -- or fear having the government see their "private" stuff -- are, as you call them, the "Bad guys",

    They can see all of mine that they want to.  They'll be bored to death.
    Whilst Apple wasn't looking at my bad guys photos, they may have believed they were safe to store them in iCloud.

    In theory, at least they now know that is probably not going to be the case and they will move them (delete them would be better). Apple's problem is now solved, and the good guys are irritated.
  • Reply 30 of 91
    lkrupp said:
    So I wonder what the authors of this letter will do if Apple goes ahead? Who will they turn to for privacy? Do they not know that Google is doing this? So if they stop buying Apple then who will they buy from, or will they get off the internet?

    Sputtering about privacy, government surveillance, etc. when you are impotent to do anything because the only option is to leave the internet is not productive. Several have threatened or stated that they will no longer buy Apple products because of this. Who are they kidding, where will they go? This will blow over, will be implemented, and life will go on. By the way, both political parties are on board with this, so no matter who you vote for it’s a done deal.

    So how about those of you clamoring for Apple to stop this tell us what you plan to do when it is implemented. What will be your response? Writing your Congressperson? Or will you be hypocrites and go about your business?
    I’d guess, correct me if I wrong, is that all online storage systems have some type of this. And that if you don’t you may well be liable because of not implementing it. This raises interesting questions and approaching it from the 18th century isn’t going to help. Just like the issue of Apple ‘bending’ to foreign laws on technology. Lots of companies ‘bend’ to US laws on what they do and sell. If you want to do business in said country, you follow their laws. Or you remove yourself from that territory. But I think those who are screaming the most about Apple ‘bending’ to outrageous foreign laws are not for a world government with one set of laws everywhere, correct conservative folks who make this an issue? So what is the alternative? Don’t make a product and sell it in that country. SO start your own company and slowly stop selling it everywhere that you disagree with a law. I think eventually Apple will be faced with having to back out of a country that goes rabid on some of these monopoly/right to fix BS issues. But it may well be a ‘western’ country that goes down that stupid road.
    DBSynckillroy
  • Reply 31 of 91
    wood1208wood1208 Posts: 2,913member
    To me as part of humanity; anyone sexually abusing,exploiting children must be put to death. Period. Apple's intention is good but if Apple report to law authority for someone with one picture in their cloud storage without they even know than that could be a problem. But, if bad peoples in business to do bad things to children than not only Apple but every company, government agency and good people on earth must report such monster behavior to law enforcement agencies. 
    williamlondonkillroyDnykjpRfC6fnBs
  • Reply 32 of 91
    omasouomasou Posts: 576member
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    edited August 2021 williamlondonkillroyDnykjpRfC6fnBsDBSync
  • Reply 33 of 91
    macplusplusmacplusplus Posts: 2,112member
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
    edited August 2021 gatorguyhcrefugeenadriel
  • Reply 34 of 91
    crowleycrowley Posts: 10,453member
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
    "might" and "may" doing a lot of work there.
    n2itivguy
  • Reply 35 of 91
    omasouomasou Posts: 576member
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
    Take the time to read before posting. They are hashing on your device before it’s encrypted in iCloud.

    Apple doesn’t want to be an accessory to your CSAM crimes or enable you to hide.

    The high level of security we enjoy from Apple’s ecosystem is to protect our privacy not to engage in crimes against children.
    edited August 2021 DnykjpRfC6fnBsn2itivguy
  • Reply 36 of 91
    What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics. 
  • Reply 37 of 91
    How can you promote privacy yet invade privacy at the same time?
    entropysbaconstangcolinngmuthuk_vanalingamchemengin1
  • Reply 38 of 91
    entropysentropys Posts: 4,168member
    Rayz2016 said:
    Yup, this basically. 



    Not sure what the EFF expects when it comes to laws passed by authoritarian governments. The only options would be to comply or leave the market. If every private company leaves the market, then state run companies will fill the void. So on the one hand you can say "we're not participating", but on the other you would have to admit that you haven't changed anything per protections for the people that live in the country. 
    I am sure the executives of state run companies (and the politicians) would be devastated by that scenario.
    edited August 2021
  • Reply 39 of 91
    entropysentropys Posts: 4,168member
    DAalseth said:
    No matter how well intentioned, this effort will be used to damage Apple's reputation, severely. It should be abandoned immediately. 
    Remember how Apple was excoriated by some last year for having a "monopoly" on covid reporting Apps. and that was a free thing they did with Google and kept no data. Apple just stuck a big red Kick Me sign on their back. 
    Quite so. 
    It’s started already



    Amazing. So the justification is that Apple is allowing a hash to identify rock spiders. Who could object to that? Yeah, any decent person hates those people, so we should all agree it is a Good Thing. 
    But now the path is created to end our privacy. 
    Once it has introduced the ability, it will have to, if the government demands it, add a hash to identify everyone with a photograph at Epstein’s island. Not so bad you say. What about a photograph of Trump? Or Hunter Biden? Or whoever the ruling class doesn’t like anymore.  And that probably applies sooner in countries without a first amendment. 

    just don’t create the ability in the first place, and keep your credibility Apple.
    edited August 2021 GeorgeBMacnadrielecarlseenelijahgbaconstangcolinngchemengin1FileMakerFeller
  • Reply 40 of 91
    22july201322july2013 Posts: 3,573member
    It's Apple's OS. Nobody can force Apple to write its OS to their specifications. Even the EFF probably support Apple's legal right to decide what goes into their OS. This isn't a question of legality it's a question of morality, reputation and profitability. And that's fine.

    For the most part people here are arguing what Apple should be doing, not what it legally must be doing. But even those people who are arguing that Apple should not be doing this aren't arguing that Apple legally must not be doing this.

    I'm finding questions of morality to be rather boring. I'm interested in issues that are of legal importance.
    edited August 2021 DBSync
Sign In or Register to comment.