Open letter asks Apple not to implement Child Safety measures

135

Comments

  • Reply 41 of 91
    palegolaspalegolas Posts: 1,361member
    Rayz2016 said:
    DAalseth said:
    … I have this notion that I no longer trust it…
    This has been my reason for never ever using Chrome, unless absolutely necessary for compatibility. I don’t trust Google.

  • Reply 42 of 91
    omairomair Posts: 9member
    Not a big fan of some of the wording in the letter but I understand the meaning behind the larger message. IMHO Apple made a mistake at minimum in how they rolled this out, at minimum. DAalseth comment was correct, this will be attempted to be used to bludgeon Apple. How well it succeeds is doubtful, IMHO.  But the surreal irony is some of those doing the bludgeoning will be among the worst purveyors of data privacy collecting and exploitation.
    No problem with the signers of the letter expressing this very important point. But Here's my problem with the signers of this letter: where the hell have they been on  the vast majority of smartphone users on the planet using a platform that was tracking the hell out of them? They've just been given a big platform to condemn user privacy issue  based on Apple's MEC surveilling, so where the hell is page 2 to protect hundreds of millions of people getting their privacy data tracked constantly? Speak now or show yourself to be looking for a few headlines. 

    EFF has been there since day 1. Calling out Facebook and Google but also calling out Apple when it was needed. Where were the rest of these letter signers? Unfortunately a few of them probably, I suspect, getting "third party research" grants. See how that works?
    Well, to be fair, rest of them never promised and championed privacy like apple does.  And this is nothing but a two step backdoor.  I have no choice but to divest from everything i own from apple.  I need to now look for alternatives to my ipad pro, macbook pro, watch, iphone, imac.  I cant believe apple did this.
    nadrielelijahgmuthuk_vanalingam
  • Reply 43 of 91
    dewmedewme Posts: 5,356member
    Wgkrueger said:
    With respect to the image scanning I think Apple can make a good case that they want illegal and horrific images off of their servers. I applaud the use technology to eradicate these images from the face of the earth and more specifically to stop child abuse in general. 

    My personal perspective is related to Apples methodology. 

    The method Apple chose involves uploading data to customer owned devices without their permission for the needs of a third party. Further they are using the processing power of their customers equipment without their permission to access customers data without their permission with the sole purpose of finding out if that customer is breaking the law. It seems like this is violating the customers trust in Apple. it also seems like this makes Apple an agent for the government. 

    The reason for what they’re scanning for and the methods they’re using is because it’s justifiably horrific, I.e. child abuse, and the images are widespread, so there is a limit to Apples privacy stance imo. 

    I agree with you line of reasoning, but there are some subtleties that need to be surfaced. First of all, Apple doesn’t really have to make a case for what they are doing because they have already established the right to do what they are proposing as part of the existing iCloud terms of service (https://www.apple.com/legal/internet-services/icloud/en/terms.html). Specifically,,, (from the linked agreement)

    C. Removal of Content

    You acknowledge that Apple is not responsible or liable in any way for any Content provided by others and has no duty to screen such Content. However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.

    As you can imagine, Apple also clearly states in the agreement what type of content that they deemed to be in violation of the agreement, for example …

    B. Your Conduct

    You agree that you will NOT use the Service to:

    a. upload, download, post, email, transmit, store or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy, hateful, racially or ethnically offensive, or otherwise objectionable;

    Your points about on-device processing sounds reasonable to me, but I would assume that Apple could easily justify the on-device agents as being part and parcel to the “iCloud service” as a whole.

    Apple undoubtedly worked through a wide array of legal, ethical, and customer response scenarios prior to going public with the details of this initiative. Again, it comes down to somebody finally trying to do something to address a very serious problem and being unwilling to stay the course of “doing nothing other than flapping lips” about the problem. I’d really like to hear exactly what the opponents of what Apple is proposing suggest that Apple (or the government or whomever) do instead of what Apple is proposing. 

    Flapping lips ain’t solving the problem. People who are impacted by these violations, the victims, aren’t comforted in any way with philosophical and ideological arguments, what ifs, and what abouts. Making progress often comes down to doing the most good with the least amount of collateral damage. Doing nothing is telling the victims that they are part of the acceptable collateral damage. I hope that isn’t the card we want to play.

    killroyn2itivguyDBSyncFileMakerFeller
  • Reply 44 of 91
    DAalsethDAalseth Posts: 2,783member
    At the end of the day the biggest indictment of this program is that it simply will not work. The bad guys will just work around it, use other platforms, pre encrypt the images, store them in other systems, and evade detection. Meanwhile this puts everyone else’s data at risk. 
    macplusplusGeorgeBMacbaconstangentropysmuthuk_vanalingamchemengin1
  • Reply 45 of 91
    carnegiecarnegie Posts: 1,078member
    It's Apple's OS. Nobody can force Apple to write its OS to their specifications. Even the EFF probably support Apple's legal right to decide what goes into their OS. This isn't a question of legality it's a question of morality, reputation and profitability. And that's fine.

    For the most part people here are arguing what Apple should be doing, not what it legally must be doing. But even those people who are arguing that Apple should not be doing this aren't arguing that Apple legally must not be doing this.

    I'm finding questions of morality to be rather boring. I'm interested in issues that are of legal importance.
    People might not be talking about them as much, but what Apple is doing does raise some legal questions.

    Apple is generally a private actor. But courts have found that private actions can represent violations of the Fourth Amendment under some circumstances. There are a  number of tests which can apply, depending on context. There's a function test which asks whether a given private actor was performing a traditional government function. There's a compulsion test which asks whether a given private actor was coerced by the government. And there's a nexus test which asks whether a given private actor was cooperating with the government. There's still considerable gray area on this issue as courts haven't agreed on how or which tests apply.

    But one of the questions here is whether Apple was somehow coerced by government actors to implement this software and program. Even if Apple wasn't coerced, was it encouraged by government actors? And if so, what's the intent behind Apple's policy? Is it trying to help law enforcement or is Apple doing this for its own business reasons? If Apple is undertaking searches in order to help government actors catch criminals and not because Apple thinks this policy helps its business, then the Fourth Amendment might be implicated.

    There are also questions when it comes to whether these actions would constitute Fourth Amendment searches anyway. If the only thing a hash-value scan can reveal is whether something illegal is present, then under Supreme Court doctrine (see, e.g., U.S. v Place (1983)) such a scan might not be considered a Fourth Amendment search because it doesn't implicate legitimate privacy interests. Notably this is a question which the 6th Circuit recently expressly left open in its U.S. v Miller (2020) decision.
    elijahgbaconstangmuthuk_vanalingamFileMakerFeller
  • Reply 46 of 91
    StrangeDaysStrangeDays Posts: 12,871member
    omair said:
    Not a big fan of some of the wording in the letter but I understand the meaning behind the larger message. IMHO Apple made a mistake at minimum in how they rolled this out, at minimum. DAalseth comment was correct, this will be attempted to be used to bludgeon Apple. How well it succeeds is doubtful, IMHO.  But the surreal irony is some of those doing the bludgeoning will be among the worst purveyors of data privacy collecting and exploitation.
    No problem with the signers of the letter expressing this very important point. But Here's my problem with the signers of this letter: where the hell have they been on  the vast majority of smartphone users on the planet using a platform that was tracking the hell out of them? They've just been given a big platform to condemn user privacy issue  based on Apple's MEC surveilling, so where the hell is page 2 to protect hundreds of millions of people getting their privacy data tracked constantly? Speak now or show yourself to be looking for a few headlines. 

    EFF has been there since day 1. Calling out Facebook and Google but also calling out Apple when it was needed. Where were the rest of these letter signers? Unfortunately a few of them probably, I suspect, getting "third party research" grants. See how that works?
    Well, to be fair, rest of them never promised and championed privacy like apple does.  And this is nothing but a two step backdoor.  I have no choice but to divest from everything i own from apple.  I need to now look for alternatives to my ipad pro, macbook pro, watch, iphone, imac.  I cant believe apple did this.
    Dropbox, Google, Microsoft, and Twitter also hash check with CSAM. You’re going to stop using all their products too, right? So…what products and services will you be using? 
    killroywilliamlondonn2itivguyericthehalfbeeDBSyncFileMakerFeller
  • Reply 47 of 91
    killroykillroy Posts: 275member
    Rayz2016 said:
    Yup, this basically. 



    Not sure what the EFF expects when it comes to laws passed by authoritarian governments. The only options would be to comply or leave the market. If every private company leaves the market, then state run companies will fill the void. So on the one hand you can say "we're not participating", but on the other you would have to admit that you haven't changed anything per protections for the people that live in the country. 

    They already use NSO Group Technologies to do what they want,
    GeorgeBMac
  • Reply 48 of 91
    killroykillroy Posts: 275member
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    DBSync
  • Reply 49 of 91
    killroykillroy Posts: 275member
    What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics. 

    There's a difference between sent and downloaded. They can send you photos now and then call the cops making a swatting call.
  • Reply 50 of 91
    omasouomasou Posts: 572member
    DAalseth said:
    At the end of the day the biggest indictment of this program is that it simply will not work. The bad guys will just work around it, use other platforms, pre encrypt the images, store them in other systems, and evade detection. Meanwhile this puts everyone else’s data at risk. 
    That’s fine. We need to be continually vigilant and do what we can to make it harder. The more trouble it is the less profitable.

    No ones data is at risk, that statement is pure contrived FUD.

    What we shouldn’t do is throw our arms up in the air and say we cannot make a difference.

    I applaud Apple for trying to make this a better world. Think about the development costs to create this and loss in device and services revenue, yet Apple still made the investment b/c they felt it important to try and make a difference.

    Think differently.

    Act differently.

    Make a difference.
    edited August 2021 killroyn2itivguyDBSync
  • Reply 51 of 91
    macplusplusmacplusplus Posts: 2,112member
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?
    edited August 2021 muthuk_vanalingam
  • Reply 52 of 91
    macplusplusmacplusplus Posts: 2,112member
    killroy said:
    What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics. 

    There's a difference between sent and downloaded. They can send you photos now and then call the cops making a swatting call.
    Sent or downloaded doesn't matter. In most jurisdictions possessing i.e. having it stored in your possession is the crime. You must check what messaging apps automatically save in your photo library.
    DoctorQGeorgeBMacmuthuk_vanalingam
  • Reply 53 of 91
    jungmarkjungmark Posts: 6,926member
    What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics. 
    Nothing unless you upload the images to iCloud. 
  • Reply 54 of 91
    DAalsethDAalseth Posts: 2,783member
    lkrupp said:
    So I wonder what the authors of this letter will do if Apple goes ahead? Who will they turn to for privacy? Do they not know that Google is doing this? So if they stop buying Apple then who will they buy from, or will they get off the internet?

    Sputtering about privacy, government surveillance, etc. when you are impotent to do anything because the only option is to leave the internet is not productive. Several have threatened or stated that they will no longer buy Apple products because of this. Who are they kidding, where will they go? This will blow over, will be implemented, and life will go on. By the way, both political parties are on board with this, so no matter who you vote for it’s a done deal.

    So how about those of you clamoring for Apple to stop this tell us what you plan to do when it is implemented. What will be your response? Writing your Congressperson? Or will you be hypocrites and go about your business?
    I agree completely.

    Let me be honest, I've seen this movie before. There's outrage, and then after a few weeks people see something else shiny and everyone shifts their attention to that. Whether it was the Black Lives Movement, or Climate Change, or the Patriot Act, or criminality in the White House, or trash in the oceans, people start out invested and then...SQUIRREL! There's a lot of lip service, but in the end the surveillance grows, CO2 pollution continues unabated, and the police are given more and more tools.

    So I agree, this will be the subject of the moment, and then everyone will go back to cat videos. Yes I am totally cynical about the future.
    edited August 2021 williamlondonDBSync
  • Reply 55 of 91
    macplusplusmacplusplus Posts: 2,112member
    jungmark said:
    What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics. 
    Nothing unless you upload the images to iCloud. 
    You don't "upload" images to iCloud. The operating system automatically syncs your photo library to iCloud once you activate iCloud Photos. So photos sent to someone may make him a real criminal if the recipient forgets to check his photo library after deleting it in the receiving messaging app, since messaging apps may come with the default option of saving media to the photo library.

    Many people use their phones by trial and error and don't have a coherent view of the operating system, are unaware of what resides where.

    And Apple's scheme doesn't warn you in such cases. Even if it detects a hash match, it continues to sneakily upload it to iCloud by updating your "perversion" score. If that score reaches some threshold it reports you, to catch you hands on job right where you are, damn pervert !
    edited August 2021 GeorgeBMacbaconstangIreneWmuthuk_vanalingam
  • Reply 56 of 91
    crowley said:

    Assuming we believe CSAM is a something that needs tackling,

    You’re suggesting that it might not be?
    I think he means, assuming we think it’s apples job to stop csam. Which also assumes that all of their users, including you, are pedophiles who won’t need their photos to be unencrypted to be scanned because your own phone has turned into the scanner… a spy that will report on you. Apples feature is also created in such a way that if you say you don’t want your private photos scanned and reported on, you must be a pedophile.
  • Reply 57 of 91
    DoctorQDoctorQ Posts: 50member
    Interesting that this news came out before the weekend. Wonder what the stock will do Monday?
    Also, this system doesn’t stop originating CSM on an iPhone with iCloud Photo turned off. If it’s not in the hash db, it won’t be flagged. One could argue that this encourages production of new CSM (sarcasm tag here).
    The point is, this is a way for Apple to get the FBI and other LEOs off their back in the most non-intrusive way they could think of. Is it flawed? Hell yes. But they hope Congress will look kindly on them come the anti-trust hearings.
  • Reply 58 of 91
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    DoctorQ said:
    Interesting that this news came out before the weekend. Wonder what the stock will do Monday?
    Also, this system doesn’t stop originating CSM on an iPhone with iCloud Photo turned off. If it’s not in the hash db, it won’t be flagged. One could argue that this encourages production of new CSM (sarcasm tag here).
    The point is, this is a way for Apple to get the FBI and other LEOs off their back in the most non-intrusive way they could think of. Is it flawed? Hell yes. But they hope Congress will look kindly on them come the anti-trust hearings.
    The news was announced on Thursday afternoon and was immediately covered by just about everybody.

    The stock closed at $146.81 at the close of business on Thursday, and is $145.96 as I write this. No appreciable change on the news, and it fell more when Apple announced blockbuster earnings a few weeks ago. The market overall is as down the same as Apple is, but it can be argued that AAPL pricing had an impact on that.
    edited August 2021 killroy
  • Reply 59 of 91
    iadlibiadlib Posts: 95member
    You think Apple cares? Like how they care about their employees not wanting to return to in office work? Laughable. 
    williamlondon
  • Reply 60 of 91
    GeorgeBMacGeorgeBMac Posts: 11,421member
    carnegie said:
    It's Apple's OS. Nobody can force Apple to write its OS to their specifications. Even the EFF probably support Apple's legal right to decide what goes into their OS. This isn't a question of legality it's a question of morality, reputation and profitability. And that's fine.

    For the most part people here are arguing what Apple should be doing, not what it legally must be doing. But even those people who are arguing that Apple should not be doing this aren't arguing that Apple legally must not be doing this.

    I'm finding questions of morality to be rather boring. I'm interested in issues that are of legal importance.
    People might not be talking about them as much, but what Apple is doing does raise some legal questions.

    Apple is generally a private actor. But courts have found that private actions can represent violations of the Fourth Amendment under some circumstances. There are a  number of tests which can apply, depending on context. There's a function test which asks whether a given private actor was performing a traditional government function. There's a compulsion test which asks whether a given private actor was coerced by the government. And there's a nexus test which asks whether a given private actor was cooperating with the government. There's still considerable gray area on this issue as courts haven't agreed on how or which tests apply.

    But one of the questions here is whether Apple was somehow coerced by government actors to implement this software and program. Even if Apple wasn't coerced, was it encouraged by government actors? And if so, what's the intent behind Apple's policy? Is it trying to help law enforcement or is Apple doing this for its own business reasons? If Apple is undertaking searches in order to help government actors catch criminals and not because Apple thinks this policy helps its business, then the Fourth Amendment might be implicated.

    There are also questions when it comes to whether these actions would constitute Fourth Amendment searches anyway. If the only thing a hash-value scan can reveal is whether something illegal is present, then under Supreme Court doctrine (see, e.g., U.S. v Place (1983)) such a scan might not be considered a Fourth Amendment search because it doesn't implicate legitimate privacy interests. Notably this is a question which the 6th Circuit recently expressly left open in its U.S. v Miller (2020) decision.

    When it comes to social hot button issues, rules of law, democracy, the Constitution and all its amendments go out the window.

    The NY governor just got tried and convicted and will soon be sentenced without ever entering a courtroom.

    My question is:   what will be the next social hot button?  I like to be prepared for these things...
    FileMakerFeller
Sign In or Register to comment.