Open letter asks Apple not to implement Child Safety measures

124

Comments

  • Reply 61 of 91
    coolfactorcoolfactor Posts: 2,243member
    I have no problem with this "service" provided by Apple since:
    1. It's 100% Opt-In, not activated automatically.
    2. Not new. One commenter mentioned that "this could be applied to email". Well, newsflash! Our emails have been geting scanned by our email providers for decades for spam. We expect that! Why aren't we protesting that that be stopped???

    killroyn2itivguyDBSync
  • Reply 62 of 91
    anonymouseanonymouse Posts: 6,860member
    I think there's a lot of ignorance about what Apple is actually doing here, in large part based on the many misleading media reports that were based on a complete lack of understanding of how this works and what it actually is, or in some cases just a desire to malign Apple at any opportunity.

    You should really read John Gruber's take on this on Daring Fireball before you comment.

    I am very much a hard line privacy advocate, but I don't see this as at all a threat to user privacy, or any sort of slippery slope.
    n2itivguyDBSynckillroy
  • Reply 63 of 91
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    I have no problem with this "service" provided by Apple since:
    1. It's 100% Opt-In, not activated automatically.
    2. Not new. One commenter mentioned that "this could be applied to email". Well, newsflash! Our emails have been geting scanned by our email providers for decades for spam. We expect that! Why aren't we protesting that that be stopped???

    Google's been scanning emails for CSAM for about eight years.
    killroyn2itivguysuddenly newtondewmeDBSyncmuthuk_vanalingamFileMakerFeller
  • Reply 64 of 91
    coolfactorcoolfactor Posts: 2,243member

    There's a lot of "What if...?" questions being thrown around, and sure, they are valid concerns IF this wasn't Apple putting privacy first. Is it not possible for such scanning of content to be done privately and securely on-device, exactly as Apple has explained it? Could Apple have found a way to keep the content safe? Maybe? Possibly?

    Secondly, this is built around machine learning, and I think a lot of commenters don't understand how this works. Such a mechanism needs to be trained with actual good and bad matches over and over and over again in order to establish the training. You can't just flip a switch and start scanning for images that the machine doesn't yet know how to recognize. Since this is on-machine processing, how does one expect phones getting trained to recognize something outside of their training? If the phones are compromised, well... so is everything on the phone.
    n2itivguyDBSync
  • Reply 65 of 91
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.

    The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.

    Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.

    This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.


    The only ones who fear storing stuff on iCloud -- or fear having the government see their "private" stuff -- are, as you call them, the "Bad guys",

    They can see all of mine that they want to.  They'll be bored to death.
    Lol, if they would see my “private” stuff then it would be the ultimate snooze fest for sure. Although I do have an encrypted hard drive, it is not that I’m doing anything wrong, it is where I store all of my financial statements and also now, unfortunately for me, my end of life documentation as well since I was diagnosed with terminal cancer back in January. My wife and daughter know where I keep the password at but it is still “my private stuff” and no way do I need anyone besides me presently and when my life ends seeing this except wife and daughter. A lot of private thoughts are there. 
    GeorgeBMac
  • Reply 66 of 91
    killroykillroy Posts: 276member
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    n2itivguy
  • Reply 67 of 91
    That's what eventually led to the warrantless wiretapping provision being eliminated by Congress: public pressure. However, there were plenty of people that were ignorant of the Patriot Act and what it contained, thus Edward Snowden's success in repackaging old news from the Patriot Act a decade later as if it were something new to be worried about.
    Warrentless wiretapping was never eliminated. The only restriction put into place were wiretaps that affected everybody in the US. So you could still have one order for "all devices east of the Mississippi" and another order for "all devices west of the Mississippi" and still be in compliance with the new law. Or an order allowing surveillance of "all Democrats" (or "all Republicans") and still be in compliance. From a privacy standpoint, it was a completely worthless gesture. From a propaganda perspective, it allows those who favor panopticon-level surveillance to lie and tell people that "Something Was Done!!!"

    But nice try, fedboi.
    darkvader
  • Reply 68 of 91
    anonymouseanonymouse Posts: 6,860member
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.

    The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.

    Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.

    This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.


    The only ones who fear storing stuff on iCloud -- or fear having the government see their "private" stuff -- are, as you call them, the "Bad guys",

    They can see all of mine that they want to.  They'll be bored to death.
    Lol, if they would see my “private” stuff then it would be the ultimate snooze fest for sure. Although I do have an encrypted hard drive, it is not that I’m doing anything wrong, it is where I store all of my financial statements and also now, unfortunately for me, my end of life documentation as well since I was diagnosed with terminal cancer back in January. My wife and daughter know where I keep the password at but it is still “my private stuff” and no way do I need anyone besides me presently and when my life ends seeing this except wife and daughter. A lot of private thoughts are there. 
    No one should ever feel they need to justify a need or desire for privacy. The idea that only "bad guys" have a problem with other people seeing their "private stuff", whatever that is and whatever reason one doesn't want other people to see it, is pernicious nonsense. Some people may feel fine having anyone and everyone in the world know anything and everything about them, but I think most people need and desire some degree of privacy. Where exactly that line is is very personal, but privacy is a basic human right and should be recognized as such, and no one should be required to defend their need for it, be told how much of it they need, or be told that wanting it means they must be doing something wrong.

    That being said, this isn't about personal privacy, this is about people using Apple's services and servers to, among other things, store and distribute known child pornography and Apple attempting to stop them from doing so, without violating personal privacy, and without creating a surveillance program the government can exploit.
  • Reply 69 of 91
    DAalsethDAalseth Posts: 2,783member
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.

    The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.

    Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.

    This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.


    The only ones who fear storing stuff on iCloud -- or fear having the government see their "private" stuff -- are, as you call them, the "Bad guys",

    They can see all of mine that they want to.  They'll be bored to death.
    Lol, if they would see my “private” stuff then it would be the ultimate snooze fest for sure. Although I do have an encrypted hard drive, it is not that I’m doing anything wrong, it is where I store all of my financial statements and also now, unfortunately for me, my end of life documentation as well since I was diagnosed with terminal cancer back in January. My wife and daughter know where I keep the password at but it is still “my private stuff” and no way do I need anyone besides me presently and when my life ends seeing this except wife and daughter. A lot of private thoughts are there. 
    Whoah Dude, Cancer is a real SOB. I had stage 4 colon and beating it was the hardest thing I've ever done. So sorry to hear your prognosis. 
    GeorgeBMacFileMakerFeller
  • Reply 70 of 91
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.

    You have the option of doing that yourself.  Apple doesn't have to eliminate it for everybody -- they simply gave everybody the option to do so if they wished.

    The ”good guys” turning off iCloud does not solve the actual problem of the “bad guys” storing their photos in Apple’s iCloud.

    Assuming we believe CSAM is a something that needs tackling, the only other workable option I can see would be to shut down iCloud photos.

    This would have the considerable disadvantage of having to store all our photos on device but this was how it used to be (in the good old days) then the "bad guys" photos would be stored safely on their own devices.


    The only ones who fear storing stuff on iCloud -- or fear having the government see their "private" stuff -- are, as you call them, the "Bad guys",

    They can see all of mine that they want to.  They'll be bored to death.
    Lol, if they would see my “private” stuff then it would be the ultimate snooze fest for sure. Although I do have an encrypted hard drive, it is not that I’m doing anything wrong, it is where I store all of my financial statements and also now, unfortunately for me, my end of life documentation as well since I was diagnosed with terminal cancer back in January. My wife and daughter know where I keep the password at but it is still “my private stuff” and no way do I need anyone besides me presently and when my life ends seeing this except wife and daughter. A lot of private thoughts are there. 
    No one should ever feel they need to justify a need or desire for privacy. The idea that only "bad guys" have a problem with other people seeing their "private stuff", whatever that is and whatever reason one doesn't want other people to see it, is pernicious nonsense. Some people may feel fine having anyone and everyone in the world know anything and everything about them, but I think most people need and desire some degree of privacy. Where exactly that line is is very personal, but privacy is a basic human right and should be recognized as such, and no one should be required to defend their need for it, be told how much of it they need, or be told that wanting it means they must be doing something wrong.

    That being said, this isn't about personal privacy, this is about people using Apple's services and servers to, among other things, store and distribute known child pornography and Apple attempting to stop them from doing so, without violating personal privacy, and without creating a surveillance program the government can exploit.

    But they ARE violating privacy "rights".  If they're looking at my private stuff they are violating my privacy.
    The question is:  Is it worth it to cut down on child pornography and child sexual abuse?
    And, beyond that, whatever the next hot button issue is will face the same question...

    These are issues that society has been dealing with since the advent of society:  personal liberty vs society enforcing its ethics and morals and sometimes enforcing things to assure its safety.

    In most people's opinion (and mine), this is a valid exercise. 
    The next one may of may not be.  
    ... Trump searched the private information of his political opponents -- as did Nixon.   Most people would frown on that.
    ... But we also search the private information of drug lords, terrorists and such.
    .......  Each case will have to stand on its own merits (or lack of).
  • Reply 71 of 91
    macplusplusmacplusplus Posts: 2,112member
    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    Today iCloud Photos. Tomorrow what? iCloud Drive? Including the Desktop and Documents folders of all your Macs? To protect themselves from your nefarious junk?
    GeorgeBMac
  • Reply 72 of 91
    trinkotrinko Posts: 5member
    I never thought I'd live in a world where the same people who are cool with Big Tech censoring political voices the left doesn't like are upset about Apple trying to combat the sexual exploitation of children.

    Given that all perverts have to do to avoid being caught is not upload to iCloud and not send children sexually explicit materials it's not clear what privacy rights are being violated. Unless of course one says that criminals have a right to privacy.  

    Further while the general concept of monitoring photos could be misused we don't condemn search warrants because they too could be misused.

    Perhaps we might better defend the true privacy of millions of Americans by banning the practices of Google et al who know pretty much everything about us and use that information to target us with advertising.  We also know that Big Tech has used that information to try and sway elections, by sending alerts only to certain people for example. That's a bigger issue than Apple putting child molesters at risk.
    Detnator
  • Reply 73 of 91
    uraharaurahara Posts: 733member
    omasou said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
    Take the time to read before posting. They are hashing on your device before it’s encrypted in iCloud.

    Apple doesn’t want to be an accessory to your CSAM crimes or enable you to hide.

    The high level of security we enjoy from Apple’s ecosystem is to protect our privacy not to engage in crimes against children.
    You are wrong. iCloud isn’t encrypted. 
  • Reply 74 of 91
    killroykillroy Posts: 276member
    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    Today iCloud Photos. Tomorrow what? iCloud Drive? Including the Desktop and Documents folders of all your Macs? To protect themselves from your nefarious junk?

    You can just op-out. i don't think Google or Facebook and others that have been doing it for yaers give you that.
  • Reply 75 of 91
    killroykillroy Posts: 276member
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    So you are saying this is no true.

  • Reply 76 of 91
    macplusplusmacplusplus Posts: 2,112member
    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    So you are saying this is no true.

    What that support document doesn't mention is that iCloud Data is encrypted using Apple's own keys, not yours. More accurately the document states that indirectly by counting the limited cases your (own) device keys are used, i.e. E2E encryption cases. All other cases don't use your device keys, meaning by deduction they use Apple's keys.

    edited August 2021 darkvadermuthuk_vanalingam
  • Reply 77 of 91
    crowleycrowley Posts: 10,453member
    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    Today iCloud Photos. Tomorrow what? iCloud Drive? Including the Desktop and Documents folders of all your Macs? To protect themselves from your nefarious junk?
    If Apple wanted to scan my computers for photos that hash to a CSAM match then I wouldn't have any problem with that.  But there's no indication that is coming, or that it wouldn't be announced in advance.
    killroy
  • Reply 78 of 91
    macplusplusmacplusplus Posts: 2,112member
    killroy said:
    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    Today iCloud Photos. Tomorrow what? iCloud Drive? Including the Desktop and Documents folders of all your Macs? To protect themselves from your nefarious junk?

    You can just op-out. i don't think Google or Facebook and others that have been doing it for yaers give you that.
    Such superficial comparisons with Google or Facebook don't help and are misleading. iCloud Drive is much more than a photo repository, it is a fundamental component of macOS. Google and Facebook don't host your Desktop and Documents folders. Apple's responsibility is deeper than Google's or Facebook's or Dropbox's. Pushing the opt-out option as an argument is just an implicit way to admit how such projects may put the whole ecosystem at stakes.
    edited August 2021 darkvadermuthuk_vanalingam
  • Reply 79 of 91
    crowleycrowley Posts: 10,453member
    killroy said:
    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    Today iCloud Photos. Tomorrow what? iCloud Drive? Including the Desktop and Documents folders of all your Macs? To protect themselves from your nefarious junk?

    You can just op-out. i don't think Google or Facebook and others that have been doing it for yaers give you that.
    Such superficial comparisons with Google or Facebook don't help and are misleading. iCloud Drive is much more than a photo repository, it is a fundamental component of macOS. Google and Facebook don't host your Desktop and Documents folders. Apple's responsibility is deeper than Google's or Facebook's or Dropbox's. Pushing the opt-out option as an argument is just an implicit way to admit how such projects may put the whole ecosystem at stakes.
    If you can opt out then it's not a fundamental component.
    killroy
  • Reply 80 of 91
    longfanglongfang Posts: 456member
    Rayz2016 said:
    DAalseth said:
    No matter how well intentioned, this effort will be used to damage Apple's reputation, severely. It should be abandoned immediately. 
    Remember how Apple was excoriated by some last year for having a "monopoly" on covid reporting Apps. and that was a free thing they did with Google and kept no data. Apple just stuck a big red Kick Me sign on their back. 
    Apple will insist there is no back door into the system, but what they don’t realise is that this the back door. This is the back door that the governments have been asking for. All they need to do is add hashes from other databases (searching for pictures of dissidents, words from subversive poetry), tweak the threshold (you have to have four hits instead of eight) and you have an authoritarian government’s wet dream. It is the ultimate surveillance tool. 

    More of a back passage than a back door, centrally controlled by Apple and law enforcement, allowing every phone to spy on its user. 

    It’s odd but I’m typing this message on my iPad, and I have this notion that I no longer trust it, nor my iPhone, nor my Macs. I’m wary of them of them now. Even if Apple did reverse course (which they won’t), I don’t think that trust is coming back. 

    However, the announcement by itself has already created the blueprint of the “back door” so might as well utilize it for something good. 
Sign In or Register to comment.