Apple's CSAM detection system may not be perfect, but it is inevitable

2

Comments

  • Reply 21 of 50
    lkrupplkrupp Posts: 10,557member
    DAalseth said:
    So the bad guys will just move their files to somewhere that the system won’t touch. 
    The rest of us are stuck with scanning of our private data without our concent.
    Really we’re left with a choice of scanning by Apple or by Google who has a record of mining any and all data for profit. 
    Sucks.
    No, you have one more choice. Get off the internet and cellular networks. Live in a cabin in the wilderness and write manifestos. 
    dewmeM68000jony0
  • Reply 22 of 50
    amar99amar99 Posts: 181member
    Like it or not, this feature is the equivalent of unreasonable search, but on an ongoing basis. It is framed as a safety feature in order to acclimate the public to this sort of invation of privacy, and turn it into the "new normal". It is the 1984 equivalent of the ever-watching eye / camera in everyone's home, framed as "We must observe you in order to keep you safe. If you have nothing to hide, you have nothing to lose." That's what they all say.

    baconstangzeus423JaiOh81phonephreakDAalsethmuthuk_vanalingam
  • Reply 23 of 50
    lkrupplkrupp Posts: 10,557member
    dutchlord said:
    I don’t want any CSAM scanning on my devices so I expect an
    opt-out button or I will stop upgrading.
    Well, an opt-out button would totally defeat the entire purpose of CSAM wouldn’t it. The purpose, of course, is to detect and report child pornography. So no, there will not be an opt-out button, you can absolutely sure of that.
    M68000jony0
  • Reply 24 of 50
    byronlbyronl Posts: 371member
    davidw said:
    >And then, the complaints started. And, they started, seemingly ignorant that Microsoft had been scanning uploaded files for about 10 years, and Google for eight.<

    That is an ignorantly false assumption.  Consumers complaining about Apple upcoming policy of scanning for  CSAM was about Apple doing it on the consumers devices, instead of on their iCloud servers. These consumers were not ignorant of the fact that other companies were already scanning for CSAM and  knew that Apple was doing same on their own servers. Hell, most of Apple customers data probably aren't even stored on Apple's own servers. they are stored on cloud servers provided by Amazon and Google. I'm sure they have policies where Apple has to scam for CSAM, if they want to use their cloud servers for storing Apple customers data. 

    >And yet, Apple's implementation of CSAM detection in iCloud Photos is only a matter of time, simply because its system strikes a middle ground. Governments can't tell Apple to include terrorist content into the CSAM database.<

    What's the point?  The US government can't tell Apple to actively scan for CSAM either. It's suppose to be "voluntary" on Apple part. Under Federal laws, Apple is only required to report CSAM, if they if they find it or are informed of it. Otherwise, Apple would become an agency of the government and must adhere to the US Constitution, when performing the search (even on their own servers.). And just like any other US government law enforcement agency, Apple must get a search warrant to justify a search for criminal activity, on the customers own device. The SCOTUS has already ruled that a smartphone can be an extension of one's "home" and require a search warrant to search it, without the owners permission. Just like how an automobile can be considered an extension of ones "home". Unless there's some sort of emergency, in order to get a search warrant, the government must show "probable cause".

    https://crsreports.congress.gov/product/pdf/LSB/LSB10713
    That’s the US government. What about the Chinese government? The Saudi Arabian one?
    elijahgbaconstangJaiOh81
  • Reply 25 of 50
    boboliciousbobolicious Posts: 1,161member
    ... is linux as an open source option the next frontier for the traditional values of 'think different' long time Apple customers ...?
    elijahgDAalseth
  • Reply 26 of 50
    crowleycrowley Posts: 10,453member
    davidw said:
    lkrupp said:
    dutchlord said:
    I don’t want any CSAM scanning on my devices so I expect an
    opt-out button or I will stop upgrading.
    Dumbass. Read the damn article. It won’t be scanning your devices. It will be scanning your iCloud photos only. And ,as the article clearly states, if you turn off uploading your photos to iCloud they won’t be scanned. So, you already have the opt-out ability right now. Dumbass.
    No, you're being the "dumbass". Did you read the part about Apple scanning iMessages of minors under a parent account and Siri and Spotlight searches. Or do you think Apple can defect CSAM being on those services, by just scanning only iCloud Photos? 
    Do you think anyone here is a minor?

    Doubtful.
    jony0
  • Reply 27 of 50
    davidwdavidw Posts: 2,093member
    crowley said:
    davidw said:
    lkrupp said:
    dutchlord said:
    I don’t want any CSAM scanning on my devices so I expect an
    opt-out button or I will stop upgrading.
    Dumbass. Read the damn article. It won’t be scanning your devices. It will be scanning your iCloud photos only. And ,as the article clearly states, if you turn off uploading your photos to iCloud they won’t be scanned. So, you already have the opt-out ability right now. Dumbass.
    No, you're being the "dumbass". Did you read the part about Apple scanning iMessages of minors under a parent account and Siri and Spotlight searches. Or do you think Apple can defect CSAM being on those services, by just scanning only iCloud Photos? 
    Do you think anyone here is a minor?

    Doubtful.
    And did YOU, read the article before commenting? Doubtful

    >Siri, along with search bars in Safari and Spotlight, steps in next. It intervenes when an Apple user of any age performs search queries related to CSAM. A popup warns that the search is illegal and provides resources to "learn more and get help." Siri can also direct people to file a report of suspected child abuse material.<
  • Reply 28 of 50
    crowleycrowley Posts: 10,453member
    davidw said:
    crowley said:
    davidw said:
    lkrupp said:
    dutchlord said:
    I don’t want any CSAM scanning on my devices so I expect an
    opt-out button or I will stop upgrading.
    Dumbass. Read the damn article. It won’t be scanning your devices. It will be scanning your iCloud photos only. And ,as the article clearly states, if you turn off uploading your photos to iCloud they won’t be scanned. So, you already have the opt-out ability right now. Dumbass.
    No, you're being the "dumbass". Did you read the part about Apple scanning iMessages of minors under a parent account and Siri and Spotlight searches. Or do you think Apple can defect CSAM being on those services, by just scanning only iCloud Photos? 
    Do you think anyone here is a minor?

    Doubtful.
    And did YOU, read the article before commenting? Doubtful

    >Siri, along with search bars in Safari and Spotlight, steps in next. It intervenes when an Apple user of any age performs search queries related to CSAM. A popup warns that the search is illegal and provides resources to "learn more and get help." Siri can also direct people to file a report of suspected child abuse material.<
    Ok fair enough, I didn't see that bit.  Is that new?  I don't recall that ever being talked about before, it was always about iCloud Photo upload hash matching and minor messaging.
    dewme
  • Reply 29 of 50
    I won't use any system that scans any of my images – if Apple wants to do this they should do it in iCloud or I will stop using their devices.
  • Reply 30 of 50
    xyzzy-xxx said:
    I won't use any system that scans any of my images – if Apple wants to do this they should do it in iCloud or I will stop using their devices.
    Your phone already scans your images. That’s how part of the search function in Photos works, it scans your photos so when you search for “trees” or “water” or “dogs” or any of a few hundred (thousand?) other terms Photos can show you those images that contain the search terms. 
    dewmeM68000jony0
  • Reply 31 of 50
    To me, being in favor of CSAM scanning on someone’s phone, or on iCloud, is equivalent to being in favor of having your home searched at law enforcement’s convenience… just because they can, not because the person is guilty of anything at all. 

    Just because a technology is available or possible doesn’t make it a good idea, or even an adequate idea. I’m pretty sure the “innocent until proven guilty” concept is considered a positive aspiration of the legal system. 

    My only purpose in buying a phone or computer is to solve my work needs or to use apps. I don’t buy it to have government intrusion. It’s astonishing that this big company that supposedly supports privacy has decided to actively pursue technology that reduces it. 

    There is certainly a slippery slope, where unscrupulous governments will push and push. Elements of this already happen in China, where the grey areas of allowed apps for example are controlled by the government in power, even if it hurts the citizens. 

    Corporations generally are in business to make money. I have supported Apple, in part, because they usually appear to also contribute to the world in positive ways. This is not positive. And there aren’t viable alternatives. 
    edited August 2022 elijahgbaconstangthtzeus423JaiOh81muthuk_vanalingamgrandact73
  • Reply 32 of 50
    lkrupplkrupp Posts: 10,557member
    xyzzy-xxx said:
    I won't use any system that scans any of my images – if Apple wants to do this they should do it in iCloud or I will stop using their devices.
    I love reading this kind of line-in-the-sand pronouncement. And whose devices will you use instead? Have you even thought about that?
    dewmejony0
  • Reply 33 of 50
    lkrupplkrupp Posts: 10,557member
    I have supported Apple, in part, because they usually appear to also contribute to the world in positive ways. This is not positive. And there aren’t viable alternatives. 
    This is the crux of the matter that no one is talking about. It’s all blathering and posturing and anger but no realization that when this happens there’s nowhere to go. We talk about freedom but we voluntarily give our freedom to travel whenever we pass through a TSA checkpoint and our personal belongings are  scanned/rummaged through by a stranger. No one here has ever vowed to never set foot on an airplane ever again, have you? Yet you voluntarily remove your shoes, belt, iPhone and drop them in a tray. And what about mask mandates and all the screaming about personal choice and freedom yet you complied. 
    M68000jony0
  • Reply 34 of 50
    georgie01 said:
    Interested read but such naivety…

    “Governments can't tell Apple to include terrorist content into the CSAM database.”

    Right now. But that is being pushed step by step. Is the author unaware of the pressure the current government has put on private companies to censor content? Does the author not understand that the obsession with ‘misinformation’ is little more than a means to inject that government control?

    Is the author unaware that ‘misinformation’ has been going on forever and that it’s not a new phenomenon? I watched a TV show on a reputable TV network decades ago on the moon landing being fake. Back then exploring a wide range of ideas, no matter how ridiculous, was considered valuable for mature humans.

    The only thing that has changed is people in society are increasingly immoral, and therefore people are increasingly unable to express self-responsibility and think critically. 

    The more we take the power away from the people the more likely it will end up in the governments’ hands. The government is not an altruistic benign organisation.
    You are misinforming people regarding the reputable network claiming that the Moon landing was fake. There may have been a show where they talked about the Moon landing being fake but the network would not have presented it as a fact. I have seen shows like that a couple of times and it was always presented as “some people believe.” And they go through the so called evidence  and then explain how their theories don’t make sense. One example would be that the flag is supposedly waving even though there is no wind on the Moon and the answer is that the flag is not waving. It just looks that way to some because of the way the flag was set up. And they explain all of this in the show. You would have to be pretty gullible to watch the show and actually come away believing that the Moon landing was fake.
    I always find it funny when someone refuses to believe that the Moon landing occurred but the same person has no problem in believing in aliens visiting Earth.
    lkruppbaconstangthtelijahg
  • Reply 35 of 50
    dutchlord said:
    I don’t want any CSAM scanning on my devices so I expect an
    opt-out button or I will stop upgrading.
    Pressing an opt-out button like that would probably be like inviting a spotlight on the user. Such when a person goes to a bank to deposit a large sum of cash and asks the teller “you don’t have to file a currency transaction report, do you?”. Even if the amount doesn’t technically meet the legal threshold, just asking that question requires the bank to fill out a CTR.
    baconstang
  • Reply 36 of 50
    baconstangbaconstang Posts: 1,140member
    lkrupp said:
    I have supported Apple, in part, because they usually appear to also contribute to the world in positive ways. This is not positive. And there aren’t viable alternatives. 
    This is the crux of the matter that no one is talking about. It’s all blathering and posturing and anger but no realization that when this happens there’s nowhere to go. We talk about freedom but we voluntarily give our freedom to travel whenever we pass through a TSA checkpoint and our personal belongings are  scanned/rummaged through by a stranger. No one here has ever vowed to never set foot on an airplane ever again, have you? Yet you voluntarily remove your shoes, belt, iPhone and drop them in a tray. And what about mask mandates and all the screaming about personal choice and freedom yet you complied. 
    OK, but comparing masking during a pandemic to unrequested on device scanning is specious at best.
  • Reply 37 of 50
    davidwdavidw Posts: 2,093member
    crowley said:
    davidw said:
    crowley said:
    davidw said:
    lkrupp said:
    dutchlord said:
    I don’t want any CSAM scanning on my devices so I expect an
    opt-out button or I will stop upgrading.
    Dumbass. Read the damn article. It won’t be scanning your devices. It will be scanning your iCloud photos only. And ,as the article clearly states, if you turn off uploading your photos to iCloud they won’t be scanned. So, you already have the opt-out ability right now. Dumbass.
    No, you're being the "dumbass". Did you read the part about Apple scanning iMessages of minors under a parent account and Siri and Spotlight searches. Or do you think Apple can defect CSAM being on those services, by just scanning only iCloud Photos? 
    Do you think anyone here is a minor?

    Doubtful.
    And did YOU, read the article before commenting? Doubtful

    >Siri, along with search bars in Safari and Spotlight, steps in next. It intervenes when an Apple user of any age performs search queries related to CSAM. A popup warns that the search is illegal and provides resources to "learn more and get help." Siri can also direct people to file a report of suspected child abuse material.<
    Ok fair enough, I didn't see that bit.  Is that new?  I don't recall that ever being talked about before, it was always about iCloud Photo upload hash matching and minor messaging.
    I recalled it being mentioned (CSAM on Siri searches) but thought it was part of Apple "Child Safety" features. And it might have been that way in the beginning. Most were focusing on the on device CSAM scanning of iCloud photos. But now it seems to be expanded to all Siri searches, not just minors. It might had been due to pressure from countries like the UK. This goes to show how easy it would be for Apple to expand CSAM scans for all iMessages and not just minors. 

    https://www.macrumors.com/2022/08/13/apple-silent-on-csam-detection-plans/

    And it may already be a reality where you are. And the scanning is done on your device.  

    https://www.theguardian.com/technology/2022/apr/20/apple-says-new-child-safety-feature-to-be-rolled-out-for-uk-iphones?CMP=oth_b-aplnews_d-1


    edited August 2022
  • Reply 38 of 50
    DAalsethDAalseth Posts: 2,937member
    lkrupp said:
    DAalseth said:
    So the bad guys will just move their files to somewhere that the system won’t touch. 
    The rest of us are stuck with scanning of our private data without our concent.
    Really we’re left with a choice of scanning by Apple or by Google who has a record of mining any and all data for profit. 
    Sucks.
    No, you have one more choice. Get off the internet and cellular networks. Live in a cabin in the wilderness and write manifestos. 
    Not a realistic option. 

    lkrupp said:
    xyzzy-xxx said:
    I won't use any system that scans any of my images – if Apple wants to do this they should do it in iCloud or I will stop using their devices.
    I love reading this kind of line-in-the-sand pronouncement. And whose devices will you use instead? Have you even thought about that?
    Exactly. Its reached the point where you HAVE to have a device to function. We parked in a public ramp last night. You paid by using an app. There was no other option. I run into a lot of things that have to be scheduled/ordered/paid for with a device or computer. And that means using Apple  or Android. I would love to see a third option outside of the Ap/An hegemony. 
  • Reply 39 of 50
    And until Apple announces their plans for CSAM I’m sticking with iOS 14, regardless of any security patches or features.
    elijahgbaconstang
  • Reply 40 of 50
    lkrupp said:
    xyzzy-xxx said:
    I won't use any system that scans any of my images – if Apple wants to do this they should do it in iCloud or I will stop using their devices.
    I love reading this kind of line-in-the-sand pronouncement. And whose devices will you use instead? Have you even thought about that?
    Yes I thought about that and I would use an Android device capable of running (not Android but) GrapheneOS.
    crowleyelijahg
Sign In or Register to comment.