Apple slammed for not doing enough to prevent CSAM distribution

Posted:
in General Discussion
Apple and Microsoft have provided details of their methods for detecting or preventing child sexual abuse material distribution, and an Australian regulator has found their efforts lacking.

iMessage can warn minors about nudity in photos
iMessage can warn minors about nudity in photos


The Australian e-Safety Commissioner demanded that major tech firms like Apple, Facebook, Snapchat, Microsoft, and others detail their methods for preventing child abuse and exploitation on their platforms. The demand was made on August 30, and the companies had 29 days to comply or face fines.

Apple and Microsoft are the first companies to receive scrutiny from this review, and according to Reuters, the Australian regulator found their efforts insufficient. The two companies do not proactively scan user files on iCloud or OneDrive for CSAM, nor are there algorithmic detections in place for FaceTime or Skype.

Commissioner Julie Inman Grant called the company's responses "alarming." She stated that there was "clearly inadequate and inconsistent use of widely available technology to detect child abuse material and grooming."

Apple recently announced that it had abandoned its plans to scan photos being uploaded to iCloud for CSAM. This method would have been increasingly less effective since users now have access to fully end-to-end encrypted photo storage and backups.

Instead of scanning for existing content being stored or distributed, Apple has decided to take a different approach that will evolve over time. Currently, devices used by children can be configured by parents to alert the child if nudity is detected in photos being sent over iMessage.

Apple plans on expanding this capability to detect material in videos, then move the detection and warning system into FaceTime and other Apple apps. Eventually, the company hopes to create an API that would let developers use the detection system in their apps as well.

Abandoning CSAM detection in iCloud has been celebrated by privacy advocates. At the same time, it has has been condemned by child safety groups, law enforcement, and governmental officials.

Read on AppleInsider

Comments

  • Reply 1 of 7
    lkrupplkrupp Posts: 10,557member
    Classic case of damned if you do and damned if you don ’t. Are any of you critics even beginning to see the predicament companies like Apple are in? No, I guess not. All you do is threaten to boycott if they don’t comply with YOUR position. I always guffaw when I read the “I have purchased my last Apple product” rants be it China, CSAM, unionization, or some other sociopolitical cause du jour. And yes, I have even succumbed to that rage myself recently. It accomplishes n nothing.
    edited December 2022 dewmeAnilu_777n2itivguytechconcwatto_cobrajony0
  • Reply 2 of 7
    darkvaderdarkvader Posts: 1,146member
    Australia needs to fuck right the fuck off.

    As we've said before, this isn't about CSAM, it's about getting a backdoor into encryption.
    baconstang
  • Reply 3 of 7
    So if they do it they’re horrible spies and if they don’t they’re in league with child perverts. Governments need to shut the f up. 
    darkvaderlolliverwatto_cobra
  • Reply 4 of 7
    Under the new Advanced Data Protection feature, which will be an opt-in feature for users (starting now in the USA, but likely to expand to other countries), Apple won't even have the ability to scan iCloud images because iCloud photos will be encrypted with keys that only the end user has access to. So is Australia going to ban the use of this new security feature in Australia?

    https://appleinsider.com/articles/22/12/13/apples-advanced-data-protection-feature-is-here---what-you-need-to-know <--

    "Apple's new Advanced Data Protection feature goes a step further and allows you to encrypt additional information in iCloud with a few new layers of security. 

    Data encrypted by enabling Advanced Data Protection

    • Device backups
    • Messages backups
    • iCloud Drive <------ includes photos
    • Notes
    • Photos
    • Reminders
    • Safari Bookmarks
    • Voice Memos
    • Wallet passes

    Apple notes that the only major iCloud data categories that aren't covered are Calendar, Contacts, and iCloud Mail, as these features need to interoperate with global systems."

    n2itivguywatto_cobra
  • Reply 5 of 7
    I thought people got mad at Apple because they WANTED to start doing it, and the backlash was so bad, they backed off. Now people are mad about that? Seeseh, people, figure it out!
    lolliverwatto_cobra
  • Reply 6 of 7
    swat671 said:
    I thought people got mad at Apple because they WANTED to start doing it, and the backlash was so bad, they backed off. Now people are mad about that? Seeseh, people, figure it out!
    Reading comprehension issues? It is the governments who are mad about Apple not including the spyware. Now, you know who pushed Apple to include that feature in the first place. This is just evidence that people who opposed it are correct.
    watto_cobra
  • Reply 7 of 7
    Protecting in a child's devices, as Apple hints at, does seem like the better way to go. Frankly, that would probably be better in general. Either provide evidence to Apple or Microsoft that you are an adult, which would then remove a flag on the internet suggesting you are a child, or provide a very easy and obvious way for parents to create child accounts on their devices. Would it be possible to work around this? Sure. Determined kids are going to find ways around most things, but it will work better than what we do now. It would likely also likely still require government intervention to mandate broad use of "I am a child" indicators, and governments might have to standardize how privacy must work for child accounts. This won't stop CSAM, but is likely a better a lot better for detecting and preventing adult-to-child sexting and actual grooming (rather than whatever social conservatives think "grooming" is these days). It does seem unlikely as a path, though. Governments really do want access to unencrypted data, and it still true that CSAM fears remains their preferred way to get there.
    watto_cobra
Sign In or Register to comment.