Australia orders Apple & others to disclose anti-CSAM measures

Jump to First Reply
Posted:
in General Discussion edited August 2022
An Australian regulator has told Apple, Meta, and Microsoft to detail their strategies for combating child abuse material, or face fines.




Apple famously announced measures to prevent the spread of Child Sexual Abuse Material (CSAM) in August 2021, only to face criticism over security and censorship. Apple backpedal its plans in September 2021, but as AppleInsider has noted, the addition of such measures is now inevitable.

According to Reuters, Australia's e-Safety Commissioner has sent what are described as legal letters to Apple and other Big Tech platforms. A spokesperson for the Commissioner, said that the letters use new Australian laws to compel disclosure.

"This activity is no longer confined to hidden corners of the dark web," said commissioner Julie Inman Grant in a statement seen by Reuters, "but is prevalent on the mainstream platforms we and our children use every day."

"As more companies move towards encrypted messaging services and deploy features like livestreaming," she continued, "the fear is that this horrific material will spread unchecked on these platforms."

It's not known in what detail or form Apple, Meta, and Microsoft are now required to report on their measures. However, they have 28 days from the issuing of the letter to do so.

The Australian regulator has asked for information about strategies to detect and also to remove CSAM. If the firms do not respond within the 28 days, the regulator is empowered to fine them $383,000 for every subsequent day they do not comply.

Apple has not commented on the issue. A Meta spokesperson said the company was reviewing the letter and continued to, "proactively engage with the e-Safety Commissioner on these important issues."

A Microsoft spokesperson told Reuters that the company received the letter and plans to respond within the 28 days.

Read on AppleInsider

Comments

  • Reply 1 of 9
    entropysentropys Posts: 4,434member
    Big brother wants a back door.
    byronlwatto_cobra
     2Likes 0Dislikes 0Informatives
  • Reply 2 of 9
    byronlbyronl Posts: 384member
    it begins…

    and of course it’s australia 
     0Likes 0Dislikes 0Informatives
  • Reply 3 of 9
    Why do you keep pushing the narrative that this stupid idea is inevitable? Apple already does this with iCloud. There is absolutely no reason to do this on device. And the lie that apple will simply refuse to change or add no csam to the hash is nonsense. This is why we don’t want this done on device. People who engage in these disgusting acts will simply use android phones with customized apps to hide behind 
    muthuk_vanalingambyronlexceptionhandlerentropyselijahgbaconstangwatto_cobra
     7Likes 0Dislikes 0Informatives
  • Reply 4 of 9
    byronlbyronl Posts: 384member
    JaiOh81 said:
    Why do you keep pushing the narrative that this stupid idea is inevitable? Apple already does this with iCloud. There is absolutely no reason to do this on device. And the lie that apple will simply refuse to change or add no csam to the hash is nonsense. This is why we don’t want this done on device. People who engage in these disgusting acts will simply use android phones with customized apps to hide behind 
    of course these people will easily find other ways. you’re acting as if this is actually about the protection of children…
    danoxentropyselijahgbaconstangwatto_cobra
     5Likes 0Dislikes 0Informatives
  • Reply 5 of 9
    JaiOh81 said:
    Why do you keep pushing the narrative that this stupid idea is inevitable? Apple already does this with iCloud. There is absolutely no reason to do this on device. And the lie that apple will simply refuse to change or add no csam to the hash is nonsense. This is why we don’t want this done on device. People who engage in these disgusting acts will simply use android phones with customized apps to hide behind 
    Also, that linked article about it being “inevitable” is clearly marked as an editorial.
    but as AppleInsider has noted, the addition of such measures is now inevitable.
    An author’s opinion does not mean it’s inevitable.

    Apple has probably been doing server side CSAM for a while now.  I see no reason to move it client side, as I’ve detailed before.

    I’ve also detailed that the “smart” criminals probably already use something else to transit their materials.  Using these big name services is like doing your business in broad daylight.  Client side CSAM only punishes the honest and the really stupid.
    baconstangwatto_cobra
     2Likes 0Dislikes 0Informatives
  • Reply 6 of 9
    danoxdanox Posts: 3,695member
    It’s inevitable call it any name you want, control, and a back door is the End Game.

    edited August 2022
    entropysbaconstangwatto_cobra
     3Likes 0Dislikes 0Informatives
  • Reply 7 of 9
    entropysentropys Posts: 4,434member
    byronl said:
    JaiOh81 said:
    Why do you keep pushing the narrative that this stupid idea is inevitable? Apple already does this with iCloud. There is absolutely no reason to do this on device. And the lie that apple will simply refuse to change or add no csam to the hash is nonsense. This is why we don’t want this done on device. People who engage in these disgusting acts will simply use android phones with customized apps to hide behind 
    of course these people will easily find other ways. you’re acting as if this is actually about the protection of children…
    Exactly. The thinking is if they pretend it is about the children those who oppose the surveillance state will be unable to argue against it.
    elijahgbaconstangwatto_cobraFileMakerFellerbyronl
     5Likes 0Dislikes 0Informatives
  • Reply 8 of 9
    elijahgelijahg Posts: 2,890member
    Instead of saying it's "inevitable", AI should be informing those who think this kind of surveillance is ok "because of the children" that it is in fact not, ok. Kowtowing to Apple's line of "we won't add more hashes because we said we won't" which means absolutely nothing is disappointing. I know it's an Apple fan site, but there is no need to suck up to Apple on everything - especially since Apple has historically been privacy focussed and this blows it completely out of the water.
    muthuk_vanalingamFileMakerFellerbyronl
     3Likes 0Dislikes 0Informatives
  • Reply 9 of 9
    badmonkbadmonk Posts: 1,354member
     The Australian regulator has asked for information about strategies to detect and also to remove CSAM. If the firms do not respond within the 28 days, the regulator is empowered to fine them $383,000 for every subsequent day they do not comply.”

    Who is the criminal here?

    I think this proves that the recent editorial of AI is correct, Apple’s CSAM screening program is inevitable.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.