Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...

Posted:
in General Discussion edited April 2022
Apple is rolling out the communication safety feature that scans iMessages for nudity on devices owned by younger users in the U.K., Canada, New Zealand, and Australia months after debuting it in the U.S.

Apple iMessage
Apple iMessage


The feature, which is different from the controversial on-device Photos evaluation function, will automatically blur potentially harmful images in received or outgoing messages on devices owned by children.

First reported by The Guardian, Apple is expanding the feature to the U.K. after rolling it out in the U.S. back in iOS 15.2. AppleInsider has learned that the feature is expanding to Canada, New Zealand, and Australia as well.

How the feature works depends on whether a child receives or sends an image with nudity. Received images will be blurred, and the child will be provided with safety resources from child safety groups. Nudity in photos sent by younger users will trigger a warning advising that the image should not be sent.

The feature is privacy-focused and is available only on an opt-in basis. It must be enabled by parents. All detection of nudity is done on-device, meaning that any potentially sensitive images never leaves an iPhone.

Apple first announced the Communication Safety function alongside a suite of features meant to provide better safety mechanisms aimed at children. That suite included a system that scanned Photos for child sex abuse material (CSAM).

The CSAM scanning had several privacy mechanisms and never looked through a user's images. Instead matched potential abusive material based on known hashes provided by child safety organizations. Despite that, Apple faced a backlash and delayed the feature until further notice.

Apple's Communication Safety function is completely unrelated to the CSAM scanning mechanism. It first debuted in the U.S. in November as part of iOS 15.2, and its expansion to these regions signals its rollout to other markets.

Read on AppleInsider
«1

Comments

  • Reply 1 of 23
    dutchlorddutchlord Posts: 206member
    I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 
    williamlondonxyzzy-xxxBeats
  • Reply 2 of 23
    crowleycrowley Posts: 10,453member
    dutchlord said:
    I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 
    So don't opt in.
    indieshackiOS_Guy80dewmepscooter63mattinozcommand_fjony0watto_cobra
  • Reply 3 of 23
    personperson Posts: 31member
    crowley said:
    dutchlord said:
    I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 
    So don't opt in.
    Nobody was given a choice. It’s just a precursor of all the anti privacy policies and features to come
    entropysxyzzy-xxxBeatsdarkvader
  • Reply 4 of 23
    crowleycrowley Posts: 10,453member
    person said:
    crowley said:
    dutchlord said:
    I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 
    So don't opt in.
    Nobody was given a choice. It’s just a precursor of all the anti privacy policies and features to come
    What do you mean?  You have a choice, the feature is opt in, it clearly says so in the article.

    The feature is privacy-focused and is available only on an opt-in basis. It must be enabled by parents. 

    dewmemknelsoncommand_fwatto_cobrajony0
  • Reply 5 of 23
    entropysentropys Posts: 4,152member
    Opt in or out is not the same as a company creating the ability to do so. iMessage main thing, apart from the cool blue boxes,is its privacy.  This is helping to wreck it.

    on the particular matter, my first thought was how many kids actually have iPhones anyway? But then I am an old fogey from The Time Before Mobiles. Every kid has phones. And privacy, once important, and protection against creation of a Big Brother in all its possible forms, no longer matters.
    Beatselijahg
  • Reply 6 of 23
    jdwjdw Posts: 1,324member
    I was against the proposed Apple CSAM features because, even though the risk was low, there was still the change it could basically "call the cops" on someone, even by accident.  However, I am not averse to the blurring of nudity as mentioned by the article because it doesn't secretly phone home or otherwise call the cops on the user.  Some naughty people who send those kinds of photos won't be happy, but since I don't do that, it's obviously a non-issue for me.  And while I don't engage in other illicit activities regarding kiddie porn either, the fact it potentially could alert authorities to target a given iPhone user keeps me dead set against it.

    So go ahead a blur naughty photos, Apple!  Just keep the cops and prosecution out of it.  Seems like that is the case here, so it doesn't bother me that this could be the precursor to something else.  When that something else arrives, we the consumer can evaluate it at the time.  For now, no objections from me.
    muthuk_vanalingamcommand_fdarkvaderwatto_cobra
  • Reply 7 of 23
    Paul_BPaul_B Posts: 82member
    Apple should work on MacOS's speed, and iCould's interface mimic that of the MacOS - yet they are working on kiddy porn.
    xyzzy-xxxBeatselijahgdarkvader
  • Reply 8 of 23
    Paul_B said:
    Apple should work on MacOS's speed, and iCould's interface mimic that of the MacOS - yet they are working on kiddy porn.
    Why does it have to be either or? They are not allowed to work on multiple things at the same time?
    jony0watto_cobra
  • Reply 9 of 23
    Paul_BPaul_B Posts: 82member
    Paul_B said:
    Apple should work on MacOS's speed, and iCould's interface mimic that of the MacOS - yet they are working on kiddy porn.
    Why does it have to be either or? They are not allowed to work on multiple things at the same time?

    My point was it's a waste of time, they work can on nuclear technology for all I care, but don't waste engineers time.
    Beatsdarkvader
  • Reply 10 of 23
    Wesley HilliardWesley Hilliard Posts: 181member, administrator, moderator, editor
    entropys said:
    Opt in or out is not the same as a company creating the ability to do so. iMessage main thing, apart from the cool blue boxes,is its privacy.  This is helping to wreck it.
    Can you explain how? How does this affect privacy of any user in any way? Because it doesn't.
    command_fwatto_cobrajony0
  • Reply 11 of 23
    boboliciousbobolicious Posts: 1,139member
    ...if one accepts that Apple can outsmart multinational or hostile state iCloud hackers, does that extend to customers as well...?
    That said at face value this opt in sounds like a welcome feature for many...
    edited April 2022 watto_cobra
  • Reply 12 of 23
    crowleycrowley Posts: 10,453member
    Paul_B said:
    Paul_B said:
    Apple should work on MacOS's speed, and iCould's interface mimic that of the MacOS - yet they are working on kiddy porn.
    Why does it have to be either or? They are not allowed to work on multiple things at the same time?

    My point was it's a waste of time, they work can on nuclear technology for all I care, but don't waste engineers time.
    Attempting to prevent child sexual abuse is a waste of time?
    command_fwatto_cobra
  • Reply 13 of 23
    jdw said:
    I was against the proposed Apple CSAM features because, even though the risk was low, there was still the change it could basically "call the cops" on someone, even by accident.  However, I am not averse to the blurring of nudity as mentioned by the article because it doesn't secretly phone home or otherwise call the cops on the user.  Some naughty people who send those kinds of photos won't be happy, but since I don't do that, it's obviously a non-issue for me.  And while I don't engage in other illicit activities regarding kiddie porn either, the fact it potentially could alert authorities to target a given iPhone user keeps me dead set against it.

    So go ahead a blur naughty photos, Apple!  Just keep the cops and prosecution out of it.  Seems like that is the case here, so it doesn't bother me that this could be the precursor to something else.  When that something else arrives, we the consumer can evaluate it at the time.  For now, no objections from me.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.

    This nudity detection feature, if as stated all happens on device, with no reporting to any party outside the original participants, does not violate privacy and is a tool that can be used by parents to protect their kids (when the kids can afford their own phone/plan, they can decide for themselves to turn it on or off, imo)

      I don’t have issue with scanning, what I take issue with is what is done with the results of a scan (is it reported or not).
    elijahgmuthuk_vanalingamwatto_cobra
  • Reply 14 of 23
    BeatsBeats Posts: 3,073member
    This is stupid and “opt-in” just means “testing”. Eventually it will be required if we allow it. Then photo scanning etc. 
    elijahg
  • Reply 15 of 23
    BeatsBeats Posts: 3,073member
    crowley said:
    Paul_B said:
    Paul_B said:
    Apple should work on MacOS's speed, and iCould's interface mimic that of the MacOS - yet they are working on kiddy porn.
    Why does it have to be either or? They are not allowed to work on multiple things at the same time?

    My point was it's a waste of time, they work can on nuclear technology for all I care, but don't waste engineers time.
    Attempting to prevent child sexual abuse is a waste of time?

    It’s none of their business. Let the lazy trigger-happy government agencies/cops do their damn job!!
    darkvader
  • Reply 16 of 23
    crowleycrowley Posts: 10,453member
    Beats said:
    crowley said:
    Paul_B said:
    Paul_B said:
    Apple should work on MacOS's speed, and iCould's interface mimic that of the MacOS - yet they are working on kiddy porn.
    Why does it have to be either or? They are not allowed to work on multiple things at the same time?

    My point was it's a waste of time, they work can on nuclear technology for all I care, but don't waste engineers time.
    Attempting to prevent child sexual abuse is a waste of time?
    It’s none of their business. Let the lazy trigger-happy government agencies/cops do their damn job!!
    It's none of parents business if their child is sending and/or receiving explicit images?
    command_fjony0
  • Reply 17 of 23
    command_fcommand_f Posts: 421member
    entropys said:
    Opt in or out is not the same as a company creating the ability to do so. iMessage main thing, apart from the cool blue boxes,is its privacy.  This is helping to wreck it.

    on the particular matter, my first thought was how many kids actually have iPhones anyway? But then I am an old fogey from The Time Before Mobiles. Every kid has phones. And privacy, once important, and protection against creation of a Big Brother in all its possible forms, no longer matters.
    How is this a privacy issue?

    The article says "All detection of nudity is done on-device, meaning that any potentially sensitive images never leaves an iPhone.". If only the phone in question is involved then no-one else is involved, so by definition it's private.
    jony0watto_cobra
  • Reply 18 of 23
    command_fcommand_f Posts: 421member
    jdw said:
    <Snip>.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.
    I don't follow your logic. CSAM detection is OK if it's done on someone else's computer (Apple's Server in the cloud) but not if it's done on your own phone? If it's done on your phone, you then get the warning and no-one else can tell (unless you keep doing it, which is my memory of what Apple proposed). If it's done on someone else's server then the data is present somewhere out of your control.

    I also don't understand what encryption you think is being "circumvented". The data on your phone is encrypted but, within the phone, the encryption keys are available and the data unlocked: if it wasn't, how would Photos be able to display your photos to you? 
    jony0watto_cobra
  • Reply 19 of 23
    command_f said:
    jdw said:
    <Snip>.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.
    I don't follow your logic. CSAM detection is OK if it's done on someone else's computer (Apple's Server in the cloud) but not if it's done on your own phone? If it's done on your phone, you then get the warning and no-one else can tell (unless you keep doing it, which is my memory of what Apple proposed). If it's done on someone else's server then the data is present somewhere out of your control.

    I also don't understand what encryption you think is being "circumvented". The data on your phone is encrypted but, within the phone, the encryption keys are available and the data unlocked: if it wasn't, how would Photos be able to display your photos to you? 
    As I’ve said before: I treat anything I send to iCloud photos as public knowledge (Even if it’s meant only for friends and family).  Anything I want to keep private stays on my devices. On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.

    You are correct that the data is encrypted before storing on the phone, and it has to be decrypted to be displayed on your screen.  This is different from the encryption used to send something via Messages for instance, and is also different from the encryption to send things via iCloud photos.  When sending images via iCloud photos, it would have to be decrypted from storage to be sent.  The image should then be encrypted in transit (https) to Apple servers using a different protocol and a different key. But then at this point Apple has access to the contents to check for CSAM, and they’ve probably have been doing this since before the announcement of on device CSAM. I suspect Apple is then re-encrypting for data at rest on the server side, so only people with the keys may have access to the data.

    I highly doubt iCloud photos will ever be fully E2E encrypted, meaning only the original intended participants are privy to the contents of the encryption, with no one in the middle being able to view it, including but not limited to Apple or law enforcement.  Threshold encryption is a way to circumvent the original participants to report to another, originally unintended party.  Thus, I see little difference between the privacy of on-device vs server side CSAM detection.  On device gives a false sense privacy.  Since I treat anything I send off device (already decrypted from local storage) as public knowledge, this is effectively the same.

    It all boils down to trust.  Do I trust Apple? More so than other companies.  Do I trust that they won’t look at my images? Generally I do, but I’m also not naive.  Do I trust their software not to be hacked or leak these images? To some degree, but again, I’m not naive.  Do I trust Apple’s claims about encryption? Generally, yes.  How about end to end encryption (meaning it stays encrypted once it leaves the device and is only decrypted once it arrives at the final destination)? Enough that I’ll send private details to someone on the other side.

    And that’s just part of it.  Do I trust the receiver not to re-share the details? Do I trust the receiver to have taken precautions not to inadvertently leak the info, such as using untrusted 3rd party software, screen lookers, or even hackers? This is why things I want to keep private stay on my phone, and things I send out of it I regard as public knowledge despite promises made about the implementation (which I generally consider as trust worthy for now).
    muthuk_vanalingamwatto_cobra
  • Reply 20 of 23
    darkvaderdarkvader Posts: 1,146member
    dutchlord said:
    I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 

    The on-device CSAM scanning is absolutely abhorrent and Apple should never have even considered it.  The appropriate solution for photos is end-to-end encryption so Apple has no access to any data.  And that's what should happen for iMessage in the cloud, photos, and all other content stored on Apple servers, it should not be possible for Apple or law enforcement to access any of it under ANY circumstances.  No, I'm not condoning kiddy porn in any way, but the privacy violation is a much more egregious offense.

    This is a different feature.  As long as it can only be used on child accounts and does not have the ability to call the police or any other government agency, I'm fine with it.  If either of those conditions changes, then it becomes a huge problem, but as of now there's no indication it will.
    muthuk_vanalingam
Sign In or Register to comment.