Apple rolling out nudity-blurring child safety feature to more countries

Posted:
in iOS
Apple is expanding where the iOS Communications Safety feature will be available, with it spreading into six more countries.

iMessage on the iPhone
iMessage on the iPhone


Communications Safety is a part of iMessage that examines inbound and outbound messages for nudity on devices owned by children. After an initial rollout in the U.S. in iOS 15.2, and an expansion in 2022, the feature's now coming to another six countries.

Along with the United States, Communications Safety is now available in the U.K. Canada, New Zealand, and Australia.

According to iCulture, Communications Safety will be heading out to the Netherlands, Belgium, Sweden, Japan, South Korea, and Brazil. In the case of the Netherlands and Belgium, it will be rolling out in the coming weeks.

Originally introduced alongside the controversial and canned on-device Photos evaluation function, Communications Safety tries to detect if a child is sending or receiving images that could contain nudity. Received images will be blurred, with the young user provided links to safety resources from child safety groups.

If an image containing nudity is about to be sent, the same system will offer another warning, advising that the image should not be sent at all.

The feature is designed to be privacy-focused, offered as an opt-in feature with all detection handled on-device so the data never leaves the iPhone.

The expansion of the feature arrives two weeks after the 20th annual Safer Internet Day, an European initiative that saw Apple promote privacy features and free educational resources on how to stay safe online.

Read on AppleInsider

Comments

  • Reply 1 of 9
    oldenboomoldenboom Posts: 30unconfirmed, member
    I'm from the Netherlands. I don't really care about my child seeing any nudity (she'll immediately yell "Yuck!" anyway). However I do care about child abuse, blackmail between kids, bullying and pornography  (especially abusive, agressive or disrespectful pornography). I do have my doubts regarding American companies trying to enforce American puritan morals upon my child. I hate to see my child already feels nudity is bad, just because of the leading puritan American morals in a lot of series and movies while at the same time agression and gore is apparantly quite accepted in the US while it's not on this side of the Atlantic. So Apple now decides she cannot see nudity but she can see violence? Yes, I would prefer my kid not to send any nude pictures and to stay away from Omegle etc. But that's way more my task as a parent, not Apple's.

    Anyway, I just read the iCulture article. It seems upon receival of a suspected nude picture, Apple will just blur the photo and warn the child with "This photo could be sensitive. Are you sure you want to view it?". And upon sending a possible nude picture "It's your choice but make sure you feel safe". Now, teens are per definition very curious and my kid is no exception: she'd definately want to view such a picture but will learn pretty fast not to view any more such pictures from certain people in a chatgroup. So, it's not as bad as I initially thought it was. It might even be quite helpful but still, the message it bears is "nudity is bad" and that's something I definately don't want my child to learn.


    appleinsideruserdarkvaderdanoxwatto_cobra
  • Reply 2 of 9
    oldenboom said:
    I'm from the Netherlands. I don't really care about my child seeing any nudity (she'll immediately yell "Yuck!" anyway). However I do care about child abuse, blackmail between kids, bullying and pornography  (especially abusive, agressive or disrespectful pornography). I do have my doubts regarding American companies trying to enforce American puritan morals upon my child. I hate to see my child already feels nudity is bad, just because of the leading puritan American morals in a lot of series and movies while at the same time agression and gore is apparantly quite accepted in the US while it's not on this side of the Atlantic. So Apple now decides she cannot see nudity but she can see violence? Yes, I would prefer my kid not to send any nude pictures and to stay away from Omegle etc. But that's way more my task as a parent, not Apple's.

    Anyway, I just read the iCulture article. It seems upon receival of a suspected nude picture, Apple will just blur the photo and warn the child with "This photo could be sensitive. Are you sure you want to view it?". And upon sending a possible nude picture "It's your choice but make sure you feel safe". Now, teens are per definition very curious and my kid is no exception: she'd definately want to view such a picture but will learn pretty fast not to view any more such pictures from certain people in a chatgroup. So, it's not as bad as I initially thought it was. It might even be quite helpful but still, the message it bears is "nudity is bad" and that's something I definately don't want my child to learn.


    Yup, I agree. Americans think it’s fine to show people being violently brutalised. But a nipple is a step too far. Seems odd on this side of the pond!
    Anilu_777darkvaderdanoxwatto_cobra
  • Reply 3 of 9
    This goes back to when teens would watch scrambled Cinemax or HBO late at night hoping to catch a glimpse of a boob!  Parents should be parents and monitor their children's activities, rather than Apple playing police man or second parent.  Like what Oldenboom said.  In some countries, nudity is way more normal than in the US, and we weren't born dressed.  Of course child porn, extortion, and bullying is horrible, but it seems Apple wants to block everything that they deem inappropriate.  What if it is an image of famous artwork that contains nudity?  Apple will block it claiming it might be unsafe to view, when it is a piece of art.
    darkvader
  • Reply 4 of 9
    mjtomlinmjtomlin Posts: 2,673member
    Rogue01 said:
    This goes back to when teens would watch scrambled Cinemax or HBO late at night hoping to catch a glimpse of a boob!  Parents should be parents and monitor their children's activities, rather than Apple playing police man or second parent.  Like what Oldenboom said.  In some countries, nudity is way more normal than in the US, and we weren't born dressed.  Of course child porn, extortion, and bullying is horrible, but it seems Apple wants to block everything that they deem inappropriate.  What if it is an image of famous artwork that contains nudity?  Apple will block it claiming it might be unsafe to view, when it is a piece of art.

    This a feature that parents can turn on in iMessages if they feel the need to. This is not Apple imposing their morals on you.

    This is about a child having an unsupervised conversation with someone in iMessage on their iPad or iPhone who is sending them nude photos. Chances are those aren't photos of art. While you may not care that predator is grooming your child, most parents would have a problem with it.
    edited February 2023 applebynatureihatescreennameschasmmuthuk_vanalingamwatto_cobra
  • Reply 5 of 9
    Of course the problem that isn’t addressed is sexual abuse within the home. In those cases nudity is the “norm” for the wrong reasons. 
  • Reply 6 of 9
    entropysentropys Posts: 4,166member
    Kids all around the world approaching their 21st birthday party sigh in relief.
    watto_cobramattinoz
  • Reply 7 of 9
    mjtomlinmjtomlin Posts: 2,673member
    oldenboom said:
    I'm from the Netherlands. I don't really care about my child seeing any nudity (she'll immediately yell "Yuck!" anyway). However I do care about child abuse, blackmail between kids, bullying and pornography  (especially abusive, agressive or disrespectful pornography). I do have my doubts regarding American companies trying to enforce American puritan morals upon my child. I hate to see my child already feels nudity is bad, just because of the leading puritan American morals in a lot of series and movies while at the same time agression and gore is apparantly quite accepted in the US while it's not on this side of the Atlantic. So Apple now decides she cannot see nudity but she can see violence? Yes, I would prefer my kid not to send any nude pictures and to stay away from Omegle etc. But that's way more my task as a parent, not Apple's.

    Anyway, I just read the iCulture article. It seems upon receival of a suspected nude picture, Apple will just blur the photo and warn the child with "This photo could be sensitive. Are you sure you want to view it?". And upon sending a possible nude picture "It's your choice but make sure you feel safe". Now, teens are per definition very curious and my kid is no exception: she'd definately want to view such a picture but will learn pretty fast not to view any more such pictures from certain people in a chatgroup. So, it's not as bad as I initially thought it was. It might even be quite helpful but still, the message it bears is "nudity is bad" and that's something I definately don't want my child to learn.



    Did you miss the part about this being an opt-in feature? It is not on by default...

    1. you have to add your child to your Family Group.
    2. you have to turn on Parental controls on your child's phone.
    3. you have to activate this feature.

    That's an awful lot of steps that you have to choose to do before Apple imposes their puritan morals on your kid.
  • Reply 8 of 9
    mjtomlin said:
    oldenboom said:
    I'm from the Netherlands. I don't really care about my child seeing any nudity (she'll immediately yell "Yuck!" anyway). However I do care about child abuse, blackmail between kids, bullying and pornography  (especially abusive, agressive or disrespectful pornography). I do have my doubts regarding American companies trying to enforce American puritan morals upon my child. I hate to see my child already feels nudity is bad, just because of the leading puritan American morals in a lot of series and movies while at the same time agression and gore is apparantly quite accepted in the US while it's not on this side of the Atlantic. So Apple now decides she cannot see nudity but she can see violence? Yes, I would prefer my kid not to send any nude pictures and to stay away from Omegle etc. But that's way more my task as a parent, not Apple's.

    Anyway, I just read the iCulture article. It seems upon receival of a suspected nude picture, Apple will just blur the photo and warn the child with "This photo could be sensitive. Are you sure you want to view it?". And upon sending a possible nude picture "It's your choice but make sure you feel safe". Now, teens are per definition very curious and my kid is no exception: she'd definately want to view such a picture but will learn pretty fast not to view any more such pictures from certain people in a chatgroup. So, it's not as bad as I initially thought it was. It might even be quite helpful but still, the message it bears is "nudity is bad" and that's something I definately don't want my child to learn.



    Did you miss the part about this being an opt-in feature? It is not on by default...

    1. you have to add your child to your Family Group.
    2. you have to turn on Parental controls on your child's phone.
    3. you have to activate this feature.

    That's an awful lot of steps that you have to choose to do before Apple imposes their puritan morals on your kid.
    Why single out nudity? Perhaps there ought there to be a number of sub-options at step 3? Nudity, violence, sexual activity? 
Sign In or Register to comment.