New iOS 15.2 beta includes Messages feature that detects nudity sent to kids
Apple's latest iOS 15.2 beta has introduced a previously announced opt-in communication safety feature designed to warn children -- and not parents -- when they send or receive photos that contain nudity.
Credit: Andrew O'Hara, AppleInsider
The Messages feature was one part of a suite of child safety initiatives announced back in August. Importantly, the iMessage feature is not the controversial system designed to detect child sexual abuse material (CSAM) in iCloud.
Although not enabled by default, parents or guardians can switch on the Messages feature for child accounts on a Family Sharing plan. The feature will detect nudity in Messages, blur the image, and warn the child.
Apple says children will be given helpful resources and reassured that it's okay if they don't want to view the image. If a child attempts to send photos that contain nudity, similar protections will kick in. In either case, the child will be given the option to message someone they trust for help.
Unlike the previously planned version of the feature, parents will not be notified if the system detects a message contains nudity. Apple says this is because of concerns that a parental notification could present a risk for a child, including the threat of physical violence or abuse.
Apple says that the detection of nudity flag will never leave a device, and the system doesn't encroach upon the end-to-end encryption of iMessages.
It's important to note that the feature is opt-in and only being included in the beta version of iOS, meaning it is not currently public-facing. There's no timeline on when it could reach a final iOS update, and there's a chance that the feature could be pulled from the final release before that happens.
Again, this feature is not the controversial CSAM detection system that Apple announced and then delayed. Back in September, Apple said it debut the CSAM detection feature later in 2021. Now, the company says it is taking additional time to collect input and make necessary improvements.
There's no indication of when the CSAM system will debut. However, Apple did say that it will be providing additional guidance to children and parents in Siri and Search. In an update to iOS 15 and other operating systems later in 2021, the company will intervene when users perform searches for queries related to child exploitation and explain that the topic is harmful.
Read on AppleInsider
Credit: Andrew O'Hara, AppleInsider
The Messages feature was one part of a suite of child safety initiatives announced back in August. Importantly, the iMessage feature is not the controversial system designed to detect child sexual abuse material (CSAM) in iCloud.
Although not enabled by default, parents or guardians can switch on the Messages feature for child accounts on a Family Sharing plan. The feature will detect nudity in Messages, blur the image, and warn the child.
Apple says children will be given helpful resources and reassured that it's okay if they don't want to view the image. If a child attempts to send photos that contain nudity, similar protections will kick in. In either case, the child will be given the option to message someone they trust for help.
Unlike the previously planned version of the feature, parents will not be notified if the system detects a message contains nudity. Apple says this is because of concerns that a parental notification could present a risk for a child, including the threat of physical violence or abuse.
Apple says that the detection of nudity flag will never leave a device, and the system doesn't encroach upon the end-to-end encryption of iMessages.
It's important to note that the feature is opt-in and only being included in the beta version of iOS, meaning it is not currently public-facing. There's no timeline on when it could reach a final iOS update, and there's a chance that the feature could be pulled from the final release before that happens.
Again, this feature is not the controversial CSAM detection system that Apple announced and then delayed. Back in September, Apple said it debut the CSAM detection feature later in 2021. Now, the company says it is taking additional time to collect input and make necessary improvements.
There's no indication of when the CSAM system will debut. However, Apple did say that it will be providing additional guidance to children and parents in Siri and Search. In an update to iOS 15 and other operating systems later in 2021, the company will intervene when users perform searches for queries related to child exploitation and explain that the topic is harmful.
Read on AppleInsider
Comments
Does anyone know how this is being done on device?
This does still have the potential to be contentious, since it's still scanning (on device) the photos being sent. That means much like before the tech for further erosion of privacy is already implemented, potentially allowing a country to force Apple to scan for particular pictures as they see fit. Yes it's enabled for child accounts only, but it wouldn't be much trouble to enable scanning for anyone, and not much more of a stretch to force them to send a surreptitious notification to a government minion.
What happens if Jingping tells Apple it has to scan for any photos of Winnie the Pooh? Will Apple still say no under threat of being removed from China? This question that was never really answered before still exists - and Apple's only response is "we won't bend to government demands", with no answer to "even if it's China?".
Though until this service also alerts me, the parent, as to who’s sending nudity, my daughter won’t be getting iMessage switched on.
In it’s planned form it’s an irrelevant feature. If anything, stating a picture may be objectionable will only pique a child’s interest and with no parental involvement this shouldn’t be considered a parental control. If improving child safety is its intent, it’s likely to achieve the opposite. Worse, it’s a placebo.
No more Updates for me !!!
Principally f-ed up …. Its surveillance.. regardless of what they want to call it or Misrepresent it as!!!
If they can scan photos… they can scan for anything their heart desires.
F-Them
This is content analysis, just like Photos has been doing for years now. When it's analysing your photos for content, so that you can search for "cat", it's also analysing for nudity.
But to report to a 3rd party of the results of a scan without my consent, that’s what I take issue with. As long as the data remains on device, scanning is a useful time saving tool.
I also being a concerned parent, probably won’t be handing over a smartphone or tablet to my kids that I don’t have access to. So this feature is not really useful to me. They have an iPad, but I also have it locked down where the password can’t be changed (and technically at this point, they don’t know the password, so they have to ask to use it).
I guess it could be useful if an unknown number/account sends illicit images to have it blurred out, just in case.
When they are old enough to buy/afford a phone and pay the monthly charges, they can do what they want.
You have it 100% backwards. The concept was that it would "warn" kids about the image they were about to send and then notify the parents as well if the kids send the image.
Also - you clearly haven't spent much time on the internet. Example: back when Periscope was still active you could go on and find a kid (clearly under 14 so not following Twitter's terms and conditions) doing something completely innocent and normal and they would be bombarded with "do you have cash app?" and emoji suggestions such as 👚⬆️ and 👖⬇️