Apple App Store loopholes endanger children, claims watchdog
A non-profit watchdog organization claims that Apple isn't doing enough to protect children from adult content in the App Store, and also refuses to accept responsibility for the failures.

One of the apps the Campaign for Accountability claims is too easily accessed by minors
The Campaign for Accountability (CfA), a group that previously reported on Apple removing App Store games at the behest of the Chinese government, now says that the store is failing to protect children worldwide. Through its Tech Transparency Project (TPP), it claims that Apple fails to take even "obvious" child safety measures over adult material.
"Apple claims that it maintains a tight grip over App Store creators to protect consumers from harmful content," said Michelle Kuppersmith, CfA Executive Director in a statement, "but it hasn't even put up the most obvious safeguard to keep underage users safe."
Specifically, the organization says that it created a test account for a fictitious 14-year-old, and then attempted to download content ranging from gambling to dating apps and pornography.
Attempting to download such apps reportedly resulted in a pop-up message asking the user to confirm they were over 17. If the test user then claimed to be, then the app downloaded, and often without further checking.
CfA says its test Apple ID was set as being that of a 14-year-old, and that apps should be aware of that setting.
"If Apple already knows that a user is under 18, how can it let the user download adult apps in the first place?" continues Kuppersmith.
"Apple has clearly chosen to pass the buck on protecting children to the app developers," she says. "But, while it seemingly has no desire to accept responsibility, Apple has no problem taking its cut of the profits that arise from age-inappropriate transactions."
It's not clear if the organization enabled the hard restrictions on age-gated content for its testing. By default, the setting is to allow all apps. Parents can execute a hard block on age-inappropriate apps from within Screen Time settings -- but it does take a manual setting to do so.
In its full report, the organization says that it downloaded 75 apps that were designated 17+ on the App Store. At least many of the 75 then had their own age verification system, but these were chiefly inadequate.
CfA says that many gambling apps were properly thorough, some requiring scans of government-issued ID, but not all. And other adult content was offered with only token age restriction.
In 37 of the apps, users have to register an account, but CfA says they accepted the user's Apple ID -- even though that ID was underage. Other apps, such as chat service Yubo are listed on the App Store as 17+, but say in-app that they allow users as young as 13.
Reportedly, some 31 of the apps tested offered signups via a Facebook account -- and Facebook blocked the registration. CfA concludes that Facebook refused registration because the test user was underage, but there were other apps that did allow the same account to register via the social media service.
Apple has not yet responded to the Campaign for Accountability and its Tech Transparency Project. Separately, the Project claimed in June 2021 that a Chinese wind farm Apple partner was allegedly linked to forced labor of Uyghurs.
Read on AppleInsider

One of the apps the Campaign for Accountability claims is too easily accessed by minors
The Campaign for Accountability (CfA), a group that previously reported on Apple removing App Store games at the behest of the Chinese government, now says that the store is failing to protect children worldwide. Through its Tech Transparency Project (TPP), it claims that Apple fails to take even "obvious" child safety measures over adult material.
"Apple claims that it maintains a tight grip over App Store creators to protect consumers from harmful content," said Michelle Kuppersmith, CfA Executive Director in a statement, "but it hasn't even put up the most obvious safeguard to keep underage users safe."
Specifically, the organization says that it created a test account for a fictitious 14-year-old, and then attempted to download content ranging from gambling to dating apps and pornography.
Attempting to download such apps reportedly resulted in a pop-up message asking the user to confirm they were over 17. If the test user then claimed to be, then the app downloaded, and often without further checking.
CfA says its test Apple ID was set as being that of a 14-year-old, and that apps should be aware of that setting.
"If Apple already knows that a user is under 18, how can it let the user download adult apps in the first place?" continues Kuppersmith.
"Apple has clearly chosen to pass the buck on protecting children to the app developers," she says. "But, while it seemingly has no desire to accept responsibility, Apple has no problem taking its cut of the profits that arise from age-inappropriate transactions."
It's not clear if the organization enabled the hard restrictions on age-gated content for its testing. By default, the setting is to allow all apps. Parents can execute a hard block on age-inappropriate apps from within Screen Time settings -- but it does take a manual setting to do so.
In its full report, the organization says that it downloaded 75 apps that were designated 17+ on the App Store. At least many of the 75 then had their own age verification system, but these were chiefly inadequate.
CfA says that many gambling apps were properly thorough, some requiring scans of government-issued ID, but not all. And other adult content was offered with only token age restriction.
In 37 of the apps, users have to register an account, but CfA says they accepted the user's Apple ID -- even though that ID was underage. Other apps, such as chat service Yubo are listed on the App Store as 17+, but say in-app that they allow users as young as 13.
Reportedly, some 31 of the apps tested offered signups via a Facebook account -- and Facebook blocked the registration. CfA concludes that Facebook refused registration because the test user was underage, but there were other apps that did allow the same account to register via the social media service.
Apple has not yet responded to the Campaign for Accountability and its Tech Transparency Project. Separately, the Project claimed in June 2021 that a Chinese wind farm Apple partner was allegedly linked to forced labor of Uyghurs.
Read on AppleInsider
Comments
If the child is on a Family account and they say "yes" to the suspect image a message gets sent to the parent.
"When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it."
I have to agree with CfA in their complaint to Apple. Why wouldn't it use the age set on the AppleID? That's so simple!
Prevent explicit content and content ratings
You can also prevent the playback of music with explicit content and movies or TV shows with specific ratings. Apps also have ratings that can be configured using content restrictions.
To restrict explicit content and content ratings:
Here are the types of content that you can restrict:
That makes my eyes roll. I cannot stand people who think "greed is the reason.
But I do agree that if the iCloud account was already set up with the age given, it should be carried all across the Apple ecosystem including App Store.