Apple App Store loopholes endanger children, claims watchdog

Posted:
in iPhone edited August 2021
A non-profit watchdog organization claims that Apple isn't doing enough to protect children from adult content in the App Store, and also refuses to accept responsibility for the failures.

One of the apps the Campaign for Accountability claims is too easily accessed by minors
One of the apps the Campaign for Accountability claims is too easily accessed by minors


The Campaign for Accountability (CfA), a group that previously reported on Apple removing App Store games at the behest of the Chinese government, now says that the store is failing to protect children worldwide. Through its Tech Transparency Project (TPP), it claims that Apple fails to take even "obvious" child safety measures over adult material.

"Apple claims that it maintains a tight grip over App Store creators to protect consumers from harmful content," said Michelle Kuppersmith, CfA Executive Director in a statement, "but it hasn't even put up the most obvious safeguard to keep underage users safe."

Specifically, the organization says that it created a test account for a fictitious 14-year-old, and then attempted to download content ranging from gambling to dating apps and pornography.

Attempting to download such apps reportedly resulted in a pop-up message asking the user to confirm they were over 17. If the test user then claimed to be, then the app downloaded, and often without further checking.

CfA says its test Apple ID was set as being that of a 14-year-old, and that apps should be aware of that setting.

"If Apple already knows that a user is under 18, how can it let the user download adult apps in the first place?" continues Kuppersmith.

"Apple has clearly chosen to pass the buck on protecting children to the app developers," she says. "But, while it seemingly has no desire to accept responsibility, Apple has no problem taking its cut of the profits that arise from age-inappropriate transactions."

It's not clear if the organization enabled the hard restrictions on age-gated content for its testing. By default, the setting is to allow all apps. Parents can execute a hard block on age-inappropriate apps from within Screen Time settings -- but it does take a manual setting to do so.

In its full report, the organization says that it downloaded 75 apps that were designated 17+ on the App Store. At least many of the 75 then had their own age verification system, but these were chiefly inadequate.

CfA says that many gambling apps were properly thorough, some requiring scans of government-issued ID, but not all. And other adult content was offered with only token age restriction.

In 37 of the apps, users have to register an account, but CfA says they accepted the user's Apple ID -- even though that ID was underage. Other apps, such as chat service Yubo are listed on the App Store as 17+, but say in-app that they allow users as young as 13.

Reportedly, some 31 of the apps tested offered signups via a Facebook account -- and Facebook blocked the registration. CfA concludes that Facebook refused registration because the test user was underage, but there were other apps that did allow the same account to register via the social media service.

Apple has not yet responded to the Campaign for Accountability and its Tech Transparency Project. Separately, the Project claimed in June 2021 that a Chinese wind farm Apple partner was allegedly linked to forced labor of Uyghurs.

Read on AppleInsider

Comments

  • Reply 1 of 6
    Apple’s policy seems consistent and intentional here.  With the recently announced “Communication Safety in Messages” feature, Apple has put forth the notion that children under 13 are capable of giving informed consent to send/view sexually explicit images. And parents should have no role in monitoring or restricting access to such content if their teenagers have an iPhone. 
  • Reply 2 of 6
    mknelsonmknelson Posts: 1,139member
    Apple’s policy seems consistent and intentional here.  With the recently announced “Communication Safety in Messages” feature, Apple has put forth the notion that children under 13 are capable of giving informed consent to send/view sexually explicit images. And parents should have no role in monitoring or restricting access to such content if their teenagers have an iPhone. 
    That's not correct.

    If the child is on a Family account and they say "yes" to the suspect image a message gets sent to the parent.

    "When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it."

    I have to agree with CfA in their complaint to Apple. Why wouldn't it use the age set on the AppleID? That's so simple!
    edited August 2021 mike1
  • Reply 3 of 6
    rcfarcfa Posts: 1,124member
    Exactly how is any of that Apple’s problem? It’s a matter of bad parenting. There are plenty of ways parents can prevent that, including restricting access to internet or the damn phone.
    watto_cobra
  • Reply 4 of 6
    Perhaps the Campaign for Accountability should have read this:  https://support.apple.com/en-us/HT201304 ;

    Prevent explicit content and content ratings

    You can also prevent the playback of music with explicit content and movies or TV shows with specific ratings. Apps also have ratings that can be configured using content restrictions.

    To restrict explicit content and content ratings:

    1. Go to Settings and tap Screen Time.
    2. Tap Content & Privacy Restrictions, then tap Content Restrictions.
    3. Choose the settings you want for each feature or setting under Allowed Store Content.

    Here are the types of content that you can restrict:

    • Ratings For: Select the country or region in the ratings section to automatically apply the appropriate content ratings for that region
    • Music, Podcasts & News: Prevent the playback of music, music videos, podcasts, and news containing explicit content
    • Music Videos: Prevent finding and viewing music videos
    • Music Profiles: Prevent sharing what you're listening to with friends and seeing what they're listening to
    • Movies: Prevent movies with specific ratings
    • TV shows: Prevent TV shows with specific ratings
    • Books: Prevent content with specific ratings
    • Apps: Prevent apps with specific ratings


    watto_cobrajony0
  • Reply 5 of 6
    netroxnetrox Posts: 1,479member
    "Apple has clearly chosen to pass the buck on protecting children to the app developers," she says. "But, while it seemingly has no desire to accept responsibility, Apple has no problem taking its cut of the profits that arise from age-inappropriate transactions."

    That makes my eyes roll. I cannot stand people who think "greed is the reason. 

    But I do agree that if the iCloud account was already set up with the age given, it should be carried all across the Apple ecosystem including App Store. 


    watto_cobra
  • Reply 6 of 6
    mknelson said:
    Apple’s policy seems consistent and intentional here.  With the recently announced “Communication Safety in Messages” feature, Apple has put forth the notion that children under 13 are capable of giving informed consent to send/view sexually explicit images. And parents should have no role in monitoring or restricting access to such content if their teenagers have an iPhone. 
    That's not correct.

    If the child is on a Family account and they say "yes" to the suspect image a message gets sent to the parent.

    "When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it."

    I have to agree with CfA in their complaint to Apple. Why wouldn't it use the age set on the AppleID? That's so simple!
    What is incorrect about what I said? While a parent can get notified for the 12 and under crowd, children still are given the choice to see and send such images. And if the child is 13+, a teenager, the parents will not receive any notification. 
Sign In or Register to comment.