Purged screen time monitoring apps misused enterprise tools, Schiller says
A critical New York Times report on Saturday suggested Apple culls screen time monitoring apps that compete similar first-party features built into iOS. In response, SVP of Worldwide Marketing Phil Schiller said the titles in question abuse a device management technology designed for enterprise users.

In its report, The Times highlights an uptick in restrictions or removals of popular screen time and parental control apps over the past year.
Some developers impacted by the new policy intimated to the publication that the tech giant aims to quash competition of Screen Time, an iOS feature and toolset designed to encourage iPhone and iPad owners to spend less time on their devices. Others, like the CEO of Freedom, an app removed from the App Store last year, said they remain dubious of Apple's stated intent to fight so-called mobile device addiction.
Apple spokeswoman Tammy Levine said the apps were removed, or had their features limited, because they were able to access "too much information from users' devices," the report said. Levine added that the app purge is not related to Apple's release of "similar tools," a likely reference to Screen Time.
"We treat all apps the same, including those that compete with our own services," Levine said, echoing a statement Apple issued to AppleInsider on Saturday. "Our incentive is to have a vibrant app ecosystem that provides consumers access to as many quality apps as possible."
The report apparently hit a nerve with Apple executives, as Schiller offered a -- much more detailed -- account of the situation in response to a customer email, reports MacRumors.
Expanding on Levine's comments, Schiller said the apps in question leveraged Mobile Device Management (MDM) technology to monitor, limit and control device usage. Designed for enterprise users managing large device deployments, MDM tools offer wide, and in some cases unfettered, device access to system administrators. As such, the incorporation of MDM technology in public-facing apps poses a material threat to user privacy and security.
Schiller details the issue in his letter. AppleInsider has not seen the correspondence and is therefore unable to vouch for its authenticity.
The Times in its report notes Kidslox and Qustodio, both parental control apps, recently filed a complaint with the European Commission after Apple forced changes to Kidslox that allegedly impacted the developer's business. In March, Spotify filed an antitrust complaint against Apple with the same European Union office, while Kaspersky Lab did the same in Russia.

In its report, The Times highlights an uptick in restrictions or removals of popular screen time and parental control apps over the past year.
Some developers impacted by the new policy intimated to the publication that the tech giant aims to quash competition of Screen Time, an iOS feature and toolset designed to encourage iPhone and iPad owners to spend less time on their devices. Others, like the CEO of Freedom, an app removed from the App Store last year, said they remain dubious of Apple's stated intent to fight so-called mobile device addiction.
Apple spokeswoman Tammy Levine said the apps were removed, or had their features limited, because they were able to access "too much information from users' devices," the report said. Levine added that the app purge is not related to Apple's release of "similar tools," a likely reference to Screen Time.
"We treat all apps the same, including those that compete with our own services," Levine said, echoing a statement Apple issued to AppleInsider on Saturday. "Our incentive is to have a vibrant app ecosystem that provides consumers access to as many quality apps as possible."
The report apparently hit a nerve with Apple executives, as Schiller offered a -- much more detailed -- account of the situation in response to a customer email, reports MacRumors.
Expanding on Levine's comments, Schiller said the apps in question leveraged Mobile Device Management (MDM) technology to monitor, limit and control device usage. Designed for enterprise users managing large device deployments, MDM tools offer wide, and in some cases unfettered, device access to system administrators. As such, the incorporation of MDM technology in public-facing apps poses a material threat to user privacy and security.
Schiller details the issue in his letter. AppleInsider has not seen the correspondence and is therefore unable to vouch for its authenticity.
Apple's App Store policies have become the target of increased scrutiny as iPhone and iOS gain more marketshare.Thank you for being a fan of Apple and for your email.
I would like to assure you that the App Store team has acted extremely responsibly in this matter, helping to protect our children from technologies that could be used to violate their privacy and security. After you learn of some of the facts I hope that you agree.
Unfortunately the New York Times article you reference did not share our complete statement, nor explain the risks to children had Apple not acted on their behalf. Apple has long supported providing apps on the App Store, that work like our ScreenTime feature, to help parents manage their children's access to technology and we will continue to encourage development of these apps. There are many great apps for parents on the App Store, like "Moment - Balance Screen Time" by Moment Health and "Verizon Smart Family" by Verizon Wireless.
However, over the last year we became aware that some parental management apps were using a technology called Mobile Device Management or "MDM" and installing an MDM Profile as a method to limit and control use of these devices. MDM is a technology that gives one party access to and control over many devices, it was meant to be used by a company on it's [sic] own mobile devices as a management tool, where that company has a right to all of the data and use of the devices. The MDM technology is not intended to enable a developer to have access to and control over consumers' data and devices, but the apps we removed from the store did just that. No one, except you, should have unrestricted access to manage your child's device, know their location, track their app use, control their mail accounts, web surfing, camera use, network access, and even remotely erase their devices. Further, security research has shown that there is risk that MDM profiles could be used as a technology for hacker attacks by assisting them in installing apps for malicious purposes on users' devices.
When the App Store team investigated the use of MDM technology by some developers of apps for managing kids devices and learned the risk they create to user privacy and security, we asked these developers to stop using MDM technology in their apps. Protecting user privacy and security is paramount in the Apple ecosystem and we have important App Store guidelines to not allow apps that could pose a threat to consumers privacy and security. We will continue to provide features, like ScreenTime, designed to help parents manage their children's access to technology and we will work with developers to offer many great apps on the App Store for these uses, using technologies that are safe and private for us and our children.
Thank you,
Phil
The Times in its report notes Kidslox and Qustodio, both parental control apps, recently filed a complaint with the European Commission after Apple forced changes to Kidslox that allegedly impacted the developer's business. In March, Spotify filed an antitrust complaint against Apple with the same European Union office, while Kaspersky Lab did the same in Russia.
Comments
But anyway ... TOLD YA.
Many people who are easily manipulated and naive fell for the fake news from the NYT.
Obviously, that is a privacy violation - particularly if this gives the DEVELOPER access to your child's iPhone.
Next thing you know, Child Molesters and Hackers will create apps using MDM technology to take snapshots of your children or yourself in compromising situations.
Apple is RIGHT to BAN these apps and BAN these Developers for misusing Apple Technology.
Fake news too?
With that hat being said may not be so innocent. I have worked for a company in the past where they suddenly kicked an app of ours out of the store after it being there 7 years with monthly updates. The reasoning was quite vague. We had phone calls, escalated it up the chain but they just kept repeating the rule instead of telling us how it even applied to us. The crazy thing was we had other apps that did the same thing but they were hung up on this one. In the end we changed categories and the name of the app and then it was perfectly acceptable. But the entire lack of specifics is where the frustration stemmed from. We would have been more than happy to change the app if they could tell us what to do or what specifically was wrong.
If these companies got the same runaround I can easily see how they could draw parallels to sherlocking the industry. There seems to be a communication breakdown somewhere.
"there is risk that MDM profiles could be used as a technology for hacker attacks by assisting them in installing apps for malicious purposes on users' devices"
Does this risk still exist (perhaps to a lesser extent) with Screen Time ?
Should the push of screen time raise a privacy red flag too...?
...and does Apple in the end have any possible access to such data, especially if synced over iCloud...?
Is the data Apple increasingly layers into each iOS upgrade with such things as screen time of concern ?
To name a few:
Find my iPhone (location tracking)
Contacts (with 'Profile Pictures' + 'Note', potentially sent to iCloud by others unknown),
Photos (auto tagging),
Safari (bookmarks),
Keychain,
Mail,
Maps,
News,
Music,
Home,
Health,
ApplePay,
Watch,
HomePod,
etc, etc...
Is this remarkable...?
Is this potential data all the more valuable as verifiable with Touch ID and/or Facial Recognition, etc.?
Even with the best of intent (to balance user benefits with privacy) might this put an incredible amount of user data at risk of 'unintended consequences'...?
Edit: I believe you’re missing the point I was making about their communication.