Telegram was pulled because of child pornography, says Apple's Phil Schiller
The "inappropriate content" that saw Telegram briefly disappear from the App Store last week was child pornography, Apple's marketing chief explained in response to a customer question.
"The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps," said Phil Schiller in an email seen by 9to5Mac. "After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children)."
Apple worked with the Telegram team to get the pornography removed and the people who posted it banned, Schiller continued. Telegram and Telegram X were only allowed back once this was completed and controls were put in place to prevent a repeat incident.
"We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk -- child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral," Schiller added.
It's unclear what safeguards might have been put in place. Telegram is a mainly personal messaging app, though it does allow group chats with up to 30,000 people. One possibility is that Apple and Telegram took steps to block any groups that might be sharing child pornography.
"The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps," said Phil Schiller in an email seen by 9to5Mac. "After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children)."
Apple worked with the Telegram team to get the pornography removed and the people who posted it banned, Schiller continued. Telegram and Telegram X were only allowed back once this was completed and controls were put in place to prevent a repeat incident.
"We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk -- child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral," Schiller added.
It's unclear what safeguards might have been put in place. Telegram is a mainly personal messaging app, though it does allow group chats with up to 30,000 people. One possibility is that Apple and Telegram took steps to block any groups that might be sharing child pornography.
Comments
and where is the illegal content kept and why was apple employees looking at it. Have to wonder about those employees. Maybe Apple hires people in India to look for images which are not allowed.Facebook does this, they have rooms full of indian workers which look at all the pictures you all post and determine if they are acceptable to others to view.
which goes back to my question where is the illegal content, it is not stored in the app, if someone posting it somewhere and can be view by the app, how it is the apps fault, does that mean safari can be pulled if someone access a website with illegal content. I get the plug in piece, but how do the plug-ins get installed. It kind of sounding like Apple is becoming the moral compass for the internet and if someone says they found an app allows someone to break the law then Apple will pull the app. I know Apple pulls some games since they had images within the game which were objectionable. This is not sounding like the same thing.
So if I downloaded the app, I was getting child porn? I don't think so. There may be content that was accessible by using the app, but the illegal content wasn't IN the app.
That statement is just all sorts of wrong.
If you'd spent 10 seconds doing research you'd find the facts of the Weiner case have nothing to do with what you claimed. Wiener was using other tools for his illegal sexting with the minor:
"The 15-year-old girl initially contacted Weiner through a direct message on Twitter in January 2016. Weiner and the girl continued to communicate on social media sites, including Facebook Messenger, Skype, Kik, Confide and Snapchat. ... Weiner asked the teen to 'engage in sexually explicit conduct via Skype and Snapchat, where her body was on display, and where she was asked to sexually perform for him'"
https://www.cnn.com/2017/09/25/politics/anthony-weiner-sentencing/index.html
...facts, what a concept, huh?
As for Telegram, I'm sure somebody tipped off somebody.
Nope, you're just making stuff up now. You should take up creative writing rather than put those skills into posts here.
Are you fucking serious? Apple has clear rules for their app store -- no app shall be used to distribute child pornography. Period. If this is upsetting to you then you're fucked in the head. (And note that Safari doesn't distribute web content, it browses it, which is why they wouldn't pull Safari if someone used it to browse child porn).
Go peddle your FUD nonsense on an Android site, as you're clearly from outer space on every discussion you've contributed to here. You hate Apple, we get it. Tell your handlers you tried but no one on AI wanted to listen to you.
Did Apple take any of those other apps off like twitter, facebook, etc? Not that I'm aware of. Seems like there was nothing wrong with thus messaging app either, but some 3rd party plugin. So I still don't get why the app was pulled like it was. Ban the plugin. Get the people passing around the pictures since it wasn't encrypted, problem solved.
"Telegram and Telegram X were only allowed back once this was completed and controls were put in place to prevent a repeat incident."
...yes, big bad Apple, requiring app devs to implement some tools to reduce or report child pornographers. Twitter and Facebook already have tools to reduce and report sex criminals.
You guys are whack.
Bitcoin is tied to huge amounts of illegal activities. But you can still have a bitcoin wallet on the app store. Discord has issues with child porn as well, but that is not taken down. Snapchat is up, but we can guess how that could be used for evil.
I think Apple should be doing this, but again give the developers a chance to Address these issues.