Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Learn more.

Telegram’s developer provided few details as to why its secure messaging applications briefly disappeared and reappeared in Apple’s App Store several days ago, but Apple today confirmed that it pulled the apps for a serious reason: Telegram was serving child pornography to users, and it wouldn’t be allowed back in the App Store until the issue was fixed.

According to a report from 9to5Mac, App Store chief Phil Schiller said that Apple had been alerted that Telegram’s apps were sharing child pornography, which Apple verified, removing the apps and notifying authorities. Rather than remaining passive about the problem, Apple then pushed the developer to remove the content, ban the users responsible for posting it, and install “more controls to keep this illegal activity from happening again.”

Apple’s removal of Telegram from the App Store coincided with seemingly minor updates to the developer’s Android apps, suggesting that nothing serious was amiss until Apple said otherwise. Telegram has long promised users ultra-secure communications that cannot be read even by foreign governments, but it has been targeted by Iranian state-sponsored hackers, and more recently criticized by Russian authorities for facilitating terrorism. Telegram has previously brushed off complaints about bad uses of its service, suggesting that dangerous users will simply change apps, and truly blocking them would require blocking the internet.

Nonetheless, Apple clearly believed that Telegram could do more to police bad users, and apparently leveraged the possible loss of its iOS user base to force a rapid change. Schiller explained in an email that Apple “will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity… we have zero tolerance for any activity that puts children at risk.”


Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

Apple’s action on behalf of at-risk children comes several weeks after activist investors asked the company to do more to protect children from “iPhone addiction,” and CEO Tim Cook suggested that social media might share blame for device “overuse.” Schiller’s email is reproduced below.

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).

The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.

We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk — child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.

I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.