Telegram has ramped up its moderation efforts, blocking 15.4 million groups and channels in 2024. This is part of an intensified crackdown on harmful content such as fraud, terrorism, and child sexual abuse material (CSAM). The announcement came via their new content moderation page, which aims to boost transparency around the platform’s efforts to combat illegal activity.
Telegram attributed its progress to AI-powered moderation tools capable of scanning and removing millions of pieces of content daily. In 2024, it blocked 705,688 groups and channels linked to CSAM and stated that public images on the platform had been checked against an extensive hash database of banned content. The database, first introduced in 2018, was expanded this year with contributions from organizations like the Internet Watch Foundation.
The platform has also intensified its anti-terrorism measures, blocking 129,986 terrorist-related communities in 2024. Since 2022, Telegram has worked with groups like ETIDAL, the Global Center for Combating Extremist Ideology, removing over 100 million pieces of terrorist content through the partnership. “Calls to violence and terrorist propaganda have no place on Telegram,” the company reiterated in its statement.
Recent actions taken by Telegram are linked to the legal troubles of its founder, Pavel Durov. Arrested in France on charges of failing to adequately moderate the platform, Durov was later released on bail. The authorities allege that Telegram has become a shelter for illegal activities like terrorism and drug trafficking.
In response to the accusations, Durov admitted the difficulty of moderating a platform as large as Telegram. He also noted the company’s efforts to comply with EU regulations and criticized the trend of holding tech founders personally responsible for user actions.
With a user base of over 900 million, Telegram remains under the spotlight for hosting controversial content. The platform has intensified its content moderation efforts and regularly reports on the removal of child sexual abuse material (CSAM). However, the legal case against Durov could set a precedent for how tech companies are held accountable for user-generated content.