TikTok, the popular short-video platform owned by ByteDance, has revealed that it removed over 2.4 million videos posted by Nigerian users in the fourth quarter of 2024 for violating its content policies. Beyond content moderation, the platform also carried out a massive purge of fake and underage accounts, as part of its efforts to ensure a safer digital space.
In its latest Community Guidelines Enforcement report, TikTok disclosed that Nigeria ranked among the top 50 countries with the most policy violations during the period. However, a more staggering revelation was the removal of a large number of accounts. Many of them deemed fake or belonging to underage users.
According to the report, a total of 211.5 million accounts were removed globally between October and December 2024. Out of these, 185.3 million were fake accounts, while 20.5 million belonged to users suspected to be under 13 years old. An additional 5.6 million accounts were deleted for other unspecified reasons.
While TikTok did not break down country-specific figures for account removals, Nigeria, being among the top violators, is expected to have had a sizable share of the deletions.
The company said fake accounts are often used for artificial engagement, misinformation, or scams. They stayed that it poses a major threat to the platform’s credibility. Similarly, underage users are removed to ensure compliance with child safety regulations and prevent exposure to potentially harmful content.
“We remain vigilant in our efforts to detect external threats and safeguard the platform from fake accounts and engagement,” TikTok stated. “These threats persistently probe and attack our systems, leading to occasional fluctuations in the reported metrics within these areas.”
With over 2.4 million videos already removed in Q4 2024, and countless accounts deleted, TikTok’s moderation efforts in Nigeria are far from over. Whether these actions will be enough to appease regulators and concerned parents remains to be seen.