Site icon TechMoran

580,000 Kenyan Videos Pulled Down by TikTok for Rules Violation

TikTok has revealed that it removed more than 580,000 videos in Kenya between July and September 2025 for breaching its content rules, underscoring the scale of moderation on one of the country’s most widely used social media platforms.

The figures, published in the company’s latest enforcement report, come at a moment when concerns over privacy, consent and online safety are intensifying.

The disclosure follows days of online uproar in Kenya over a Russian content creator accused of secretly recording encounters with women and posting the clips on social media platforms, including TikTok and YouTube.

Although there has been no official confirmation, many social media users speculated that smart glasses may have been used to film women in public spaces without their knowledge or clear consent.

The controversy has reignited debate about whether platforms are moving quickly enough to detect and remove harmful or exploitative material.

Smart glasses are capable of capturing photos and video hands-free. Meta, which manufactures one such product, says its glasses display an LED light to signal when recording is taking place and that its policies prohibit harassment or privacy violations.

However, privacy advocates argue that public awareness of such indicators remains limited.

Against this backdrop, TikTok said that 99.7% of the videos it removed in Kenya during the quarter were taken down before they were reported by users.

Furthermore, 94.6% were removed within 24 hours of being posted.

In addition to video removals, the company said it interrupted about 90,000 live sessions in Kenya over the same period for violating its content policies. That figure represents roughly 1% of all livestreams in the country during those three months.

Globally, TikTok reported removing 204.5 million videos between July and September, equivalent to around 0.7% of total uploads.

According to the company, 99.3% of those removals were proactive, while nearly 95% were taken down within a day.

Automated systems were responsible for 91% of the removals worldwide.

The report also states that more than 118 million fake accounts were deleted, alongside over 22 million accounts suspected of belonging to users under the age of 13.

Meanwhile, legal experts say the Kenyan controversy highlights gaps in how platforms handle covert recording.

In response to mounting scrutiny, TikTok says its moderation efforts rely on a combination of automated detection tools and human reviewers to tackle harmful content, including harassment and misinformation.

The company also says it has expanded wellbeing features aimed at helping users — particularly teenagers — manage screen time and build healthier digital habits.

Nevertheless, as new recording technologies become more discreet and accessible, questions remain over whether enforcement systems can keep pace with emerging forms of online abuse.

Exit mobile version