TikTok has released its Q2 2024 Community Guidelines Enforcement report, shedding light on its content moderation efforts in Kenya.
This follows growing scrutiny after a petition was submitted to Parliament last year, calling for the platform’s ban due to concerns about the spread of inappropriate content.
Why It Matters
• Government Scrutiny: TikTok’s content moderation efforts have been under the spotlight in Kenya, where lawmakers expressed concerns about harmful content circulating on the platform. The petition to ban TikTok was rejected in September 2023, but the Kenyan Parliament urged the platform to ramp up its moderation efforts.
• Transparency and Accountability: TikTok’s latest report aims to provide more transparency, showing that over 360,000 videos were removed in Kenya for violating its policies during Q2 2024. The proactive approach in moderating content reflects the company’s commitment to improving user safety and meeting regulatory expectations.
Details
• Video Removals: 360,000+ videos were removed in Kenya, accounting for 0.3% of all uploads during the period. Of these, 99.1% were flagged and taken down before users could report them.
• Account Suspensions: TikTok also suspended 60,465 accounts for policy violations, with a significant portion (57,262) removed for suspected users under the age of 13, in line with the platform’s protection policies for younger audiences.
• Global Reach: Globally, TikTok removed over 178 million videos in June 2024, with 144 million of these taken down automatically. This demonstrates the platform’s reliance on AI-powered moderation tools to detect and remove harmful content quickly, often before it reaches users.
What’s Next
TikTok continues to invest in advanced AI moderation technologies, with a global proactive detection rate of 98.2%. This technology is expected to play a pivotal role in further reducing harmful content across the platform, improving the user experience, and aligning with regulatory demands.
Source: Nairobi News