The news: TikTok is laying off hundreds of UK workers as it shifts more of its content moderation to artificial intelligence, according to the Wall Street Journal.
- The cuts are part of a global restructuring that also affects South and Southeast Asia, with the company claiming over 85% of takedowns already happen via automation.
- The move comes as the U.K.’s Online Safety Act ramps up pressure on platforms to curb harmful content.
Zooming out: It’s not just TikTok. Other major players are recalibrating, too: Meta and X have pulled back from fact-checking and restrictions, reflecting a wider industry pivot away from intensive human oversight.
While Pinterest and Snapchat emphasize centralized controls, Reddit moderation skews community-led. Meta, TikTok, and YouTube sit in the middle with hybrid AI- and platform-led systems.
Why it matters: In the US, most social media users (44%) prefer fact-checkers to moderate misinformation, while 36% favor community-driven notes. Only 12% want no moderation at all, per a Two Cents Insights and Rep Data survey.
- Opinions on moderation practices appear to be partisan: Trump voters are far more likely (17%) than Harris voters (6%) to oppose moderation.
- Two-thirds of adults believe thriving online communities require clear rules and active moderation, alongside interaction tools like messaging. Scalability (36%) and navigation (26%) matter, but rule-setting and participation remain the pillars of digital community health.
- While TikTok doubles down on AI, users are asking for the opposite: Less algorithmic content (44%) and stronger quality control (36%). Privacy (38%) and tools for genuine connections (31%) are also top demands.
Our take: A pullback from human moderation may reduce costs for platforms, but it risks eroding user trust—especially as people call for more human oversight, not less. The mismatch between user expectations and platform practices could influence engagement levels, ad safety, and brand risk.
For advertisers, this is a double-edged sword. Less moderation can create openings for broad reach and edgy creative, but also raises the chance of adjacency to harmful or misleading content. As consumers demand more governance, privacy, and authenticity, brands that prioritize safe, well-moderated environments may find themselves with a competitive edge in trust and loyalty.