MPs Urged to Stop TikTok Sacrificing Online Safety with Mass AI-driven Job Cuts

Unions and some of the UK’s highest profile online safety campaigners have today called on MPs to urgently investigate a proposed wave of over 400 job cuts from TikTok’s London office.

The redundancies are targeted at the ‘Trust and Safety Team’, which is responsible for protecting users and communities from harmful online content - including deep fakes, toxicity and abuse.

In an open letter to Chi Onwurah MP, Chair of the Science, Innovation and Technology Committee, online safety campaigners along with TUC General Secretary Paul Nowak and CWU General Secretary Dave Ward say the committee must investigate and examine the implications for UK online safety and workers’ rights.

Letter signatories – which include prominent online safety campaigners such as Ian Russell, Adele Zeynep Walton and Alice Hendy MBE – warn that up to 30 million TikTok users (including an estimated 1 million children under the age of 13) are at risk without safety-critical staff working in content moderation.

Threat to online safety

On the threat to online safety, the letter warns:

“Every single redundancy is targeted at the ‘Trust and Safety Team’, effectively ending content moderation in London - with similar cuts to human moderation happening worldwide.

“These safety-critical workers are the frontline of protecting users and communities from deep fakes, toxicity and abuse.

“The UK has 30 million TikTok users, of which well over 1 million are estimated by the official regulator to be children under 13 - despite TikTok’s own rules stating that 13 is the minimum age to create an account. “TikTok is already subject to an investigation by the Information Commissioner’s Office for misuse of children’s data. “And now it is looking to replace skilled UK workers with unproven AI-driven content moderation and with workers in places like Kenya or the Philippines who are subject to gruelling conditions, poverty pay and precarity as they toil for Big Tech’s billionaires.”

The CWU say over 400 job cuts were announced just eight days before workers were scheduled to vote on union recognition with the United Tech and Allied Workers – with workers seeking job security and protection for whistleblowing on unethical practices.

Accusing TikTok of “cutting corners” at the cost of workers’ rights, user safety and the integrity of online information, signatories say TikTok must reverse the planned Trust and Safety redundancies and respect workers’ rights to organise.

The TUC says replacing UK jobs with AI and offshoring to countries where workers have less rights is “reprehensible”. The letter points out that there is “no business case” for the redundancies.

“Union busting” tactics

On TikTok’s “union busting”, the union leaders and other letter signatories say:

“The mass UK job cuts were announced just eight days before workers were scheduled to vote on union recognition with the United Tech and Allied Workers, the tech-focused branch of the Communication Workers Union. These workers sought a union for job security and protection for whistleblowing on unethical practices.

“There is no proper business case for making these redundancies. TikTok’s revenues are booming – with a 40% increase for the UK and Europe alone. “Yet the company has decided to cut corners. We believe this decision is an act of union-busting - at the cost of workers’ rights, user safety and the integrity of online information.”

TUC General Secretary Paul Nowak said: “Replacing more than 400 safety-critical UK jobs with AI and smaller numbers of low-paid workers abroad is reprehensible. These devastating cuts will put millions of Brits – many of them children – at risk of accessing harmful content online.

“Select Committee MPs should now investigate the impacts for workers’ rights, user safety and the integrity of online information.”

Online safety campaigner Adele Zeynep Walton said: “At a time when the majority of British children have seen harmful content online, TikTok’s decision to cut safety staff will cost young people their lives. TikTok’s algorithm has consistently been found to amplify dangerous content that promotes suicide and self-harm to child accounts, and this decision would only exacerbate this crisis. “As someone bereaved by online harms, I hear daily of new tragedies that could have been prevented if social media platforms put user safety over profit. Rather than following in the reckless footsteps of Meta and X, TikTok ought to prioritise user safety, if it genuinely cares about its community’s wellbeing.”

A TikTok content moderator said: “TikTok’s decision to drastically cut human moderation comes at the expense of our jobs and your safety.

“We take pride in working to make the internet a safe place for everyone and have serious concerns about TikTok’s cost-cutting and offshoring, which we know will have a huge impact on the platform’s safety.

“Despite TikTok’s best efforts to smash the union and silence our voices, they won’t succeed.

“We know organising is the only way to hold TikTok to account, protect jobs, and make the internet a better place.”

Calls for action

The signatories call on TikTok to:

  • Reverse the planned Trust and Safety redundancies in London
  • Maintain robust human-led content moderation standards
  • Respect workers' rights to organise without interference

And ask Chi Onwurah MP to:

  • Investigate these developments through the Science, Innovation and Technology committee, inviting union representatives to provide evidence regarding the importance of human moderation and regarding TikTok's union-busting.
  • Examine the implications of TikTok’s actions for UK online safety and workers’ rights and investigate what legislative steps could be taken to prevent offshoring or replacing human moderators with AI.

Our Sponsors and Supporters

PR

MPs Urged to Stop TikTok Sacrificing Online Safety with Mass AI-driven Job Cuts

Video

Together for Change: The West Ham United Foundation & R;pple Tournament 2025

PR

R;pple Joins The Royal Foundation's National Suicide Prevention Network as Innovation Partner