TikTok is being threatened with authorized motion over cuts to its UK on-line security groups.
In August, the social media firm introduced more than 400 workers would lose their jobs, with AI changing a few of the staff and different jobs being rehired overseas.
TikTok is being accused of threatening these security staff with redundancy days earlier than they had been attributable to vote on forming a union.
Learn extra: TikTok moderators warn users may be at risk
Now, two moderators have despatched a authorized letter to TikTok laying out the phrases of a possible authorized case on grounds of illegal detriment and automated unfair dismissal.
Illegal detriment is when an employer treats a employee unfairly as a result of they used a protected employment proper, for instance, being a union consultant, asking for versatile working or whistleblowing concerning the firm.
“In June, TikTok mentioned it was going to rent a whole lot extra content material moderators, then two months later, they fired everybody,” mentioned Stella Caram, head of authorized at Foxglove, a non-profit supporting the moderators.
“What modified? Employees exercised their authorized proper to attempt to type a commerce union. That is apparent, blatant and illegal union-busting,” she mentioned.
TikTok has been given one month to answer the authorized declare.
A TikTok spokesperson mentioned: “We as soon as once more strongly reject this baseless declare.
“These modifications had been a part of a wider world reorganisation, as we evolve our world working mannequin for Belief and Security with the advantage of technological developments to proceed maximising security for our customers.”
The 2 moderators launching the case are working with the United Tech & Allied Employees (UTAW), the non-profit authorized organisation Foxglove and regulation agency Leigh Day.
In unique interviews final month, three whistleblowers told Sky News the cuts would put UK users at risk, a declare repeated by Julio Miguel Franco, one of many moderators behind the authorized motion.
“TikTok wants to inform the reality,” he mentioned.
“When it says AI can do our job of maintaining individuals protected on TikTok, it is aware of that is garbage.
“As a substitute, they need to steal our jobs and ship them to different nations the place they will pay individuals much less and deal with them worse. The tip result’s TikTok turns into much less protected for everybody.”
Learn extra on social media:
Online sleuths and fake news: The world of missing people
Parents of sextortion victim who took his own life sue Meta
Inside paperwork seen by Sky Information present that TikTok deliberate to maintain its human moderators in London for not less than the remainder of 2025.
The paperwork lay out the rising want for devoted moderators due to the rising quantity and complexity of moderation.
TikTok’s head of governance, Ali Regulation, additionally informed MPs in February that “human moderators … have to make use of their nuance, abilities and coaching” to have the ability to reasonable hateful behaviour, misinformation and deceptive data.
After a sequence of letters between TikTok and MPs, Dame Chi Onwurah, chair of the science and know-how choose committee, mentioned she was “deeply” involved concerning the cuts.
“There’s a actual danger to the lives of TikTok customers,” she mentioned.
Final month, in an exclusive sitdown with Sky News, nonetheless, Mr Regulation mentioned consumer security wouldn’t be compromised.
“We set a excessive benchmark in the case of rolling out new moderation know-how.
“Particularly, we guarantee that we fulfill ourselves that the output of present moderation processes is both matched or exceeded by something that we’re doing on a brand new foundation.
“We additionally be sure that the modifications are launched on a gradual foundation with human oversight in order that if there is not a stage of supply according to what we anticipate, we are able to handle that.”











