TikTok is being accused of “backtracking” on its security commitments, because it places tons of of moderator jobs in danger in its London workplace.
In August, the corporate introduced that hundreds of jobs were at risk in its Trust and Safety offices.
In an open letter to MPs, the Trades Union Congress (TUC) mentioned on Thursday that TikTok is “trying to exchange expert UK staff with unproven AI-driven content material moderation and with staff in locations like Kenya or the Philippines who’re topic to gruelling circumstances, poverty pay”.
Sky Information understands that quite a lot of the 400+ moderators shedding their jobs might be changed by company staff in different nations, as a part of TikTok’s efforts to streamline its belief and security operations.
TikTok’s moderators in Dublin and Berlin have additionally reported they’re liable to redundancy.
Now, the chair of the Science, Innovation and Expertise Choose Committee, Dame Chi Onwurah MP, has informed the corporate the job losses “convey into query” TikTok’s capability to guard customers from dangerous and deceptive content material.
“TikTok appear to be backtracking on statements it made solely half a 12 months in the past,” mentioned Dame Chi.
“This raises alarming questions not solely about its accountability […], but in addition about its plans to maintain customers secure.
“They need to present readability urgently and reply key questions on its modifications to its content material moderation course of, in any other case, how can now we have any confidence of their capability to correctly reasonable content material and safeguard customers?”
She set a ten November deadline for the agency to reply.
In an alternate of letters with the social media big, Dame Chi identified that as just lately as February this 12 months, TikTok’s director of public coverage and authorities, Ali Regulation, had “highlighted the significance of the work of employees to help TikTok’s [AI] moderation processes”.
Learn extra from Sky Information:
Federal Reserve cuts interest rates
Microsoft Azure outage hits thousands
Within the alternate that the committee printed on Thursday, Mr Regulation mentioned: “We reject [the committee’s] claims of their entirety, that are made with out proof.
“To be clear, the proposals which have been put ahead, each within the UK and globally, are solely designed to enhance the velocity and efficacy of our moderation processes so as to enhance security on our platform.”
A TikTok spokesperson additionally informed Sky Information: “As we specified by our letter to the committee, we strongly reject these claims.
“This reorganisation of our world working mannequin for belief and security will guarantee we maximise effectiveness and velocity in our moderation processes as we evolve this essential security perform for the corporate with the advantage of technological developments.”
Final month, TikTok moderators informed Sky Information that younger individuals within the UK could also be exposed to more harmful content if the redundancies go ahead.
“Should you communicate to most moderators, we would not let our youngsters on the app,” mentioned one moderator, who requested to stay nameless. He spoke to Sky Information at a protest outdoors the corporate’s UK headquarters.
On the time, TikTok informed Sky Information they “strongly reject these claims”.











