TikTok has poached hundreds of content moderators in Europe from outsourcing companies that serve social media rivals such as Facebook, as the group seeks to tackle a growing problem with harmful content.
The short-form video app, owned by China’s ByteDance, has been rapidly expanding its “trust and safety hub” in Dublin as well as hiring other moderators in London tasked with reviewing material posted by European users.
At least 190 of those to join since January 2021 previously worked through contracting companies for Accenture, Covalen and Cpl, according to an FT analysis of public LinkedIn profiles.
Meta, Facebook and Instagram’s parent company, YouTube and Twitter are known to rely heavily on these contracting companies to oversee and remove some of the platforms’ most violent and harmful content. TikTok said it hired several hundred moderators in the UK and Ireland since January last year, adding to the thousands it has in similar hubs in California and Singapore.
This month, Meta chief executive Mark Zuckerberg blamed its slowing growth on younger users fleeing Facebook and Instagram in favour of TikTok, leading to more than $220 billion being wiped off the company’s value in a day. But with TikTok’s huge growth comes the problem of dealing with the worst excesses of users, an issue that has put leading social networks in the crosshairs of politicians and regulators across the world.
“Our continuous investment in our trust and safety operations reflects our focus on maintaining TikTok as a place for creativity and entertainment,” said Cormac Keenan, global head of Trust and Safety at TikTok.
Better benefits
The push meant TikTok’s European workforce rose more than 1,000 in 2020, when the company’s turnover in the region grew 545 per cent to $170.8 million. But according to UK Companies House filings, pre-tax losses widened fourfold to $644.3 million, “driven primarily by the increase in employees to support the growth of the business [in Europe]”.
TikTok’s strategy has been to offer moderators in-house positions with better salaries and benefits in order to lure experienced staff from the same limited talent pool as Facebook, Instagram, YouTube, and Snap.
Recruits often speak multiple languages and have experience in content moderation, according to people with direct knowledge of the hiring process. The company said languages were a “key consideration for prospective candidates”.
“I chose TikTok because the benefits are better, the environment is better, and the company values every member,” said one TikTok employee who joined last year from Accenture. “It was better for my career and I wanted to be able to work from home, which was a battle at Accenture.”
Another content moderator who moved from YouTube to TikTok said the levels of disturbing content in the job were similar, but that psychological support was better at TikTok.
Accenture, Cpl and YouTube did not respond to requests for comment and Covalen declined to comment.
Candie Frazier, a former content moderator in California, is suing TikTok. She claims the company failed to protect her mental health after viewing extreme and violent videos. The company said it does not comment on ongoing litigation but has continued to expand on a range of wellness services to support moderators.
Facebook previously agreed to pay $52 million to a group of thousands of US moderators who claimed they were left traumatised after watching disturbing content on the platform.
Meta said it offers wellbeing training and support for internal and contracted content moderators, breakout areas for reviewers to step away from their desks if needed and technology that ensures reviewers are not exposed to potentially graphic content back-to-back for long periods of time.
Meta last month revealed its monthly active users to its services dropped for the first time to 2.9 billion. TikTok has more than 1 billion monthly active users, bringing it in line with Instagram and above Snap, which has more than 500 million. – Copyright The Financial Times Limited 2022