Case against Facebook an opportunity to improve working standards

Content moderators ‘suffering injuries’ to protect internet users from harmful material

Monika Bickert, head of global policy management at Facebook:  ‘These jobs can be very challenging. That goes for the full-time employees as well as the contractors.’ Photograph: Mark Wilson/Getty Images
Monika Bickert, head of global policy management at Facebook: ‘These jobs can be very challenging. That goes for the full-time employees as well as the contractors.’ Photograph: Mark Wilson/Getty Images

The case against Facebook being pursued in the Irish High Court by a content moderator is being seen as an opportunity to improve international working standards at the internet giants.

Cori Crider, director of Foxglove, a UK-based not-for-profit group supporting the Irish case against Facebook, said moderators employed directly by Facebook or contracted were “the guardians of our public conversation”.

“In a few years’ time we are going to look back on these conditions and see them the way that we now see early unsafe factory work in a steel mill or a meat packing plant in the early 20th century,” she said.

“There may have been a time that Facebook and YouTube and other big social media companies did not realise that this system of work posed an unacceptable level of PTSD but I think that time has long since passed.”

READ MORE

Amnesty International has said there is a “serious human rights concern” about the work done by content moderators for international tech companies.

Human beings can't be treated as expendable in the quest for a cleaner internet

“If people are suffering mental health injuries as a foreseeable result of the work they do, the type of content they’re exposed to, and the conditions in which they’re exposed to that content, that is a serious human rights concern,” said Joshua Franco, a senior research adviser at Amnesty Tech, the organisation’s technology and human rights arm, who has spoken to a number of content moderators in Ireland and internationally.

‘Suffering injuries’

“It’s a necessary job that they’re doing to protect other users of the internet, but they’re suffering injuries as a result. There has been reporting on this issue in various contexts around the world for years now, so we think this is an issue that companies and the sub-contractors who work for those companies should be aware of, and should be taking steps to mitigate. Human beings can’t be treated as expendable in the quest for a cleaner internet.”

Crider believes the legal action taken in Dublin could help make the work practices of social media companies compliant with the EU safety standards for their employees across Europe and further afield, given that Facebook’s offices in Dublin are responsible for Europe, the Middle East and Africa.

She said she has spoken to some former Facebook employees made ill by as little as 10 to 15 hours of exposure to the images they witnessed as part of their work.

At last month’s Seanad hearings of the International Grand Committee on Disinformation and “Fake News” – a group comprising parliamentarians from 10 countries – a senior Facebook executive acknowledged that the work of moderators was difficult.

A company that consistently posts profit margins of about 40 per cent an afford to treat its workers better

“These jobs can be very challenging. That goes for the full-time employees as well as the contractors,” Monika Bickert, head of global policy management at Facebook, said, responding to a question from Fianna Fáil TD James Lawless.

The company, she said, had a major focus on providing resources for employees and contractors for this work.

Expensive

Ms Crider conceded it would be expensive for Facebook “to fix” its “factory floor” but that it had to accept that it depends on these moderators to protect the integrity of the social network and that it also had to acknowledge that it had “a systematic problem on their hands”.

“I do not deny that fixing it will be expensive but a company that consistently posts profit margins of about 40 per cent and feels that it can expand into cryptocurrency can afford to treat its workers better,” she said.

A spokeswoman for Facebook said it recognised “reviewing certain types of content can sometimes be difficult”, but that it provided extensive training and full-time support to moderators. It also employs technical solutions to limit their exposure to graphic material as much as possible.

“This is an important issue and we are committed to getting this right,” said the spokeswoman.