Facebook's treatment of its content moderators: another attempt to avoid paying the costs of scale

Casey Newton has talked to many of Facebook's outsourced content moderators - and the picture is appalling.

Adam Tinworth
Adam Tinworth

The last 24 hours’ must-read piece for social media-inclined journalists has to be Casey Newton’s astonishing reporting on the working conditions of Facebook’s content moderators. While the fact that Facebook treats its moderations badly shouldn't really be a surprise, the poor conditions and trauma suffered by the staffers are still shocking:

It’s a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators’ every bathroom and prayer break; where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers; where people develop severe anxiety while still in training, and continue to struggle with trauma symptoms long after they leave; and where the counseling that Cognizant offers them ends the moment they quit — or are simply let go.

Just think about how the offices of the Silicon Valley elite are often painted: all their perks, and fun - and massive salaries. That Facebook treats its moderators this way tells you everything you need to know about how important this job is seen at a corporate level.

And the impact on the staff employed there is brutal. Not only does the piece start with the emotional impact of a woman being forced to watch and assess a video of a graphic assault, the long-term impact on the staff’s worldview is worrying:

The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”

The moderators themselves are being sucked into conspiracy theories. The people whose job it is to decide what content is against policy are coming to believe in it. This is a profound failure of Facebook’s approach, which seems to give no thought at all to managing the consequences of this work — but that’s Facebook, isn’t it? A business dedicated to avoiding taking responsibility for the consequences of its success.

Facebook’s failure to address criticism

Facebook’s rapid PR response is less than encouraging:

A lot of the recent questions are focused on ensuring the people working in these roles are treated fairly and with respect. We want to continue to hear from our content reviewers, our partners and even the media – who hold us accountable and give us the opportunity to improve. This is such an important issue, and our Global Ops leadership team has focused substantial attention on it and will continue to do so.

Well, now, even the media help hold them accountable. The need to insert the word “even”, and that it survived multiple layers of sign-off, speaks volumes about how little Facebook likes the media’s reporting of it over the last 18 months, doesn’t it? An organisation that works so very hard to get us to be transparent about all our activists really dislikes that transparency being turned back on it.

Oh, and the post spends way more time giving excuses for why this has happened than in any practical suggestions of the way forwards. In fact, it doesn’t mention addressing the vicarious trauma endured by the moderators at all.

Key takeaways

  1. This is another clear example of Facebook trying to minimise the costs of the scale it has achieved. It is using outsourced moderators, because each warm body they’re using costs them a fraction of the engineers and other anointed ones in the major offices.
  2. It’s also indicative that Facebook are still not taking their responsibilities seriously on this. They’re doing it as cheaply as they possibly can, which is shameful for such a wildly profitable company.
  3. The human cost of regular exposure to graphic material on social media is well-known in the journalism world, at the very least. Press Gazette did a good piece on it a couple of years ago, and First Draft has some useful guidelines. Facebook could stand to learn from that.
Facebookmoderationvicarious traumastressoutsourcing

Adam Tinworth Twitter

Adam is a lecturer, trainer and writer. He's been a blogger for over 20 years, and a journalist for more than 30. He lectures on audience strategy and engagement at City, University of London.

Comments