Coronavirus: the social media moderation problem explained

As the social platforms empty their offices of staff, content moderation is taking a serious hit. Here's why.

Adam Tinworth
Adam Tinworth

Why are we seeing spikes in Facebook content marked as spam, or YouTube videos being pulled down?

The Covid-19 pandemic is leading to a worldwide period of social distancing, with as many people as possible asked to work from home. This helps slow the spread of the disease, hopefully easing the pressure on health services.

That means staff from the social networks are being sent home, including content moderators. Most content moderators work in call centre-like facilities where social distancing is not feasible.

Because, in many cases, the content moderators cannot work from home, the platforms are relying on machine learning-trained algorithms to police content.

Unfortunately, as we learned on 17th March, these systems can be buggy and make serious mistakes.

Why can't the moderators work from home?

There are very serious privacy concerns about letting people working from home have access to other people's private posts. Remember the fuss about contractors having access to people's recordings from voice speakers? Very similar concerns apply here.

As a result, the moderation systems are built on the assumption that the moderators will be coming into work. They are locked down and monitored so that staff can't share things they see in people's private accounts. It appears that these systems have been built without remote access in mind, because the privacy concerns and lack of ability to supervise moderators made it seem like too much work for too little reward.  

So, why are the automated systems struggling?

This is a really hard problem to solve.

The systems have to recognise patterns of misleading content and pull them down. That requires a whole bunch of insight into how people phrase things, about which sites can be relied on, and so on. And some of those are judgement calls. That's why a combination of both AI tools and humans work so well. The AI handles the easy stuff, allowing humans to deal with the rest.

In the absence of the human element, the idea seems to be to turn up the strength of the AI element, risking false positives in the process. As Google said about YouTube:

As we do this, users and creators may see increased video removals, including some videos that may not violate policies. We won’t issue strikes on this content except in cases where we have high confidence that it’s violative. If creators think that their content was removed in error, they can appeal the decision and our teams will take a look. However, note that our workforce precautions will also result in delayed appeal reviews.

So, even when the systems are functioning well, we should expect to see more false positives and more misinformation slipping through. The general quality of moderation will go down.

Any good news?

Yes, actually. The big social media platforms are working together to try to co-ordinate coronavirus and Covid-19 misinformation response:

Further reading

Social media giants warn of AI moderation errors as coronavirus empties offices
Alphabet Inc’s YouTube, Facebook Inc and Twitter Inc warned on Monday that more videos and other content could be erroneously removed for policy violations, as the companies empty offices and rely on automated takedown software during the coronavirus pandemic.
Coronavirus and the emergency in content moderation
Today, let’s talk about some of the front-line workers at Facebook and Google working on the pandemic: the content moderators who keep the site running day in and day out. Like most stories about content moderators, it’s a tale about difficult tradeoffs. And actions taken over the past few days by F…
content moderationSocial MediaFacebookGoogleYouTube

Adam Tinworth Twitter

Adam is a digital journalism lecturer, trainer and writer. He's been a blogger for over 20 years, a journalist for 30 and teaches audience strategy and engagement at City St George’s, London.

Comments