Misinformation spreading on social media is killing people:

Over the last two months, misinformation spread through WhatsApp in India has riled up lynch mobs that killed twelve people.

People sometimes talk about the Twitter mob - and mob justice on social media is a problem - but this is palpably worse. Inevitably, tools like WhatsApp which speed human conversation can help propagate damaging information, as well as useful information. And the information appearing in a closed chat environment further separates it from context, and gives it the social weight of being passed on by a friend.

It's no wonder that this problem is popping up all over the world.

The mob justice problem

It's worth noting that this problem isn't confined to social media. Nearly 20 years ago, a paediatrician's house was attacked in Newport, after people confused her job with her being a paedophile. And this was off the back of a sustained campaign by the (now defunct) News of the World.

However, you can't escape the fact that digital messaging tools have increased the potential for this, and boosted the speed with which it can happen.

What is Facebook, WhatsApp's owner, doing about it? Ironically, it's running full page newspaper ads:

The ads begin with the headline “Together we can fight false information” and offer tips on how users can identify whether news they read via shared messages on WhatsApp is true or not. The ads will run in both English and Hindi and appear in national and regional newspapers around the country.

Here's an example:

Tactical versus strategic approaches to dealing with misinformation

This is interesting the context of an Indian effort to regulate the news market to deal with the "fake news" - misinformation and disinformation - problem. And it appears to be going badly:

The government’s attempt at regulating fake news by suspending journalistic accreditation was a spectacular failure. Journalists viewed the circular as stifling the press and it was recalled.

Essentially, I think we're still in the stage of trying tactical solutions to something that needs a strategic cultural shift to deal with. These digital tools aren't going away, so we need to figure out the new norms we need to deal with it, through a combination of research and experimentation, and then some education, to help people make smarter decisions about what they believe, as far as we can within our own cognitive biases.

The academics in India quoted above, Sohini Chatterjee and Akshat Agarwal are research fellows at Vidhi Centre for Legal Policy, have some suggestions:

  • ensure critical media literacy, with critical digital literacy as a component
  • nurture a general culture of scepticism among citizens towards information
  • in a limited set of situations, such as when there is threat to life or national security, targeted and proportionate legal interventions can be explored

Investing in research

Facebook is also doing something relatively easy for it to do, but which will be helpful: throwing money at the problem.

The WhatsApp Research Awards will provide funding for independent research proposals that are designed to be shared with WhatsApp, Facebook, and wider scholarly and policy communities. These are unrestricted monetary awards that offer investigators the freedom to deepen and extend their existing research portfolio. Applications are welcome from individuals with established experience studying online interaction and information technologies, as well as from persons seeking to expand their existing research into these areas.

The societal problems thrown up by our digital communication tools are almost certainly the single most important area for research if we want any degree of political and social stability in the coming decades.