Trump, QAnon and the limits of the “viral” metaphor to describe online communities

Deplatforming Trump and QAnon won’t kill the movement. But it might just stop it growing as fast.

Adam Tinworth
Adam Tinworth

The last few months have made it very plain that there are limits to how far the “viral” metaphor will go online. The best part of a year living with a true viral pandemic has reminded us what viral really means. And last week’s assault on the US Capitol makes it very plain that misinformation is not a virus, at leas, or in the way that we now all understand.

Yes, when Richard Dawkins coined the phrase “meme” and described his concept of viral spread of ideas, he was certainly on to something. Ideas do spread like virus, incubating in hosts, and then spreading to others. But the metaphor is not exact.

Unlike viruses, ideas that spread viral create communities and that’s where their true danger lies. A biological viral infection is something your immune system fights off, and you live, or it fails to fight it off, and you die. A mimetic infection lodges in your brain, and binds you to others, in a community, with the memetic ideas as the social object you bind around. And it’s harder to cure a community than it is to cure an idea.

It’s hard to kill a community

Look, nobody really needs a middle-aged Brit’s take on the Capitol riots. But in the billions of pixels of writing around it that the last week has generated, I have noted one very distinct trend: to attribute the situation to a mix of misinformation, social media and Trump. This is an information channel problem, that we must fix. That approach is very much a continuation of the narrative of information flows that’s developed around misinformation and polarisation: the information, and those who generate it, are responsible for the problem. Cut off the information flow, and the problem goes away, right?

Well, no.

QAnon is now a community, and that’s what will make it so very difficult to stop. By booting these people off the mainstream platforms, you do take away reach: you go some way to slowing the spread of the memetic infection. But you don’t do anything to cure the infection, nor do you take away the ability of QAnon advocates to gather. They don’t need the mainstream platforms for that.

You see, the QAnon movement wasn’t born on any of those networks: it was born on the 4chan anonymous message board. The central figure of “Q” — who may actually be a group of people — started posting there, and that’s where the cult began. The community formed around the Q drops, and has persisted as repeated and focused efforts to shut 4chan — and its more offensive spin off 8chan — have succeeded. Once a community has formed, it can move. It is not dependent on the host platform. Qanon will continue, in forums, on message board, on Discord servers, in messaging app chats: anywhere people can build a community online is where the community can thrive.

Misinformation ideology as social object

The idea of Q and the war against the corrupt elite that Q philosophy espouses is a social object that the community has gathered around. That community has forged bonds and relationships that will persist during the clampdown on their activities, and which might actually be strengthened — that sense of shared oppression can bring people together in compelling way.

In any reasonable estimation, cutting off Trump and the most egregious QAnon groups from mainstream platforms merely retards the growth of the movement, rather than countering it. And to understand why, you need to understand the role Trump and the social media platforms play in this.

Trump

Trump himself is now merely another social object that the QAnon community gathers around. In fact, the social object is less Trump himself, and more the idea of the crusader Trump that is central to the QAnon ideology. When the real Trump strays from the boundaries of the conceptual Trump, his followers assume deception. There is a communal sense of him as their leader, even if it doesn’t match to the reality of the man. Presented with evidence to the contrary, they reject it, and seek to rationalise it away.

Social Media

Much of the mainstream debate is about the big platforms we already know: Facebook, Twitter, YouTube, et al. They have been rapidly deplatforming Trump. But they’re too late, and they might not even be doing the most effective thing.

Given, as we discussed above, the major social platforms weren’t central to the creation of Qanon, what has their role been? Recruitment and amplification. Where these platforms excel is in putting this extremist content next to much more mainstream content, and thus algorithmically greasing the tradition from mainstream political positions, to more extreme ones, without the new convert having to hunt out the more obscure places that the ideology grew.

They are engines of acceleration and growth.

So, yes, there is a value in driving these fringe ideologies off the major platforms. They slow their growth. It’s the social distancing of viral misinformation.

This is a social issue, not merely a technology one

However, we won’t really start to address the problems that led us there, until we start seeing it as a community issue, rather than as an information one. Once the misinformation is out there, and people are bonding around it, you’re too late to kill it by cutting of the information flow: the community has formed. It will generate its own information and media internally, and use that to grow instead.

To borrow from the viral metaphors again, what you have to do is isolate those communities as quickly as you can, to prevent spread of memetic infection, and then work on cleaning the infection itself. In many cases, the only way to do that is to try and help separate people from the toxic community, and help them reintegrate into more normal information spheres. And, right now, we have no idea how to do that.

Deprogramming online cultists

I suspect we have a lot to learning from people who work on cult deprogramming here. Alex Kantrowitz published an interview with cult deprogrammer Rick Alan Ross just before Christmas, that’s a valuable read in this context:

The online cult members go to the YouTube channel, the web platform of the group and they become completely consumed with the information that the group is putting out online. And this is to the exclusion of any other information to balance it. And so in a sense, they cocoon themselves online and it’s very hard to penetrate that. And it can become a reinforcing wall that is built around an individual that we don’t see, but they live in it on a daily basis.

This is the reason that the internet is so powerful at creating these movements: it’s easy to slip into self-reinforcing bubbles of like-minded people, who support and validate each other in their views. You can create worlds for yourself where you never have to face the cognitive dissonance of engaging socially (as opposed to argumentatively) with someone who holds very different views from you.

Complex problems don’t yield to simple solutions

This is a complex problem, born out of a potent brew of social isolation, technological development and intentional bad actors. And, as with most complex problems, simple solutions won’t have much of an impact. Yes, getting the most extreme beliefs off mainstream platforms will help, but it won’t cure the problem.

If anything, we need to work harder at social inoculation: what’s the human version of vaccination? How do we make people less vulnerable to conspiracy thinking and cults? At one level, a little more human kindness will help. The fewer people who feel alone and isolated, the small the recruiting pool in the early days of these movements.

Equally, we need to pay an awful lot more attention to what’s happening in out-of-the way corners of the internet. People laughed at QAnon as a weird expression of internet culture, just as they did predecessor movements like Gamergate and the Incel movements. We have form for this. We didn’t take the reporting on state-sponsored misinformation seriously enough early enough, and, despite it being well-reported in the mid 2010s, it was left to run unchecked, and stoke polarisation in many countries.

We have to get better at this. Likewise, we have to spot these movements earlier, and start acting to counter them more quickly, rather than writing yet more “ho, ho, isn’t the internet storage?” articles.

Can we nip dangerous misinformation-based ideologies in the bud?

By the time an extreme idea has become a social object, and a community has formed around it, you’re too late. The momentum is building. We need to be better at countering this in the early days. And that’s a societal problem as much as a governmental one.

But there’s also a challenge for journalism here: what are we amplifying, and why are we amplifying it? Like the social platforms we’re quick to criticise, we are amplification engines, too.

What role have we played in normalising dangerous, extreme and delusional ideas? That’s a hard and challenging question to answer. Which is what so many are keen to pin the blame solely on the social platforms. Yes, they bear culpability. But the problem is broader and deeper than them. They’re just an excellent amplifying mirror.

online communitiestrumpmisinformationSocial Media

Adam Tinworth Twitter

Adam is a lecturer, trainer and writer. He's been a blogger for over 20 years, and a journalist for more than 30. He lectures on audience strategy and engagement at City, University of London.

Comments