A caravan of bots, and other news from the misinformation wars

Despite the efforts of the social platforms, the bots are getting more active — and more subtle.

Adam Tinworth
Adam Tinworth

fascinating reporting from WIRED about bot usage to amplify the caravan story before the USA midterm elections:

Late last week, about 60 percent of the conversation was driven by likely bots. Over the weekend, even as the conversation about the caravan was overshadowed by more recent tragedies, bots were still driving nearly 40 percent of the caravan conversation on Twitter.

Buzzfeed also pulled together a good round-up of the current research into bot activity. One interesting nugget:

The study also suggests that the Internet Research Agency, which runs the Kremlin’s online disinformation efforts from its headquarters in St Petersburg, had to work harder to engage Americans with content designed to appeal to partisans on the left. The Microsoft Research team found that purchased Facebook ads were crucial for driving traffic spikes for the Internet Research Agency’s left-wing content, whereas right-wing posts that took off went viral without this assistance.

I suspect you might see different patterns if you applied the same methodology to the UK left and right, but it's interesting nonetheless.

Also interesting — if slightly more predictable — is that the misinformation teams are adapting their behaviours to try and avoid detection by the big platforms, who are now actively hunting for these sorts of accounts:

“We’ve done a lot research on fake news and people are getting better at figuring out what it is, so it's become less effective as a tactic,” said Priscilla Moriuchi, a former National Security Agency official who is now a threat analyst at the cybersecurity firm Recorded Future threat manager.
Instead, Russian accounts have been amplifying stories and internet "memes" that initially came from the U.S. far left or far right. Such postings seem more authentic, are harder to identify as foreign, and are easier to produce than made-up stories.

So, rather than create stories wholesale, they're looking for the most useful content created by native partisans, and giving it an extra push towards virality. We're in the early stages of an arms race between the misinformation operatives and the social platforms. It will probably escalate fast over the next two years.

fake newsrussiainternet research agencybotsmisinformationsocial media manipulation

Adam Tinworth Twitter

Adam is a lecturer, trainer and writer. He's been a blogger for over 20 years, and a journalist for more than 30. He lectures on audience strategy and engagement at City, University of London.

Comments