Liveblogged notes from the “Fake News” event at City University, co-organised by The Media Society and the Student Publication Association. Prone to error, inaccuracy, horrible typos and screaming crimes against grammar and syntax. Post will be improved over the next 48 hours.
Jonathan Hewett, our chair, isn’t keen on the phrase “fake news”. What are we actually talking about? Intentionally produced misinformation, that’s designed to be propagated.
Alastair Reid, ex-First Draft News, now independent digital journalist
First Draft was initially set up to deal with user generated content from breaking news events – and the verification of it. But over the last 18 months the rapid growth of misinformation and disinformation has shifted that focus.
Megan Lucero, the Bureau of Investigative Journalism
That phrase is not one she’ll be using. It’s being to delegitimise the press, and so we shouldn’t use it. Yes, a small sliver of it is bad journalism, but there’s far more misinformation, propaganda and so on. There’s a glut of information – we need more journalists to sift through it.
— Richard Evans (@richardwgevans) March 1, 2017
James Ball, special correspondent, Buzzfeed
He’s writing a book on bullshit and “fake news”. The web is full of hoax sites full of deceptive news. It’s the pantomime villain behind this era of bullshit. You can make a distinction between fake news and hyperpartisan news, or accurate stories spun out of all proportion. Lots of people on the centre left are sharing stories about “Trump’s secret plan”, the same sort of conspiracy theory thinking that we criticise the far right for. But this isn’t going to be solved by Google withdrawing ads, or Facebook tagging content.
— Interhacktives (@Interhacktives) March 1, 2017
How did we get here?
AR: Social media has a huge role in that. Anyone can publish anything they want to. The barriers to reaching thousands or millions of people have collapsed. Five years ago everyone was hailing the Arab Spring as a wonderful example of social media use. But that freedom has now been weaponised by those with an agenda. 15 years ago blogging was being talked about in the same way.
ML: It’s exploited a very beautiful part of humanity: trust and integrity. People trust what they read. Your world view was the paper you subscribed to. We have to change that. We need to critically assess what we read online.The business model problem
JB: The business models of news work against accurate news, and incentivise inaccurate news. The example of the woman striking back in a van driver in a video is a perfect example: viral content bought by an agency, without anyone checking that it’s true. The Sun started poking into it, and found some evidence it was staged, but by then everyone had made their money – money they wouldn’t have made if they paused to check it.
At the far end of that business model is people who have no content creation costs – because they make it up and push it out to millions. An extreme partisan site like The Canary has an incentive to go extreme on a story – and the “mainstream media won’t tell you” is a real hook. The key ingredients for them are “bombshell”, “evil Tories” and “Laura Kuntzberg/BBC covering it up”. We can’t ignore the economics of that.
James Ball makes a good point – we have to re-examine the internet's business model. Cites the recent cyclist video. #cityfakenews
— Keumars Afifi-Sabet (@Keumars) March 1, 2017
JH: Sensationalism doesn’t necessarily mean inaccurate. It’s a way of selling a story.
ML: Language is powerful. We know that. Some of those words can grab people.
AR: The current business model just rewards clicks and sensationalism, so encourages this. What would a business designed for trust look like? A paywall limits access to good information to those able to pay for it.
JB: “Sponsored Links” or “More from the web” content promotion links essentially push false stories, or very, very low quality information. They’re important parts of many sites’ business models – but they feed these sketchy sites. Buzzfeed don’t run display ads. Their viral content teams follow the same editorial standards as everyone else. They run debunking posts as well as reporting posts. Sponsored content pays for this. The idea is to figure out what goes viral but stays trusted. The headlines do what they say.
AR: There aren’t clear enough flags in things like the Facebook news feed for us to see what site the story actually comes from. It’s hard to differentiate between a trusted news site and a nonsense social publisher.
— Ryan Watts (@ryanleewatts) March 1, 2017
ML: How do we help people determine what is trusted and what isn’t? One researcher created a list, but it had Private Eye and The Onion on it. So satire is a problem? No. The great thing about the internet is it is a level playing field, and anyone can build an audience. Making a list of the “good” sites goes against that.
— Sam Neve (@SJNeve) March 1, 2017
JB: The New Yorker ended up on one of those lists because it has a satire column. Letting lots more people into the conversation has as many great upsides in bring new journalistic voices, as well as the problems. Being boring, it comes back to education of people.
— Jem Collins (@Jem_Collins) March 1, 2017
Sponsored content and trust
JH: Does sponsored content erode trust?
JB: I came from The Guardian which is known for being not-for-profit… It can be very pious about it. Buzzfeed works differently. Commercial watches what works well for editorial, and then does a version of that with a sponsored section at the end. If that’s not defensible, no advertising is.
Readers are probably less aware of this stuff then we think. About ⅓ os users can’t differentiate the paid and organic results in Google. So do they understand what a Telegraph supplement sponsored by China is?
AR: It’s very hard to qualify the effect of fake news on voting. Most of the studies have looked at articles, and ignored the memes and viral images. The Britain First Facebook page is pushing out issue-based memes all the times, and has more followers than the main political parties put together.
— Jane Singer (@JaneBSinger) March 1, 2017
Media literacy and social reinforcement
JH: Should we be doing more on media literacy? Should people pay more attention to what they share?
AR: Social media is designed for peer to peer sharing of visual information. We are more likely to believe information from peers and information that is communicated visually. And the feedback mechanisms are all positive reinforcement, not negative reinforcement. People who try to correct often get attacked for “pushing mainstream media propaganda”.
JB: And some of the fixes can make it it worse. Try posting something to Facebook that your friends will disagree with – and enjoy the next two hours being screamed at by your friends – even if it’s not true. We fetishise fact checks – but you only look for them when you hear something you don’t like. Face checking culture is also really easy to hijack. The “Bowling Green massacre” fact check, led to Sean Spicer releasing a list of 78 “underreported” attacks, which were then triumphantly fact-checked by the newspapers just brought the Trump agenda to a bigger audience. They got played.
ML: We can’t regulate people’s personal lives – but we can hold politicians to account. We can disincentivise lying from politicians by making sure there are political consequences for it. We need to do this, because we don’t want a completely corrupt government system. Journalists need to claim this back. There’s a real gap at the local level with the decline of local newspapers – so how to we hold local politicians to account before they get to national level?
Is this symptomatic of a lack of trust in institutions?
JB: It’s not a co-incidence that partisan politics and hyper-partisan news have risen at the same time. There’s a lack of confidence in our institutions. But it’s not as simple as one creating the other. We need to break the cycle. The Remain campaign was a perfect example of how not to campaign: three people from different parties didn’t show diverse voices saying “remain” – it showed the establishment.
ML: Same with the Clinton campaign.
AR: Social media promotes content that makes you feel good, so it’s easier to retreat into your comfort bubble.
ML: I’m reluctant to go too far down this line – only a slither of this is bad journalism.
Is this just growing pains of a new medium?
JB: I agree completely. It’s really easy for national newspaper journalists to write for their peers and awards, and not encounter the general public. Now that gap is gone. We can’t hide from what people don’t understand – and we can’t hide when we’re wrong and being told how wrong we are. Yes, it’s a growing pain – but that doesn’t mean we shouldn’t take effort to address it.
ML: At some point people are going to decide who they trust. Eventually all the stuff flinging at us won’t stick, and the good stuff will.
Does human edited content have a role? Twitter has Moments, Facebook took stick for removing this from the loop. Is this a solution?
AR: This has been an on-going conversation. But then the question comes up: who is doing it? What’s their background? And the networks doing this tends to be far more opaque. There’s a big role for everyone, from media literacy in schools, through the networks, to us being more transparent about how we know what we know. Linking, for god’s sake. Just linking to where we got things from. If corrections are tiny or not seen at all, then there’s no trust. If Kellyanne Conway apologise for her mistakes, and we don’t – who will trust us.
JB: Facebook’s human curation of trending was accused of suppressing right-wing content.
Being informed is a process – and the average person isn’t an investigate reporter. It’s people waiting for an American in Costa flipping through their feeds and sharing. What does an investigative workflow for people look like?
ML: Well, for one, people could start reading article before the share it. We shouldn’t regulate the internet. We’re still teaching for an old era, not for the modern era. We need to teach people to critically assess the site, the source – and look for a second source. What is the URL? Are their sourcing links? Critically assess.
AR: I’d love to see the social networks run critical thinking house ads.
JB: This is way bigger than social media. There’s no bigger filter bubble than buying one newspaper or watching one news channel.
An attendee pointed out that the audience of the US shock jocks is bigger than all other media…
AR: The New York Times and Washington Post are doing tremendous job. They’re getting themselves out there in front of that – their new mottos drive home the value of journalism. Too many places assume that the public like and trust us. There’s a real need to go out and make a positive case for journalism.
JB: I can’t talk too much about the Trump dossier as I didn’t work on it. Ben Smith has talked really well about it. There’s a real danger of pulling things into “fake news” because we don’t agree with them. The dossier is real and was being debated as to its veracity.
ML: Polls aren’t great for data, as we’ve seen. But data journalism is crucial for the debate we’re having. We need to use data in every facet of our decision-making. You’re producing data about yourself on your smartphone – and people are weaponising that and using its against you. Classic journalism doesn’t help you navigate through that – but data journalism does.