The need for intelligent skepticism, proved through links
Can you trust your commenters? Can you trust your transcription service? In the digital world, here be dragons…
Curate your comments — or don't bother
This is a really excellent piece exploring some of the issues around maintaining the value of reader comments without exposing you and your community to the most vile problems that can emerge. I've shifted position on this over the years, from thinking comments should be the default, to believing that you should only open comments if you're prepared to invest time and people in managing them properly.
If you want an example of how toxic comments can be, look at the responses on any Spectator article that strays from a conventional right wing position.
This article, though targeted at the US election, is applicable much more widely.
Beware the Otter (AI)
This tweet by Buzzfeed's misinformation expert Jane Lytvynenko is worth noting:
I've used Otter for transcribing videos and talks, and found it valuable. But for anyone dealing with confidential sources, this should raise significant concerns.
Facebook finally acts
Well, now, these are interesting, when put together:
For so much activity to happen in just a few days, and targeting the one end of the political spectrum that Facebook has tended to be hands-off with, is telling. Something is in the air, and Facebook is shifting to protect itself.
As Alex Hern put it in his newsletter:
I think Mark Zuckerberg made a calculated gamble, that staying on the right side of the Republican Party would protect Facebook from suffering at the hands of regulation it did not want. And I think the company is now starting to make a similar calculation: that it needs to begin severing those links.
While you should never count your presidential chickens, Biden has been consistently ahead in polling in the US. If he wins, the easy ride that Facebook has given the less reality-based cohort of the US right might come back to haunt it.
And if Hern is right, and I rather suspect he is, then this should concern us all. A single company is using its power to manipulate the political landscape in its favour, through allowing the spread of outright propaganda. This is… troubling.
Casey Newton proposes an alternative take, that Facebook is acting now to head off the worst problems likely to arise when the election result is announced, and most likely contested.
I don't think we have to choose between these explanations. Both are probably aspects of the same revised approach.
Quick Links
- 💃🏻 TikTok now #2 most popular app amongst US teens. As Charles Arthur points out: “TikTok’s algorithm, though, is clearly the first wave of an entirely new sort of app: it generates its network purely from what you’re interested in, not who you know.”
- 🕵🏼♂️ Talking of TikTok, they've formed a fact-checking partnership with AFP in the Asia-Pacific region. (Thanks to Corinne for that link.)
- 🇺🇸 Talking of fact-checking, First Draft have launched a US election misinformation dashboard. It's lovely.
- 🗞 A worthwhile piece by Rasmus Nielsen giving context for the current state of news journalism. Nothing new or surprising in there, but it's well-articulated and a useful summary — which betrays its roots as an opening keynote for a conference.
Photo Chaser
We had the most gorgeous sunset yesterday evening:
Sign up for e-mail updates
Join the newsletter to receive the latest posts in your inbox.
Clickbait, comments and screen apnea Paid Members Public
The complexities of clickbait, the problem of scale in comments — and why you're breathing wrong when you're staring at a screen…