Up until this week, there was some question as to whether the newsletter platform Substack has a Nazi problem. Now, it undoubtably does, and it’s entirely Substack’s own fault. They’ve made a series of unforced errors, screwed up the PR spectacularly, and are losing publishers and subscribers as a result. Two big tech/digital focused publications are already on the way out of the door — Platformer and Garbage Day — with more likely to follow.
Happy new year, Substack.
To understand what’s happening here, we need to rewind a little, back to the dim and distant past of November, when Jonathan M Katz wrote a piece for The Atlantic called, unsubtly, Substack has a Nazi Problem:
An informal search of the Substack website and of extremist Telegram channels that circulate Substack posts turns up scores of white-supremacist, neo-Confederate, and explicitly Nazi newsletters on Substack—many of them apparently started in the past year.
Substackers for open letters
There can be no doubt that there is a lot of hateful content on the internet. But Substack has come up with the best solution yet: Giving writers and readers the freedom of speech without surfacing that speech to the masses. In your Substack Inbox, you only receive the newsletters you subscribe to. Whether you’re a reader or a writer, it is unlikely you’ll receive hateful content at all if you don’t follow it. (I never have—though I saw it all the time on Twitter and Facebook despite never having followed an account in that vein.)
This became the first of two open letters on the subject. At pretty much the same time, Katz and Marisa Kabas were working on their own open letter, Substackers Against Nazis:
In the past, you have defended your decision to platform bigotry by saying you “make decisions based on principles not PR” and “will stick to our hands-off approach to content moderation.” But there’s a difference between a hands-off approach and putting your thumb on the scale. We know you moderate some content, including spam sites and newsletters written by sex workers. Why do you choose to promote and allow the monetization of sites that traffic in white nationalism?
The scale question
One element of this debate that has possibly made it more heated than it could have been is that nobody seems quite sure how much of a problem it actually is. Katz originally gave the scale as being small but significant:
At least 16 of the newsletters that I reviewed have overt Nazi symbols, including the swastika and the sonnenrad, in their logos or in prominent graphics. Andkon’s Reich Press, for example, calls itself “a National Socialist newsletter”; its logo shows Nazi banners on Berlin’s Brandenburg Gate, and one recent post features a racist caricature of a Chinese person. A Substack called White-Papers, bearing the tagline “Your pro-White policy destination,” is one of several that openly promote the “Great Replacement” conspiracy theory that inspired deadly mass shootings at a Pittsburgh, Pennsylvania, synagogue; two Christchurch, New Zealand, mosques; an El Paso, Texas, Walmart; and a Buffalo, New York, supermarket. Other newsletters make prominent references to the “Jewish Question.” Several are run by nationally prominent white nationalists; at least four are run by organizers of the 2017 “Unite the Right” rally in Charlottesville, Virginia—including the rally’s most notorious organizer, Richard Spencer.
Note that most of the newsletters he found go unnamed. Casey Newton of Platformer decided to look into it himself:
Over the past few days, the Platformer team analyzed dozens of Substacks for pro-Nazi content. Earlier this week, I met with Substack to press my case that they should remove content that praises Nazis from the network. Late today, we submitted a list of accounts that we believe to be in violation of the company’s existing policies against incitement to violence. I am scheduled to meet with the company again tomorrow.
Note that there’s no number given. Well, Substack chose to provide that information to another Substack publication, Public:
A Substack spokesperson told Public that Newton’s list contains just 6 Substacks with 29 paid subscribers between them, a tiny fraction of the more than 2 million paid subscribers the service has today.
That’s… not a lot.
Jesse Singal critiqued Newton’s reporting and evasion in a typically long and detailed post on his own Substack:
Frankly, I think the more likely explanation here is that Newton and his team didn’t find very many Nazi Substacks, but that this would have been a difficult thing for them to report. If they did, it would complicate the leading role Platformer had played in stoking this controversy, not to mention piss off subscribers already increasingly convinced Substack is basically the film-premiere scene in Inglourious Basterds.
It doesn’t help that generally those making the accusations have been unwilling to name the publications that they say are Nazi in nature. The reason both Katz and Newton have given is roughly the same: extremism researchers told them not too, to both avoid drawing attention to the publications, and to help prevent the reported ending up in the firing line. Fair enough — but it does mean that there is a big “trust us, it’s a problem” aspect to the discussion around this. Certainly, bar the ones Katz explicitly mentioned, I found it quite difficult to find clearly Nazi publications in the hour or so I spent searching.
However, I’d also argue that the scale is less of an issue than the underlying policy decision. And I do agree with the way John Naughton summed up his piece on the developing crisis in The Guardian last week:
And, as the veteran journalist Casey Newton has also noted, the correct number of Nazis on Substack is zero.
The power question
Just to complicate things, there is an issue of power here. Twitter has taught a certain category of person that they can have a strong censorious effect by whipping up enough support that platforms choose to deplatform people. And Substack has set out its stall in making some of those people welcome on its platform. Understandably, those who have been hounded off other platforms are looking at this with some nervousness.
At least some of the people who are defending Substack aren’t doing it because they are like or are defending Nazis. They have the much more reasonable concern that they might be next. Substack has become a comfortable and profitable home for professional journalists and others who have been forced off other platforms for views or reporting that are considered unacceptable by certain political groupings. Examples include:
- Andrew Sullivan, pushed out from New York Magazine. (Full disclosure: I am a very long-term reader of Sullivan and current paid subscriber to his Substack.)
- Bari Weiss, late of the New York Times
- Jesse Singal and Katie Herzog (for reporting on transgender issues perceived as transphobic)
Notably, they also include Graham Linehan, the comedy writer whose participation in the gender critical movement has essentially burned down his career.
No wonder that many of them would rather Substack handled its own moderation policies, rather than powerful online groups. These are people whose very incomes could be threatened a second time if this campaign spreads beyond explicit Nazis. One core feature of online discourse over the last decade or so has been scope creep on things we all agree are bad. For example, we all agree that violence is generally bad. But violence had gone from meaning “actual physical assault” to words being violence, to “silence is violence”. That’s scope creep right there.
As Freddie deBoer put it:
A basic part of the point is that, as the past decade and a half proves, contemporary liberals have an incredibly expansive view of what a fascist is. I am a pro-choice, pro-reparations, pro-trans rights, pro-Palestinian, pro-redistribution Marxist, and I am routinely called a fascist by the kind of people who are pushing this line. I promise you that if Substack started banning “literal Nazis,” people would make an effort to include me - it’s happened before on other platforms - and if that effort arose, a lot of people pushing the “we’re only talking about literal Nazis” line would have no problem pushing for me to be deplatformed.
Ban Nazis, then broaden the definition of Nazi, and…
I have no idea if that’s in the minds of the organisers of Substackers Against Nazis (I hope not), but I can clearly see what that might be a worry to some people on the platform. After all, we’ve seen this play out before, most notably on pre-Musk Twitter. And it’s not challenging to find people starting to think about what other kinds of content should be moderated.
Which brings us to the…
The free speech question
Does anybody else roll their eyes when people start talking about free speech?
The difficulty is that pretty much nobody is a free speech absolutist, whatever they might claim. I might describe myself as tending towards a free speech maximalist position, but I believe in libel and slander laws. And I won’t publish bigoted comments on my blog. Most people have their limits on free speech, including me. And most people also understand the difference between your rights to speech without consequence vary in different environments.
(Yes, I know the argument about free speech being about protections from governments, not individuals or companies. But let’s be honest: some people are capable of wielding enough power and influence on social media that they can have a disproportionate impact on other people’s lives, just as governments can do. It may be “consequence culture”, but it is entirely possible to have “disproportionate consequences”, too. And that’s a genuine free speech concern.)
So, most of us have a line we wouldn’t want to see crossed. And as publishers and businesses, we get to choose where to set that line. Substack’s problem is that they’ve decided to set that line in an unwise place.
I have some sympathy for the slippery slope argument. I think it’s inevitable that, now that Substack has capitulated slightly on the Nazi issue, people will try again on the gender critical issue. Why “again”? Well, this was an issue a couple of years ago when trans writers Jude Doyle and Grace Lavery left the platform, unwilling to share space we people they saw as transphobic.
But a slippery slope argument is rarely a good argument for doing nothing. And I really do think that Substack could have made this whole issue go away last November by saying “thanks very much for drawing this to our attention, please let us know where the publications advocating violence or genocide are, and we’ll pull them down”. Or even “our trust and safety team are looking into this, and we’ll remove publications that violate our rules”.
But they didn’t. And they didn’t when the open letter started spreading. And they only acted when one of the bigger publishers came to them with specific examples. That creates the view that, actually, they’d be happier leaving the Nazis alone. And that’s how you get a Nazi bar.
The platform question
The other big problem here is that Substack is trying to have its cake and eat it. A lot of its defence is built on the legacy idea of it being a neutral publishing platform, which is how it used to define itself in its earlier days. But it stopped being that when it started making itself into a social network. With Notes, its Twitter clone, and Chats, essential groups for newsletter subscribers, it’s taken on social network functionality, with the aim of both helping newsletter authors find bigger audiences and making Substack into a destination. It’s a VC-backed company, and it needs a moat. The social element is that moat.
Once upon a time, we used to ask whether Medium was a platform or a publisher, and ended up coining the word “platisher” because it combined both. Substack is no longer in the business of being a publisher, having backed away from its advance scheme, but is almost certainly best described as a “platwork” — a combination between a platform and a network.
And it’s that element that undermines its defence. As Casey Newton on Platfomer put it:
By 2023, in other words, Substack no longer could claim to be the simple infrastructure it once was. It was a platform: a network of users, promoted via a variety of ranked surfaces. The fact that it monetized through subscriptions rather than advertising did not change the fact that, just as social networks have at times offered unwitting support to extremists, Substack was now at risk of doing the same.
You can’t claim to be a neutral infrastructure provider, when your key selling point over your competitors is the network recommendations and social features that helps publishers grow.
Moderation in costly moderation
Now, it’s completely understandable that Substack wants to dodge moderation. Moderation is damn expensive. And Substack appears to be facing a challenging financial situation. Taking on significant moderation costs would not be good for its survivability. But those costs come with becoming a social platform, as every single social network before it has discovered.
The excuse “people only see what they subscribe to” disappears when you’re recommending other content in Notes, in emails and in the app. At a bare minimum, you need to be moderating those, so more extreme content from the left or the right (because, let’s be honest, we’ve seen genocidal rhetorical from supporters of both sides in the Israel/Hamas conflict recently) isn’t pushed in front of people who really don’t want to see it.
Mike Masnick published a content moderation learning curve last year. Substack are about halfway through it. The problem they’re facing is three-fold:
- Subscribers leaving
- Publishers leaving
- Bad headlines
The headlines will go away because we have notoriously short attention spans in the media, but the other two are unlikely to. I’ve had three conversations in the last two days with friends who publish on Substack and who are wondering if they should move.
The only way to stop the bleed is to bite the bullet, speed run their way through the content moderation curve, and hire some more trust and safety curve. If they really want to be the platform of free and open (and, yes, profitable) discourse, then they need to learn a fundamental lesson: sometimes it’s the walls that make the community. And who you keep out is as important as who you let in.
Notes and further reading
Ben Whitelaw’s invaluable Everything in Moderation newsletter highlights an academic response:
Further reading: I hope to write more on this but JM Berger of Middlebury Institute of International Studies at Monterey University notes how this episode — which started back in November with an Atlantic story and has included not one but two open letters from hundreds of Substack writers — perfectly falls into the five stages of content moderation.
And here’s the piece he’s referring to.
Jason Kottke will not link to Substacks now, with one exception:
I’m making exceptions to my no-linking-to-Substack policy for why-we’re-leaving-Substack posts.
Matt Birchler notes that Substack is doing at one thing right, at least.
Let’s end this with a high note: it’s to Substack’s credit that they continue to let their writers own their subscriber lists.
Phillipe Meda took advantage of that, and is glad now:
Having transitioned our much more modest online activities from WordPress (aka "hot mess") to Ghost more than a year ago now, I must say that the way Substack has handled "free speech" lately has been beyond dreadful. But even before that, when shopping for a new platform, the very business model of Substack and how they deal with your content were non-starters for me (same as Medium).
And, for Hannah Raskin, the Nazi issue was just one thing driving her from the platform:
Since The Food Section launched in September 2021, the platform has introduced various referral and recommendation systems, but those function primarily to draw readers deeper into the Substack universe, rather than promote allegiance to one publication.
And if you want even more detail on the Substack story, this long two-parter is worth a read:
Sign up for e-mail updates
Join the newsletter to receive the latest posts in your inbox.