More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”
Banning nazis is not a slippery slope.
It is when people start agreeing with what is said.
they love what I say just hate the word nazi
I’ll just give an example.
Recently, when discussing defederated instances, I’ve seen an interesting picture: people cheered defederating instances of Nazis and…pedophiles.
An average person would see no issue here. Right, one more terrible group banned! Take those perverts down! But there’s a catch that I discovered quite a while ago, and it’s a rabbit hole like no other.
And when you see something like that, you clearly understand that there’s a lot of things in the world people still heavily misunderstand, while feeling certain about the position they didn’t have 5 minutes to research on, and that people are already on the slippery slope, banning groups they didn’t have time and effort to comprehend. And there’s a lot more of that than just pedophiles, this is just a very bright example that will probably make most of those reading this uncomfortable and will illustrate the concept best.
Also, I’m full aware that most people will likely choose to downvote this, not comment anything and end up thinking I support child molesters (hell no, if you support child molestation go get some mental health asap, fucking kids is very bad)
sorry what exactly about banning nazis causes one to ban non-offender pedophile support groups. like what is the actual causal link you’re suggesting? if you just mean “I noticed random people endorse this thing I have no opinion on, and also this similar sounding thing I think is bad,” that’s not super compelling
I’m saying that banning Nazis comes from public opinion and perception (which is correct to my knowledge), and that banning pedophiles comes from public opinion of just the same people (which is wrong as far as I know). Both groups (third is instances full of bots and spam) are heavily banned on the Fediverse, so it’s not “some people’s opinion” but rather, essentially, a policy for majority of instances.
This is to the point that the organized banning of groups that shouldn’t be banned and hate towards groups that shouldn’t be hated didn’t stop, and without venues for free speech, we may never know that and keep hating and banning those we need to support to make this world a better place.
by causal link, I mean how does banning nazis cause support groups for non-offending pedophiles to get banned. like how does that actually happen. please be as specific as you can be
I see.
It’s not banning nazis directly causing banning non-offending pedophiles, it’s banning people considered dangerous causing both, with Nazis just setting the precedent (because obviously they are bad, and there’s little disagreement). Fedi is just one example where banning Nazis is not full stop. Other groups are banned too, sometimes without much consideration, and this happens on many different platforms - Tumblr, Discord, Facebook, and even daddy Elon’s Xitter, to name a few.
This goes as part of my argument on why we need spaces with completely free speech. We cannot expect instance admins or even platform owners to be completely objective in their estimations of right and wrong, and we can’t trust them to be unaffected by societal stereotypes.
Moreover, even in such an ideal scenario where they are fully objective, their userbase might think differently, forcing admins to take measures against various marginalized groups.
At that point, it seems to me like the only way out of this conundrum is having some platforms - not mainstream ones, mind you - allowing everything: platforms, from which positive, but initially rejected ideas can spread.
nobody but nazis wants to be on those lol. go post on gab or whatever if you want that. it’s free. you can do it. you just don’t actually want to
Why would I want to post anything on Gab, a far-right platform?
I hoped we’ll keep on with sensible conversation.
Substack, on its part, is used by various authors and is absolutely not limited to Nazis.
the site you are imagining, the supposed free speech site? it converges to gab. this dynamic is basic and I can’t take you seriously if you don’t get this.
what exactly do you think substack will consist of in two years if they don’t do a 180? the entire reason we’re having this conversation right now is that a bunch of substack writers said they would rather leave than hang out with nazis
If all the content on those instances was ai generated then your hot take could be taken seriously. We all know it’s not.
I’m talking specifically about instances with strong rules, either prohibiting any child imagery or only allowing drawings (which is just about any anti-contact place). Both types are heavily defederated from, and barely anyone makes a difference between that and literal child porn instances (which should be not just defederated, but seized by authorities and admins brought to justice)
I’ve updated third bullet point in accordance with your comment, thank you.