Substack says it won’t ban Nazis or extremist speech

Substack says it won’t ban Nazis or extremist speech

Under pressure from critics who say Substack profits from newsletters promoting hate speech and racism, the company’s founders said Thursday they would not ban Nazi symbols and extremist rhetoric from the platform.

“I just want to make it clear that we don’t like Nazis either – we wish no one had these opinions,” Hamish McKenzie, co-founder of Substack, said in a statement. “But some people share these and other extreme views. We therefore do not believe that censorship (including demonetization of publications) will make the problem disappear. On the contrary, it makes it worse.”

The response came weeks later Atlantic found that at least 16 Substack newsletters had “obviously Nazi symbols” in their logos or graphics, and that white supremacists had been allowed to post on the platform and profit from it. Hundreds of the newsletter’s editors signed a letter opposing Substack’s position and threatening to leave. About 100 other people signed a letter supporting the company’s position.

In the statement, Mr McKenzie said he and the company’s other founders, Chris Best and Jairaj Sethi, had come to the conclusion that censoring or demonetizing posts would not make the problem of hateful rhetoric go away.

“We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to disempower bad ideas,” he said.

This stance has sparked waves of outrage and criticism, including from popular Substack writers who said they don’t feel comfortable working with a platform that allows hateful rhetoric to fester or prosper.

The debate renewed questions that have long plagued tech companies and social media platforms about how content should be moderated, if at all.

Substack, which takes 10% of revenue from writers who charge newsletter subscriptions, has faced similar criticism in the past, particularly after allowing transphobic and anti-vaccine language from some writers.

Nikki Usher, a communications professor at the University of San Diego, said many platforms face what’s known as “the Nazi problem,” which states that if an online forum is available long enough, there will have extremists. at one point.

Substack establishes itself as a neutral content provider, Professor Usher said, but that also sends a message: “We’re not going to try to control this problem because it’s complicated, so it’s easier not to to take a position. »

More than 200 writers who publish newsletters on Substack have signed a letter opposing the company’s passive approach.

“Why do you choose to promote and allow the monetization of sites that traffic in white nationalism? » says the letter.

The authors also questioned whether part of the company’s vision for success was giving a platform to hateful people, such as Richard Spencer, a prominent white nationalist.

“Let us know,” the letter said. “From there, we can each decide if we are still where we want to be. »

Some popular writers on the platform have already promised to leave. Rudy Fosterwho has more than 40,000 subscribers, wrote on Dec. 14 that readers often tell her they “can’t stand paying Substack anymore” and that she feels the same way.

“So, here’s to a 2024 where none of us will do that!” ” she wrote.

Other writers have defended the company. A letter signed by around 100 Substack editors says it’s best to let writers and readers moderate content, not social media companies.

Elle Griffinwho has more than 13,000 subscribers on Substack, wrote in the letter that while “there is a lot of hateful content on the Internet,” Substack has “found the best solution yet: giving writers and readers the freedom to ‘expression without resurfacing’. this speech to the masses.

She argued that subscribers only receive the newsletters they sign up for and are therefore unlikely to receive hateful content if they don’t follow it. That’s not the case on X and Facebook, Ms. Griffin said.

She and the other signatories to the letter of support for the company emphasized that Substack is not really one platform, but thousands of individualized platforms with unique, curated cultures.

Alexander Hellene, who writes science fiction and fantasy stories, signed Ms. Griffin’s letter. In an article on Substackhe said a better approach to content moderation was to “take matters into your own hands.”

“Be an adult,” he wrote. “Block people.”

In his statement, Mr. McKenzie, the co-founder of Substack, also defended his decision to host Richard Hanania, president of the Center for the Study of Partisanship and Ideology, on Substack’s podcast “The Active Voice “. The Atlantic reported that Mr. Hanania had previously described black people on social media as “animals” who should be subject to “more policing, incarceration and surveillance.”

“Hanania is an influential voice for some in American politics,” Mr. McKenzie wrote, adding that “it is useful to know his arguments.” He said he was not aware of Mr Hanania’s writings at the time.

Mr. McKenzie also argued in his statement that censoring ideas considered hateful only promotes them.

But research In recent years suggests THE opposite Is right.

“Deplatforming appears to have a positive effect in reducing the spread of far-right propaganda and Nazi content,” said Kurt Braddock, a communications professor at American University who has studied violent extremist groups.

When extremists are excluded from one platform, they often move to another platform, but a large portion of their audience does not follow them and their income eventually declines, Professor Braddock said.

“I can appreciate someone’s dedication to free speech, but free speech rights are dictated by the government,” he said, emphasizing that companies can choose which types of content that they host or prohibit.

Although Substack says it does not allow users to call for violence, even that distinction can be blurred, Professor Braddock said, because racists and extremists can approach the line without doing so openly. But their rhetoric can still incite others to violence, he added.

Allowing Nazi rhetoric on a platform also normalizes it, he said.

“The more they use rhetoric that dehumanizes or demonizes a certain population,” Professor Braddock said, “the more acceptable it becomes for the general population to follow.”

Avatar photo

David B.Otero

Related Posts

Read also x