Levka<p><a href="https://kolektiva.social/tags/Substack" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Substack</span></a> <a href="https://kolektiva.social/tags/Nazis" class="mention hashtag" rel="nofollow noopener" target="_blank">#<span>Nazis</span></a></p><p>"Substack’s 'Nazi problem' won’t go away after push notification apology</p><p>Substack may be legitimizing neo-Nazis as 'thought leaders,' researcher warns. </p><p>(. . .)</p><p>Substack has long faced backlash for allowing users to share their 'extreme views' on the platform, previously claiming that 'censorship (including through demonetizing publications)' doesn't make 'the problem go away—in fact, it makes it worse,' Lorenz noted. But critics who have slammed Substack's rationale revived their concerns this week, with some accusing Substack of promoting extreme content through features like their push alerts and 'rising' lists, which flag popular newsletters and currently also include Nazi blogs.</p><p>But perhaps even more appealing than Substack's lack of content moderation, Fisher-Birch noted that these groups see Substack as a legitimizing tool for sharing content' specifically because the Substack brand—which is widely used by independent journalists, top influencers, cherished content creators, and niche experts—can help them 'convey the image of a thought leader.'</p><p>'Groups that want to recruit members or build a neo-fascist counter-culture see Substack as a way to get their message out,' Fisher-Birch told Ars.</p><p>That's why Substack users deserve more than an apology for the push notification in light of the expanding white nationalist movements on its platform, Fisher-Birch said.</p><p>'Substack should explain how this was allowed to happen and what they will do to prevent it in the future,' Fisher-Birch said.</p><p>Ars asked Substack to provide more information on the number of users who got the push notification and on its general practices promoting 'extreme content through push alerts—attempting to find out if there was an intended audience for the 'error' push notification. But Substack did not immediately respond to Ars' request to comment.</p><p>Joshua Fisher-Birch, a terrorism analyst at a nonprofit non-government organization called the Counter Extremism Project, has been closely monitoring Substack's increasingly significant role in helping far-right movements spread propaganda online for years. He's calling for more transparency and changes on the platform following the latest scandal.</p><p>In January, Fisher-Birch warned that neo-Nazi groups saw Donald Trump's election 'as a mix of positives and negatives but overall as an opportunity to enlarge their movement.' Since then, he's documented at least one Telegram channel—which currently has over 12,500 subscribers and is affiliated with the white supremacist Active Club movement—launch an effort to expand their audience by creating accounts on Substack, TikTok, and X.</p><p>Of those accounts created in February, only the Substack account is still online, which Fisher-Birch suggested likely sends a message to Nazi groups that their Substack content is 'less likely to be removed than other platforms.' At least one Terrorgram-adjacent white supremacist account that Fisher-Birch found in March 2024 confirmed that Substack was viewed as a back-up to Telegram because it was that much more reliable to post content there."</p><p><a href="https://arstechnica.com/tech-policy/2025/07/substacks-nazi-problem-wont-go-away-after-push-notification-apology/" rel="nofollow noopener" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">arstechnica.com/tech-policy/20</span><span class="invisible">25/07/substacks-nazi-problem-wont-go-away-after-push-notification-apology/</span></a></p>