Reddit hosts an immense number of communities on its servers, dedicated to about as many human pursuits as there are. Some of those groups are delightful, like r/showerthoughts or r/dogswithjobs, and some are reprehensible, like the long-since banned r/jailbait (pictures of underage women) and r/picsofdeadkids (exactly what it says on the tin). But there are subreddits that don’t fall neatly into either category, and it’s why the site implemented a quarantine policy in 2015 — basically, some of these communities are toxic enough that Reddit won’t show them to you but you can join them.
As CEO Steve Huffman wrote three years ago in a post explaining the measure, the site “will Quarantine communities whose content would be considered extremely offensive to the average redditor,” though he doesn’t say who that might be. “Our most important policy over the last ten years has been to allow just about anything so long as it does not prevent others from enjoying Reddit for what it is: the best place online to have truly authentic conversations,” Huffman continued. According to Wired at the time, Huffman and the site’s admins took a “you know it when you see it” approach to deciding what’s worth hiding from the average user. Quarantining a subreddit means that it’s also ineligible to generate revenue from ads or from gold, Reddit’s premium membership feature.
re: Reddit quarantines:
one thing really stood out to me. Reddit basically says having a subreddit dedicated to racism, anti-Semitism and misogyny is fine; they just won’t promote that subreddit or put ads on its pages. pic.twitter.com/2R3qJRycrL
— Julia Alexander (@loudmouthjulia) September 28, 2018
That approach was updated yesterday, when site administrators announced the addition of an appeals process that emphasizes restorative rather than punitive justice: now, subreddit moderators can challenge their quarantined status if they can prove their community has undergone a “sustained community transformation.” Appeals need to show new moderation techniques — like auto-modding, or adding more moderators and new rules — over a period “of at least one month,” Reddit wrote in its policy, “demonstrating meaningful reform of the community.”
“As of September 27, we have revised our existing quarantine policy to improve consistency in its application, as well as to add a comprehensive appeals process,” a Reddit spokesperson tells The Verge. The key part here is consistency — Reddit is famously opaque about how and why it takes actions across the site, although that does seem to have changed a bit since the site overhauled its content policy last October.
In the comments beneath the announcement, users were skeptical about the promised changes, raising questions about whether the new policy would reinforce echo chambers, how exactly a subreddit is marked for exclusion from the main site, and whether quarantining subreddits makes them more likely to radicalize and spread their subscribers to other parts of Reddit.
The admin who was running the announcement, u/landoflobsters, didn’t seem to have the answers the commenters wanted. Those users pointed out the essential contradictions in the idea behind excluding some toxic subreddits from the main body of users. Quarantined subreddits are still held to the same content policy that governs the rest of Reddit. But if that’s true, and the content that’s posted in these forums is vile enough for site admins to consider it toxic to the rest of the site, then why not just ban the offending subreddits entirely?
That’s a question Reddit has always struggled with. If your site is explicitly founded on the principle of absolute free speech, where can you draw the line on what is and what is not acceptable? And can you do that without alienating your users? The addition of an appeals process suggests, if nothing else, Reddit is ready to listen to its users to chart a path forward.