Reddit is rolling out significant new policies that will compel a segment of its most active volunteer moderators to step down from some of their roles. These changes, set to unfold over the coming months, are designed to limit individuals from moderating more than five subreddits, each boasting over 100,000 monthly visitors. While Reddit asserts this move will foster “diverse perspectives,” many veteran moderators are voicing alarm, predicting a potential decline in content quality and an increase in harmful material across the platform.
Understanding Reddit’s New Moderation Framework
The core of Reddit’s new strategy, announced via the r/modnews subreddit, centers on limiting the number of large communities a single user can oversee. Those currently exceeding this five-subreddit cap will be required to relinquish positions or apply for an exemption. Reddit is reportedly collaborating with moderators to finalize the exemption criteria.
According to a Reddit admin, known as Go_JasonWaterfalls, the rationale behind these changes is to reinforce the unique character of Reddit’s communities. “What makes Reddit reddit is its unique communities, and keeping our communities unique requires unique mod teams,” the admin explained. They emphasized the unsustainability of a system where one person can moderate an unlimited number of large communities, advocating for a “strong, distributed foundation” that supports varied experiences and viewpoints.
Further platform adjustments include altering how subreddit traffic is displayed. Subscriber counts will be replaced by a “unique number of unique visitors over the last seven days, based on a rolling 28-day average.” Additionally, subreddits will showcase “contributions per week,” reflecting the total number of posts and comments within the last seven days. Notably, the old.reddit.com interface will not display these new statistics but will still lose subscriber counts.
The implementation timeline begins December 1, preventing users already at or above the five-subreddit limit from accepting new moderation invitations for large communities. By March 31, Reddit expects full compliance. Moderators who remain above the threshold will be systematically transitioned out, starting with communities where their activity is lowest, until they meet the new criteria. Options for affected mods include seeking an exemption, opting for “Alumni status” (which offers no mod abilities), or becoming an “advisor” with read-only permissions.
Reddit estimates that only “0.1 percent of our active mods” will be affected. Given that Reddit reported over 60,000 daily active moderators in December 2023, this figure translates to approximately 60 individuals.
Moderators React: Fears of Expertise Loss and Site Degradation
Despite Reddit’s assurances, many moderators are expressing profound concern, anticipating a significant loss of invaluable volunteer expertise. One anonymous moderator, who faces the prospect of leaving several subreddits, told Ars Technica that the change means “many subreddits will become isolated and will no longer have expert moderators who know how the site works [and] who know how to moderate efficiently and skillfully.” They also highlighted the potential for reduced minority representation within moderation teams.
These volunteers often dedicate countless hours, possessing rare expertise, deep connections to their communities’ subject matter, and extensive histories with the subreddits they oversee. Finding adequate replacements for such dedicated individuals is expected to be a considerable challenge.
This isn’t Reddit’s first skirmish with its volunteer moderation force. A contentious period in 2023 saw Reddit forcibly remove protesting moderators who had privatized their subreddits to contest new API access fees. At the time, ousted mods warned of an impending surge in inaccuracies, insults, and misinformation—concerns that are now resurfacing with these new rules.
Another anonymous moderator, who views the changes as “bullshit,” suggested they are a punitive response to past mod protests. They asserted that Reddit “made a mountain out of a molehill,” framing the policy as a reaction to a small number of “abusive mods who moderated hundreds of subreddits” and a few who “held them hostage” during previous policy disputes.
However, not all sentiment is negative. Some users and moderators support the restrictions, viewing them as a necessary step to address the “power mods” issue—the perception that certain individuals wield excessive influence by moderating numerous communities. For instance, some moderators at r/Conservative have applauded the changes, citing concerns about Reddit being “overly moderated.”
When asked if the new limits target “power mods,” Reddit spokesperson Tim Rathschmidt stated, “This is not a judgment of any specific moderator or mod practices; it’s a structural change. Establishing limits sitewide creates a solution for both the present and the future that is not tied to any individual mods.”
Reporting Changes Ignite Further Concern Over Content Safety
Beyond the mod limits, several moderators worry about forthcoming changes to Reddit’s content reporting and review processes. A significant shift will see moderator-removed comments automatically disappear from user profiles. Previously, such content remained visible until reported and removed by Reddit itself. While Reddit assures that all mod reports will still be reviewed, an admin stated that “most violative content is already caught by our automated and human review systems,” and “mods are empowered to remove it” if anything is missed.
This has sparked fear among moderators that problematic users may not face appropriate consequences, and serious issues could be overlooked. Tim Rathschmidt reiterated Reddit’s commitment: “Keeping Reddit safe and healthy is our highest priority, and a goal we share with mods. This will not change how we action users who break Reddit Rules.”
Yet, skepticism persists. Moderator Gregory_K_Zhukov, who opted to be identified by their username, challenged Reddit’s claim of effectively actioning reports. “Experience from our mod team suggests that isn’t the case,” they told Ars Technica, citing instances of Holocaust denial and “clearly aggressive and racist” content persisting on subreddits. The anonymous second mod echoed this, interpreting Reddit’s policy shifts as attempts to “eschew responsibility,” potentially allowing “xenophobia, slurs, transphobia, outright death threats and hate speech, calls for violence, disinformation, and spam… to run amok.”
Gregory_K_Zhukov also questioned the automatic deletion of mod-removed comments from user profiles, arguing it hinders moderation by reducing available information—such as whether a user has a history of similar violations. For this mod, reporting content to Reddit is primarily about prompting site-level warnings or bans, not merely content removal. They predict this new policy will lead to more “bad faith actors” because “fewer reports are going to be getting actioned,” discouraging mods from reporting.
An Uncertain Future for Reddit’s Communities
Many dedicated moderators are struggling to reconcile Reddit’s logic with the potential ramifications of these new rules. Despite Reddit’s assurances of continued review and enforcement, the company’s past disputes with its user base, particularly over API fees, have eroded trust. The 2023 protests underscored the deep commitment moderators have to their roles and communities.
Forcing out such devoted volunteers will undoubtedly impact subreddits, though the full extent and severity of this impact remain to be seen. Both Reddit and its moderators share a vested interest in maintaining a safe and valuable platform—a critical concern for Reddit, a public company heavily reliant on advertising revenue and licensing content to AI firms.
While Reddit likely believes these changes will enhance the site, the history of friction means moderators are understandably wary. Given Reddit’s track record of standing firm on controversial policy shifts, these new moderation rules are expected to proceed as planned, ushering in a new era for content governance on the platform.
Disclosure: Advance Publications, owner of Ars Technica parent Condé Nast, is the largest shareholder in Reddit.