Moderation: The Unsung Hero of Community

Tim McDonald and PJ Zaiac

community

May 9, 2023

Have you ever heard the expression, “One bad apple spoils the bunch?” When starting, or scaling, a community, this saying holds even more true. But how can you ensure that your community is free of any ‘bad apples?’ Moderation! While Moderators are vital members of any Community Team, they are often overlooked. The community manager gets the credit when things go well and the moderators get all the blame from community members when they don’t agree with any actions taken against them. But the role played by effective moderation is the difference between a healthy community and one that becomes toxic.

By establishing an overview of action options and escalation protocols, your moderation team will be enabled to help create a healthy community. Let’s look at how.

Moderation Moderation Protocol

One of our core missions at HomeRoom is to ensure that every community we support is secure and safe - both for our clients and their community members. Each community is unique with its own set of guidelines and governance structures, but we are often asked how to take action when moderation is needed to maintain a community’s safety. This blog post will walk through some core definitions relating to community moderation as well as provide examples of actions we’d take in various situations to help moderation teams evaluate and take proper action to keep a community safe.

Action Options

There are several common action options that moderators can take. Choosing which option to take is best taken on a case-by-case basis, and all depends on your protocols (more on that in the next section). Here are some of the most common actions:

  • Warning: A recorded instance of a community team member providing a community member a written or verbal warning for their actions - some common terms in community guidelines are to operate on a Single Warning or Three Strike model for warnings before further action is taken.

    • Single Warning: A clause typically included in community guidelines to set a precedent that a community member will be warned based on their behavior before higher-tiered moderation action is taken.

    • Three Strike: A clause included in community guidelines where a community member can have 3 instances of recorded action taken (typically an umbrella where warnings are included but so are higher-tier actions) before being removed and unwelcome to return to a community.

  • Timeout: An action many community platforms support that allows a community member to remain in the community but silences any messages they try to send within an allotment of time. We recommend setting Timeouts as low as 15 minutes to allow a community team member to reach out privately and give time for the member to cool off, or as high as 5-7 days to drive home the severity of a member’s actions and will go over both examples further down the road.

  • Kick: The action of removing a community member from a community but not preventing them from returning on their own time. Typically used as a ‘soft’ ban where a community member just needs some time away but could be a valuable and positive member in the future.

  • Ban: The action of removing a community member and preventing their account on any community platform from returning. Sometimes a ban is for a certain amount of time and sometimes a ban is for the foreseeable future.

  • Unban: The action of removing a formerly banned community member’s account from the list of banned users, thus allowing them to return if they so choose. Some platforms allow banned users to request to be unbanned from the community, but it is up to the community team to use discretion based on their history of action within the community and the content of their unban request.

  • Probation: Much like how Probation works in American society, community probation is keeping a member on a ‘special watch’ for a period of time following moderation action where behavior outside of community guidelines will result in heightened moderation action. If a community member remains aligned with guidelines during this period, they are typically welcome to participate in that way moving forward with recorded information in Mod Logs in case anything goes haywire in the future.

Want some more common terminology, including: Temp or Perma, Mod Logs, Alt Profiles, Raid, VPN Protection and IP Address Bans, and clear definition of the Community Team? Please let us know!

Escalation Protocols

The first step in any community’s moderation protocols begin before a community’s launch, with writing clear Community Guidelines and setting proper permissions for a tiered governance structure to match. Without Guidelines, there are no clear places to refer to in order to find what is and is not acceptable within a community, and without Governance a community will have no methods of accountability or maintaining safety.

Once a community’s launched with proper guidelines and governance, however, it’s hard to implement meaningful moderation if there’s no understanding across a team on which actions to take for what offenses within a community. We recommend working directly with moderation teams to ensure that everyone on those teams is knowledgeable about what their options are when they flag an issue in the community and empowered to take appropriate action when applicable, including spam-related bans, taking action on explicit behavior against guidelines, and mass action in the event of a raid.

Sometimes the lines can get blurred, however, so we use the following escalation protocol when an answer is unclear:

  1. Refresh on Community Guidelines. The answer to whether a user’s behavior is acceptable will typically fall under the rules - this could be anything from a member making disparaging comments about others when there is a no-discrimination rule in place to a member refusing to listen to a community team member asking them to adjust their behavior.

  2. If the situation is still unclear and isn’t hostile or volatile, Warn the member that the community team will be discussing their behavior and to expect to be followed up with as soon as a decision is reached.

  3. If the situation IS hostile or volatile, Timeout the member in question - give it enough time that that member will not be able to actively participate via voice or text while a decision is reached by the moderation team. If the decision for that member to remain in the community is reached before the Timeout period ends, it can always be removed ahead of the deadline as well.

  4. Loop in additional members of the Community Team - this could be moderators working with Super Mods, or reaching out to the Community Manager directly if they are not already involved to decide on a course of action.

  5. If a Timeout or other introductory action does not work (in the event of a member using alts, or other members trying to push messages through on their behalf), this typically will reroute to Community Guidelines as that escalated action is typically against any well-written set of rules, and the ability to act with more severe moderation options like a perma-ban becomes more clear.

Moderation in Action

It’s important to remember that moderation is the most effective when a community’s moderators both have clarity on what instances within their community require action, and are empowered to take that action swiftly. Here are a few generalized examples of moderation in action:

Scenario #1: A longer-standing community member who has never been warned posts a meme that blatantly breaks community guidelines or is reported as making other members uncomfortable.

Moderation in Action: The message containing the meme is deleted by mods. A written warning is provided with an added note in mod logs.

What’s next? When the meme is deleted swiftly, the moderators have prevented other community members who may have been sensitive to its contents from finding it later. The community member who posted the meme is also refreshed on community guidelines and can continue participating in the community without further offenses.

Scenario #2: A new community member can clearly be described as a “troll”, meaning they’ve entered the space clearly to drum up drama - they could be posting disparaging things about other members or spamming negative messages.

Moderation in Action: This one is incredibly clear - this is a ban! Given that this is a new member who has only exhibited trolling behavior, do not waste time and energy attempting to reprimand someone who will not listen.

What’s next? Once a troll is removed from a community, there’s a quick return to normal - no community members are stuck trying to wade through a troll’s behavior, scam, or spam.

Scenario #3: As a super-mod, admin, or community manager, a community member reaches out directly about a moderator abusing their power - anything from threatening or harassing other community members to documented behavior against community guidelines outside of the community - that makes this member concerned for other members and feeling unsafe in your community.

Moderation in Action: This one is tricky because many community members grow into moderators based on activity and passion for the community - this makes them great candidates for the moderator’s role, but not all of them will have good intentions or know how to best conduct themselves. In instances where it’s overwhelmingly clear that the moderator in question is behaving outside of your community’s guidelines, warnings, revoking their moderator status, and even bans from the community could be used with discretion based on the severity of their actions.

What’s next? Make sure your moderators are very familiar with your community guidelines for themselves as well as the community, and as unfortunate as it is to have to warn, de-mod, or ban someone that many of your community is likely close with through regular interactions with them, choosing to not reprimand behavior outside of community guidelines consistently is tantamount to condoning them. By taking action with consistency, even if it’s against community moderators, friends, or VIPs, your community members will continue to feel welcome, protected, and like the guidelines in place are clear across the board.

As you can see, there is a lot that goes into making your community safe and secure. To become a healthy and thriving community it takes planning and proper moderation. After all, without it, the bad apples not only spoil the bunch, they destroy the entire orchard. Don’t let your community be destroyed by a bad apple. Get the insights, strategy, and tools you need to manage and moderate your Discord community with HomeRoom.