4. Moderation and safety

Tweet

Image by ian boyd via Flickr

Why moderate?

“Why moderate?” says Sue John, Online Community Manager at Britishexpats.com. “Well mainly for the benefit of the community. It helps to keep things on topic, keeps information and conversation flowing, helps keep a lid on troublemakers and trolls. [Moderators] help and assist new members by welcoming them into the community.”

Alison Michalk, Editor & Community Manger at Fairfax Digital agrees: “I think mods are akin to Traffic Control. They welcome and direct members to the best area… They have three roles - friendly participant, leader/rule enforcer and member advocate.”

Moderation is essential to a clean, healthy, vibrant community. A good moderator has a light touch, barely noticeable, and a well-moderated community is spam-free, troll-resilient and buzzing.

Moderation options

Most things in online community management are fluid, shades of grey (opinion, approach, even what metrics are important) but the options for moderation are fairly static:

  • Pre-moderation: This is where content added to a community needs to be approved by a moderator before it goes live. This is particularly apt for communities aimed at children and vulnerable people. Webchats and ‘live’ Q&As will often be pre-moderated, with inappropriate questions or comments being weeded out before the chat subject ever sees them.
  • Post-moderation: This is where all content added to a community goes live straightaway but is then reviewed by a moderator and removed if necessary.
  • Reactive moderation: Where members and visitors flag up inappropriate content for moderators to review. This is more suitable for a community of adults, where topics aren’t particularly sensitive and the route for flagging content is simple and clear.
  • No moderation and self-moderation: Where no formal moderator reviews content and the community self-governs (or doesn’t as the case may be).

Which is the best form of moderation?

“Reactive - ownership in hands of community,” tweets Mark Sheldon, Engagement Manager at Pluck, “requires less internal resource. Provide a ‘thumbs down’ to minimise abuse reports.”

He explains: “Thumbs down tends to weed out subjective abuse reports, but still gives the user the sense of having stated their dislike.”

Which is a valid point, for a community of adults, seasoned online community users and aware of their ability to self-regulate and the methods by which to do so.

I particularly like the idea of a thumbs down function, to stop people reporting posts just because they don’t like them, rather than because they break any rules. However, balance is key, and a complimentary ‘thumbs up’ function should, in my opinion, always accompany.

But for a community designed for children for example, reactive moderation would be unsuitable.

So the type of moderation really does depend on what kind of animal your community is:

  • Who is the community aimed at?
  • Is it particularly at risk of malicious posting?
  • Does your membership feel comfortable with self-regulation?
  • Do you have the resources to pre-moderate quickly enough or will messages take too long to go live?
  • Is the subject matter particularly legally-sensitive?
  • Are children or vulnerable people going to be using it?
  • Is there a high chance of defamation e.g. a celeb gossip community?
  • How much control do you need rather than want?

Again, we come back to the importance of planning, and thinking strategically and honestly about why you are building a community, who you are building it for and how it should (and will) work.

To edit or not to edit

One potential tool in the moderator’s kit is ‘editing’. You have a fantastic, long, detailed post chock-full of conversation starters and open questions. And then one paragraph happens to include a couple of lines of pretty toxic accusations against another community member. You don’t want to lose the value of the whole post just because of this one paragraph, so you edit the bad bits out.

Do you?

If you do, you are running the risk of being held responsible for the content of the post as if you, yourself, wrote it.

Not only do you - as a moderator and the organisation you work for - become an editor, responsible for the content, but you run the risk of changing the meaning of someone else’s words and upsetting the community, making them feel invaded and trampled over.

If a post is fabulous, apart from one crucial bit, then it comes down to two options:

  1. If the dodgy content is time-sensitive i.e. it needs to come down NOW: take the whole post down.
  2. If it breaks your rules, but no laws, and you don’t feel it’s doing much damage, give the original author the option to edit it within a determined timeframe. If they don’t, take the whole thing down.

Good guidelines

  • Ensure that you have very clear, plain English guidelines, so that any moderation decisions are backed up by the rules that govern all members’ use of the site.
  • Contrary to myth, rules are there to be kept. Members agree to the rules of the site when they sign up, so don’t feel guilty or awkward about enforcing them.
  • Make sure the rules are clear - this makes it easier to be fair and consistent. It also stops it being personal i.e. as community manager you can legitimately say, “hey, it’s the rules, it’s not me!
  • Situations will arise that aren’t covered in your guidelines. Use your intuition, talk it over with your team, then use the experience to inform adaptations and additions to your guidelines. Next time you’ll know what to do!
  • Record everything. Any warnings, any relevant contact with members, record it all - you never know when you’ll be asked to show your reasoning. Don’t worry about it, if you have nothing more whizzbang just keep notes on a spreadsheet with a date and description.
  • At FreshNetworks, we suggest a three strikes and you’re out policy, with immediate bans for serious offences. This is another reason notes and records are important. People will try to quibble!

Tough calls

Solid rules and guidelines help cut down grey areas, but touch calls will still present themselves. Often in the form of a new user, who takes the time to write lots of very detailed, helpful, friendly posts, that all contain a mega-whopping link to their eBay shop or affiliate program or an active user who usually behaves impeccably and starts trying to agitate other members and slowly divide and conquer…

  1. Use your judgement - if a post doesn’t sit right, if you feel uncomfortable with someone’s language, the chances are that other community members will feel the same. Included with your judgement will be your recall of history and your records. If you’re unsure, check your records for previous activity like this, spend a little time looking at how the member is currently behaving, and how the community is reacting to their content.
  2. If you’re still unsure, ask for a second opinion. Sometimes a fresh pair of eyes and chatting it through will make all the difference. If it’s not time sensitive, give yourself five minutes, do something else, make a brew, and then come back to it calmer.

Fights and feuds

An arbiter of good sense in community management, Rich Millington blogs at FeverBee. He says that fights are good.

Wha…?

No really, fights (not malicious activity, but clashes of personality) he says, show that you’re doing a good job:

“Fighting is good for your community. It means that members care what other members think of them. You’re doing a good job. Seriously. If members are fighting you’ve created a close community…

“Remember why most people leave communities. Few leave a community because they get into a fight, most leave a community because it’s gotten boring. “

While I wouldn’t recommend provoking fights (nor taking part, an absolute no-no, these are not your fights) encouraging healthy debate and highlighting vibrant discussions isn’t something to shy away from. While the debate rages within the confines of the rules, your community is functioning well.

Dealing with feuds on an online community will test anyone mettle. They can go back years (particularly in mature communities), can be brought in from real life relationships, can be the result of online relationships becoming offline relationships, may involve cliques, troublemakers, deliberate campaigns… We didn’t say being a community manager was easy!

Big fat no-nos… there are a few

There are certain situations, certain content, that undeniably must be moderated. Largely common sense will prevail and almost any community manager or moderator would remove:

  • Illegal content
  • Explicit content (unless this suited the nature of the community)
  • Blatant spam
  • Clear defamation of a celebrity or known person

But what about repeating a well-repeated rumour about a celebrity? Or accusing a TV expert of not knowing enough about a subject?

It is not the same as repeating well-worn gossip to a friend in a bar. This is a no-no too.

Case in point: In 2007, Mumsnet.com, an online community started and managed by a group of mums in North London, paid author Gina Ford a five-figure sum to settle a libel claim.

Gina Ford, a well-known figure in the baby book market, advocates strict, routine-based methods that some members of the Mumsnet community took exception to and allegedly defamatory comments were posted.

A legal fight ensued, with Justine Roberts, Mumsnet’s founder telling the press the site’s 15,000 daily comments were “impossible to monitor unless you have eyes and ears everywhere”.

In this case, the reactive moderation was not reactive enough and it proved costly.

  •  
  •  
  •  
  •  
  •  
  •  


For social media agency support get in touch or follow us on Twitter.

8 Comments

  1. Richard Caelius:

    Fantastic advice on the intricacies of community moderation. The article strikes a great balance between detail and overview and leaves the reader, presumably people that are responsible for UGC moderation within organizations, with actionable items that can be implemented and tracked.

    There are many good tips here, but one piece of advice I thought of as being particularly useful was the “Record everything”. Tracking of all interactions with users, as well as edits or removal of content can be useful in court, should this information ever be required as evidence. (I might just add to always track the exact time and IP address as part of your dataset)

    The only small qualification of the article is that it does not mention or discuss automatic software tools for moderation. These can be useful in some instances, where certain types of content are totally unacceptable and even a short publication of this type of content puts the firm hosting the system at risk of litigation. In those cases the risk of post-moderation can be mitigated through tools what can help qualify content and flag risky postings or can auto-moderate. (The latter, as is stated, should be clearly outlined in the guidelines published for the community and there should always be a method to re-instate auto-moderated postings).
    In terms of technical alternatives, there don’t seem to be that many and one of the leading vendors of such automatic moderation services is called Keibi Technologies. (Keibi was just acquired by Lithium Technologies, so it remains to be seen how this offering might adapt through this change in control)

  2. Holly Seddon:

    Hi Richard,

    Thanks for your insightful and detailed comment! You’re right, I hadn’t included any forms of auto-moderation and I think that warrants some further coverage/thoughts. I wonder if a blog post on ‘tool kits for Community Managers’ would be a good idea, I would suggest auto-moderation software forms part of a collection of options for automating/simplifying part of the work, a little like a spell-check does for a writer…

    Thanks again,
    Holly

  3. Footprints (09.06.09) | Chris Deary:

    [...] 4. Moderation and safety [...]

  4. links for 2009-06-10 « Using technology in the voluntary and community sector:

    [...] Moderation and safety Why moderate an online community? (from FreshNetworks) (tags: moderation onlinecommunity communitymanagement socialmedia) [...]

  5. What does a community manager do? | FreshNetworks Blog:

    [...] Managing moderation and moderators That’s right, as communities mature – and some of the first communities are veritable grandparents now – it’s simply not sustainable or sensible to have community managers spending all their time moderating. [...]

  6. Itaquera:

    Great article, thanks for share with us

  7. The unnatural lingo of the online world | FreshNetworks Blog|Social media agency|Online communities:

    [...] does the word ‘moderation’ really mean anything to most people? When we write our disclaimers and use the word, does it mean [...]

  8. Tools for online editors | Content savvy:

    [...] a good start and explains the different approaches to moderating comments; if you’re thinking why moderate?, then Freshnet’s 15 essential articles for community managers answers this and more Share [...]