community-moderation-LO1-1140×460-1 placeholder
Articles

What Community Moderation Is—and What It Isn’t

By OpenWeb

Most people are familiar with the concept of community moderation—right?

It’s generally assumed that moderation weeds out the ‘toxic’ comments from a community. But what does ‘toxic’ even mean? 

At OpenWeb, our mission is to help publishers build healthy online communities. Our partners rely on us to create safe and civil environments on their sites. We do that using multi-layered moderation technologies powered by proprietary AI and machine learning algorithms.

We have a proven approach to using moderation to build thriving communities. If you’re a publisher, your ability to grow and sustain a community may depend on it.

So let’s break down what community moderation is, what it isn’t, and why it’s important.

What community moderation is

When moderation is done right, it provides a safe environment where everyone can have a voice online. It gives your users the opportunity to disagree, be opinionated, and speak freely and openly. Arguments can happen—in fact, they always do (and should). Disagreements, even stark ones, are a healthy part of civil discourse because they help move the conversation forward.

Moderation protects against toxicity. Toxic online behavior includes racism, hate speech, language that incites violence, trolling and spam, just to name a few.

It’s important to know that disagreement and discord alone don’t equate to toxicity—unless, of course, they veer into toxic territory.

Publishers who use community moderation can guard against toxicity in their communities, generating quality conversations. At OpenWeb, we’ve observed that users who are exposed to quality conversations are more likely to contribute productively to the conversation and to spend more time on-site.

What moderation is not

Moderation is intended to prevent toxic behavior—not remove discordant comments. Users can and should criticize each other’s ideas, criticize the writers on your website, and, yes, criticize even you, the publisher brand. As long as they don’t violate your guidelines or use abusive language, they can voice positive thoughts, negative sentiment, and everything in between.

Walking that line is the key difference between moderation and censorship. Censorship is the suppression of free speech; while moderation allows free speech to flourish and produce productive conversations and further the growth of your community. 

A baseline of moderation is necessary in all communities

Moderation isn’t about shutting down conversations—it’s about making them possible. And effective moderation leads to healthy communities that drive better results for publishers: including increased user engagement, time spent on-site, and retention.


At OpenWeb, we approach quality as a framework where publishers can set their own rules, enforce those rules using moderation tools, and incentivize users to create quality comments through exposure, rewards, and community privileges. Learn more about our approach to moderation.

Let’s have a conversation.

Right now OpenWeb has a limited number of partners we can work with in order to provide the highest quality service to each and every one. Let us know you’re interested and stay informed about how OpenWeb is empowering publishers and advertisers to change online conversations for good.
Loading...
Loading...