fight-toxicity placeholder
Articles, Conversations

Changing how we talk online: How we’re fighting toxicity and promoting quality conversations

By Mitch Hansen and Kristen Dunleavy

It’s no secret: we’re facing a crisis of incivility. Racism, hate, personal attacks, and other toxic behaviors flow too freely online. This stifles free expression, making it difficult to have the important conversations that let us grow, learn, and move society forward. And, as we’ve seen all too frequently, this online hate can spill into the offline world with disastrous and tragic effects.

It’s time to change the way we talk online.

At OpenWeb, we’ve developed powerful, multi-layered moderation technology that combats toxicity and promotes quality conversations. You’ve maybe heard about our work with Jigsaw’s Perspective API to develop AI-enhanced moderation, or how we incentivize quality through interventions in the user experience.

But, what exactly makes a “quality conversation?”

Most would agree that a quality conversation is thoughtful, interesting, and productive: the kind of conversations that help us learn through exposure to different perspectives (imagine, a respectful debate). That’s correct – but there’s more to it. 

Quality conversations are good for society, and they’re also good for publishers.

Publishers who leverage effective moderation and host high quality conversations build strong communities – which means audience growth, retention, registrations, and the creation of valuable first-party data. This empowers publishers to forge a clear path forward in an era when traditional social media platforms still monopolize most users’ time online.

In this post, we’ll explore what quality is and how OpenWeb’s multilayered moderation technology works, and how we create quality conversations across 1,000+ publishers.

How do we define quality?

Through our extensive experience building thousands of online communities and collecting feedback from millions of users, we know that quality conversations have these four main attributes:

  • Fact-based
  • On-topic
  • Informative
  • Attacks ideas, not people

Our hybrid moderation engine promotes quality by surfacing the most productive comments. Let’s take a look at how it works below

How we combat toxicity

At OpenWeb, we approach quality as a framework where publishers are empowered to set their own rules for their community and enforce those rules using our moderation. Our moderation tech is both scalable and customizable, and our partners determine what behavior is and isn’t permitted in their community.

But how do we fight toxicity, practically speaking? 

Our AI/ML powered moderation tech kicks in as soon as a user submits a comment, with toxicity filters that can detect semantic patterns, incivility, author attacks, spam, and more. 

Our OS tracks behavior at the user level to build civility profiles, tailoring moderation controls based on each user’s history of contributions.

Community members can also flag any comments they feel are out of bounds. Finally, OpenWeb’s staff of human moderators are available 24/7 to provide manual moderation when needed 

Quality Creates More Quality

Reducing toxicity is critical for quality conversations, but it’s just the beginning of our approach to moderation. We believe that anyone can contribute to quality conversations in the right environment. One way we do this is by incentivizing users for their productive contributions to the community. Users can earn reputation points based on the thoughtful comments they post, and publishers have the ability to recognize these power users by featuring their comments at the top of the conversation.

But that’s not all: at OpenWeb, we know that when users are exposed to quality comments, they are more likely to create quality comments themselves. Our moderation tech recognizes and surfaces the highest quality comments to the top of the conversation. Our “Best” Sorting algorithm automatically identifies the best comments and elevates them to the top of the conversation where they are exposed to more people.

When we compared “Best” Sorting to “Newest” sorting (or, sorting by the newest comments as opposed to the highest quality comments), we saw a 12% lift in reading time and a 5% lift in safety rate.

Quality is good for publishers

Quality conversations are essential for growing a loyal, registered user base. At OpenWeb, users who take part in quality conversations spend more time on-site and come back to the publisher’s site more frequently. If you’re curious about what quality conversations can do for you, we have a tool for that: our Impact Calculator will tell you what you can achieve with OpenWeb. Try it now.

Subscribe to our blog