firefighting placeholder

Learning from the Most Disastrous Experiment in Content Moderation

By Kristen Dunleavy

When Parler recently announced its return to Apple, it was under the condition that the app provide better moderation efforts to remove harmful content from its platform. Parler previously used very little moderation—and as a result, it was booted by both Apple and Google earlier this year for allowing the spread of violent rhetoric around the attacks on the US Capitol.

Parler touted itself as the “free speech” alternative to Twitter and Facebook—but there’s a difference between promoting free speech and enabling harmful online behavior like hate speech, trolling, misinformation, and more. Free speech shouldn’t be a free-for-all; it requires protections to ensure participants can freely express themselves without fear.

To protect free speech online, we need to create environments where all people feel invited to speak freely and engage in the types of debates that help us move forward as a society. That is, we need a baseline of safety to make free expression possible.

Effective content moderation—like OpenWeb’s AI/ML-powered tech—allows free speech to flourish. How? Moderation reduces toxicity, creating safer online spaces. Even more importantly, content moderation surfaces the best and most thoughtful comments, leading to higher quality conversations and healthier debates.

This post will explore how moderation creates safe environments where quality conversations can flourish—and why these environments are critical for free speech online.

Safety is a bare necessity for free speech online

Toxicity can quickly spin out of control in any online community. In fact, 38% of adults say they see trolling on social media every day.

No one wants to become bait for trolls simply for voicing their opinion. That’s why safe, civil, online environments are critical for encouraging free speech—and effective moderation makes these environments possible.

All online platforms can—and should—set their own standards for content moderation. They can remove bad actors and flag any speech that goes against their guidelines. It’s important to note that moderation and censorship are not the same. Censorship is the complete suppression of free speech, while moderation makes free speech possible by weeding out toxicity

In environments with less toxicity, meaningful conversations aren’t competing with trolling and harassment—and it’s a lot easier for more people to join the discussion.

Quality allows free speech to flourish

Safety is the baseline for creating environments that encourage free speech online. Content moderation not only creates safer online spaces, it promotes higher quality conversations.

Reducing toxicity is only part of moderation’s job. Since moderation surfaces the most thoughtful comments, it leads to more productive conversations and fewer dead-end, off-topic musings. It’s also less likely that questionable content will take center stage, distracting attention away from the most meaningful conversations.

Online spaces with higher standards of quality—where rational, thought-provoking discussions can take place—encourage more people to join in and lead to the healthy debates that we need as a society.

Both safety and quality are essential for free speech online

Parler is just one example of what can happen when a platform uses too little content moderation. Instead of providing a safe haven for free speech, the platform let everything in—including the harmful content that ultimately led to its ban. 

A baseline of moderation is essential for free speech online. Communities that use effective moderation can reduce toxicity, leading to higher quality conversations where more people can have their voices heard.

Healthier online environments aren’t a pipedream, either: there are already thousands of publishers growing thriving online communities where their users can freely exchange ideas, without toxicity taking over. 

How are they doing it? OpenWeb helps publishers become the hosts of online discourse using our quality conversation platform. We do this using multi-layered, AI/ML-driven moderation technology to create healthier online conversations. Learn more about our approach to quality

Let’s have a conversation.

Right now OpenWeb has a limited number of partners we can work with in order to provide the highest quality service to each and every one. Let us know you’re interested and stay informed about how OpenWeb is empowering publishers and advertisers to change online conversations for good.