Community-Led-Moderation-LO5 placeholder
Conversations

What Happens When Algorithms Favor Engagement Over Quality?

By Kristen Dunleavy

Debating ideas is essential to a functioning democracy. We argue over theories of what to do and how to do it, over interpretations of a body of information, eventually arriving at a conclusion or a compromise. Over time, through this process, we uncover more truth. The sum of our actions helps move the whole of society forward.

Today, an increasing number of our conversations happen online, within social media platforms. These environments that favor clicks over quality—where algorithms often reward the worst behavior and it’s difficult to distinguish fact from fiction—are filtering how we interpret the world and changing how we communicate, corrupting the basis of what allows a healthy society to function.

In Anne Applebaum’s Atlantic piece, How to Put Out Democracy’s Dumpster Fire, she writes, “We don’t have an internet based on our democratic values of openness, accountability, and respect for human rights. An online system controlled by a tiny number of secretive companies in Silicon Valley is not democratic but rather oligopolistic, even oligarchic.”

So, where did we go wrong? This post will examine why civil discourse is so difficult in our existing social spaces, and why all hope is not lost for the future of the internet.

Social media’s algorithms are making productive discourse impossible

The algorithms that the social media giants have created are designed to capture attention by drawing forward the content that elicits the strongest emotional response—it could be puppies, but it could just as easily be hateful speech.

These algorithms manipulate our emotions by validating and exploiting our biases, feeding us more of what they think we want. And they’re quite good at this, driving engagement so targeted that it leads to the “echo chambers” we’ve all become familiar with.

In this environment, people who aren’t “like” us in one way or another begin to seem further and further away: that distance makes the underlying assumption of good faith outlined above as essential to a functioning democracy harder to maintain. 

“If one half of the country can’t hear the other, then Americans can no longer have shared institutions, apolitical courts, a professional civil service, or a bipartisan foreign policy. We can’t compromise. We can’t make collective decisions—we can’t even agree on what we’re deciding,” Applebaum writes.

Dangerous content spreads quickly thanks to the scale of these platforms 

When a handful of platforms control the majority of online conversations, dangerous content can spread far and wide.

Healthy discourse can’t survive in a place where it’s often difficult to distinguish fact from fiction: on most social platforms there is no accountability for bogus content that is created, shared, remixed, and shared again over and over again. Facebook alone has billions of monthly active users, which has created an environment where misinformation and extremist content has gained serious traction in the past.

In his New Republic piece, Making Sense of the Facebook Menace, Siva Vaidhyanathan writes, “It’s as if no one working there in its early years considered the varieties of human cruelty—and the amazing extent to which amplified and focused propaganda might unleash it on entire ethnicities or populations.”

And since these algorithms favor popularity and controversy over quality, many users are continuously exposed to misinformation. Over time, it becomes more likely that they’ll adopt and share problematic content, continuing an ugly cycle.

Social media platforms aren’t healthy for journalism, either

Journalism plays a critical role in keeping the public informed on the facts and issues required to partake in healthy discourse. 

But today, journalists are increasingly required to play the algorithm’s game, as well.  Platforms like SubStack and Patreon offer journalists independence, but also push journalists further into the role of digital influencer, where they must endlessly compete for followers and likes. In this model, these same algorithms impact journalists’ livelihood by deciding what content is—or is not—surfaced to the journalists’ hard-won social audience.

Facebook’s recent announcement that they will allow independent journalists to host websites and newsletters threatens to accelerate this trend.

In his New Republic piece, Facebook Has Found a New Way to Ruin Media, Jacob Silverman writes,  “Imagine Facebook—with its data-driven, amoral attitude toward publishing, where everything truly is just “content” coming down the production line—deciding to give deals to some of its most popular media personalities.”

The solution? Social platforms that respect and empower journalists, publishers, and users 

Can social media giants do more to encourage healthy discourse on their platform? Of course they can. But those in favor of a healthy democratic society need to seek other environments for healthy discourse. 

“We must alter the design and structure of online spaces so that citizens, businesses, and political actors have better incentives, more choices, and more rights,” said Applebaum.

None of this can happen within a small handful of major, powerful platforms. We don’t need a new Facebook, or a better Twitter: we need platforms designed with the public’s best interests in mind, where all voices are heard and respected.

Algorithms that encourage civil discourse are part of the solution—and there is already progress being made toward more respectful social spaces.

At OpenWeb, we believe that quality conversations are a critical part of the path forward. We help publishers build safe and civil communities where people can debate, listen, and learn—where quality and accountability come first. In these environments—where everyone has a voice—we can work together to support a healthier democracy. Learn more about us.

Subscribe to our blog