Face it. You are one of the many people who mindlessly click “agree” when you sign up for online services. We all are. Most if not all of us have probably skimmed or skipped over the terms of service agreements that accompany our favorite websites and apps.
Why did you agree to something you didn’t read? Too long, right?
Take the example of iTunes, the popular online platform for purchasing and downloading music, movies, and other media. If you were to read through the entire iTunes terms of service agreement, you’d be faced with a document that’s over 20,000 words long and contains 56 sections. That’s getting into Game of Thrones territory.
As the world begins to turn away from toxic social feeds, the internet is beginning to prioritize data privacy and transparency. As Publishers we want our users to trust our content and in exchange they are entrusting us with their own user data. But is that a fair exchange if the user doesn’t really understand what they are agreeing to?
It doesn’t take too long to find endless examples of websites who are only interested in protecting their own interests at the expense of the users’ data.
In 2010, a company called GameStation included a clause in its terms of service [Source] agreement that required users to give up their immortal souls. Most users didn’t notice the clause, and over 7,500 people unknowingly agreed to the terms. While this was obviously a joke, it highlights just how absurd and exploitative some terms of service agreements can be.
But it’s not just companies trying to be funny – there have been examples of companies using their terms of service to do things that are decidedly less funny.
For example, in 2012, Instagram changed its terms of service to allow the company to sell users’ photos to advertisers without compensating the users. The change sparked outrage among Instagram users, and the company eventually backed down and reverted to its previous terms of service.
So what can we do about this problem?
One solution is for governments to step in and regulate companies – like in this case, the terms of service agreements that companies are allowed to use.
A recent example of this is the recently proposed TLDR Act. This act would require companies to provide summaries of their terms of service agreements in plain language, so that users can quickly and easily understand what they’re agreeing to.
And the TLDR act is not unique to the US – similar attempts have been made in other countries as well. In 2018, for example, the European Union adopted a new regulation called the General Data Protection Regulation (GDPR), which includes provisions related to the transparency and readability of terms of service agreements. The GDPR requires companies to present their terms of service in a clear and understandable way, and also includes provisions related to user consent and the right to be forgotten.
Great – the governments have solved it – right?
Well not exactly.
Often, where tech is concerned, regulators rely on the loudest voices in the room to guide them in the details of their legislation. That could be concerned citizens, lobbyists, companies – all with varying degrees of understanding and investment in the issue. While this may result in fine laws being passed, the legislative process and advances in technology move at decidedly different paces. So there are many cases where the laws become outdated quickly after being put into effect, as the tech has moved on past the scenario originally in question.
Some jurisdictions are more legislative than others too. There are currently over a dozen privacy regulations that govern websites internationally that site operators need to make sure they are following, many with overlapping or even contradictory requirements.
And then there is the issue of sanctions and oversight. Even if a valid law is passed, some other government agency needs to be in charge of enforcing and policing it. These government agencies are notoriously under-resourced, so review of violations can take years in some cases, allowing offenders of the regulation to practice blissfully until formal sanctions are levied.
A better solution for most tech issues that require oversight is for the industry to step up and figure out on its own how to identify and regulate the bad actors on specific issues – like creating better terms of service agreements that are easier for users to understand.
Since we all love a checks and balances moment, it’s even better if trade organizations can agree on standards and best practices by which to govern themselves. This way, those who are most in the know are helping check their peers, rather than relying on a legislators intern’s googling abilities.
For terms of service, that might mean having a recognized industry best practice of simplifying the language used in these documents, presenting key information more clearly, and perhaps even breaking up the agreement into shorter sections that are easier to digest.
As a Trust & Safety professional now appointed as OpenWeb’s new General Manager of Trust & Safety, I have spent my career navigating issues like these, helping brands with their privacy compliance and making their Privacy Policies (and Terms of Service) easier to read and understand.
This movement originally started with children’s and family brands, where we worked to help the young users and their parents to understand the sometimes complicated legislation that was aimed at protecting their personally identifiable information. I guided a couple of these children’s brands to lead by example, separating legal concepts into smaller chunks of information, using non-legal, conversational language where possible and calling out the parts (with links) that the reader might need further info on.
Soon other children’s and family publishers started following our lead and chose to partner with their audience members and share the accountability, rather than just simply protecting themselves legally with those ubiquitous website footer links to eye-dizzying legal contracts.
And this is one of the main reasons I joined OpenWeb; the commitment to helping craft and promote best practices and set standards for both user communities and publishers – all in the effort to help build an open, healthier web for everyone.
At OpenWeb we believe that the future of the internet requires robust and vibrant online communities where publishers have direct relationships with their audience. By facilitating healthy conversations and experiences, audiences can feel safe and trust that their data isn’t going to fuel an attention-at-all-costs toxic social algorithm.
This core mission of OpenWeb resonates deeply with me. By taking the responsibility to lead in improving data transparency and user trust in the online world, we as companies can bring positive benefits for users, businesses, and publishers alike.
Joi Podgorny is a Trust & Safety expert and Metaverse veteran, who has spent the better part of the past two decades working on the bleeding edge of the technology & entertainment industries, from product management to content/ brand development to leading international data privacy compliance, community management & social media teams.
Joi is currently working with OpenWeb as the GM of Trust & Safety, helping our publisher partners reach their goals with our products, while helping define, create and promote an internet with healthier conversations.