Santa Clara Principles 2.0: Updating standards for transparency in content moderation

Share Article

By Zak Rogoff & Nathalie Maréchal

Identifying content moderation solutions that protect users’ rights to free expression and privacy is one of the toughest challenges we face in the digital era. Around the world, digital platforms are getting due scrutiny of their content moderation practices from lawmakers and civil society alike, but these actors often disagree on how companies might do better. They also routinely fail to consult the people and communities most affected by companies’ failures to moderate content fairly.

But there is agreement in some areas, like on the need for more transparency. Indeed, there is a growing global consensus that companies should be much more transparent and accountable about their processes for creating and enforcing content rules than they are at present.

Today, we’re excited to join colleagues from around the world for the launch of the second edition of the Santa Clara Principles on Transparency and Accountability in Content Moderation, a civil society initiative to provide clear, human rights-based transparency guidelines for digital platforms.

Launched in 2018, the original Santa Clara Principles laid out essential transparency practices that companies could adopt in order to enable stronger accountability around their content moderation practices. The second edition of the principles builds on this work by acknowledging the particular challenges that companies must confront around the world, and by explicitly extending the principles to apply to paid online content, including targeted advertising.

To do this work, Ranking Digital Rights joined more than a dozen civil society organizations to seek feedback on the original set of principles from a range of stakeholders from around the world, to ensure the revised edition would reflect the needs of the diverse and growing body of people who use digital platforms. Our goal was to share our expertise in human rights benchmarking and encourage the coalition to publish standards that align with our standards on governance and freedom of expression, which we have used to evaluate the world’s most powerful tech and telecom companies since 2015.

In particular, we made the case that when it comes to targeted advertising, companies should be held to equal or higher levels of scrutiny and transparency as with the moderation of user-generated content. Beyond protecting the freedom of expression of advertisers themselves, this will help digital platforms take steps to prevent advertising that discriminates or that misleads, harasses, or otherwise interferes with users’ freedom of expression and information rights.

Independent research and reporting has shown that platforms do not adequately enforce national advertising laws, and that they sometimes even violate their own consumer protection-oriented rules. Transparency reporting is a necessary first step toward accountability in this area. Since our 2020 methodology revision, RDR’s indicators have advanced clear standards for advertising transparency that have influenced this and other important policy advocacy efforts.

Read the revised Santa Clara Principles.

Highlights

A decade of tech accountability in action

Over the last decade, Ranking Digital Rights has laid the bedrock for corporate accountability in the tech sector by demanding transparency from both Big Tech and Telco Giants.

RDR Series:
Red Card on Digital Rights

A story of control, censorship, and state surveillance during the FIFA World Cup in Qatar

Related Posts

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!