The RADAR: What really caused Facebook’s latest censorship wave?

Share Article

Human rights defenders document protests in Medellín, Colombia. Photo by Humano Salvaje (CC-BY-SA-2.0)

Human rights defenders document protests in Medellín, Colombia. Photo by Humano Salvaje (CC-BY-SA-2.0)

This is the RADAR, Ranking Digital Rights’ newsletter. This special edition was sent on May 26, 2021. Subscribe here to get The RADAR by email.

It was not just a glitch. In recent weeks, incidents of censorship on Facebook and Instagram spiked and brought frustration to activists and journalists working to document protests in Colombia, violence in occupied Palestine, and the public health crisis in India. Similar patterns have emerged for Twitter users in the latter two locales. Facebook also removed a swathe of posts referencing the Al-Aqsa mosque in the Old City of Jerusalem. Its systems then added insult to injury, notifying users that posts were removed because they were associated with “violence or dangerous organizations.

RDR joined statements issued by SMEX and ARTICLE 19 denouncing this wave of censorship, demanding that the companies be more transparent about how and why they carry out these types of removals, and offering recommendations drawing on RDR’s corporate accountability standards.

While some content has been restored, many questions remain. Spokespeople for both companies have attributed the problems to technology, not human decision-making, using terms like “technical error,” “glitch,” and “.” A tweet from Instagram Commsexplained that “…it took us such a long time to figure out what was taking place…because this [was] an automated deployment…” In other words, a machine did it.

What kinds of machines have such supreme decision-making powers that they can cause thousands of pieces of evidence of human rights violations to disappear from a platform, without any human involvement? We know that algorithmic systems lie at the center of this story, but we don’t know much more.

In the 2020 RDR Index, we showed how some of the world’s most powerful tech companies offer no actionable public information about how their algorithms are built, and how they’re meant to work, and pointed to some examples of just how harmful this can be for people’s rights. This major wave of recent takedowns proves our point.

In the coming months, we will release a bite-sized report on one of everyone’s favorite new companies to scrutinize: ByteDance! This will mark the launch of an expanded research agenda at RDR, where we’ll be using new methods to investigate and report on decisions made by algorithms.

Google pipes data center. Photo by Jorge Jorquera (CC BY-NC-ND 2.0)

Google pipes data center. Photo by Jorge Jorquera (CC BY-NC-ND 2.0)

Putting a check on Big Tech: Access Now letter campaign

What can companies do to prevent such consequences in the future and improve their human rights records? We’ve got answers to this question, for every company we rank! Last week, we launched our annual joint campaign with Access Now and the Business & Human Rights Resource Centre to pressure each of the 26 companies in the RDR Index to make one, single change to their policies or practices.

Read our recommendations

The new oil: Google’s data center deal with Saudi Arabia
Alongside Human Rights Watch, SMEX, and other NGO partners, RDR co-authored a public letter asking Google to go on the record about a pending agreement to build a massive new data center in Saudi Arabia.

The center will store troves of personal data belonging to organizations in the Kingdom and throughout the Arab region, including media and human rights groups. We are pushing Google to publish evidence that it has carried out sufficient human rights due diligence on this deal, given Saudi Arabia’s notorious use of digital tools to spy on and persecute critics like Jamal Khashoggi. Evidence has shown that the late Washington Post contributor was heavily surveilled by Saudi authorities prior to his 2018 assassination at the Saudi embassy in Istanbul.

Farewell, Rebecca!

It is with heavy hearts that we bid farewell to our founder, Rebecca MacKinnon, who will conclude her work with RDR at the end of this month. A leading advocate for freedom of expression and privacy online since 2004, Rebecca conceived and established RDR in 2013 and led our program until 2019, when she elected to step back and seek new leadership for the organization. She found this in Jessica Dheere, who became our director in September 2020.

Read our tribute post →

RDR media hits

Columbia Journalism Review: RDR Senior Policy Analyst Nathalie Maréchal contributed to a CJR Galley discussion about the Facebook Oversight Board, alongside legal scholars Kate Klonick and Evelyn Douek, journalist Alan Rusbridger, and CJR’s Mathew Ingram. “I do not think it is possible to adequately govern online expression without examining the business model,” argued Maréchal. Read via Columbia Journalism Review

Slate: Rebecca MacKinnon wrote an op-ed arguing that Facebook’s biggest problem lies in its actual board of directors, not the fanfared Oversight Board. This week, Facebook shareholders will vote on a proposal urging Zuckerberg to relinquish his seat as chair of the board to an independent chair. “Ready or not, Facebook might then have to experience real oversight by its actual governing board,” she wrote. Read via Slate

Consumer Reports: New research shows that advertisers have been targeting teenage Facebook users to promote subjects that the company explicitly prohibits in their own advertising policy, including gambling and eating disorders. “Enforcing its own rules for advertising is the bare minimum Facebook should be doing,” said RDR’s Nathalie Maréchal, speaking with CR author Kaveh Waddell. Read via Consumer Reports

Drzavljan D: RDR Research Director Amy Brouillette spoke on Drzavljan D, a podcast focusing on information society and the media. “Companies have had a longer history of interacting with regulators and not really confronting demands from the public, particularly telecommunications campaigns,” said Brouillette. Listen via Drzavljan D

Where to find us

RightsCon 2021 | Maximizing company transparency in the majority world
June 11 at 8:30 AM ET | Register here
How can tech companies break down barriers to transparency and how can the human rights community motivate them to do so? Join RDR Company Engagement Lead Jan Rydzak alongside company representatives, civil society members, investors, and researchers to identify strategies to hold companies accountable to users.

Global Solutions Summit | Liberal discourse and values on the internet
May 27 at 16:00 CET/10:00 AM ET | Register here
Rebecca MacKinnon will join a panel at the Global Solutions Summit, an event hosted in cooperation with the German Federal Ministry of Justice and Consumer Protection. Fellow panelists will include German Minister Christine Lambrecht and European Commission Vice President Věra Jourová.

 

Put us on your radar! Subscribe to The RADAR to receive our newsletter by email.

Highlights

A decade of tech accountability in action

Over the last decade, Ranking Digital Rights has laid the bedrock for corporate accountability in the tech sector by demanding transparency from both Big Tech and Telco Giants.

RDR Series:
Red Card on Digital Rights

A story of control, censorship, and state surveillance during the FIFA World Cup in Qatar

Related Posts

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!