The RADAR: How do we solve the Facebook problem?

Share Article

"Social Decay" artwork by Andrei Lacatusu, licensed for reuse (CC BY-NC-ND 2.0)

“Social Decay” artwork by Andrei Lacatusu, licensed for reuse (CC BY-NC-ND 2.0)

This is the RADAR, Ranking Digital Rights’ newsletter. This special edition was sent on September 23, 2021. Subscribe here to get The RADAR by email.

A bombshell series published last week by the Wall Street Journal shows how Facebook’s insatiable thirst for user data and consequent obsession with keeping users engaged on the platform takes priority over the public interest again and again, even when people’s lives and fundamental rights are at imminent risk of loss. It also provides new evidence that Facebook does not follow its own rules when it comes to moderating online content, especially outside the U.S.

In a September 18 blog post, Facebook’s VP of Global Affairs Nick Clegg wrote that the stories contained “deliberate mischaracterizations” of the company’s work and motives, but he pointed to no specific examples of details that the stories got wrong. The evidence — internal documents, emails, and dozens of interviews with former staff — is difficult to refute. And it is not especially surprising. It builds on a pattern that journalists, researchers, and civil society advocates have been documenting for years.

One story in the series offers a litany of instances in which Facebook employees tried to alert senior managers to brutal abuses of the platform in developing countries, only to have their concerns pushed aside. Former employees told WSJ of company decisions to “allow users to post videos of murders, incitements to violence, government threats against pro-democracy campaigners and advertisements for human trafficking,” despite all these things going against Facebook’s Community Standards. A former executive said that the company characterizes these issues as “simply the cost of doing business.”

Consistent with years of reports and grievances from global civil society, and more recent accounts from whistleblowers, these stories shed new light on Facebook’s long-standing neglect of human rights harms that stem from its platform, but occur far away from Menlo Park. One internal document showed that of all the time staff spent combatting disinformation, only 13 percent of it was devoted to disinfo campaigns outside the U.S.

The company likely prioritizes content moderation in the U.S. because it faces the greatest regulatory and reputational risks on its home turf. But this is no longer the epicenter of its userbase. With its ruthless pursuit of growth in the global south, the reality today is that most Facebook users do not live in stable democracies or enjoy equal protection before the law. As Facebook endlessly connects users with friends, groups, products, and political ads, it creates a virtual minefield — with real life-or-death consequences — for far too many people worldwide.

Think back to late August, when social media lit up with messages of Facebook and Instagram users in Afghanistan frantically working to erase their histories and contact lists. The company offered some “emergency response” measures, allowing Afghan users to limit who could see their feeds and contacts. But on a platform that constantly asks you to share information about yourself, your friends, your activities, and your whereabouts, is a band-aid solution at best.

In situations of violent conflict, contestation of political power, or weak rule of law, the protection of a person’s privacy can mean the protection of their safety, their security, their right to life. Matt Bailey underlined this in a piece for PEN America:

…in a cataclysm like the one the Afghan people are experiencing, this model of continuously accruing data—of permanent identity, publicity, and community—poses a special danger. When disaster strikes, a person can immediately change how they dress, worship, or travel but can’t immediately hide the evidence of what they’ve done in the past. The assumptions that are built into these platforms do not account for the tactical need of Afghan people today to appear to be someone different from who they were two weeks ago.

But this is not just a problem for people in Afghanistan, or Myanmar, or India, or Palestine, where some of the company’s more egregious acts of neglect have played out and received at least some attention in the West. The problem is systemic.

Facebook employees often cite “scale” as a reason why the company will never be able to consider every human rights violation or scrub all harmful content from its platform. But how exactly did Facebook come to operate at such an awesome scale? Perhaps more than any other social media platform, Facebook has cannibalized competitors and collected and monetized user data at an astonishing rate, putting these things ahead of all other interests, including the human rights of its 3 billion users.

Our work at Ranking Digital Rights rests on the principle that regardless of scale, companies have a responsibility to respect human rights, and that they must carry this out (as written in the UN Guiding Principles on Business and Human Rights) “to avoid infringing on the rights of others and address adverse impacts with which they are involved.” We push companies to commit to respecting human rights across their business practices, and then push them to implement these commitments at every level of their organization. Facebook made such a commitment earlier this year. But to what end?

As evidence of its disregard for people’s rights continues piling up, Facebook’s promises ring hollow, as do its lackluster efforts to improve transparency reporting and carry out (and actually act upon) human rights due diligence. Today, leaks from whistleblowers and former employees seem like the only reliable source of information about how this company actually operates.

For us, this begs the question: How valuable is it to assess Facebook’s policies alone? In this case, and with some of the other tech giants we rank, would it be more effective to expand our focus to include leaks and other hard evidence of corporate practice?

We don’t have all the answers yet, but as revelations like these become more and more frequent, we will continue asking these questions of ourselves, our peers, and the companies we rank. If tech companies do not want to tell the world how they work, how they profit, and how they factor the public interest into their bottom line, we will need to find new ways to force their hand.

Facebook is an ad tech company. That’s how we should regulate it.

RDR Senior Policy and Partnerships Manager Nathalie Maréchal is calling on platform accountability advocates to start following the money when it comes to regulating Facebook. In a recent piece for Tech Policy Press, Maréchal wrote:

[We] need to reframe the ‘social media governance’ conversation as one about regulating ad tech. Facebook, Twitter, YouTube, TikTok and the rest exist for one purpose: to generate ad revenue. Everything else is a means for producing ad inventory.

Maréchal also spoke with The Markup’s Aaron Sankin about Facebook’s claim that it supports internet regulations that would mandate specific approaches to content moderation. We think content moderation is important and raises really difficult questions, but we can’t let this distract us from ads, which are the main driver of Facebook’s profits.

“…[As] long as everyone is focused on user content and all of its discontents, we are not talking about advertising. We are not talking about the money,” Maréchal said. Read via The Markup

Telenor mobile shop in Yangon, Myanmar. Photo by Remko Tanis via Flickr (CC BY-NC-ND 2.0)

Telenor mobile shop in Yangon, Myanmar. Photo by Remko Tanis via Flickr (CC BY-NC-ND 2.0)

Another company in crisis: Telenor’s fraught departure from Myanmar

In July, Norwegian telecommunications firm Telenor announced plans to sell its subsidiary in Myanmar to M1 Group, a Lebanese conglomerate with a record of corrupt practices and human rights abuses. Since then, it has come to light that the Myanmar military, which took control of the country in a February 1 coup, ordered telecommunications providers to install surveillance technology on their networks to help boost the military’s snooping capacity. The sale has yet to be approved by the military regime, and industry sources cited by Nikkei Asia suspect the deal may be rejected.

Human rights advocates in Myanmar and around the world have been pushing Telenor to take responsibility for its human rights obligations and stand up against military demands. In August, RDR joined a coalition letter to Telenor Group board chair Gunn Wærsted calling for the company to either cancel or pause the sale in order to carry out robust due diligence measures, including consultation with local civil society, and publication of human rights impact assessments on the effects of the sale.

What’s new at RDR?

Changes are coming to the RDR Index! This spring, we looked back on five years of the RDR Corporate Accountability Index and made a major decision: In 2022, we will split our flagship research product into two separate rankings. Next April, we will release a new ranking of digital platforms. In October 2022, we expect to publish a new ranking of telecommunications companies. This approach will allow us to dedicate more time to studying the contexts in which these companies operate and to streamline our engagement efforts around all of the companies we rank.

The 2020 RDR Index, now in translation: The executive summary of the 2020 RDR Index is now available in six major languages: Arabic, Chinese, French, Korean, Russian, and Spanish! As in years past, we partnered with Global Voices Translation Services to translate these key components of our research. Check them out.

#KeepItOn: Campaign letters to prevent network shutdowns in Russia, Zambia, Ethiopia

As members of Access Now’s #KeepItOn campaign coalition to prevent network shutdowns worldwide, we supported the following advocacy letters in recent months:

EVENTS
Tech Policy Press symposium | Reconciling Social Media & Democracy
October 7 at 1:00 PM ET | Register here
At this convening to discuss various proposals to regulate the social media ecosystem, Nathalie Maréchal will join panelists including Francis Fukuyama, Cory Doctorow, and Daphne Keller to promote an approach to corporate governance that can advance human rights.

Highlights

A decade of tech accountability in action

Over the last decade, Ranking Digital Rights has laid the bedrock for corporate accountability in the tech sector by demanding transparency from both Big Tech and Telco Giants.

RDR Series:
Red Card on Digital Rights

A story of control, censorship, and state surveillance during the FIFA World Cup in Qatar

Related Posts

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!