Shutterstock

Hate speech. Viral disinformation campaigns. Political polarization propelled by targeted ads.

Pressure is mounting on policymakers to hold internet platforms liable for these kinds of online speech. So far, most regulatory options focus on curtailing free expression in a way that could threaten protections offered by the First Amendment and Article 19 of the Universal Declaration of Human Rights.

On March 17, RDR will publish our first major policy report, “It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge.” The report will point to ways to regulate companies while protecting freedom of expression online.

Note:  Our March 17 event,“It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge,” is cancelled due to evolving concerns around coronavirus (COVID-19) and changes in speaker availability. We apologize for any inconvenience this may cause.

Download: Ranking Digital Rights’ response to Facebook on the Oversight Board bylaws, trust, and human rights review

Shutterstock

Today, Facebook released the highly anticipated bylaws for its Oversight Board, the soon-to-launch independent body that will allow users to appeal the company’s content moderation decisions before independent panels of policy experts.

We at RDR think this experiment in internet governance shows real progress toward new models of content moderation that protect and promote human rights. The bylaws reflect improved remedy with binding results, establish commitments to disclose data, and implement some of the recommendations of a third-party human rights review commissioned by Facebook. At the same time, they reveal that much work remains to be done for these models to succeed and endure. Universal human rights principles should play a central role in the Oversight Board’s processes and structures, and its scope should extend to Facebook’s due diligence mechanisms and algorithmic oversight.

The release of the bylaws follows last month’s announcement that the Oversight Board will operate under an independent trust and the publication of a third-party human rights review of its creation and prospective operations, conducted by BSR. Today, we are publishing a full response to all three developments.

Facebook’s Oversight Board (sometimes referred to as Facebook’s ‘Supreme Court’) has been a long time coming. The company has faced a barrage of criticism in recent years for its lackluster responses to hate speech, disinformation campaigns, and attempts to incite violence through the platform, among other content issues. RDR itself has pushed hard for greater transparency around the company’s Community Standards, which govern what can and cannot be expressed on the platform. There have also been calls for greater transparency around the mechanisms controlling how some voices are amplified on Facebook while others are silenced or obscured.

The Oversight Board, first announced by CEO Mark Zuckerberg in a 2018 blog post, will seek to address some of these shortcomings by offering a binding grievance mechanism unswayed by Facebook’s influence.

In May 2019, reflecting on the Oversight Board’s draft Charter, we argued that the creation of an independent governance and appeals mechanism for content moderation is both critical and timely. The 2019 RDR Index revealed profound gaps in Facebook’s remedy and grievance mechanisms, which were among the weakest of any ranked company. RDR has strongly advocated for Facebook to incorporate the Santa Clara Principles on Transparency and Accountability in Content Moderation into its appeals processes, thus embracing a roadmap to a system of remedy grounded in human rights principles.

The Oversight Board’s newly released bylaws show signs of progress toward an appeals mechanism – a way for users to formally appeal Facebook’s decisions to remove or preserve controversial pieces of content – that may really work in practice. Targeting some of the weaknesses RDR has identified in Facebook’s existing remedy processes, they provide clear timeframes for most aspects of the Oversight Board’s operations and elaborate on the data that will be disclosed as the Oversight Board carries out its mandate. These are promising developments in the direction of transparency and accountability.

But there is significant room for improvement. First and foremost, human rights norms could play a much larger role in both the bylaws and the Charter. From the inception of Facebook’s public consultations on the Oversight Board, RDR has pressed for the Board to be anchored in universal human rights principles, which apply to companies through the UN Guiding Principles on Business and Human Rights. These norms should be a core component of the Oversight Board’s work and permeate its operations, as independent experts have argued repeatedly. We recognize the progress from the draft Charter, which makes no reference to human rights norms, to the final Charter and bylaws, where the Board’s decision-making process includes assessing the impact of content removal on the right to free expression. But this neither covers the full spectrum of human rights nor equates to accepting human rights principles as the cornerstone. Facebook has made a commitment to human rights norms through its membership in the Global Network Initiative, whose Principles are grounded in them. The company should embrace and reiterate this commitment in every new endeavor, including the Oversight Board.

The bylaws also fail to adequately acknowledge the role of algorithms in promoting and amplifying problematic speech. In our present online reality, where platforms are no longer limited to keeping content up or taking it down, the Oversight Board should also have input on other decision-making options available to Facebook, including demotion and other algorithmic changes to the visibility of content. The Oversight Board should be able to issue these and other advisory opinions without having to be prompted by Facebook.

In December, Facebook also announced the creation of an independent trust tasked with supporting the regular operations of the Oversight Board, and shared an independent human rights review of the emerging body. RDR welcomes both announcements. We have long advocated for mechanisms that would ensure the Oversight Board’s independence from Facebook. Yet the risk of bias remains, as Facebook alone is responsible for selecting and appointing the trustees as well as the initial officers of the Oversight Board.

We also commend Facebook for commissioning a human rights review – and for sharing it publicly prior to the launch of the Oversight Board. Human rights impact assessments and similar, structured due diligence mechanisms are in short supply across the industry. This publication has the potential to change that precedent and lay a foundation for best practice. We also encourage Facebook to imbue the Oversight Board with the authority to provide advice on the company’s broader due diligence processes as they develop.

Facebook has accepted a great challenge in setting up the Oversight Board – it is putting forth a structure with the potential to set new norms for the governance of content moderation, not only on its own platform but across the internet. Given its dominant role in the industry as an enabler of online speech for billions of people across the globe, it is critical to get it right the first time. We welcome Facebook’s increased commitment to transparency and accountability. At the same time, the company should take note that this commitment will only take the Oversight Board so far in the absence of an explicit anchoring in universal human rights, which should underpin its design, launch, and evolution.

Wall Street sign

Rick Tap/Unsplash

At a time of regulatory and geopolitical uncertainty, investors should seek tech companies that are using human rights standards to guide their work and build trust with users. Look for companies with policies, practices, and governance that go above and beyond baseline legal compliance. 

Today we release our Winter 2020 Investor Update. Our latest special edition for investors uses RDR’s 2019 Corporate Accountability Index results to show how leading companies are handling artificial intelligence, targeted advertising, content moderation, and other burning industry issues around which regulatory consensus has yet to form. We argue that to get ahead of regulatory risks, CEOs and boards need to take responsibility for the human rights risks and negative social impacts associated with their business models.

Digital rights issues have become increasingly important to investors. The number of shareholder resolutions addressing issues covered by the RDR Index has risen over the years, from just 2 in 2015 to 12 in 2019.

See this interactive table for a list of resolutions cross-referenced to RDR Index indicators. 

A strong theme across many of the proposals that made it onto proxy ballots in 2019 is the need for more responsible and accountable governance—particularly in relation to online speech, artificial intelligence, and privacy. While these resolutions lacked enough votes to pass (with some companies’ dual class share structure making passage impossible), the sharpened focus and growing number of such resolutions points to a clear increase in investor concern about digital rights issues. Related resolutions are already being filed for 2020: the advocacy group SumofUs cited RDR data in a resolution calling on Apple to promote freedom of expression, and the corporate responsibility organization As You Sow filed a resolution calling on Facebook to address disinformation and hate speech. We anticipate that shareholders will be at least as active on these and related issues in 2020 as they were last year.

Our recommendations to investors:

  • Look for companies that go beyond legal compliance to proactive stewardship. Rather than simply looking for how well companies are preparing to comply with anticipated regulation, investors should focus on companies that demonstrate good data stewardship, and proactively work to protect users’ human rights, whether or not the law compels them to do so.
  • Look for companies that conduct comprehensive oversight and impact assessments. By examining company performance on specific RDR Index indicators, investors can gain a more granular picture of specific types of risk. For example: The 2019 RDR Index highlighted the failure of Facebook, Google, and Twitter to conduct human rights impact assessments, which left them ill-equipped to understand and mitigate the risks of these practices for users.
  • Reward companies that take responsibility for their human rights impact on issues that lack regulatory consensus, like online speech. The media is awash with headlines about online extremism, hate speech, and disinformation. Debates about appropriate regulatory responses – from increasing intermediary liability to antitrust – make it harder to predict the regulatory future for online speech than for privacy. Under such circumstances, look for efforts by companies to be accountable to users and affected communities despite the absence of clear regulation. A key first step will be for companies to be more transparent about how they formulate and enforce rules for paid as well as organic user content. Greater disclosure will contribute to a more informed policy discussion about what types of rules will be most effective.
  • Hold companies accountable for the part they play in shaping our shared future: While the spotlight on the world’s most powerful tech giants is already strong, scrutiny of how their operations affect the public interest will only intensify in a highly volatile U.S. election year. At a time like this, corporate responsibility and accountability around advertising business models and algorithmic decision-making systems becomes even more important.

For more analysis and resources, see RDR’s investor resource page. If you are an investment professional, please consider participating in our investor survey.

RDR and Access Now pushed tech companies to adopt one human rights recommendation each. Here’s how they responded.

Ellery Biddle

India’s Jammu and Kashmir region has seen 55 internet shutdowns so far in 2019. These shutdowns, often triggered by conflict on the ground, have left residents unable to communicate, access information, exchange money or do any other activity that the internet enables. While shutdowns happen at the behest of the Indian government, it is up to companies like Bharti Airtel—a dominant internet and mobile service provider in India—to actually cut off services.

This might lead Kashmiris to wonder: What exactly happens when Bharti Airtel receives a shutdown request from the government? Which government agency ordered the shutdown and for what purpose? In what areas is it taking place? How long is it expected to last?

If the company would make just one change—to publicly disclose all information about shutdown requests—customers would at least know what to expect, and where to begin when considering strategies for advocating change.

If you could ask a technology company to make one change in the way it treats user rights, what would it be? Our partners at Access Now asked this very question, focusing on the 24 companies that we score here at Ranking Digital Rights (see the table of recommendations).

This month, Access Now launched a campaign urging each company evaluated in our 2019 RDR Corporate Accountability Index to make just one public commitment to improve their human rights practices.

On October 3, open letters were sent to top executives at each company, asking them to commit to fulfill one key recommendation drawn from the RDR Index. The letters also call for greater transparency and urgency in safeguarding privacy and freedom of expression for users.

Internet shutdowns and content censorship were a major driver of several recommendations. Government and third-party requests to shut down networks, hand over user information, or block content were issued to América Movil, Bharti Airtel, MTN, Ooredoo, Orange, Telefonica, Telenor, Vodafone, and Yandex.

Some companies in the RDR Index do not make explicit commitments to protect or promote human rights at all. The letters advise these companies—Axiata, Kakao, Mail.Ru, and Tencent—to make and publish formal commitments to uphold freedom of expression or privacy.

Other letters targeted more specific areas, such as the handling of user information (Etisalat), protecting net neutrality (AT&T), articulating data breach policies (Verizon Media), and notifying users when restricting access to content (Microsoft).

The letters also urge certain companies to strengthen governance and oversight (Apple and Deutsche Telekom), provide grievance mechanisms (Baidu, Google, and Samsung), and conduct human rights impact assessments of targeted advertising practices (Facebook and Twitter).

Without the kind of transparency that we and our partners at Access Now ask of these companies, they cannot truly be held accountable to the public. RDR is proud to renew its partnership with Access Now in engaging with these companies to advance rights-respecting policies and products. In our previous letter campaigns, 15 out of 22 companies responded to our recommendations in 2018, and 12 of 16 companies responded in 2016.

This year, Access Now gave the companies two weeks to respond to the letters for 2019. You can find company responses to this year’s letters from our recommendations table and the Business and Human Rights Resource Centre as they become available.

Markus Spiske/Unsplash

Earlier this year, we began a process of expanding the RDR Index methodology to address human rights harms associated with companies’ ​targeted advertising policies and practices​, and for their use and development of algorithmic decision-making systems. Today, we are thrilled to publish draft indicators and to invite feedback from companies, policy experts, internet users, and other stakeholders. 

Our goal with these new indicatorswhich will be piloted in the coming weeksis to set global accountability and transparency standards, grounded in the Universal Declaration of Human Rights (UDHR), for how major, publicly traded internet, mobile, and telecommunications companies can demonstrate respect for human rights online as they develop and deploy these new technologies. A description of the methodology development process can be found on our website.

In the coming weeks, we will be assessing these draft indicators by conducting a pilot study, as well as seeking input from the companies ranked in the RDR Index. Company feedback is a key component of our approach, and will complement the input we received from other types of stakeholders throughout the year.

The draft indicators can be downloaded at the following link: “RDR Corporate Accountability Index: Draft IndicatorsTransparency and accountability standards for targeted advertising and algorithmic decision-making systems.”

Please send feedback to methodology@rankingdigitalrights.org. We look forward to hearing from you. 

To stay informed about our progress and plans, please subscribe to our newsletter.