Improving Corporate Transparency Reporting

Technology companies face mounting pressure to address certain types of content on their platforms. And while some content can be problematic and deserves to be addressed, Ranking Digital Rights emphasizes the need for companies to develop accountable, fair, and consistent practices when doing so.

To that end, RDR’s 2015 Corporate Accountability Index includes several indicators on companies’ transparency reporting practices with regard to free expression and privacy. These indicators examine the extent to which companies explain their processes to evaluate third-party requests for content restriction or for access to user data, report data about the volume and nature of those requests, and enforce of their terms of service.


RDR has written a white paper, “Ranking Digital Rights Findings on Transparency Reporting and Companies’ Terms of Service Enforcement” to summarize the Index findings on these indicators and provide recommendations on how companies can improve their disclosure and reporting, particularly related to content restriction. These recommendations include:

  • Companies should specify what services or platforms their transparency reporting covers.
  • Companies should expand their transparency reporting to include requests from private parties as well as those from governments.
  • Companies should provide enough granularity in their reporting to give the public a clear picture of the scope and implications of company actions.
  • Through terms of service and other community standards-type documents, companies already disclose information about the circumstances in which they restrict content; they should take the next step and report data about the volume of actions they take to enforce these rules with respect to different types of content.

Since the Index data was finalized, companies have taken steps in the right direction. Last October, Microsoft released its first content removals requests report. This report, which was released after the 2015 Index data was finalized, includes data on government requests, copyright infringement requests related to Bing search results, and requests received under the European Court of Justice’s “right to be forgotten” ruling. Shortly after the Index was released in November, Facebook updated its transparency report to specify that it covers requests related to Facebook, Messenger, WhatsApp, and Instagram.

In February, Twitter became the first company ranked in the Index to disclose some information on the actions it takes to enforce its Terms of Service. It disclosed that it has suspended more than 125,000 accounts “for threatening or promoting terrorist acts,” which violates the Twitter Rules. Twitter’s content removals transparency report covering the second half of 2015 also disclosed the number of times the company received legal requests to restrict content and did so because the content violated Twitter’s terms of service. (Twitter’s reporting does not include content removal requests the company’s customer support team received through online forms.)

These company actions demonstrate that there is momentum toward disclosing more information related to content restriction. We hope RDR’s findings and recommendations can help those who advocate for greater transparency reporting from companies.

Leave a Reply