F7. Data about private requests for content or account restriction

The company should regularly publish data about private requests to remove, filter, or restrict access to content or accounts.

Elements
  1. Does the company break out the number of requests it receives by country?
  2. Does the company list the number of accounts affected?
  3. Does the company list the number of pieces of content or URLs affected?
  4. Does the company list the reasons for removal associated with the requests it receives?
  5. Does the company describe the types of parties from which it receives requests?
  6. Does the company list the number of requests it complied with?
  7. Does the company publish the original requests or disclose that it provides copies to a public third-party archive?
  8. Does the company report this data at least once a year?
  9. Can the data be exported as a structured data file?
  10. Does the company clearly disclose that its reporting covers all types of private requests that it receives?
Research guidance

Companies frequently receive requests from parties via private (non-government or non-judicial) processes to remove, filter, or restrict content or accounts. We expect companies to regularly publish data about the number and type of requests received through private processes, and the number of such requests with which it complies. This indicator focuses on requests that come through some sort of defined or organized process. This can be a process established by law, (e.g., requests made under the U.S. Digital Millennium Copyright Act, the European Right to be Forgotten ruling, etc.) or a self-regulatory arrangement (e.g., company agreements to block certain types of images). This indicator does not examine company reporting on content or accounts restricted under terms of service enforcement mechanisms; that is evaluated in indicator F4.

Potential sources:

  • Company transparency report