1. The company breaks out the number of requests it receives by country.
2. The company lists the number of accounts affected.
3. The company lists the number of pieces of content or URLs affected.
4. The company lists the types of requests pertaining to the subject matter associated with the requests it receives (e.g., copyright violation, hate speech, incitement to violence, child abuse images, etc.).
5. The company describes the types of parties from which it receives requests (e.g. requests made under under a notice-and-takedown system, requests from a non-governmental organization, requests from a voluntary industry self-regulatory body, etc.).
6. The company lists the number of requests it complied with.
7. The company either publishes the original requests or provides copies to a third-party archive such as Chilling Effects or a similar organization.
8. The company reports this data at least once a year.
9. The data reported by the company can be exported as a structured data file.
Guidance: This indicator examines company disclosure of data on the requests it receives from private parties (non-governmental and non-judicial) to remove content. We expect companies to regularly publish data about the private requests they receive to remove content.
Evaluation: This indicator is scored using a checklist, meaning companies can only receive full credit if their disclosure meets all elements in the checklist.
Potential sources:
For more information, click here for a glossary of terms.