Please make sure you have read the above note on transparency reporting indicators before using this indicator.
The company should regularly publish data about requests to remove, filter, or restrict access to content or accounts that come through private processes.
Elements:
- Does the company break out by country the number of requests to restrict content or accounts that it receives through private processes?
- Does the company list the number of accounts affected?
- Does the company list the number of pieces of content or URLs affected?
- Does the company list the reasons for removal associated with the requests it receives?
- Does the company clearly disclose the private processes that made requests?
- Does the company list the number of requests it complied with?
- Does the company publish the original requests or disclose that it provides copies to a public third-party archive?
- Does the company report this data at least once a year?
- Can the data be exported as a structured data file?
- Does the company clearly disclose that its reporting covers all types of requests that it receives through private processes?
Definitions:
Account / user account — A collection of data associated with a particular user of a given computer system, service, or platform. At a minimum, the user account comprises a username and password, which are used to authenticate the user’s access to his/her data.
Account restriction – The company limits, suspends, deactivates, deletes, or removes a specific user account or permissions on a user’s account.
Clearly disclose(s) — The company presents or explains its policies or practices in its public-facing materials in a way that is easy for users to find and understand.
Content – The information contained in wire, oral, or electronic communications (e.g., a conversation that takes place over the phone or face-to-face; the text written and transmitted in an SMS or email; photos, links, or text posted on social media).
Content restriction – An action the company takes that renders an instance of user-generated content no longer visible on the platform or service. This action could involve hiding or withholding content (for example, when a company deems that the content violates the local laws of a given country and decides to remove it from view for users located in that country), or deleting the content altogether, meaning the content has been removed from the platform or service and is no longer visible to anyone.
Public third-party archive —Ideally, companies publish information about the requests they receive so that the public has a better understanding of how content gets restricted on the platform. Companies may provide information about the requests they receive to a third-party archive, such as Lumen (formerly called Chilling Effects), which is an independent research project that manages a publicly available database of requests for removal of online content.This type of repository helps researchers and the public understand the types of content that are requested for removal, as well as gain a better understanding of legitimate and illegitimate requests.
Indicator guidance: Companies frequently receive requests to remove, filter, or restrict content or accounts through private processes, such as requests made under the U.S. Digital Millennium Copyright Act, the European Right to be Forgotten ruling, etc.) or through a self-regulatory arrangement (e.g., company agreements to block certain types of images).
We expect companies to regularly publish data about the number and type of requests received through these private processes, and the number of such requests with which it complies.
Potential sources:
- Company transparency report
No Comments