Please make sure you have read the note on transparency reporting indicators before using this indicator.
The company should clearly disclose its process for responding to requests to remove, filter, or restrict content or accounts that come through private processes.
Elements:
- Does the company clearly disclose its process for responding to requests to remove, filter, or restrict content or accounts made through private processes?
- Do the company’s explanations clearly disclose the basis under which it may comply with requests made through private processes?
- Does the company clearly disclose that it carries out due diligence on requests made through private processes before deciding how to respond?
- Does the company commit to push back on inappropriate or overbroad requests made through private processes?
- Does the company provide clear guidance or examples of implementation of its process of responding to requests made through private processes?
Definitions:
Account / user account – A collection of data associated with a particular user of a given computer system, service, or platform. At a minimum, the user account comprises a username and password, which are used to authenticate the user’s access to his/her data.
Account restriction / restrict a user’s account — Limitation, suspension, deactivation, deletion, or removal of a specific user account or permissions on a user’s account.
Clearly disclose(s) — The company presents or explains its policies or practices in its public-facing materials in a way that is easy for users to find and understand.
Content – The information contained in wire, oral, or electronic communications (e.g., a conversation that takes place over the phone or face-to-face; the text written and transmitted in an SMS or email; photos, links, or text posted on social media).
Content restriction — An action the company takes that renders an instance of user-generated content invisible or less visible on the platform or service. This action could involve removing the content entirely or take a less absolute form, such as as hiding it from only certain users (eg inhabitants of some country or people under a certain age), limiting users’ ability to interact with it (eg making it impossible to “like”), adding counterspeech to it (eg corrective information on anti-vaccine posts), or reducing the amount of amplification provided by the platform’s curation systems.
Private requests — Requests made through a private process rather than a judicial or governmental process. Private requests for content restriction can come from a self-regulatory body such as the Internet Watch Foundation, or a notice-and-takedown system, such as the U.S. Digital Millennium Copyright Act. For more information on notice-and-takedown, as well as the DMCA specifically, see the recent UNESCO report, “Fostering Freedom Online: The Role of Internet Intermediaries” at http://unesdoc.unesco.org/images/0023/002311/231162e.pdf (p. 40-52 of 211).
Indicator guidance: In addition to demands from governments and other types of authorities, companies can receive requests to remove or restrict access to content and accounts through private processes. These types of requests can come through formal processes established by law, (e.g., requests made under the U.S. Digital Millennium Copyright Act, the European Right to be Forgotten ruling, etc.) or via self-regulatory arrangements (e.g., company agreements to block certain types of materials or images, such as via the EU’s Code of Conduct on Disinformation). Note that this indicator does not regard private requests to be requests that come through any kind of court or judicial process, which are considered under “government” requests (F5a).
Previous RDR research found that telecommunications companies also have in place processes enabling private entities to submit requests to block websites or URLs. For example, in the 2020 RDR Index, Telenor obtained disclosed a private partnership through which it restricts access to ‘’child pornography’’.
This indicator evaluates whether the company clearly discloses how it responds to requests to remove, filter, or restrict content or accounts that come through these types of private processes (Element 1). The company should disclose the basis for complying with these types of requests (Element 2), and whether it conducts due diligence on these requests before deciding how to respond (Element 3). We also expect companies to commit to push back on overly broad requests to remove content or accounts that come through private processes (Element 4), and to publish clear examples that illustrate how a company handles these types of requests (Element 5).
Potential sources:
- Company transparency report
- Company help or support center
- Company blog posts
- Company policy on copyright or intellectual property
No Comments