6. Questions for investors

The RDR Index methodology provides a clear standard for investors to use in evaluating company respect for users’ digital rights.114 How comprehensive are companies’ efforts to mitigate risks to their business? How clearly do they show that they are working to anticipate and reduce potential privacy or freedom of expression risks faced by those who use their technologies, platforms, and services?

Shareholder value is put at risk not only by security breaches, but also when companies fail to identify and mitigate broader risks to user privacy across their business operations. Companies also face risks when they fail to anticipate and address content-related issues spanning from incitement to violence and targeted disinformation campaigns, to government censorship and network shutdowns.

Over the past two years, many government regulatory initiatives have emerged quickly in response to breaches, scandals, and tragedies. The current momentum in the United States for national privacy regulation was not expected by analysts and pundits even a year ago. In response to recent terror attacks and continued concerns about cross-border disinformation campaigns during sensitive election periods, efforts to regulate information flows through telecommunications networks and content appearing on internet platforms are also proliferating in a range of countries. It is clear that if companies merely focus on compliance with existing and widely anticipated regulations, they are not doing enough to protect themselves from long-term regulatory risk.

The RDR Index indicators represent a concrete standard not only for companies to meet their normative responsibility to respect human rights, but also for moving beyond compliance and getting ahead of regulatory risks. Companies in the sector that build their policies and practices around transparency, accountability, and respect for users’ human rights will be in a better position to identify and mitigate harms to individuals and communities that regulators will eventually be compelled to address. Usually, by the time regulatory intervention becomes necessary to address a problem, that problem will have already become entrenched and widespread, making compliance much more costly.

The following 12 categories of questions are offered as guidance for investor due diligence about whether companies are making adequate efforts to respect users’ rights, thereby mitigating individual harms and broader business risks. These questions are also a useful starting point for investor engagement with companies, particularly when combined with key findings and recommendations from the individual company report cards.

1. Oversight: Does the board of directors exercise direct oversight over risks related to user security, privacy, and freedom of expression? Does board membership include people with expertise and experience on issues related to digital rights? (Indicator G2)

2. Risk assessment: Has the company management identified digital rights risks that are material to its business—or which may become material in the future? Does the company carry out human rights impact assessments on the full range of ways that its products and services may affect users’ human rights, including risks associated with the deployment of algorithms and machine learning? Does it disclose any information about whether and how the results of assessments are used? Are the assessments assured by an independent third party? (Indicator G4)

3. Business model: Does the company evaluate and disclose risks to users’ human rights that may result from its business model, particularly targeted advertising? Does it evaluate tradeoffs being made between profit and risk, such as sharing of user data with commercial partners versus strong data controls? (Indicator G4)

4. Stakeholder engagement and accountability: Is the company a member of the Global Network Initiative (GNI) and if not, why not? Does it engage with vulnerable communities in the course of developing and conducting its risk assessment processes, developing and enforcing terms of service, and developing as well as implementing grievance and remedy mechanisms? (Indicator G5)

5. Grievance and remedy: Does the company disclose accessible and meaningful mechanisms for users to file grievances and obtain remedy when their freedom of expression or privacy rights are infringed in relation to the company’s product or service? (Indicator G6)

6. Transparency about data collection and use: Regardless of whether a company claims to be compliant with relevant law, does it disclose clear information about its policies and practices regarding collection, use, sharing, and retention of information that could be used to identify, profile, or track its users? (Indicators P1-P12)

7. Transparency about handling of government demands and other third-party requests affecting users’ freedom of expression and privacy rights: Does the company disclose policies for how it handles all types of third-party requests to provide access to user data, restrict content, restrict access, or shut down service? (Indicators F5-F7, and P10-P12)

8. Publication of transparency data: Does the company publish regular data about the volume and nature of the requests it receives, and responds to, for sharing user data, restricting content or accounts, or shutting down networks? Does it also publish data about the volume and nature of content and accounts restricted in the course of enforcing its own terms of service? (Indicators F6, F7, and P11)

9. Evidence of strong policies for addressing security vulnerabilities: Does the company disclose clear information about policies for addressing security vulnerabilities, including the company’s practices for relaying security updates to mobile phones? (Indicator P14)

10. Encryption: Does the company commit to implement the highest encryption standards available for the particular product or service? If not, why not? (Indicator P16)

11. Mobile security: Do companies that operate mobile ecosystems disclose clear policies about privacy and security requirements for third-party apps? (Indicators P1-P8)

12. Telecommunications transparency about network management: Do telecommunications companies disclose whether they prioritize, block, or delay applications, protocols, or content for reasons beyond assuring quality of service and reliability of the network? If yes, do they disclose the purpose for doing so? (Indicator F9)

To view each company’s “report card”:
rankingdigitalrights.org/index2019/companies

Footnotes

[114] See the 2019 RDR Index methodology at: rankingdigitalrights.org/2019-indicators