New support helps add depth to senior management and raise visibility

RDR is growing! 

Since 2015, four iterations of the RDR Corporate Accountability Index have made a clear impact on some of the world’s most powerful internet, mobile, and telecommunications companies. Many have improved disclosures, policies, and practices affecting users’ freedom of expression and privacy in response to RDR’s findings. Thanks to the generous support of our long-standing funders, RDR is setting clear standards for how tech companies should respect their users’ freedom of expression and privacy. 

Now, with support from three new funders in 2019—Luminate, Mozilla, and Craig Newmark Philanthropies—we are taking our first steps toward scaling up and achieving our strategic priorities. The new funding enables us to develop research publications for a broader range of audiences, deepen our engagement with companies, scale up RDR’s operations, strengthen our senior management, and increase our visibility in U.S. and international media. 

To that end, in September RDR welcomed three new members to the team, including our first deputy director, Jessica Dheere. As a former managing editor and founder and former executive director of SMEX, a leading Middle East–based digital rights organization, Jessica brings more than two decades of experience managing operations and communications in media and digital rights. In addition to overseeing day-to-day management of RDR, she will lead the implementation of RDR’s strategic priorities, build out the RDR team, and also apply her experience as an early innovator in the digital rights field to identify new opportunities to expand and meet the diverse needs of our range of stakeholders.

In addition, Hailey Choi has joined RDR as communications associate. With experience leading projects for the Digital Impact Alliance at the United Nations and the Center for International Policy, Hailey is the anchor member of our growing communications team. 

Finally, Zak Rogoff, who has been integral to developing RDR’s new draft indicators on targeted advertising business models and algorithmic decision-making—to be published later this month and piloted this fall—has joined RDR as a full-time research analyst. Before turning to research, he campaigned for Access Now, the Free Software Foundation, and Fight for the Future. 

These new additions—and several more to come before the end of the year—position RDR not only for more growth, but also for greater impact. 

If you believe that benchmarking companies on the quality and transparency of their policies is essential to protecting and respecting digital rights, come work with us. RDR currently seeks an Editorial and Communications Manager who can raise our visibility among diverse stakeholders in both U.S. and international media (apply by Oct. 7) and a Company Engagement Lead & Research Analyst who will strengthen companies’ understanding of why and how they are evaluated by the RDR Index, and what specific steps they need to take to better align their policies and practices with human rights standards (apply by Oct. 14).

Andrew Kazmierski/Shutterstock

Governments committed to a free and open internet must do more than sign statements to ensure that the internet supports and sustains human rights for future generations.

This week, world leaders descended on New York City for the UN General Assembly’s annual debate. While issues like climate change, migration, military conflict, and economic inequality have dominated the news, many government delegations also came to talk about the problems and opportunities of globally networked digital technologies.

On Monday, 27 countries issued a joint statement on “Advancing Responsible State Behavior in Cyberspace.” It affirms that an “international rules-based order should guide state behavior in cyberspace,” including a commitment to universal human rights standards:

“We reiterate that human rights apply and must be respected and protected by states online, as well as offline, including when addressing cybersecurity. …As responsible states that uphold the international rules-based order, we recognize our role in safeguarding the benefits of a free, open, and secure cyberspace for future generations.” 

We appreciate this public commitment, but governments must do more than sign statements. Not only must they uphold their own constitutional and treaty obligations to protect and respect human rights, but also they must ensure that the companies that provide the infrastructure, content moderation, and other services that more than half the world now rely on do not corrode fundamental human rights, especially the rights to freedom of expression and privacy. 

In order for people to exercise their rights and hold power accountable in our digitally networked age, internet users must be able to know who controls their ability to connect, speak online, or access information; and who has the ability to access and share their personal information. Four iterations of the RDR Corporate Accountability Index have underscored how people around the world still lack basic information about how private and government entities exercise power over their digital lives. 

Meanwhile, governments are responding to serious national and public security threats perpetrated through networked communications technologies. Some regulations have improved company disclosures, policies, and practices. Far more have made it harder for companies to meet global human rights standards for transparency, responsible practice, and accountability in relation to freedom of expression and privacy. 

In our 2019 RDR Index report we cite some concrete examples: Many regulatory efforts targeting hate speech and disinformation on social media are pushing companies to over-censor journalism and activism even as they fail to address the harms caused by the abuse of their rules. Government surveillance via internet, mobile and telecommunications companies is growing less accountable and transparent in much of the world. And while data protection laws in some places are helping to protect users’ rights, the lack of coherent regulation in other parts of the world enables threats to proliferate. 

Our analysis of these challenges shows that governments could be doing much more to optimize their legal, regulatory, and policy frameworks both to raise the costs of corporate behavior that infringes on people’s freedom of expression and privacy, and to incentivize greater respect for human rights. While we have included recommendations for governments in previous years, in the 2019 RDR Index we expanded the recommendations for governments in each chapter, and we have now summarized all of them in one reference document to provide more concrete, actionable advice for policymakers. 

Measures and commitments should include: Conducting human rights impact assessments on proposed legislation; maintaining limitations on liability for third-party content; instituting comprehensive privacy law; reforming surveillance law and practices to comply with human rights standards; and protecting the right to encrypt. Most important, just as companies should be subjected to more robust oversight to prevent abuses of users’ rights, governments must commit to enabling independent and credible oversight to prevent abuse of their own censorship and surveillance powers. 

Other recommendations emphasize the importance of accountability and transparency by governments as well as by companies. Specifically, we recommend that companies should be required by law to implement board oversight, systematic internal and external reporting, and impact assessments to identify, evaluate, and mitigate potential human rights harms, including violations of users’ freedom of expression and privacy. As we note in our 2019 RDR Index, such laws are starting to emerge in Europe. 

Government transparency must also be strengthened. As the human rights community pushes companies to be more transparent about what content is removed or restricted, and about who has access to people’s personal data, governments should also publish regular reports revealing the volume, nature, and purpose of requests their agencies and branches make to companies.

Access to remedy is also vital. Some countries already require that companies provide grievance and remedy mechanisms, but where they exist, such laws tend to focus more on physical harms or commercial and service issues rather than digital rights issues like freedom of expression and privacy. Laws could do much to improve the quality and availability of grievance and remedy mechanisms for internet users. 

Finally, global collaboration is essential. Governments committed to advancing a free and open internet that supports and sustains human rights should work proactively and collaboratively with one another, as well as with civil society and the private sector. They should work together with all stakeholders to establish a positive roadmap for addressing threats to individuals and communities without causing collateral violations of human rights. 

The 27 signatories should now hold one another accountable for translating words into action. Only then will they deserve applause from internet users around the world. 

Shutterstock.com

Since the publication of the inaugural Ranking Digital Rights Corporate Accountability Index in 2015, digital rights advocates in non-English-speaking contexts have asked for versions of our materials in their language. Early on, in 2015 and 2017, we provided Spanish translations, and last year we expanded this effort to make the RDR Index more accessible, including five more languages—Arabic, Chinese, French, Korean, and Russian. This year, we added German, marking Deutsche Telekom’s inclusion as the first German company in the RDR Index.

This means that four-page summaries of our key findings are now available in the lingua franca of the majority of the 24 internet, mobile, and telecommunications companies we rank—17 of which are headquartered outside the U.S. In addition to translating the key findings and charts of the 2019 RDR Index, we also translated the company report cards of 12 of the non-U.S. companies.

Our hope is that these multilingual materials, provided by Global Voices Translation Services, will facilitate access for civil society actors to our unique datasets, benchmarks, analysis, and recommendations and help them engage more directly and effectively with company representatives and government officials.

Making sure that RDR’s work represents the global reach of the companies we rank as well as the companies’ global geographic distribution is key to our quest to motivate change that improves corporate respect for internet users’ human rights both locally and worldwide.

You can find all our non-English-language materials on our Translations page, and below, you will find the 2019 translations listed by type and language:

Four-page summary of the key findings of the 2019 RDR Index:

Company report cards from the 2019 RDR Index:

An overview of the methodology is also available in select languages:

If you have any questions regarding the methodology, translations, or company report cards, please contact us at info@rankingdigitalrights.org.

Ryzhi/Shutterstock.com

Ranking Digital Rights (RDR) seeks input on our work to expand the RDR Corporate Accountability Index to address human rights harms from companies’ use of algorithms, machine learning, and automated decision-making. We also seek feedback on our work to incorporate services offered by Amazon and Alibaba into the RDR Index ranking. 

In February 2019, we announced plans to develop the RDR Index methodology to address the evolving, increasingly complex human rights threats that internet users face. We also opened public consultations soliciting feedback for our ongoing work to develop new indicators that set accountability and transparency standards for company policies and practices related to targeted advertising. 

This week, we are releasing a set of consultation documents (see below) summarizing RDR’s work aimed at encouraging corporate accountability and transparency regarding the use of algorithms, machine learning, and automated decision-making. We are also releasing consultation documents (see below) summarizing our work to include Amazon and Alibabaand specifically, e-commerce platforms and digital personal assistantsin the RDR Index.

Stakeholder feedback: We welcome feedback on these documents by September 13, 2019. Feedback from a wide range of experts and stakeholders is essential to developing a methodology that is credible, rigorous, and effective. It will also help to inform further research as well as in-person stakeholder and expert consultations, which in turn will inform the drafting of pilot indicators that will be published and pilot-tested later in 2019. Please send comments and input to: methodology@rankingdigitalrights.org

Algorithms, machine learning, and automated decision-making

The use of automationfor both content curation and data processingposes a range of human rights risks to internet users, particularly to the right to freedom of expression and information and to the right to privacy. The failure by companies to respect these fundamental human rights also causes or contributes to violations of other human rights, such as the right to non-discrimination. The following materials outline our rationale and approach for developing new indicators addressing these issues:

  1. Rationale: for why and how RDR plans to expand the RDR Index methodology to address algorithms, machine learning, and automated decision-making.
  2. Human rights risk scenarios: a list of “risk scenarios,” each describing human rights harms directly or indirectly related to privacy and expression that can result from companies’ use of algorithms, machine learning, and automated decision-making.
  3. Best Practices: a number of best practices for company disclosure and policy that could help prevent or mitigate these risks.

Our goal in developing new indicators that address human rights harms posed by the use of algorithms, machine learning, and automated decision-making is to help set global accountability and transparency standards for how major, publicly traded internet, mobile, and telecommunications companies can demonstrate respect for human rights online as they develop and deploy these new technologies. 

New companies: Amazon and Alibaba

As two of the world’s largest digital platforms, Amazon and Alibaba’s absence from the RDR Index represents a key gap in our current ranking. There have been growing concerns about both companies’ privacy practices and respect for human rights in generalparticularly in relation to e-commerce platforms and personal digital assistants (PDAs), which collect enormous amounts of information about users. The following consultation materials summarize our rationale and approach for integrating these companies and services into the RDR Index. 

  1. Rationale: for why we are expanding the RDR Index to include Amazon and Alibaba.
  2. Human rights risk scenarios: a list of “risk scenarios” describing privacy and freedom of expression-related risks associated with e-commerce platforms and personal digital assistants.
  3. Best practices: a number of best practices for company disclosure and policy that could help prevent or mitigate these risks.

Our goal in expanding the RDR Index to include Amazon and Alibaba is to apply RDR’s global accountability and transparency standards to two companies that have enormous influence over the rights of people around the world who use their products and services. RDR’s work in this area can inform the work of other stakeholders, including investors conducting due diligence on portfolio risk, policymakers seeking to establish regulatory frameworks to protect the rights of internet users, and advocates looking to encourage these companies to adopt policies and practices to mitigate the human rights harms associated with their services.

Please send feedback to methodology@rankingdigitalrights.org. We look forward to hearing from you. 

To stay informed about our progress and plans, please subscribe to our newsletter here.

Images remixed by Oiwan Lam.

On June 4, which coincided with the 30th anniversary of the Tiananmen Square massacre, a user on the Chinese microblogging platform Sina Weibo posted the word “candle’’ in Chinese. Two hours later, the post disappeared.

The post was yet another attempt by Chinese internet users to outsmart censors that ban references to the massacre that followed the 1989 student-led democracy movement in China. In the days leading to this year’s anniversary, platforms like Weibo, LINE, TOM-Skype, and others actively monitored and removed posts referencing and remembering the massacre.

Chinese companies did the same for coverage of memorial activities taking place in Hong Kong, where thousands of people joined a vigil at the city’s Victoria Park to honor the victims. For example, popular live streaming platform YY updated its list of banned keywords to include references to Hong Kong memorial activities, their locations, and names of groups and advocates organizing them.

These cases of content takedowns by Chinese social media platforms at the behest of the government are but the latest examples of how privately-owned internet companies in China are an integral part of the country’s censorship and surveillance regime. Chinese law requires local platforms, as well as foreign companies like Apple and LinkedIn doing business in the country, to proactively monitor and take down objectionable content.

Overall ranking and scores of internet and mobile ecosystem companies.

It is therefore not surprising that China’s largest tech companies Baidu and Tencent continued to perform poorly in the 2019 Ranking Digital Rights (RDR) Corporate Accountability Index. The RDR Index evaluates how transparent companies are about their policies and practices affecting human rights — specifically users’ freedom of expression and privacy.

Baidu and Tencent made notable improvements to policies and disclosures that are not directly related to government censorship and surveillance demands, like how they secure user data from breach or theft, and how they handle user information for commercial purposes. They revealed barely anything, however, about their policies and practices that pose the greatest threats to internet freedom and digital rights in China: censorship and government surveillance. Their inability to disclose commitments, policies, or practices related to government demands to take down content or provide access to user information kept Tencent and Baidu near the bottom of the 2019 RDR Index, ranking 10th and 11th respectively among the 12 internet and mobile ecosystem companies evaluated.

Baidu and Tencent were among the companies that improved their overall scores in the 2019 RDR Index.

 

Freedom of expression blackout

China’s cybersecurity law bans internet users from publishing information that damages “national honor,” “disturbs economic or social order,” or is aimed at “overthrowing the socialist system.” Platforms and search engines automatically filter politically-sensitive keywords such as “human rights’’ and “Tiananmen Square.’’ They are also required to comply with an ever-evolving list of censorship requests from authorities, driven by current events and hot topics on social media.

For example, censors last year banned phrases like “anti-sexual harassment” in an effort to prevent the #metoo movement from spreading to China. According to Wechatscope, a research initiative that monitors censorship on the Tencent-owned messaging and social media app WeChat, allegations of sexual harassment and sexual misconduct were one of the most heavily censored topics on the service in 2018.

Chinese internet companies that fail to comply with regulations risk fines or even revocation of their business license, prompting them to invest substantial financial and human resources to keep objectionable content off of their sites.

In September 2017, the Cyberspace Administration penalized Baidu, Tencent, and Weibo with maximum fines under the country’s cybersecurity laws for failing to detect and take down banned content including, “pornography’’ and “false rumors.’’ A month later, Weibo hired 1000 additional content moderators to monitor and remove “pornographic, illegal and harmful content.”

These companies are also increasingly deploying artificial intelligence technologies to help moderators monitor objectionable content.

The Freedom of expression category of the RDR Index applies 11 indicators to evaluate how transparent companies are about their rules and how they are enforced, how they deal with government demands to censor content, and how they respond to government orders to shut down access to the internet or to certain services or applications. Baidu and Tencent performed poorly in this category.

The government’s constant crackdown on freedom of expression, through censorship demands and draconian laws, prevents companies from being transparent about how they moderate content on their platforms and how they respond to the Chinese government’s censorship orders. In the Freedom of Expression category of the RDR Index, Baidu and Tencent received the two lowest scores of all internet and mobile ecosystem companies, disclosing hardly anything about these policies. Both companies revealed limited information about what types of content and activities are prohibited on their services (F3) but they disclosed nothing about how they respond to government censorship demands (F5). They also did not commit to notify users when they restrict their access to content or accounts (F8).

Privacy progress remains inadequate

In the Privacy category, both Baidu and Tencent made improvements mainly on indicators related to how they handle user information and their security policies.

The Privacy category of the RDR Index applies 18 indicators to evaluate how transparent companies are about policies and practices affecting users’ privacy and security, including how clearly companies disclose what types of user information they collect, share, with whom, and why.

Improvements made by Baidu included disclosing more detailed information about the types of user information it shares, with whom, and why (P4, P5). The company also disclosed more about its security policies, including limits on employees’ access to user data (P13), its process for responding to data breaches (P15), and its use of encryption technologies(P16).

These positive changes appear to have been influenced by new data protection guidelines⁠ — the Personal Information Security Specification⁠ — issued by the national information technology security standards-setting organization (known as TC260), China’s national standards body. The specification clarifies the definition of personal information, and sets the guidelines for how organizations should handle personal information, including the collection, retention, use, sharing and transfer of personal data.

However, this progress remains inadequate to safeguard Chinese users’ privacy from Chinese government surveillance in a regime where political dissent can be defined as a crime and where ethnic muslims who have not been convicted of any crime are held in internment camps against their will.

China’s cybersecurity law requires internet companies to collect and verify users’ identities whenever they use major web sites or services and to “provide technical support and assistance’’ to security agencies in their criminal investigations. Internet companies are also required to keep user activity logs and relevant data for six months and to hand it over to the authorities when requested without due process.

Authorities also have direct access to user data and communications. Internet users have been arrested for the content of private conversations. WeChat has come under considerable scrutiny from activists and dissidents who believe their accounts and conversations are monitored, which the company denies. In April 2018, the internet policing department in Zhejiang Province ordered an investigation of an individual who criticized president Xi Jinping in a WeChat group that only had eight members. A leaked police directive identified the real name of the user, who used a pseudonym, phone number, ID number, and location. In 2017, several WeChat users were arrested after making politically sensitive jokes in a private chat-room.

Laws giving the Chinese government direct access to user communications prevent Baidu and Tencent from being transparent about how they handle government requests to hand over user data. Neither companies published any information at all about how they respond to third-party requests for user data (P10) and failed to reveal any data about such requests (P11). They also disclosed no commitment to notify users about requests made to access their data (P12). Baidu, however, disclosed the circumstances under which it may not notify users of requests for their information.

Opportunities for further improvement

The Chinese censorship and surveillance regime requires internet companies to play a proactive role in monitoring and removing objectionable content and surveilling users. Companies that fail to comply with government orders and regulations risk fines and even closure. As a result it is unrealistic to expect Chinese companies to commit to challenge government demands to censor content or hand over user data or to be very transparent regarding such demands. In fact, Chinese National State Security Law prevents the disclosure of information related to national security and crime investigations. However, even in the absence of regulatory changes, both Baidu and Tencent can take immediate steps to improve their disclosure of policies and practices affecting users’ freedom of expression and privacy.

Specifically, both companies could:

  • Increase transparency about private requests: both companies should improve their disclosures of how they respond to private requests to restrict content or accounts and for user information.
  • Give users more control over their information: Tencent and Baidu should provide users with more options to access and control their own information.
  • Improve transparency regarding handling of user data for commercial purposes: the two companies could further their policies of collecting, sharing and retaining user information.