Work with RDR as an OTF Information Controls Fellow!

Ranking Digital Rights is an official host organization for the Open Technology Fund’s 2017 Information Controls Fellowship, which considers applicants from a broad range of specializations and approaches to propose projects that would help “increase understanding of tactics used by repressive governments to censor and surveil the internet and mechanisms to overcome them.”

As a fellowship host, RDR welcomes applications from graduate students and seasoned researchers with backgrounds in computer science, engineering, internet and telecommunications law, or communications studies with a strong technical background. Projects should be directly related to RDR’s core mission, which is to evaluate and benchmark ICT sector companies on their respect for freedom of expression and privacy. At the same time, fellows should help to address new research questions, methodological problems, or advocacy opportunities not presently covered by the work of RDR’s core full-time staff. For example:

  • Carry out a research project designed to help us to identify, develop, and test out changes to the Index research methodology to accommodate new types of companies or technologies;
  • Carry out a research project to determine how the Index research methodology should be adapted to companies operating in a single country or region;
  • Support a regional research partner in developing and piloting a national or regional version of the Index;
  • Develop a research or technical testing project, using the Index data and findings as a starting point, to examine the impact of certain company policies among certain user communities, or to verify whether companies’ actual practices are consistent with their disclosed policies. Ideally the project could serve as a proof of concept for other researchers to emulate or expand upon;
  • Work with NGO partners in a particular country or region to develop advocacy strategies using Index data;
  • Develop new ways of sharing and using the Index data for advocacy through visualizations and other online tools.

Prior academic research experience or professional work related to freedom of expression, censorship, privacy, and surveillance in the ICT sector is important. Prior experience working collaboratively with teams and meeting deadlines is also important. International experience and ability to read at least one language other than English is a major plus.

The successful applicant will demonstrate a thorough understanding of the Ranking Digital Rights Corporate Accountability Index: its purpose, research methodology, advocacy goals, etc., and will clearly articulate how his or her skills and interests can concretely build upon the project’s methodology, research community, and outputs. We are a global project and proposals to work remotely for all or part of the fellowship are welcome when accompanied by evidence that the applicant has prior experience working remotely with people in other countries. When submitting a CV it will be useful if the applicant can include a list of appropriate references who can attest to their experience and track record.

Click here to learn more about the fellowship (including start times, fellowship length, different fellowship types, stipend, etc) and to apply through the OTF website.

New, global accountability mechanisms needed for a free and open internet

As governments around the world adopt internet regulations that clash with international human rights norms, new and more innovative mechanisms are needed to hold tech companies accountable to these standards, according to a new paper by Ranking Digital Rights (RDR) team members published by the Centre for International Governance Innovation (CIGI).

In the paper, “Corporate Accountability for a Free and Open Internet,” authors Rebecca MacKinnon, Nathalie Marechal, and Priya Kumar make the case for how global human rights benchmarking and evaluation projects like RDR’s Corporate Accountability Index help fill “governance gaps” caused by the failure of traditional governance institutions to hold governments and companies accountable for protecting and respecting the rights of internet users around the world.     

“Private Internet intermediaries increasingly find themselves at odds with governments, with serious implications for human rights,” according to the authors. “Even where law does not compel companies to violate users’ rights, companies generally lack sufficient market and regulatory incentives to protect the human rights of all of their users.”

The authors therefore call for new cross-border accountability initiatives outside existing governance institutions that will strengthen and enforce corporate accountability in upholding international freedom of expression and privacy standards: “If international legal and treaty frameworks cannot adequately protect human rights, then other types of governance and accountability mechanisms are urgently needed to provide incentives to owners and operators of Internet platforms and services to respect human rights,” according to the authors.  

Ranking Digital Rights is one of several efforts that might serve as building blocks for such mechanisms and institutions, according to the authors. The inaugural Index, published in November 2015, ranked Internet and telecommunications companies on 31 indicators evaluating disclosed commitments, policies and practices affecting Internet users’ freedom of expression and right to privacy. These types of rankings, when combined with transparency and disclosure frameworks, can help foster greater accountability as well as respect for international human rights standards.

Why companies fail on privacy policies

Why are privacy policies so difficult to understand? Because they are vague and unclear–which prevents users from understanding what companies do with their information, according to new research by former Ranking Digital Rights (RDR) research analyst Priya Kumar.

In November 2016, Kumar presented a paper using data from RDR’s 2015 Corporate Accountability Index, in which she analyzed the privacy policies of 16 of the world’s largest tech companies evaluated in that year’s Index. Her research shows that these companies typically fail to convey to users what happens to their information–from the point it is collected to when it is (possibly) deleted. Kumar finds that along with vague or unclear language, the lack of uniform definitions for what companies consider “personal information” make it difficult for users to get a complete and accurate picture of how companies handle their information.

The analysis also shows that companies are more transparent about the information they collect compared to what information they share, and that companies are least transparent about what user information they retain–even after a user deletes their account or service. “People would expect a company to keep information they actively submit to the service (e.g., posts, messages, photos, videos, etc.), until they delete it themselves,” according to Kumar. “But companies collect several other types of user information, and they typically fail to disclose how long they retain those types of information.”

The paper was presented as part of the Privacy and Language Technologies track of the Association for the Advancement of Artificial Intelligence’s (AAAI) Fall Symposium Series held in Virginia. Click the link for a PDF of the paper: Privacy Policies and Their Lack of Clear Disclosure Regarding the Life Cycle of User Information

RDR @ the 2016 IGF

Last week, Ranking Digital Rights traveled to Guadalajara, Mexico for the the 11th Internet Governance Forum. The theme this year was “Enabling Inclusive and Sustainable Growth.” In all of the workshops and panels we participated in, our message focused on a central concern: as the next billion people get connected to the internet, their human rights need to be protected and respected by governments and companies. We believe that our Corporate Accountability Index produces data and analysis that can help governments, businesses, and civil society work together to address the concrete challenges in protecting, respecting, and defending human rights in the digital age.

Many of the official IGF sessions and side meetings provoked thoughtful discussion and provided us with ample opportunity to share insights from our work researching and analyzing Internet and telecommunications companies’ human rights-related public disclosures.

Some of the issues we highlighted included:

It is crucial that people have control over how their identities are presented online, as Rebecca MacKinnon noted in the session Human Rights: Broadening the Conversation. Real identity policies are pernicious, particularly for gender minorities and members of marginalized groups, and companies should bear this in mind when determining the choices they provide users.

We stressed the importance of companies disclosing information relating to government requests for user data – both in terms of their processes for responding to these requests, including indicating that they push back against inappropriate or overly broad requests, as well as data about the number of government requests received and with which they complied. This issue was also highlighted by civil society activists from Mexico, who noted that Mexican authorities often obtain user information without oversight or judicial warrants.

We also pointed out that governments have an important role to play to ensure companies adequately respect human rights. In some instances, regulatory ambiguity can leave companies unsure if the law prohibits disclosure on certain issues, and therefore they withhold publishing information on their policies or practices. We’ve also seen in our research that in countries that haven’t passed data protection laws, companies tend to not adhere to best practices for collection of user information.

Combatting online violent extremism was a recurring topic of discussion, particularly in light of the recent announcement that Google, Facebook, Microsoft, and Twitter planned to create “a shared industry database of ‘hashes’ — unique digital ‘fingerprints’ — for violent terrorist imagery or terrorist recruitment videos or images that [they] have removed from [their] services.” We again stressed the need for transparency and accountability for such a system, and for independent review for how images are included in it, as we’ve found that companies’ disclosure around their Terms of Service enforcement is often lacking. Companies have been under enormous pressure from governments to do something about this issue. At the same time, any new measures taken to facilitate the removal of content need to be carried out in a manner that is responsible, accountable, and respects users’ rights. It is vital that companies work closely with civil society to make sure that implementation of the new database system does not inflict new “collateral damage” on freedom of expression.  

Although much of the human rights trends highlighted by civil society at the conference were negative, as censorship and surveillance are on the rise around the world, there is also some cause for optimism. Many ICT companies, particularly those that are members of the Global Network Initiative, are making commitments to respect human rights throughout their operations and carrying out due diligence to ensure these commitments are upheld. This may include instituting board-level oversight on privacy and free expression matters, creating mechanisms for grievance and remedy, and conducting human rights impact assessments. These practices, among others, are evaluated in the “Governance” section of our 2017 methodology (previously referred to as “Commitment” in the 2015 Index). As MacKinnon concluded in the session Implementing Human Rights Standards to the ICT Sector, “Despite all of our complaints, which are many and justified, I think things would be a lot worse if we hadn’t had this system where companies are being held accountable to whether or not they are implementing their [human rights] commitments and whether or not they have a system in place.”

In addition to the official IGF sessions, we met with representatives from civil society, governments, the private sector, academia, and others to discuss a wide range of issues. Several of RDR’s research partners also attended the IGF, and we had a productive meeting with them to share and receive feedback on our ongoing research process and also begin brainstorming plans around the 2017 Index launch this March.

We’re looking forward to continuing many of these conversations in the new year, and in the lead-up to the launch of our 2017 Corporate Accountability Index, which will be launched in advance of RightsCon in late March. Stay tuned!

#KeepItOn: Corporate Accountability for Network Shutdowns

keepiton

Internet shutdowns are bad for human rights – as this YouTube video by RDR advocacy partner Access Now clearly illustrates, and as the UN Human Rights council asserted in a landmark resolution this past summer. Shutdowns are also bad for business. A recent paper by the Brookings Institution found that between July 2015 and June 2016, 81 short-term shutdowns of the internet by 19 countries cost the global economy over $2.6 billion in GDP.

For both reasons, the UK-based investor advocacy group ShareAction and Access Now recently co-published an Investor Brief explaining why investors should be concerned, and suggesting questions they should be asking of the telecommunications companies in whose stock they invest. Last month ShareAction and UNPRI (Principles for Responsible Investment) hosted an investor briefing event in their London offices. RDR was asked to present at the meeting alongside Access Now and the Global Network Initiative, whose members have also been speaking out against the harms of network shutdowns. The Investor Brief cites RDR as a useful tool for investors in evaluating companies’ performance on digital rights including network shutdowns, and notes which companies that performed poorly in RDR’s 2015 Index have also been connected to internet shutdowns.

While our 2015 Index methodology did not have a dedicated indicator focusing exclusively on network shutdowns, specific elements within several of the 2015 “freedom of expression” indicators examined company policies and practices in relation to network shutdowns. Specifically F4: Reasons for account or service restriction, F5: Notify users of restriction, and F6: Process for responding to third party requests which includes requests to restrict or shut down networks, and F7: Data about government requests which included data about requests to shut down networks. Other indicators in the commitment section also sought due diligence and accountability policies and mechanisms that would have an impact on how companies handle government demands to shut down networks.

For the 2017 Index, in response to the growing problem of network shutdowns and the need to highlight company policy and practice in relation to them, we have consolidated elements related to network shutdowns into a single indicator, F10: Network shutdowns, which states:

The company should clearly explain the circumstances under which it may shut down or restrict access to the network or to specific protocols, services, or applications on the network.

In order to evaluate telecommunications companies on this indicator we evaluate their disclosures on eight “element” questions:

  1. Does the company clearly explain the reason(s) why it may shut down service to a particular area or group of users?
  2. Does the company clearly explain why it may restrict access to specific applications or protocols (e.g., VoIP, messaging) in a particular area or to a specific group of users?
  3. Does the company clearly explain its process for responding to requests to shut down a network or restrict access to a service?
  4. Does the company commit to push back on requests to shut down a network or restrict access to a service?
  5. Does the company clearly disclose that it notifies users directly when it shuts down the network or restricts access to a service?
  6. Does the company list the number of network shutdown requests it receives?
  7. Does the company clearly identify the specific legal authority that makes the request?
  8. Does the company list the number of requests with which it complied?

Stay tuned for the launch of the 2017 Corporate Accountability Index in March 2017 to find out which companies do best and worst on this indicator.