Archives for February 2016

RDR @ the 2016 Internet Freedom Festival

Internet Freedom Festival. Come celebrate the free internet with us! 1-6 March 2016, Valencia, Spain

Ranking Digital Rights is organizing a full day of sessions on Saturday, March 5 as part of the Internet Freedom Festival held at Las Naves in Valencia, Spain. The full schedule is available here.

If you want to learn more about how NGOs are encouraging ICT companies to respect human rights, come to our first session, “Holding Companies to Account: Advocating for Corporate Respect for Human Rights” at 10am in the Auditorium. Allon Bar and Nathalie Maréchal will join Jillian York and Sarah Myers West of to compare the experiences of our two projects and discuss methods to push tech companies to better respect human rights. The session will be structured as a conversation between members of each project and then a Q&A with the community.

Interested in ranking technology companies in your country? If so, come  to Taller 2 (workshop 2) for “Ranking ICT companies on digital rights: A ‘how to’ guide” from 11am to 1pm on Saturday. Led by Nathalie and Allon, this interactive workshop will guide participants through the initial steps of launching a ranking similar to RDR’s Index, but on the national or local level. Interested participants are encouraged to RSVP to Nathalie (marechal [at] rankingdigitalrights [dot] org).

Do you have ideas to share? Come  to Taller 2 on Saturday from 3 to 5 pm for “Ranking tech companies part 2: software, devices and networking equipment.” We are hard at work revising the methodology for the next iteration of the Index, and we need your input! In this session we invite privacy and freedom of expression experts, technical specialists, and other participants to discuss how to best incorporate companies that make and sell software, devices, and networking equipment into RDR’s methodology

Ranking such companies brings challenges such as ensuring the indicators are comparable across diverse product ranges, comprehending dense company documents,, and dealing with the fact that these types of companies may have more limited public disclosure. At the same time, it is clear that people who use  of these products may suffer because of how products are configured and what operational decisions companies make. Devices and software may have access to location data or biometric information about their users, they may restrict certain types of web visits, encrypt device storage, etc. These features impact users’ rights to freedom of expression and privacy. That makes it especially important to devise an approach to benchmark software producers and device and network equipment manufacturers.

Some of the specific questions we’d like to brainstorm about include:

  • what specific products should be included?
  • what indicators of the 2015 Corporate Accountability Index can be used directly for these other types of companies?
  • what indicators should be adapted?
  • what indicators should be added?

This session is focused on ensuring that privacy and free expression issues of concern to attendees can be incorporated in the Index. Here again, we’d appreciate if interested attendees could RSVP to Nathalie (marechal [at] rankingdigitalrights [dot] org).

At least part of the team will be present for the entire Festival and we’d love to connect with you, so please reach out!

Fellowship Opportunities with RDR

Ranking Digital Rights is pleased to announce two fellowship opportunities for 2016:

Information Controls Fellowship Program

COMPASS Summer Fellows Program

Ranking Digital Rights will also consider fellows sponsored through other funding schemes. All prospective fellows are encouraged to contact Nathalie Marechal ( with any questions not covered by this web page or by the links provided.


  • Graduate student or seasoned researcher with background in computer science, engineering, Internet and telecommunications law, communication studies, or other relevant field
  • Strong technical background
  • Prior academic research experience or professional work related to freedom of expression, censorship, privacy, and surveillance in the ICT sector
  • Prior experience working collaboratively with teams and meeting deadlines
  • International experience and ability to read at least one language other than English is a major plus

Information Controls Fellowship Program

The Open Technology Fund’s Information Controls Fellowship Program (ICFP) cultivates research, outputs, and creative collaboration at different levels and across institutions on the topic of information controls – specifically repressive Internet censorship and surveillance. Applicants develop their own research plan, which must adhere to the parameters described on the program website.

Senior Fellowship
Start date: Flexible; Summer or Fall 2016
Duration: 6 months or 12 months
Eligibility: Graduate students and seasoned researchers with backgrounds in computer science, engineering, Internet and telecommunications law, or communications studies with a strong technical background
Application deadline: March 25, 2016
Stipend: $4,200 per month, plus travel allowance ($2,500 for 6 months, $5,000 for 12 months)

A senior fellow might focus their work in one of several ways:

  • Support a regional research partner in developing and launching a national or regional version of the Index;
  • Work with NGOs or research partners to develop related research and/or technical testing projects, using the Index data as a starting point, or the fellow might develop such a project themselves as proof of concept for other researchers to emulate or expand upon;
  • Carry out a year-long research and pilot testing project that would produce a proposal for how we might modify the Index methodology to add new technologies and/or company types.

Seasonal Fellowship
Start date: Flexible; Summer or Fall 2016
Duration: three months
Eligibility: Graduate students and seasoned researchers with backgrounds in computer science, engineering, Internet and telecommunications law, or communications studies with a strong technical background.
Application deadline: March 25, 2016
Stipend: $2,500 per month

A seasonal fellow might focus their work in one of several ways:

  • Carry out a research project designed to help us to identify, develop, and/or test out changes to the Index research methodology to accommodate new types of companies or technologies.
  • Specific responsibilities to be assigned based on the fellow’s skills and interests, and on the needs of the project.

For more information on the Information Controls Fellowship Program, see

COMPASS Summer Fellows Program

Start date: June 2016
Duration: Eight weeks
Eligibility: Must be PhD student in Communication at a participating university
Application deadline: Varies by school
Stipend: Varies by school

The Consortium on Media Policy Studies (COMPASS) supports eight-week fellowships for PhD students in Communication Studies from participating universities, providing them with valuable experience and insight into the world of media policy-making in Washington, DC.

Fellows must first apply for funding through their sponsoring university before applying for placement with Ranking Digital Rights. They commit to eight weeks of full-time work at Ranking Digital Rights’ Washington, DC offices, and to participating in other activities organized by the COMPASS program.

The 2016 COMPASS Fellow will primarily conduct research for the 2017 Corporate Accountability Index. Other responsibilities to be assigned based on the fellow’s skills and interests, and on the needs of the project.

For more information on the COMPASS program, see

RDR’s Comments to the UN on the ICT Sector’s Role in Free Expression Online

RDR recently submitted comments to a project looking at the role of companies in promoting freedom of expression led by the U.N. Special Rapporteur on Freedom of Opinion and Expression.

The Special Rapporteur’s study aims to identify the main actors in the information and communications technology (ICT) sector that affect freedom of expression, the legal issues at play, and the frameworks for corporate responsibility that exist in this space. RDR’s submission highlights that while companies face many legal and regulatory obstacles to fully disclosing information about their impact on freedom of expression, even companies that operate in restrictive environments can take steps to improve their respect for freedom of expression.

Broadly speaking, three types of company actions can directly restrict or otherwise affect freedom of expression:

  • Actions resulting from requests made by governments, or other government requirements;
  • Actions resulting from requests made by private parties for legal, commercial, or other reasons, or other private-party requirements;
  • Actions taken by companies on their own initiative when setting and enforcing private terms of service, making design and engineering choices, or carrying out commercial and business decisions.

In many countries, law, policy, or regulation can limit companies’ ability to disclose information about these types of actions. For example, “transparency reporting,” or the disclosure of data related to the volume and nature of requests, is becoming a standard practice. Six of the 16 companies ranked in RDR’s Corporate Accountability Index published some type of transparency report related to freedom of expression concerns (A seventh company released its inaugural report of this type shortly after RDR finalized the Index data).

This reporting varies in clarity and granularity, but in some countries, companies are legally barred from disclosure. For example, Chinese laws on state secrets and national security prohibit disclosure of information on government requests to restrict content, and Indian law prevents companies from disclosing information about specific requests (though this does not preclude reporting of aggregate data).

In other cases, ambiguity in the law leaves companies unsure of what they can and cannot publish. For example, the Malaysian Official Secrets Act 1972 may prevent companies from disclosing some information about government requests, although according to local legal experts consulted during RDR’s Index research, it would be unrealistic to conclude that this law affects every restriction request that companies receive.

RDR’s Index and its prior work on the role of Internet intermediaries demonstrates that while legal and policy environments significantly influence ICT companies, such companies can nevertheless take steps toward respecting freedom of expression, regardless of where they operate.

Companies should clearly commit to respect human rights, including freedom of expression. They should consider their effect on freedom of expression as part of their corporate governance mechanisms and conduct due diligence to understand how their business decisions affect freedom of expression. RDR’s Corporate Accountability Index found that while a number of companies take such steps with regard to privacy, similar oversight of freedom of expression is lacking. For example:

  • Oversight: Researchers examining the Korean company Kakao—which performed competitively in the Index overall—found clear disclosures of executive and management oversight on privacy issues, but they did not find similar evidence of oversight on freedom of expression.
  • Employee training: Of the companies that disclose information about employee training on freedom of expression and/or privacy, Kakao’s public materials only mention privacy-related training. At AT&T (USA) and Vodafone (UK), training programs focused on privacy issues appeared to be more common than trainings covering freedom of expression.
  • Whistleblower programs: Twitter (USA), Bharti Airtel (India) and América Móvil (Mexico) maintain employee whistleblower programs that clearly cover privacy issues, but there is no evidence that these companies’ programs also cover freedom of expression.
  • Due diligence: Impact assessment and related human rights due diligence processes carried out by Vodafone appeared to be more thorough for privacy than for freedom of expression.

RDR’s full comments are available here.

The Special Rapporteur’s study will be presented to the U.N. Human Rights Council in June. The report as well as submissions from stakeholders will be publicly available on the website of the U.N.’s Office of the High Commissioner for Human Rights.

RDR in Slate’s Future Tense blog

The following story, written by Priya Kumar, originally appeared in Slate’s Future Tense blog.

When Was the Last Time You Read a Privacy Policy?

Tech companies know that everyone skips the fine print. It’s time for them to change approaches.

At one point last fall, I had 16 tabs open on my Web browser, each displaying the privacy policy of a global Internet or telecommunications company. While conducting research for the Ranking Digital Rights Corporate Accountability Index, I read and re-read each policy, trying to figure out what companies said they did with the vast stores of user information that lived on their servers.

The end result? While the policies were lengthy, they frequently glossed over the amount of information being collected and the ways in which it is used. Sure, Google and Facebook each mentioned in their policies that they collected user information through third-party trackers. But that in no way reflects the scope of the third-party trackers: According to UC–Berkeley researcher Ibrahim Altaweel and his team, Google’s tracking mechanisms cover 85 percent of the most popular websites, and Facebook’s tracking reaches 55 percent of the most popular websites, giving both companies extraordinary visibility into the Web browsing habits of millions of people. So even those who actually read company policies don’t get a full picture of what’s going on, making it difficult to protect their personal information.

Altaweel’s was one of many findings discussed at the Federal Trade Commission’s PrivacyCon on Jan. 14 and the Future of Privacy Forum’s Privacy Papers for Policymakers event the day before. Both events highlighted the disconnect between people’s expectations and corporate practices related to the collection and use of personal information, echoing the findings of the Corporate Accountability Index.

Researchers challenged the “notice-and-choice” approach that drives privacy protection in the United States. They emphasized that yes, corporate practices related to user information need additional transparency—but simply understanding what companies do isn’t enough if people lack meaningful alternatives or mechanisms for recourse.

Historically, the United States has approached privacy as a commercial matter while countries in Europe and elsewhere typically view privacy as a fundamental right of their citizens. Consequently, other countries tend to have data protection laws that companies must follow, while American companies protect people’s privacy by providing notification of the company’s privacy practices (like the privacy policy) and giving people a choice about whether to give their data to the company (like the “I agree” button everyone clicks without thinking).

The problems of this notice-and-choice approach are evident to anyone who uses the Internet: No one reads privacy policies, and no one really feels like he or she is exercising a choice when clicking “I agree.” Furthermore, privacy policies are written in a way that satisfies regulators, not regular people. They lack clear explanations of company practices, and companies can change the policies at any time.

As usage of mobile apps, wearable technology, and smart devices for inside and outside the home continues to rise, the notice-and-choice model will become even more obsolete given that user information flows almost constantly between such devices and company servers. Consider that UC–Berkeley researcher Serge Egelman and his team found that applications on an Android phone issue nearly 100,000 requests to access a person’s sensitive data per day. No one wants to receive a notification for each of those requests. Yet 80 percent of the participants in Egelman’s study would have denied at least one of those permission requests, had they been given the opportunity to do so.

The argument that individuals rationally weigh the pros and cons of giving their data to companies doesn’t reflect reality considering the vast amount of data users generate and the number of companies that access that data. Joseph Turow, a professor at the University of Pennsylvania, calls this the “tradeoff fallacy.” He and his team found that more than half of Americans want control over their personal information, but they feel powerless to exert that control.

Instead, people use other shortcuts when deciding whether to interact with a given product or service online. Carnegie Mellon University researcher Ashwini Rao and her team found that people use websites based on their expectations of what the companies do with user information, rather than what the privacy policies say companies actually do. For example, users expect a banking website to collect financial information, not health information. Yet Rao and her team found that Bank of America’s policy says it collects and shares health information from its registered users, meaning people could be using the website under mistaken assumptions about the privacy of certain pieces of information.

In addition, professors Heather Shoenberger from the University of Oregon and Jasmine McNealy from the University of Florida found that people are more likely to accept a website’s terms if the site has a privacy policy—regardless of what the policy actually says—or if it has appealing design aesthetics. These behaviors suggest that people may be using websites under mistaken assumptions of what they believe companies are doing with their information. Additional transparency could ameliorate the situation, but what’s truly needed is for companies to act in ways that respect users’ privacy.

Consider this: Columbia University professor Roxana Geambasu and her team found evidence that an advertising tool used by Gmail until November 2014 targeted ads based on sensitive personal information—something its policies, both those in place in 2014 and those in place now, say it does not do. (Google refused to comment on Geambasu’s research, but a representative said, “We have an extensive set of policies that guide what ads can be shown, and in Gmail we manually review all ads that are shown.”) And while Geambasu and other researchers emphasize that results like these do not imply that such targeting is intentional, such work does highlight the need for better understanding of how systems like Google’s algorithms determine who sees what information.

Users currently lack adequate frameworks to seek redress if they believe a company’s actions violate their privacy. The Corporate Accountability Index found a dearth of clear processes for remedy among companies. India’s Bharti Airtel scored highest in the index among telecommunications companies on remedy, and South Korea’s Kakao scored highest among Internet companies. Both companies are headquartered in countries that maintain certain legal requirements for remedy, which boosted their performance.

Fordham University law professor Joel Reidenberg suggests that a new international treaty will be needed to protect privacy in a global, networked era. Otherwise we risk allowing governments to gain greater insight into the lives of their citizens, while the ways that governments use such information become more opaque. Indeed, while a significant portion of user information lives on company servers, privacy policies typically state that companies may turn information over to the government, though, of course, details about the circumstances in which they would do so are scant.

 What else can be done? Turow encouraged public interest organizations to examine company policies and highlight how well, or not so well, companies perform. Ranking Digital Rights’ Corporate Accountability Index does just that, by evaluating a group of technology companies according to standards related to freedom of expression and privacy. The index found that, overall, company disclosure about the collection, use, sharing, and retention of user information is poor. But companies can take steps in the right direction by, for example, providing a list of the third parties with which they share user information, as Yahoo does, or specifying how long they retain user information when people delete their accounts, as Twitter does. Change won’t happen overnight, but if the robust conversations at events such as PrivacyCon are any sign, many people are working to achieve that change.


Last week, Ranking Digital Rights participated in PrivacyCamp and the Computers, Privacy and Data Protection (CPDP) conference in Brussels. Two issues dominated the discussions: government mass surveillance, especially in light of the Schrems Safe Harbor decision (PDF), and the new EU General Data Protection Regulation. Participants also discussed corporate practices and their impact on privacy.

At CPDP, the panel “Appfail or Appwin” discussed how mobile apps may or may not respect users’ right to privacy. For example, Finn Myrstad of the Norwegian Consumer Council investigated the terms & conditions of apps, which may sometimes change without notice, and sometimes impose perpetual, worldwide and irrevocable licenses on users. They conducted a fun experiment to see how that plays out in the streets of Oslo. The Council will soon release a report about apps’ terms. In turn, Richard Tynan of Privacy International explained that even if apps don’t demand your real identity, your device collects enough data to compile a reliable picture of who you are. An audience member working for the Dutch Data Protection Authority called on Google to require that app developers publish a privacy policy, as well as to add an option for users to temporarily grant permissions for apps to do certain things such as access location only at specific times, but not generally.

Other sessions also touched on how companies should address privacy concerns. We learned that more and more companies are thinking about privacy impact assessments as ways to mitigate risks and ensure legal compliance, even though, according to one speaker, individuals within companies conducting the assessment may lack sufficient awareness about the range of privacy risks people might face when using the company’s services. Also, accountability frameworks are emerging as instruments for companies to go beyond compliance with data protection regulations, to being able to credibly demonstrate their practices to external stakeholders. For example, in order to meet RDR’s standard for corporate policy and practice, companies should be accountable not only to business partners and regulators, but also to affected individuals and the wider public.

In a panel on transparency reporting, Artur Alves of Concordia University recognized that transparency reports have beefed up over the past years, but could gain more in uniformity and transparency on internal processes. Microsoft’s Mark Lange highlighted the company’s Transparency Hub, and further talked about national security related requests. He explained that Microsoft is limited in its disclosure on receiving such requests, due to U.S. legal restrictions. Niels Huijbregts of Dutch ISP XS4ALL, said that customers welcome their transparency reports, but he had to overcome hesitation within XS4ALL’s parent company KPN, where some feared harming government relations. Nate Cardozo talked about EFF’s Who’s Got Your Back report, and said that companies are still deficient when it comes to national security-related reporting, reporting on government requests for terms of service enforcement, as well as reporting on informal processes.

Further discussions at CPDP dealt with anything from interactive toys and the need to have granular control over what kind of data they collect from children, to data minimization, where a European Commission panelist called for privacy compliance as a competitive advantage.

Ranking Digital Rights supports the view that improved privacy practices provide business opportunities for companies. The many conversations at CPDP confirmed that a human rights-centered approach to privacy accountability is necessary to improve companies’ practices.