New, global accountability mechanisms needed for a free and open internet

As governments around the world adopt internet regulations that clash with international human rights norms, new and more innovative mechanisms are needed to hold tech companies accountable to these standards, according to a new paper by Ranking Digital Rights (RDR) team members published by the Centre for International Governance Innovation (CIGI).

In the paper, “Corporate Accountability for a Free and Open Internet,” authors Rebecca MacKinnon, Nathalie Marechal, and Priya Kumar make the case for how global human rights benchmarking and evaluation projects like RDR’s Corporate Accountability Index help fill “governance gaps” caused by the failure of traditional governance institutions to hold governments and companies accountable for protecting and respecting the rights of internet users around the world.     

“Private Internet intermediaries increasingly find themselves at odds with governments, with serious implications for human rights,” according to the authors. “Even where law does not compel companies to violate users’ rights, companies generally lack sufficient market and regulatory incentives to protect the human rights of all of their users.”

The authors therefore call for new cross-border accountability initiatives outside existing governance institutions that will strengthen and enforce corporate accountability in upholding international freedom of expression and privacy standards: “If international legal and treaty frameworks cannot adequately protect human rights, then other types of governance and accountability mechanisms are urgently needed to provide incentives to owners and operators of Internet platforms and services to respect human rights,” according to the authors.  

Ranking Digital Rights is one of several efforts that might serve as building blocks for such mechanisms and institutions, according to the authors. The inaugural Index, published in November 2015, ranked Internet and telecommunications companies on 31 indicators evaluating disclosed commitments, policies and practices affecting Internet users’ freedom of expression and right to privacy. These types of rankings, when combined with transparency and disclosure frameworks, can help foster greater accountability as well as respect for international human rights standards.

Why companies fail on privacy policies

Why are privacy policies so difficult to understand? Because they are vague and unclear–which prevents users from understanding what companies do with their information, according to new research by former Ranking Digital Rights (RDR) research analyst Priya Kumar.

In November 2016, Kumar presented a paper using data from RDR’s 2015 Corporate Accountability Index, in which she analyzed the privacy policies of 16 of the world’s largest tech companies evaluated in that year’s Index. Her research shows that these companies typically fail to convey to users what happens to their information–from the point it is collected to when it is (possibly) deleted. Kumar finds that along with vague or unclear language, the lack of uniform definitions for what companies consider “personal information” make it difficult for users to get a complete and accurate picture of how companies handle their information.

The analysis also shows that companies are more transparent about the information they collect compared to what information they share, and that companies are least transparent about what user information they retain–even after a user deletes their account or service. “People would expect a company to keep information they actively submit to the service (e.g., posts, messages, photos, videos, etc.), until they delete it themselves,” according to Kumar. “But companies collect several other types of user information, and they typically fail to disclose how long they retain those types of information.”

The paper was presented as part of the Privacy and Language Technologies track of the Association for the Advancement of Artificial Intelligence’s (AAAI) Fall Symposium Series held in Virginia. Click the link for a PDF of the paper: Privacy Policies and Their Lack of Clear Disclosure Regarding the Life Cycle of User Information

RDR @ the 2016 IGF

Last week, Ranking Digital Rights traveled to Guadalajara, Mexico for the the 11th Internet Governance Forum. The theme this year was “Enabling Inclusive and Sustainable Growth.” In all of the workshops and panels we participated in, our message focused on a central concern: as the next billion people get connected to the internet, their human rights need to be protected and respected by governments and companies. We believe that our Corporate Accountability Index produces data and analysis that can help governments, businesses, and civil society work together to address the concrete challenges in protecting, respecting, and defending human rights in the digital age.

Many of the official IGF sessions and side meetings provoked thoughtful discussion and provided us with ample opportunity to share insights from our work researching and analyzing Internet and telecommunications companies’ human rights-related public disclosures.

Some of the issues we highlighted included:

It is crucial that people have control over how their identities are presented online, as Rebecca MacKinnon noted in the session Human Rights: Broadening the Conversation. Real identity policies are pernicious, particularly for gender minorities and members of marginalized groups, and companies should bear this in mind when determining the choices they provide users.

We stressed the importance of companies disclosing information relating to government requests for user data – both in terms of their processes for responding to these requests, including indicating that they push back against inappropriate or overly broad requests, as well as data about the number of government requests received and with which they complied. This issue was also highlighted by civil society activists from Mexico, who noted that Mexican authorities often obtain user information without oversight or judicial warrants.

We also pointed out that governments have an important role to play to ensure companies adequately respect human rights. In some instances, regulatory ambiguity can leave companies unsure if the law prohibits disclosure on certain issues, and therefore they withhold publishing information on their policies or practices. We’ve also seen in our research that in countries that haven’t passed data protection laws, companies tend to not adhere to best practices for collection of user information.

Combatting online violent extremism was a recurring topic of discussion, particularly in light of the recent announcement that Google, Facebook, Microsoft, and Twitter planned to create “a shared industry database of ‘hashes’ — unique digital ‘fingerprints’ — for violent terrorist imagery or terrorist recruitment videos or images that [they] have removed from [their] services.” We again stressed the need for transparency and accountability for such a system, and for independent review for how images are included in it, as we’ve found that companies’ disclosure around their Terms of Service enforcement is often lacking. Companies have been under enormous pressure from governments to do something about this issue. At the same time, any new measures taken to facilitate the removal of content need to be carried out in a manner that is responsible, accountable, and respects users’ rights. It is vital that companies work closely with civil society to make sure that implementation of the new database system does not inflict new “collateral damage” on freedom of expression.  

Although much of the human rights trends highlighted by civil society at the conference were negative, as censorship and surveillance are on the rise around the world, there is also some cause for optimism. Many ICT companies, particularly those that are members of the Global Network Initiative, are making commitments to respect human rights throughout their operations and carrying out due diligence to ensure these commitments are upheld. This may include instituting board-level oversight on privacy and free expression matters, creating mechanisms for grievance and remedy, and conducting human rights impact assessments. These practices, among others, are evaluated in the “Governance” section of our 2017 methodology (previously referred to as “Commitment” in the 2015 Index). As MacKinnon concluded in the session Implementing Human Rights Standards to the ICT Sector, “Despite all of our complaints, which are many and justified, I think things would be a lot worse if we hadn’t had this system where companies are being held accountable to whether or not they are implementing their [human rights] commitments and whether or not they have a system in place.”

In addition to the official IGF sessions, we met with representatives from civil society, governments, the private sector, academia, and others to discuss a wide range of issues. Several of RDR’s research partners also attended the IGF, and we had a productive meeting with them to share and receive feedback on our ongoing research process and also begin brainstorming plans around the 2017 Index launch this March.

We’re looking forward to continuing many of these conversations in the new year, and in the lead-up to the launch of our 2017 Corporate Accountability Index, which will be launched in advance of RightsCon in late March. Stay tuned!