Archives for April 2017

Uber breaks Apple’s rules, German court upholds WhatsApp user data sharing ban, local authorities in Kashmir order ISPs to block social media and messaging apps

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

Uber, Apple, and user privacy

The New York Times reported that in 2015 Uber ran afoul of Apple’s privacy rules for adding a feature in its iPhone app allowing it to identify devices even after users had deleted the Uber app or erased all contents on the device. The practice, known as “fingerprinting,” tracks devices using their Unique Device Identifier (UDID), which in 2013 Apple announced it would no longer allow app developers to do. According to the article, Uber engineers “geofenced” Apple’s headquarters in Cupertino, California in an effort to hide that portion of the code from Apple employees. After discovering the code in 2015, Apple CEO Tim Cook demanded that Uber stop fingerprinting devices or it would be banned from the App Store, according to The New York Times.

This issue puts a spotlight on the need for mobile ecosystem companies like Apple, Google, and Samsung, to have clear and transparent user-information collection and retention policies for third-party apps hosted on their app stores. Findings of the 2017 Corporate Accountability Index showed that all three mobile ecosystems evaluated fell short in this regard. While all three companies disclosed they require third-party apps that collect user data to have privacy policies, none disclosed that they review the content of these policies for compliance with app store rules.

German Court bans WhatsApp from sharing user data with other Facebook services

A German court has upheld an order banning Facebook from collecting data on WhatsApp users in Germany. The court ruled that Facebook, which owns WhatsApp, must obtain user consent before its other services can process user information obtained from WhatsApp. WhatsApp updated its terms of service and privacy policy in August 2016 to state that it could share certain user data with Facebook, like a user’s phone number, in order to improve targeting advertising. The German case is one of several ongoing legal challenges the company is facing in the EU over its WhatsApp user data-sharing practices.

Of the 12 internet companies evaluated in the 2017 Corporate Accountability Index, Facebook received the lowest score on our indicator evaluating disclosure of options users have to control what information the company’s collects, retains, and uses. Our research found that WhatsApp did not fully disclose the options users have to control what information is collected or how their information is used for targeted advertising.

ISPs in Kashmir ordered to block social media and messaging services

Authorities in the northern India state of Jammu and Kashmir have ordered all ISPs to block 22 social networks and messaging apps for one month or until further notice. The services include social networks Facebook, Twitter, and QZone, and messaging and VoIP services and apps Skype, WhatsApp, and WeChat, which authorities claim were “being misused by anti-national and anti-social elements” in the Kashmir Valley to disturb “peace and tranquility.” Authorities previously ordered telecommunications companies to suspend 3G and 4G mobile internet services after several videos circulating online of security forces abusing civilians drew outrage from Kashmiris.

The rise of network shutdown orders by governments has sparked growing concerns by human rights groups and policy makers around the world. In 2016, India had the highest number of internet shutdowns in the world, with 31 instances of internet shutdowns in Jammu and Kashmir since 2012, according to the Software Freedom Law Centre. The UN Human Rights Council in 2016 condemned network shutdowns as a violation of international human rights law and called on governments to refrain from taking these actions. At the same time, companies should push back on government demands to shut down networks, and clearly explain the circumstances under which they comply with such requests. Findings of the 2017 Corporate Accountability Index showed that all telecommunications companies evaluated failed to meet this obligation to varying extents and none disclosed sufficient information about their policies for responding to network shutdown requests.

Hungarian government entrusts Russian company with user data, social media crackdowns in France and China, and new smartphone security concerns emerge

Corporate Accountability News Highlights (we are still experimenting with the name) is a new series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

Hungarian Government in Hot Water Over Data Privacy

Hungarian Prime Minister Viktor Orbán and Russian President Vladimir Putin (Image via Kremlin.ru, licensed under a Creative Commons Attribution 4.0 International license)

The Hungarian government’s recent national consultation about EU policies on immigration and economic issues, “Let’s Stop Brussels!,” has come under fire not just for its skewed survey design, but also for the way that its website originally handled individuals’ data. As reported by the Hungarian investigative reporting outlet 444, the online survey portal originally included code for Yandex Metrika, a website analytics tool offered by Russian internet company Yandex (the code was removed from the site after the 444 story was published).The choice of a Russian website analytics tool is interesting in light of Hungarian Prime Minister Viktor Orbán’s moves for closer ties with Russia, which also prompted an opposition party campaign to place stickers on top of the government’s billboards about the consultation so they instead read “Let’s Stop Moscow!”

In addition to raising eyebrows over the potential geopolitical significance, the Hungarian government’s use of Yandex’s code also raised significant privacy concerns. Yandex Metrika includes a feature called “webvisor” which, when enabled, allows administrators to track mouse movements, clicks, keystrokes, entries, and other data to monitor how users interact with their sites. According to 444, not only was this feature enabled on the consultation website, but it was also set up to capture the information a user typed into all fields on the website—including name, age, and email address—potentially violating the site’s privacy policy, which stated that users’ personal data would not be shared with any third parties.

Although the 2017 Corporate Accountability Index did not examine Yandex Metrika as a service, we did evaluate Yandex as a company and several other services. We found that overall, Yandex had limited disclosure of its policies for collecting, using, sharing, and retaining user data. As noted in the Index’s Russian company analysis, Russian law enforcement authorities may have direct access to communications data through a mass surveillance system known as SORM.

This incident also highlights the importance of writing a clear and specific privacy policy and ensuring that all services used on the site are in compliance with the policy, so that users are aware of with whom they are sharing their data.

Facebook Cracks Down on Content

Facebook recently announced in a blog post that as part of its efforts in combatting spam, fake accounts, and “deceptive content,” it had taken action against over 30,000 accounts in France. This move comes shortly before the French presidential election, which according to Reuters, was a key motivator for the company’s efforts to combat misinformation on the platform.

In the 2017 Index, while Facebook received credit for disclosing some data about content that it restricts in response to government requests, the company was found to disclose no information about content and accounts it restricts for violating its terms of service. Although the disclosure in the recent blog post is a step in the right direction, the company should include such information in its transparency report, and also include data on actions it has taken to restrict content due to other reasons.

We (can’t) Chat – Citizen Lab Research on WeChat and Weibo Content Filtering

New research from Citizen Lab examining content filtering on two Chinese messaging and social networking platforms, WeChat (operated by Tencent, which was included in the 2017 Index) and Sina Weibo (not included in the 2017 Index), found evidence of image-based filtering on WeChat. Although it is understood that WeChat, along with other Chinese internet platforms and apps, filters sensitive keywords, this is the first documented instance of similar filtering based on images deemed “sensitive” (in this case, content relating to the detention of Chinese lawyers and activists).

In our 2017 Index, we noted that Tencent had limited disclosure on processes it uses to identify content or accounts that violate the company’s rules, and almost no disclosure on its processes for responding to third party requests for content removals. Both Chinese companies in the Index, Baidu and Tencent, had more limited disclosures on policies relating to users’ freedom of expression than for privacy.

New study claims the angle users hold their phones can help hackers guess PINs

New research from Newcastle University reveals how motion sensor data from when a user types a PIN into their phone can help hackers identify what that PIN is. This data alone is not enough for a would-be hacker to gain access, especially without also knowing how an individual holds his/her phone when typing in certain numbers. However, the study’s authors also noted that unlike other a phone’s camera or microphone, many mobile apps and websites can access motion sensor data without asking a user’s permission, and that “people were far more concerned about the camera and GPS than they were about the silent sensors.”

This study is one example of why app permissions are important, as many apps may have access to this type of user data, and how information that’s not treated as sensitive for app permissions may help give away more private information than users may think. It’s important that mobile ecosystems serve as better gatekeepers for user privacy in their app stores. The Index looks for company disclosure that they review privacy policies of apps in a way that provides adequate privacy safeguards for users.

Groups adapt Corporate Accountability Index methodology for new research

Digital rights groups in India and Pakistan have adapted the Ranking Digital Rights Corporate Accountability Index methodology to evaluate if and how telecommunications and internet companies in those countries disclose commitments to users’ freedom of expression and privacy.

The Centre for Internet and Society, an internet research institute based in India, applied the 2017 Index methodology to evaluate eight telecommunications and internet companies operating in India. Findings showed while companies demonstrated some commitment to users’ privacy, most fell short in key areas. The organization held an event in January 2017 to launch the report as well as a rankathon for participants to learn more about the companies evaluated and to provide feedback on the methodology and ways to adopt it for future research.  

In December 2016, Pakistan-based Digital Rights Foundation published a study examining privacy-related disclosures of five telecommunications companies, based on privacy indicators adapted from the Index methodology. Findings showed that not all privacy policies were available in Urdu or other languages commonly spoken in Pakistan, and that company policies for responding to government requests for user data were often unclear. The organization’s executive director, Nighat Dad, discussed some of these findings at our roundtable session at RightsCon on how to conduct research and advocacy focused on ICT companies.

These types of projects using the Index methodology allow for more in-depth analysis of how companies in different countries or regions commit to respecting users’ rights. We encourage researchers and civil society to adapt the Index methodology to launch research initiatives evaluating company disclosures of policies affecting users’ freedom of expression and privacy in their own countries and contexts.

Corporate Accountability News Roundup

The Corporate Accountability News Roundup is a new series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues  around the world.

Twitter takes on Trump

According to Twitter, the Trump administration last week withdrew its attempt to force the company to reveal the identity of one its users. The account, @ALT_USCIS, is one of several “alt agency” accounts created after President Donald Trump took office, and which tweets criticisms of the administration. In response, Twitter announced it was suing the US government on grounds the demand was “unlawful and unenforceable because it violates the First Amendment rights of both Twitter and its users by seeking to unmask the identity of one or more anonymous Twitter users voicing criticism of the government on matters of public concern.” Twitter dropped its lawsuit after the government withdrew the summons.

The Ranking Digital Rights 2017 Corporate Accountability Index looks for companies to disclose their processes for responding to government requests for user information, including if the company carries out due diligence on government requests before deciding how to respond, and commits to push back on inappropriate or overbroad government requests. This recent case is an example of Twitter implementing these commitments. Our research showed that Twitter clearly disclosed its processes for responding to government requests for user information (P10) and also topped all internet and mobile companies evaluated for its transparency reporting on the government and private requests it receives to hand over user information (P11).

In Europe: bans on encryption, hate speech

The EU Justice Commissioner Věra Jourová has indicated the European Commission will propose new rules this June allowing law enforcement to access information from encrypted apps. This follows pressure from the governments of France, Germany, and the UK–including the recent call from UK Home Secretary Amber Rudd’s for police to be able to access encrypted chats from WhatsApp and similar services. However, as technical experts have continued to caution, allowing such access would prevent companies from being able to deploy secure end-to-end encryption and would put user privacy at risk. In our recommendations for companies, we note that except where permitted by law, companies should publicly commit to implement the highest encryption standards available, including end-to-end encryption. The EU’s proposed rules could prevent companies from being able to do so.

Germany’s cabinet has approved a plan that would fine social media networks for not removing hate speech content quickly enough. This plan raises numerous freedom of expression concerns, and puts companies, rather than courts, in the position of determining what speech is legally permissible. As noted in our recommendations for governments, authorities should limit legal liability imposed on companies for their users’ speech and other activities, in consistency with the Manila Principles on Intermediary Liability.

New rumors about Google’s return to China

The South China Morning Post reported that Google is engaged in talks with the Chinese government to potentially re-enter the Chinese market with certain services, such as Google Scholar. Google has not confirmed or commented about these discussions or whether or not it plans to re-enter China. However, the company did announce that users in China can now download the Translate app, which the Washington Post writes may be a signal that the company is slowing moving back into the Chinese market. In 2010, Google formally withdrew from China and shut down its Chinese-language search engine, citing concerns over censorship. Google’s products are currently blocked by the Chinese government’s “Great Firewall” and are not available to Chinese users without use of censorship circumvention technology.

Tech companies should conduct human rights impact assessments (HRIAs) before launching new services or entering new markets, to identify any risks to user free expression and privacy and take necessary steps to mitigate these risks. This is particularly important before launching any services in markets, such as in China, where the government has a record of human rights abuses.

As our research in the 2017 Index showed, Google expressed a commitment to carry out HRIAs, stating: “Prior to localizing in a new market, the company approach is to first examine the government’s record with respect to freedom of expression and privacy by consulting reports prepared by NGOs and analyzing the laws that are relevant for freedom of expression and privacy in that country.” Therefore, if reports that Google is considering re-entry into China are true, Google should live up to its stated commitments and be carrying out human rights impact assessments that will help it determine whether or how its presence in China may change in the future.
Our research indicates that although many aspects of Chinese companies’ poor performance in the 2017 Index can be blamed on China’s legal and regulatory environment, there are some areas in which companies still have room to improve their disclosures on certain privacy and free expression issues. For more on this, check out our analysis comparing free expression and privacy disclosures from Baidu and Tencent.

Ranking Digital Rights at RightsCon

Corporate transparency is essential to building public trust, according to Annette Fergusson, head of Vodafone Group’s Sustainable Business unit, who spoke on a panel for the European launch of the Ranking Digital Rights 2017 Corporate Accountability Index at RightsCon in Brussels on March 29.

The event featured Ranking Digital Rights (RDR) project director Rebecca MacKinnon, who was joined by a group of panelists to discuss results of the 2017 Index, which ranked 22 of the world’s largest internet, mobile, and telecommunications companies on their disclosed commitments to users’ freedom of expression and privacy. Along with Vodafone’s Fergusson, panelists included Silvia Grundmann, head of the Media and Internet division at the Council of Europe, Adam Kanzer, managing director of Domini Impact Investments, a socially responsible mutual fund, and Afef Abrougui, a researcher with Beirut-based Social Media Exchange Network (SMEX) and an RDR research affiliate. The session was moderated by Malavika Jayaram, executive director of the Digital Asia Hub, an internet research center based in Hong Kong.

Panelists discussed why companies should be transparent about policies affecting users’ freedom of expression and privacy. According to Fergusson, companies need to be transparent in order to gain users’ trust: “Without trust, we don’t have a social license to operate,” she said. Vodafone tied with AT&T for the top spot among the ten telecommunications companies ranked in the 2017 Index. Vodafone earned the highest score among telecommunications companies on the Index’s governance indicators, which measure a company’s institutionalized commitments to human rights, including to freedom of expression and privacy.

While transparency is essential, companies should also work to ensure that human rights commitments made at the parent level are followed through at all levels of the company, according to Abrougui. Telecommunications companies, for instance, can have different policies, and varying levels of policy disclosure, in the different markets in which they operate, she said.  

Kanzer noted that while there is a difference between measuring company disclosure of their policies and measuring their actual practices, policy transparency is an important first step.

Talking so companies will listen, listening so companies will talk

Also at RightsCon, RDR senior research fellow Nathalie Maréchal led a roundtable discussion called “How to Talk So Companies Will Listen, and Listen So Companies Will Talk: Doing Company Advocacy and Research.” The session brought together researchers, advocates, and industry representatives to share best practices for communicating their research or advocacy initiatives to companies.

Participants shared their experiences and strategies for engaging with companies through their work on a variety of projects, including the Fundación Karisma’s and Digital Rights Foundation’s research evaluating the privacy policies of telecommunications companies in Pakistan, and OpenNet Korea’s work with Citizen Lab researching the Korean app “Smart Sheriff.” UCLA professor Sarah Roberts also offered insights into her experiences engaging with companies as part of her research on commercial content moderation. Strategies for company engagement depend on the company and political contexts, and can include building long-term relationships with human rights allies within companies, according to participants.

Michael Samway, former Vice President and Deputy General Counsel at Yahoo! Inc. who founded the company’s Business and Human Rights Program, noted that trust between advocates and companies is only formed through years of engagementand that for advocates, it is crucial to have practical solutions in mind before approaching a company.  

Samway, who serves as an RDR advisory board member, was also interviewed at RightsCon for a podcast discussion about evolution of the  broader business and human rights movement, and how advocates and other stakeholders can achieve meaningful engagement with companies.

This year’s RightsCon event in Brussels brought together 1,500 participants from 100 countries, according to event organizer Access Now. We look forward to seeing everyone next year at the seventh annual RightsCon conference in Toronto!