Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

UK Government calls for increased internet regulation following terror attack

UK Prime Minister Theresa May (Image via Number 10, licensed CC BY-NC-ND 2.0)

In response to the most recent terror attack in London, UK Prime Minister Theresa May is calling for tighter internet regulations in order to combat terrorism. May said she plans to pursue international agreements to regulate cyberspace and criticized internet companies for providing “safe spaces” that allow extremists to communicate.

Adding to May’s comments, UK Home Secretary Amber Rudd said the government wanted tech companies to do more to take down extremist content and to limit access to end-to-end encryption.

In response, some tech companies defended their efforts to identify and remove extremist content. “Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it,” according to Facebook.

Human rights advocates warn that government attempts to regulate the internet could threaten freedom of expression and that efforts to address online extremist content must be consistent with international human rights norms. According to David Kaye, UN Special Rapporteur on freedom of expression, efforts to combat violent extremism can be the “perfect excuse” for both authoritarian and democratic governments to restrict freedom of expression and control access to information.

Brazil’s Supreme Court holds hearings on WhatsApp blocking

Brazil’s Supreme Court has held hearings to decide whether it is legal for courts to direct telecommunications companies to block apps like WhatsApp. Lower courts on numerous occasions have ordered telecommunications companies to block WhatsApp (which is owned by Facebook), after the company refused to turn over user information requested as part of criminal investigations. Judges in these cases based their rulings on an interpretation of Brazil’s Marco Civil law, which some civil society groups argue is an improper application of the law.

WhatsApp co-founder Brian Acton, who testified at one of the hearings, explained why the company is unable to turn over user information, “Encryption keys relating to conversations are restricted to the parties involved in those conversations,” Acton said. “No one has access to them, not even WhatsApp.” The court is expected to issue a ruling on the matter in the next few weeks.

Of the eight messaging and VoIP services evaluated in the 2017 Corporate Accountability Index, WhatsApp earned the highest score for its disclosure of encryption policies. The company disclosed that transmissions of user communications are encrypted by default using unique keys, as well as that end-to-end encryption is enabled for users by default.

Weibo users censored on Tiananmen anniversary

Users of Chinese social network service Weibo faced restrictions when attempting to post photos or videos on the platform during the anniversary of the 1989 Tiananmen Square pro-democracy protests. Users outside of China were unable to post photos or videos, and users within China also reported that they were unable to change their profile information or post photos and videos as comments during this time.

Although references to Tiananmen Square are regularly censored in China, both by the government and by technology companies, censorship is often heightened in early June each year, as documented by groups such as Freedom House. In a post on June 3, Weibo said that some of the platform’s functions would not be available until the 5th due to a “systems upgrade.” Comments on the post were disabled, and according to a Mashable reporter, “when we attempted to post a comment saying: ‘I want to comment,’ a notice popped up saying that the comment was in violation of Weibo’s community standards.”

As research in the 2017 Corporate Accountability Index showed, Chinese companies are legally liable for publishing or transmitting prohibited content, and services that do not make a concerted effort to police such content can be blocked in China.

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

The Guardian publishes Facebook internal content guidelines, offering glimpse into opaque process

Image by Veluben (Licensed CC BY-SA 3.0)

An investigation by the Guardian has revealed new details about Facebook’s internal rules for policing and removing content. The “Facebook Files” is based on information from internal training manuals for the company’s moderators leaked to the Guardian, which outline rules for reviewing and removing content, including violent, sexually explicit, extremist, racist, and other types of sensitive materials. Facebook moderators reported having little time to decide whether to remove a given item and voiced concern over confusing and inconsistent content removal policies, according one report.

In response to the Guardian’s investigation, Facebook said they “don’t always share the details of our policies, because we don’t want to encourage people to find workarounds – but we do publish our Community Standards, which set out what is and isn’t allowed on Facebook, and why.”

The Guardian’s investigation puts a spotlight on the lack of transparency by social network companies regarding their internal policies for evaluating and removing content. Findings of the 2017 Corporate Accountability Index showed that most companies evaluated failed to disclose any information about the volume and nature of content they remove for terms of service violations. Only three companies of the 22 evaluated—Google, Microsoft, and Twitter—received any credit for doing so. Facebook does not publish data on any type of content removed for violating its rules, and did not receive any credit on this indicator.

Apple discloses it received a now-declassified National Security Letter

In its latest transparency report, Apple revealed for the first time that a National Security Letter it received from the U.S. government has been declassified, though the company did not publish the letter itself. The USA Patriot Act allows the government to compel a company to disclose user information by issuing a National Security Letter, without a court order. National Security Letters also include a gag order preventing companies from disclosing any information about the request, including to affected users. Some companies have been able to publish some of the national security orders they’ve received through legal challenges and following the passage of the 2015 USA Freedom Act.

Apple also reported it received double the number of national security orders in the last half of 2016 compared to the first half of the year. Apple defines “national security orders” as requests made under the Foreign Intelligence Surveillance Act (FISA) as well as National Security Letters it received. As the company notes, U.S. law allows companies to publish national security requests only within a certain range, but prohibits them from publishing the specific number.

In the 2017 Corporate Accountability Index, Apple received the fourth-highest score of the 12 internet and mobile companies evaluated for its disclosure of data about government and other third-party requests for user information, including the number of requests it receives and with which it complies, the legal authority requests are made under, and what types of user information are requested.

WeChat launches new search feature

WeChat, a messaging app owned by Chinese internet company Tencent, recently launched a new search feature that has been called a “direct challenge” to Chinese search engine Baidu, which currently dominates the Chinese internet search market. Rather than aggregating search results from across the web, the new feature will retrieve content from within WeChat, including posts from a user’s friends and news articles published directly to WeChat. Articles published on WeChat have their own URLs, but Baidu’s search engine is unable to index them.

In the 2017 Corporate Accountability Index, Baidu’s search engine was the lowest-scoring search engine of the five evaluated due to its low levels of disclosure on policies affecting users’ rights to freedom of expression and privacy. Baidu was also the only company in the entire Index to receive no credit in the Governance category, which evaluates a company’s institutional commitments to freedom of expression and privacy principles. Overall, although Tencent disclosed more information than Baidu about its policies affecting users’ rights, both companies disclosed less about policies affecting users’ freedom of expression than about privacy-related policies.

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

WannaCry ransomware highlights importance of security updates

Screenshot of WannaCry infection map from MalwareTech

Since its outbreak on May 12, the largest global ransomware attack in history, WannaCry, has affected hundreds of thousands of computers in more than 150 countries. The ransomware was based on a vulnerability in certain versions of Microsoft’s Windows operating system. The National Security Agency (NSA) had developed an exploit targeting this vulnerability, which was stolen and later published by a group of hackers called the “Shadow Brokers” in 2016. The WannaCry developers used this exploit to create rapidly spreading malware that encrypted the hard drives of more than 300,000 computers, according to the White House. A window instructed affected individuals that if they wanted to access their files, they would have to pay $300 in Bitcoin to receive the decryption key. Entities affected included the United Kingdom’s National Health Service, which shut down sixteen hospitals as a result, and Telefónica, which was the first company to report it had been impacted by the attack.

After hackers published the exploit in 2016, Microsoft released a patch fixing the vulnerability, but users who did not install the update or who used older operating systems that no longer supported regular security updates, such as Windows XP, remained vulnerable. Microsoft also released an emergency patch for Windows XP shortly following the attack.

The WannaCry ransomware attack highlights the need for companies to provide regular security updates and to clearly disclose to users their policies and timelines for responding to security vulnerabilities once they are discovered. As highlighted in the 2017 Corporate Accountability Index findings, users rely on software being up-to-date and resilient against malware and companies should clearly communicate to users for how long after purchase (or until what date) they are guaranteed to receive software updates.

EU fines Facebook over data policies

The European Commission has fined Facebook €110 million for providing ‘misleading’ information about the company’s technical capacity to match user data between WhatsApp and Facebook. According to the Commission, Facebook did not accurately state that it could match users’ WhatsApp phone number with their Facebook profile. Facebook filed this information to EU competition authorities as part of a merger approval process after the company purchased WhatsApp in 2014.

In a statement released in response to the Commission’s fine, Facebook said it did not intentionally mislead the Commission. “The errors we made in our 2014 filings were not intentional and the Commission has confirmed that they did not impact the outcome of the merger review,” the company said.

Facebook has confronted numerous legal challenges in Germany and other EU countries over its WhatsApp data-sharing practices. In the 2017 Corporate Accountability Index, Facebook received the lowest score of all internet and mobile companies for its lack of disclosure about how users can control what the company does with their information.

Chinese cybersecurity law raises concerns

International business groups are calling on the Chinese government to delay implementing a cybersecurity law set to take effect this June. The law has raised concerns over provisions requiring data about Chinese citizens to be stored within the country and vague security certification requirements for companies. It is unclear if companies will have provide software source code to authorities, according to reports.

As also noted in the 2017 Corporate Accountability Index, this law requires companies to cooperate with crime and national security investigations, and could mean that companies will be obligated to comply with government requests for user information and other surveillance demands. Both Chinese companies evaluated in the Index, Baidu and Tencent, had low levels of disclosure relating to government requests for user data, and current laws make it unrealistic for Chinese companies to reveal this information.

On May 10, Ranking Digital Rights (RDR) team members Lisa Gutermuth and Ilana Ullman presented findings of the 2017 Corporate Accountability Index at Re:publica, an annual conference on technology and society held in Berlin.

In their session, the RDR team reviewed Index research which found that the world’s major internet, mobile, and telecommunications companies lack disclosure of policies affecting users’ freedom of expression and privacy. To illustrate this, they discussed this year’s research showing that companies failed to disclose enough about how they handle user information, and also did not clearly disclose policies for responding to data breaches, for which only three out of the 22 companies evaluated received any credit. The RDR team also explained how differences in scores between both Russian companies and both Chinese companies illustrated areas in which there is some degree of choice in policy disclosure despite the restrictive legal environment. They also presented several of the Index’s recommendations for companies and for governments.

Several RDR partners and researchers also presented their work at the event. Vladan Joler and Djordje Krivokapic of the SHARE Foundation presented their work on mapping Facebook’s algorithm, Gisela Pérez de Acha of Derechos Digitales presented on the “Right to be Forgotten” in Latin America, and Tanya Lokot, a lecturer at Dublin City University, facilitated a meetup on digital storytelling.

More than 9,000 participants attended Re:publica 2017, according to event organizers.

You can view the full presentation here.

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

John Oliver emphasizes corporate role in net neutrality debate

The debate over government enforcement of net neutrality principles in the U.S. has remerged in full force. On April 27, the FCC released a notice of proposed rulemaking (NPRM) that outlined its intention to deregulate the telecommunications industry and reverse the net neutrality provisions that were established with the 2015 Open Internet Order. This weekend, comedian John Oliver, in an echo of his hit 2014 net neutrality tirade, once again brought the topic of net neutrality to the masses with a feature segment. In addition to laying out his arguments in favor of net neutrality, Oliver also highlighted a point about corporate responsibility, noting examples in the past in which ISPs used their networks to favor their own content or services over that of their competitors. He argued that without regulatory enforcement, companies have little incentive to voluntarily abide by net neutrality principles.

Oliver’s remarks highlight the importance of regulatory enforcement to protect consumer rights in the absence of other accountability mechanisms. RDR’s methodology is based on companies’ disclosure of commitments and policies that respect users’ rights: in the Freedom of Expression category we evaluate whether ISPs disclose that they do not block, prioritize, or delay content for reasons beyond assuring network quality and reliability. The results of the 2017 Index show that of the 10 ISPs evaluated, only U.K.-based Vodafone disclosed that it does not engage in these types of traffic management practices.

More mobile apps are “listening” for marketing beacons

Last week, scholars from the Technical University of Braunschweig presented new research at the IEEE European Symposium on Privacy and Security documenting a potentially growing privacy threat to mobile app users. The research findings, which were covered by several media outlets, found 234 examples of Android apps that are “constantly listening for ultrasonic beacons in the background,” compared to 39 found in December 2015 and just six found in April 2015. These mobile apps are equipped with technology that, if users grant the app permission to access the device’s microphone, use the microphone to “listen” for ultrasonic tones that are emitted by advertisers. Companies such as Signal360 market products that use ultrasonic beacons to track users for advertising purposes — for example, a sports stadium might partner with a mobile app developer to send promotions to users of the mobile app when they walk into the stadium. These beacons aren’t only emitted at stadiums, though — they’re found in brick and mortar stores, billboards, online ads, television ads, etc, and can be used to link multiple devices to a single owner. These companies can then build profiles on users based on where they go and what they watch on TV or search for online. In 2015, the Center for Democracy and Technology filed comments with the FTC highlighting the privacy concerns presented by this type of cross-device tracking.

Mobile applications should clearly disclose what types of user information they might collect, how they collect this information, and the third parties with whom they share it, so that users can make informed decisions about the apps they choose to download and use. This includes information conveyed and collected via ultrasonic signals. In addition, companies that operate mobile ecosystems should make an effort to protect users by disclosing whether and to what extent they evaluate the privacy policies of the third-party apps in their app stores. The 2017 Index evaluated three mobile ecosystemsGoogle’s Android, Apple’s iOS, and Samsung’s implementation of Android—and found that, while companies may have guidelines regarding app privacy policies, none of these companies disclosed whether they evaluate the content of these policies.

Russia blacklists mobile messaging app WeChat

“Communication tools / iOS” (Image via Microsiervos on Flickr, CC BY 2.0)

On May 4, Russia’s telecommunications regulator, Roskomnadzor, added the mobile messaging app WeChat to its list of banned websites and information outlets for failing to register with the government as an “organizer of information.” The regulator has reportedly required ISPs to block more than two dozen IP addresses associated with Tencent, WeChat’s parent company. Companies that are registered as “organizers of information” are required to comply with a new set of amendments known as Yarovaya’s Law passed last July, including requirements to store users’ metadata and communications content on servers located in Russia, hand over this data at the request of Russian authorities, and assist the government in decrypting encrypted data.

As Freedom House noted in their Freedom on the Net 2016 report, more governments are cracking down on communication apps than ever before. Companies around the world face pressure from governments trying to censor content or conduct surveillance. RDR’s methodology awards credit to companies that report on the requests they receive from governments to block access to online content or to restrict services. It also rewards companies for disclosing that user communications and content are encrypted, and if not, it expects companies to disclose the sharing of user information with government authorities.