Corporate Accountability News Highlights: Facebook internal documents highlight lack of transparency on content removals, Apple reveals it received a National Security Letter, and WeChat unveils new search feature

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

The Guardian publishes Facebook internal content guidelines, offering glimpse into opaque process

Image by Veluben (Licensed CC BY-SA 3.0)

An investigation by the Guardian has revealed new details about Facebook’s internal rules for policing and removing content. The “Facebook Files” is based on information from internal training manuals for the company’s moderators leaked to the Guardian, which outline rules for reviewing and removing content, including violent, sexually explicit, extremist, racist, and other types of sensitive materials. Facebook moderators reported having little time to decide whether to remove a given item and voiced concern over confusing and inconsistent content removal policies, according one report.

In response to the Guardian’s investigation, Facebook said they “don’t always share the details of our policies, because we don’t want to encourage people to find workarounds – but we do publish our Community Standards, which set out what is and isn’t allowed on Facebook, and why.”

The Guardian’s investigation puts a spotlight on the lack of transparency by social network companies regarding their internal policies for evaluating and removing content. Findings of the 2017 Corporate Accountability Index showed that most companies evaluated failed to disclose any information about the volume and nature of content they remove for terms of service violations. Only three companies of the 22 evaluated—Google, Microsoft, and Twitter—received any credit for doing so. Facebook does not publish data on any type of content removed for violating its rules, and did not receive any credit on this indicator.

Apple discloses it received a now-declassified National Security Letter

In its latest transparency report, Apple revealed for the first time that a National Security Letter it received from the U.S. government has been declassified, though the company did not publish the letter itself. The USA Patriot Act allows the government to compel a company to disclose user information by issuing a National Security Letter, without a court order. National Security Letters also include a gag order preventing companies from disclosing any information about the request, including to affected users. Some companies have been able to publish some of the national security orders they’ve received through legal challenges and following the passage of the 2015 USA Freedom Act.

Apple also reported it received double the number of national security orders in the last half of 2016 compared to the first half of the year. Apple defines “national security orders” as requests made under the Foreign Intelligence Surveillance Act (FISA) as well as National Security Letters it received. As the company notes, U.S. law allows companies to publish national security requests only within a certain range, but prohibits them from publishing the specific number.

In the 2017 Corporate Accountability Index, Apple received the fourth-highest score of the 12 internet and mobile companies evaluated for its disclosure of data about government and other third-party requests for user information, including the number of requests it receives and with which it complies, the legal authority requests are made under, and what types of user information are requested.

WeChat launches new search feature

WeChat, a messaging app owned by Chinese internet company Tencent, recently launched a new search feature that has been called a “direct challenge” to Chinese search engine Baidu, which currently dominates the Chinese internet search market. Rather than aggregating search results from across the web, the new feature will retrieve content from within WeChat, including posts from a user’s friends and news articles published directly to WeChat. Articles published on WeChat have their own URLs, but Baidu’s search engine is unable to index them.

In the 2017 Corporate Accountability Index, Baidu’s search engine was the lowest-scoring search engine of the five evaluated due to its low levels of disclosure on policies affecting users’ rights to freedom of expression and privacy. Baidu was also the only company in the entire Index to receive no credit in the Governance category, which evaluates a company’s institutional commitments to freedom of expression and privacy principles. Overall, although Tencent disclosed more information than Baidu about its policies affecting users’ rights, both companies disclosed less about policies affecting users’ freedom of expression than about privacy-related policies.

Corporate Accountability News Highlights: Wannacry ransomware infects hundreds of thousands of computers, EU fines Facebook over data policies, and international businesses urge China to delay cybersecurity law

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

WannaCry ransomware highlights importance of security updates

Screenshot of WannaCry infection map from MalwareTech

Since its outbreak on May 12, the largest global ransomware attack in history, WannaCry, has affected hundreds of thousands of computers in more than 150 countries. The ransomware was based on a vulnerability in certain versions of Microsoft’s Windows operating system. The National Security Agency (NSA) had developed an exploit targeting this vulnerability, which was stolen and later published by a group of hackers called the “Shadow Brokers” in 2016. The WannaCry developers used this exploit to create rapidly spreading malware that encrypted the hard drives of more than 300,000 computers, according to the White House. A window instructed affected individuals that if they wanted to access their files, they would have to pay $300 in Bitcoin to receive the decryption key. Entities affected included the United Kingdom’s National Health Service, which shut down sixteen hospitals as a result, and Telefónica, which was the first company to report it had been impacted by the attack.

After hackers published the exploit in 2016, Microsoft released a patch fixing the vulnerability, but users who did not install the update or who used older operating systems that no longer supported regular security updates, such as Windows XP, remained vulnerable. Microsoft also released an emergency patch for Windows XP shortly following the attack.

The WannaCry ransomware attack highlights the need for companies to provide regular security updates and to clearly disclose to users their policies and timelines for responding to security vulnerabilities once they are discovered. As highlighted in the 2017 Corporate Accountability Index findings, users rely on software being up-to-date and resilient against malware and companies should clearly communicate to users for how long after purchase (or until what date) they are guaranteed to receive software updates.

EU fines Facebook over data policies

The European Commission has fined Facebook €110 million for providing ‘misleading’ information about the company’s technical capacity to match user data between WhatsApp and Facebook. According to the Commission, Facebook did not accurately state that it could match users’ WhatsApp phone number with their Facebook profile. Facebook filed this information to EU competition authorities as part of a merger approval process after the company purchased WhatsApp in 2014.

In a statement released in response to the Commission’s fine, Facebook said it did not intentionally mislead the Commission. “The errors we made in our 2014 filings were not intentional and the Commission has confirmed that they did not impact the outcome of the merger review,” the company said.

Facebook has confronted numerous legal challenges in Germany and other EU countries over its WhatsApp data-sharing practices. In the 2017 Corporate Accountability Index, Facebook received the lowest score of all internet and mobile companies for its lack of disclosure about how users can control what the company does with their information.

Chinese cybersecurity law raises concerns

International business groups are calling on the Chinese government to delay implementing a cybersecurity law set to take effect this June. The law has raised concerns over provisions requiring data about Chinese citizens to be stored within the country and vague security certification requirements for companies. It is unclear if companies will have provide software source code to authorities, according to reports.

As also noted in the 2017 Corporate Accountability Index, this law requires companies to cooperate with crime and national security investigations, and could mean that companies will be obligated to comply with government requests for user information and other surveillance demands. Both Chinese companies evaluated in the Index, Baidu and Tencent, had low levels of disclosure relating to government requests for user data, and current laws make it unrealistic for Chinese companies to reveal this information.

Corporate Accountability News Highlights: Net neutrality debate returns to the U.S., Ultrasonic beacons track users via mobile apps, Russia blocks messaging app WeChat

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

John Oliver emphasizes corporate role in net neutrality debate

The debate over government enforcement of net neutrality principles in the U.S. has remerged in full force. On April 27, the FCC released a notice of proposed rulemaking (NPRM) that outlined its intention to deregulate the telecommunications industry and reverse the net neutrality provisions that were established with the 2015 Open Internet Order. This weekend, comedian John Oliver, in an echo of his hit 2014 net neutrality tirade, once again brought the topic of net neutrality to the masses with a feature segment. In addition to laying out his arguments in favor of net neutrality, Oliver also highlighted a point about corporate responsibility, noting examples in the past in which ISPs used their networks to favor their own content or services over that of their competitors. He argued that without regulatory enforcement, companies have little incentive to voluntarily abide by net neutrality principles.

Oliver’s remarks highlight the importance of regulatory enforcement to protect consumer rights in the absence of other accountability mechanisms. RDR’s methodology is based on companies’ disclosure of commitments and policies that respect users’ rights: in the Freedom of Expression category we evaluate whether ISPs disclose that they do not block, prioritize, or delay content for reasons beyond assuring network quality and reliability. The results of the 2017 Index show that of the 10 ISPs evaluated, only U.K.-based Vodafone disclosed that it does not engage in these types of traffic management practices.

More mobile apps are “listening” for marketing beacons

Last week, scholars from the Technical University of Braunschweig presented new research at the IEEE European Symposium on Privacy and Security documenting a potentially growing privacy threat to mobile app users. The research findings, which were covered by several media outlets, found 234 examples of Android apps that are “constantly listening for ultrasonic beacons in the background,” compared to 39 found in December 2015 and just six found in April 2015. These mobile apps are equipped with technology that, if users grant the app permission to access the device’s microphone, use the microphone to “listen” for ultrasonic tones that are emitted by advertisers. Companies such as Signal360 market products that use ultrasonic beacons to track users for advertising purposes — for example, a sports stadium might partner with a mobile app developer to send promotions to users of the mobile app when they walk into the stadium. These beacons aren’t only emitted at stadiums, though — they’re found in brick and mortar stores, billboards, online ads, television ads, etc, and can be used to link multiple devices to a single owner. These companies can then build profiles on users based on where they go and what they watch on TV or search for online. In 2015, the Center for Democracy and Technology filed comments with the FTC highlighting the privacy concerns presented by this type of cross-device tracking.

Mobile applications should clearly disclose what types of user information they might collect, how they collect this information, and the third parties with whom they share it, so that users can make informed decisions about the apps they choose to download and use. This includes information conveyed and collected via ultrasonic signals. In addition, companies that operate mobile ecosystems should make an effort to protect users by disclosing whether and to what extent they evaluate the privacy policies of the third-party apps in their app stores. The 2017 Index evaluated three mobile ecosystemsGoogle’s Android, Apple’s iOS, and Samsung’s implementation of Android—and found that, while companies may have guidelines regarding app privacy policies, none of these companies disclosed whether they evaluate the content of these policies.

Russia blacklists mobile messaging app WeChat

“Communication tools / iOS” (Image via Microsiervos on Flickr, CC BY 2.0)

On May 4, Russia’s telecommunications regulator, Roskomnadzor, added the mobile messaging app WeChat to its list of banned websites and information outlets for failing to register with the government as an “organizer of information.” The regulator has reportedly required ISPs to block more than two dozen IP addresses associated with Tencent, WeChat’s parent company. Companies that are registered as “organizers of information” are required to comply with a new set of amendments known as Yarovaya’s Law passed last July, including requirements to store users’ metadata and communications content on servers located in Russia, hand over this data at the request of Russian authorities, and assist the government in decrypting encrypted data.

As Freedom House noted in their Freedom on the Net 2016 report, more governments are cracking down on communication apps than ever before. Companies around the world face pressure from governments trying to censor content or conduct surveillance. RDR’s methodology awards credit to companies that report on the requests they receive from governments to block access to online content or to restrict services. It also rewards companies for disclosing that user communications and content are encrypted, and if not, it expects companies to disclose the sharing of user information with government authorities.

Corporate Accountability News Highlights: Turkey blocks Wikipedia, UK lawmakers mull fines for social media companies, and US-based cloud services companies pledge compliance with EU data protection rules

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

Turkish government blocks Wikipedia

Image via Wikimedia foundation (Licensed CC BY-SA 3.0)

The Turkish government blocked Wikipedia this week, citing a law that gives authorities the ability to block websites that it deems are obscene or a threat to national security. “Instead of coordinating against terrorism, it [Wikipedia] has become part of an information source which is running a smear campaign against Turkey in the international arena,” the government said. The Wikimedia Foundation issued a statement refuting the government’s claims and urging authorities to remove the block. “We strongly oppose censorship or threats that lead to self-censorship,” stated Wikimedia.

Blocking Wikipedia is the Turkish government’s latest crackdown on freedom of expression on online platforms. Turkey was rated “Not Free” in Freedom House’s annual Freedom on the Net report, which noted the government has, on numerous occasions, temporarily blocked social media services including Twitter, Facebook, WhatsApp and YouTube. According to Twitter’s most recent transparency report, Turkey had the greatest number of government requests for content removal, both in terms of court orders (844 requests) as well as requests from government agencies, police, and other government authorities (2,232 requests).

UK Parliament: Social media companies should do more to police content

A new report by the UK Parliament’s Home Affairs Select Committee calls on social media companies like Twitter, Facebook, and Google to do more to monitor and remove illegal content. The report criticizes these companies for being “shamefully far” from addressing “illegal and dangerous content,” claiming that they should be able to use the same technology used to identify and take down content for copyright infringement to identify and remove hate speech and extremist content. The report recommended the UK government consider fining companies that fail to remove illegal content quickly enough, referencing a law recently proposed in Germany.

Some privacy advocates warn that such efforts to curb extremist content could lead to increased government censorship and that automating the process could make it more likely that legal content is erroneously removed.

As highlighted in the 2017 Corporate Accountability Index, some companies, like Google, Microsoft and Twitter, are starting to publish data about content that they remove for violating their rules. For instance, in a 2016 blog post Twitter published some information on these takedowns. The company’s most recent transparency report included data about content that was removed, following a government request, due to terms of service violations.

Companies pledge compliance with EU data protection rules

Google, Microsoft, and Amazon have committed to ensuring that their cloud services will be compliant with new European Union rules on data privacy, which come into effect in May 2018. The General Data Protection Regulation (GDPR), which European lawmakers adopted in April 2016, specifies new, EU-wide privacy rules for handling personal information of EU citizens.

The GDPR requires companies to adhere to the principle of data minimization, to have accountability measures which may include appointing a Data Privacy Officer, and to abide by new requirements for reporting data breaches. Findings of the 2017 Corporate Accountability Index showed that the companies evaluated disclosed little information about policies for responding to data breaches. Only three of the 22 companies we evaluated revealed some information about whether they notify authorities or users who might be affected by a data breach.

Corporate Accountability News Highlights: Uber breaks Apple’s rules, German court upholds WhatsApp user data sharing ban, local authorities in Kashmir order ISPs to block social media and messaging apps

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights that highlights key news related to tech companies, freedom of expression, and privacy issues around the world.

Uber, Apple, and user privacy

The New York Times reported that in 2015 Uber ran afoul of Apple’s privacy rules for adding a feature in its iPhone app allowing it to identify devices even after users had deleted the Uber app or erased all contents on the device. The practice, known as “fingerprinting,” tracks devices using their Unique Device Identifier (UDID), which in 2013 Apple announced it would no longer allow app developers to do. According to the article, Uber engineers “geofenced” Apple’s headquarters in Cupertino, California in an effort to hide that portion of the code from Apple employees. After discovering the code in 2015, Apple CEO Tim Cook demanded that Uber stop fingerprinting devices or it would be banned from the App Store, according to The New York Times.

This issue puts a spotlight on the need for mobile ecosystem companies like Apple, Google, and Samsung, to have clear and transparent user-information collection and retention policies for third-party apps hosted on their app stores. Findings of the 2017 Corporate Accountability Index showed that all three mobile ecosystems evaluated fell short in this regard. While all three companies disclosed they require third-party apps that collect user data to have privacy policies, none disclosed that they review the content of these policies for compliance with app store rules.

German Court bans WhatsApp from sharing user data with other Facebook services

A German court has upheld an order banning Facebook from collecting data on WhatsApp users in Germany. The court ruled that Facebook, which owns WhatsApp, must obtain user consent before its other services can process user information obtained from WhatsApp. WhatsApp updated its terms of service and privacy policy in August 2016 to state that it could share certain user data with Facebook, like a user’s phone number, in order to improve targeting advertising. The German case is one of several ongoing legal challenges the company is facing in the EU over its WhatsApp user data-sharing practices.

Of the 12 internet companies evaluated in the 2017 Corporate Accountability Index, Facebook received the lowest score on our indicator evaluating disclosure of options users have to control what information the company’s collects, retains, and uses. Our research found that WhatsApp did not fully disclose the options users have to control what information is collected or how their information is used for targeted advertising.

ISPs in Kashmir ordered to block social media and messaging services

Authorities in the northern India state of Jammu and Kashmir have ordered all ISPs to block 22 social networks and messaging apps for one month or until further notice. The services include social networks Facebook, Twitter, and QZone, and messaging and VoIP services and apps Skype, WhatsApp, and WeChat, which authorities claim were “being misused by anti-national and anti-social elements” in the Kashmir Valley to disturb “peace and tranquility.” Authorities previously ordered telecommunications companies to suspend 3G and 4G mobile internet services after several videos circulating online of security forces abusing civilians drew outrage from Kashmiris.

The rise of network shutdown orders by governments has sparked growing concerns by human rights groups and policy makers around the world. In 2016, India had the highest number of internet shutdowns in the world, with 31 instances of internet shutdowns in Jammu and Kashmir since 2012, according to the Software Freedom Law Centre. The UN Human Rights Council in 2016 condemned network shutdowns as a violation of international human rights law and called on governments to refrain from taking these actions. At the same time, companies should push back on government demands to shut down networks, and clearly explain the circumstances under which they comply with such requests. Findings of the 2017 Corporate Accountability Index showed that all telecommunications companies evaluated failed to meet this obligation to varying extents and none disclosed sufficient information about their policies for responding to network shutdown requests.