Corporate Accountability News Highlights is a regular series by Ranking Digital Rights highlighting key news related to tech companies, freedom of expression, and privacy issues around the world.

European Commission gives tech companies 1 hour to remove terrorist content

The European Commission – Berlaymont Building. Photo credit: Glyn Lowe [CC BY 2.0].

Online platforms should remove terrorist content within one hour after being notified, the European commission said in a new recommendation. On 1 March the Commission adopted a “Recommendation on measures to effectively tackle illegal content online” proposing a “common approach” for platforms to “detect, remove and prevent the re-appearance of content online” including terrorist content, hate speech, child sexual abuse material, and copyright infringement.

“Given that terrorist content is typically most harmful in the first hour of its appearance online and given the specific expertise and responsibilities of competent authorities and Europol, referrals should be assessed and, where appropriate, acted upon within one hour, as a general rule,” the commission explained in the Recommendation.

Companies should also put in place “easy and transparent rules” to flag illegal content including “fast-track procedures for ‘trusted flaggers’,” the Commission said. It also advises companies to cooperate “through the sharing and optimisation” of technological tools that automatically detect terrorist content.

While not legally binding, the recommendation increases pressure on tech giants, already facing scrutiny in the EU, to act with speed to remove illegal content.

The latest move by the EU to regulate online platforms was met with criticism by the Computer & Communications Industry Association, which represents the tech industry. In a statement, the association said the one hour limit “will strongly incentivise hosting services providers to simply take down all reported content.”

The Center for Democracy and Technology, which advocates for online civil liberties and rights, said the new rules “lack adequate accountability mechanisms,” adding that its “emphasis on speed and use of automation ignores limits of technology and techniques.”

Companies should be transparent about their process for enforcing their rules by disclosing information about the types of content or activities they do not allow, and the processes they use to identify infringing content or accounts. None of the internet and mobile ecosystem companies evaluated in the 2017 Corporate Accountability Index disclosed whether government authorities receive priority consideration when flagging content to be restricted. Companies should also disclose and regularly publish data about the volume and nature of actions taken to restrict content or accounts that violate their rules. Of the 22 internet, mobile, and telecommunications companies evaluated in the 2017 Corporate Accountability Index, only three—Microsoft, Twitter, and Googlepublished any information at all on their terms of service enforcement.

(more…)

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights highlighting key news related to tech companies, freedom of expression, and privacy issues around the world.

U.S. Supreme Court hears Microsoft privacy case

Microsoft Corporation headquarters in Redmond, Washington. Photo credit: user Coolcaesar [CC BY-SA 4.0] via Wikimedia Commons.

On Tuesday, the U.S. Supreme court heard arguments in the U.S. v. Microsoft case, in which the Department of Justice is seeking to force Microsoft to hand over content of emails stored in a data center in Ireland, under the 1986 Stored Communications Act. The case could set a new precedent that allows governments to obtain data stored in other countries.

The case dates back to 2013 when a New York state judge issued a warrant requesting that Microsoft hand over Outlook email information belonging to a user, who was the subject of a drug-trafficking investigation. While the company agreed to hand over metadata stored in the U.S., it refused to hand over the content of the emails, arguing that they are protected by Irish and EU privacy laws since they are stored in Ireland. The company says that the government should try to obtain the sought-after information using the United States-Ireland Mutual Legal Assistance Treaty (MLAT). MLATs are bilateral, multilateral or regional agreements that allow governments to exchange information related to an investigation.

The U.S. government argues that the MLAT process is “costly, cumbersome and time-consuming,” and is not needed since “the privacy intrusion occurs only when Microsoft turns over the content to the Government, which occurs in the United States.”

In court on Tuesday, Microsoft argued that the 1986 law is outdated and that the case should be decided by Congress. The Congress is considering to pass a new legislation, the Clarifying Lawful Overseas Use of Data (CLOUD) Act, which would clarify that warrants issued under the Stored Communications Act apply to data stored overseas, while allowing companies to challenge such warrants when they violate the privacy laws of the country where the data is stored.

While supported by tech companies including Microsoft, Facebook, Google and Apple, privacy advocate groups including the Electronic Frontier Foundation (EFF) and Access Now slammed the bill because it allows the U.S government to access data stored in any foreign country without consideration to its privacy laws. The bill would also give the U.S President power to enter into “executive agreements” with other countries for cross-border access to data. Such agreements would allow foreign governments to request U.S. companies to hand over data stored in the U.S, as long as the user is not a U.S citizen or based in the country, “without the procedural safeguards of U.S. law typically given to data stored in the United States,” EFF says.

A decision by the Supreme court is expected by summer. If the court rules in favor of the U.S. government, it would set a new precedent allowing governments to obtain data stored in other countries. The European Union is already considering a bill that would allow law enforcement authorities of any member-state to request data stored not only within the 28 EU countries, but also overseas, Reuters reported.

Companies should disclose information about their process for responding to government requests for user data including their processes for responding to non-judicial government requests and court orders, and the legal basis under which they comply with requests. In addition, companies should publicly commit to push back on inappropriate or overbroad government requests. Companies should also disclose and regularly publish data about these requests including, listing the number of requests received by country and number of accounts and pieces of content affected, and specifying the legal authorities making the requests.

(more…)

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights highlighting key news related to tech companies, freedom of expression, and privacy issues around the world.

Instagram appeases Russian censorship request

Russian opposition leader Alexei Navalny. Photo credit: Evgeny Feldman / Novaya Gazeta [CC BY-SA 3.0] via Wikimedia Commons.

Instagram has blocked posts by Russian opposition leader Alexei Navalny, in response to demands from Russian authorities. The posts blocked by Instagram are related to corruption allegations made by Navalny, which target billionaire Oleg Deripaska and Russia’s deputy prime minister Sergei Prikhodko.

Navalny also posted a video on Youtube and his website on the link between Deripaska and the Kremlin. Deripaska filed an injunction with Russia’s federal media regulator, Roskomnadzor, to remove the videos and reports about the allegations. Instagram, which is owned by Facebook, and Russian media sites complied. Russian ISPs also blocked Navalny’s website.

The video is still available on Youtube, which is owned by Google. However, the platform warned Navalny and his team that if they do not take down the offending materials, the company will do so.

Internet, mobile ecosystem, and telecommunications companies should be transparent about how they handle government requests for content restrictions and publish data about the number of requests received, the number they complied with, and the types of subject matter associated with these requests. Most companies evaluated in the 2017 Corporate Accountability Index lacked transparency about how they handle government requests to restrict content or accounts, and did not disclose sufficient data about the number of requests they received or complied with, or which authorities made these requests.  

Companies should also notify users when it restricts content. Services that host user-generated content should notify those who posted the content, and users trying to access it. The notification should include a clear reason for the restriction. The 2017 Index found that companies do not disclose sufficient data about their user notification policies when they restrict content or accounts.

(more…)

Graphic credit: Ahmad Mazloum and Salam Shokor/SMEX (CC BY SA)

Mobile users in Arab states lack critical information about basic policies affecting their freedom of expression and privacy, according to new research by the Social Media Exchange (SMEX), a Beirut-based media development and digital rights organization.  

The report, “Dependent Yet Disenfranchised: The Policy Void that Threatens the Rights of Mobile Users in Arab States,” uses the Ranking Digital Rights Corporate Accountability Index methodology to analyze policies of all 66 mobile operators based in the 22 countries of the Arab region. Research showed that only 14 of the region’s 66 mobile operators publish terms of service and just seven operators publish privacy policies. Most mobile operators in the region also do not publish transparency reports providing data on government requests for user data and content blocking or removals. Apart from the local subsidiaries of MTN, Orange, and Vodafone, not a single operator made a commitment to respect users’ free speech and privacy rights in a publicly accessible human rights policy.

Of the 14 operators that publish terms of service policies, the research found that these policies fall vastly short of protecting users’ freedom of expression rights. The analysis showed that these policies fail to clearly disclose the rules and how they are enforced, and some operators do not even publish terms of service policies in the primary languages of their users. Companies also failed to provide users with remedy mechanisms addressing their free speech complaints.

The report was researched and written by Afef Abrougui, who currently serves as Corporate Accountability Editor at Ranking Digital Rights.  

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights highlighting key news related to tech companies, freedom of expression, and privacy issues around the world.

Facebook in breach of German Data protection law

Facebook Headquarters at 1 Hacker Way, Menlo Park, California. Photo by Anthony Quintano (CC BY 2.0)

A German court has ruled that Facebook is in breach of the country’s data protection law, in a lawsuit filed by a consumer advocacy group. The court found that five Facebook default settings such as the disclosure of a user’s location when chatting to others on the Facebook mobile application, and the appearance of personal Facebook profiles in search results—violate the Federal Data Protection Act for failing to meet the requirement of informed consent. Under the act, German users should be provided with “clear and easy to understand information on the nature, scope and purpose of the intended use of [their] data.”

Facebook said that it would appeal the court’s decision.

The court also ruled against eight other clauses in Facebook’s terms of use such as “pre-formulated declarations of consent” allowing the company to use names and profile pictures of its users “for commercial, sponsored or related content” and to transfer their data to the United States.  

Companies should clearly disclose to users what options they have to control collection, retention and use of their personal information. Internet, mobile, and telecommunications companies evaluated in the 2017 Corporate Accountability did not disclose enough information about such options. Facebook disclosed less about these options than any of the 12 internet companies evaluated. The company did not disclose options allowing users to control the company’s collection of their user information, and how their information is used for targeted advertising.

The court also ruled that Facebook’s “authentic name” policy which requires users to use a name that appears on their IDs was in violation of the German Telemedia Act, which requires providers to allow users to use pseudonyms. Internet companies and providers of prepaid mobile services should not require users to verify their names with government-issued IDs. Research from the 2017 Index showed that while Facebook did not require users to do so for its Instagram and WhatsApp, users of Facebook and the Messenger app are required to verify their accounts with information that can connect users to their offline identity.

(more…)