Instagram complies with Russian censorship request, Intel faces lawsuits over security flaws, EU warns Facebook and Twitter about content removal policies

Share Article

Corporate Accountability News Highlights is a regular series by Ranking Digital Rights highlighting key news related to tech companies, freedom of expression, and privacy issues around the world.

Instagram appeases Russian censorship request

Russian opposition leader Alexei Navalny. Photo credit: Evgeny Feldman / Novaya Gazeta [CC BY-SA 3.0] via Wikimedia Commons.

Instagram has blocked posts by Russian opposition leader Alexei Navalny, in response to demands from Russian authorities. The posts blocked by Instagram are related to corruption allegations made by Navalny, which target billionaire Oleg Deripaska and Russia’s deputy prime minister Sergei Prikhodko.

Navalny also posted a video on Youtube and his website on the link between Deripaska and the Kremlin. Deripaska filed an injunction with Russia’s federal media regulator, Roskomnadzor, to remove the videos and reports about the allegations. Instagram, which is owned by Facebook, and Russian media sites complied. Russian ISPs also blocked Navalny’s website.

The video is still available on Youtube, which is owned by Google. However, the platform warned Navalny and his team that if they do not take down the offending materials, the company will do so.

Internet, mobile ecosystem, and telecommunications companies should be transparent about how they handle government requests for content restrictions and publish data about the number of requests received, the number they complied with, and the types of subject matter associated with these requests. Most companies evaluated in the 2017 Corporate Accountability Index lacked transparency about how they handle government requests to restrict content or accounts, and did not disclose sufficient data about the number of requests they received or complied with, or which authorities made these requests.  

Companies should also notify users when it restricts content. Services that host user-generated content should notify those who posted the content, and users trying to access it. The notification should include a clear reason for the restriction. The 2017 Index found that companies do not disclose sufficient data about their user notification policies when they restrict content or accounts.

Intel faces lawsuits over security flaws

Intel is facing 32 lawsuits in courts in the U.S. and other countries over security flaws in computer processors that could allow hackers to steal sensitive data, including passwords and encryption keys. The flaws, known as Meltdown and Spectre, affected almost all computer and mobile phone devices manufactured in the past 20 years.

According to Intel’s annual report to the Securities and Exchange Commission (SEC), as of February 15, 30 customer class action lawsuits were filed by plaintiffs seeking compensation for monetary damages caused in relation to the security vulnerabilities. The company faces two additional securities lawsuits for allegedly making false or misleading statements about its ‘’products and internal controls that were revealed to be false or misleading by the disclosure of the security vulnerabilities.’’

Companies should clearly disclose what steps they take to keep user data secure. The 2017 Corporate Accountability Index found that “companies communicate less about what they are doing to protect users’ security than they do about what users should do to protect themselves.” Companies disclosed more to users about how to defend themselves against cyber risks than about what steps they take to keep users’ information secure or about what they do to address security vulnerabilities once they are discovered.

European Commission warns Facebook and Twitter about content removals

The European Commision has warned Facebook and Twitter that they should be more transparent about their process for removing content in accordance with EU consumer protection laws. The Commission said that Facebook needs to update its terms to clarify how users can appeal content removals while Twitter has to renounce its power to remove ‘’infringing content’’ at its ‘’sole discretion,’’ as the company’s terms of service state.

Companies should be transparent about their process for enforcing their rules by disclosing information about the types of content or activities they do not allow, and the processes they use to identify infringing content or accounts. Companies should also disclose and regularly publish data about the volume and nature of actions taken to restrict content or accounts that violate their rules. Research from the 2017 Index showed that most companies, including Facebook, do not publish such data. Of the 22 internet, mobile, and telecommunications companies evaluated in the 2017 Index, only threeMicrosoft, Twitter, and Googlepublished any information at all on their terms of service enforcement.

Highlights

A decade of tech accountability in action

Over the last decade, Ranking Digital Rights has laid the bedrock for corporate accountability in the tech sector by demanding transparency from both Big Tech and Telco Giants.

RDR Series:
Red Card on Digital Rights

A story of control, censorship, and state surveillance during the FIFA World Cup in Qatar

Related Posts

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!