09 Mar Brussels says online platforms should remove terrorist content in one hour, Washington state enacts net neutrality law, Cameroon restores internet access to its Anglophone regions
Corporate Accountability News Highlights is a regular series by Ranking Digital Rights highlighting key news related to tech companies, freedom of expression, and privacy issues around the world.
European Commission gives tech companies 1 hour to remove terrorist content
Online platforms should remove terrorist content within one hour after being notified, the European commission said in a new recommendation. On 1 March the Commission adopted a “Recommendation on measures to effectively tackle illegal content online” proposing a “common approach” for platforms to “detect, remove and prevent the re-appearance of content online” including terrorist content, hate speech, child sexual abuse material, and copyright infringement.
“Given that terrorist content is typically most harmful in the first hour of its appearance online and given the specific expertise and responsibilities of competent authorities and Europol, referrals should be assessed and, where appropriate, acted upon within one hour, as a general rule,” the commission explained in the Recommendation.
Companies should also put in place “easy and transparent rules” to flag illegal content including “fast-track procedures for ‘trusted flaggers’,” the Commission said. It also advises companies to cooperate “through the sharing and optimisation” of technological tools that automatically detect terrorist content.
While not legally binding, the recommendation increases pressure on tech giants, already facing scrutiny in the EU, to act with speed to remove illegal content.
The latest move by the EU to regulate online platforms was met with criticism by the Computer & Communications Industry Association, which represents the tech industry. In a statement, the association said the one hour limit “will strongly incentivise hosting services providers to simply take down all reported content.”
The Center for Democracy and Technology, which advocates for online civil liberties and rights, said the new rules “lack adequate accountability mechanisms,” adding that its “emphasis on speed and use of automation ignores limits of technology and techniques.”
Companies should be transparent about their process for enforcing their rules by disclosing information about the types of content or activities they do not allow, and the processes they use to identify infringing content or accounts. None of the internet and mobile ecosystem companies evaluated in the 2017 Corporate Accountability Index disclosed whether government authorities receive priority consideration when flagging content to be restricted. Companies should also disclose and regularly publish data about the volume and nature of actions taken to restrict content or accounts that violate their rules. Of the 22 internet, mobile, and telecommunications companies evaluated in the 2017 Corporate Accountability Index, only three—Microsoft, Twitter, and Google—published any information at all on their terms of service enforcement.