Public submission to the Canadian government’s proposed approach to regulating online harms

Share Article

RDR has contributed to the public consultation on the Canadian government’s proposed legislative and regulatory framework to address harmful content online. The framework sets out entities that would be subject to the new framework, what types of content would be regulated, new rules and obligations for regulated entities, and two new regulatory bodies and an advisory body that would oversee the new framework. We believe that efforts to address these harms must promote and uplift freedom of expression and information as well as our fundamental right to privacy. We commend the Canadian government’s objective to create a safe and open internet and have a few recommendations on how the government can tackle the underlying causes of online harms. Read the introduction of our submission below or download it in its entirety here.

Honorable members of the Department of Canadian Heritage:

Ranking Digital Rights (RDR) welcomes this opportunity for public consultation on the Canadian government’s proposed approach to regulating social media and combating harmful content online. We work to promote freedom of expression and privacy on the internet by researching and analyzing how global information and communication companies’ business activities meet, or fail to meet, international human rights standards (see www.rankingdigitalrights.org for more details). We focus on these two rights because they enable and facilitate the enjoyment of the full range of human rights comprising the Universal Declaration of Human Rights (UDHR), especially in the context of the internet.

RDR broadly supports efforts to combat human rights harms that are associated with digital platforms and their products, including the censorship of user speech, incitement to violence, campaigns to undermine free and fair elections, privacy-infringing surveillance activities, and discriminatory advertising practices. But efforts to address these harms need not undermine freedom of expression and information or privacy. We have long advocated for the creation of legislation to make online communication services (OCSs) more accountable and transparent in their content moderation practices and for comprehensive, strictly enforced privacy and data protection legislation.

We commend the Canadian government’s objective to create a “safe, inclusive, and open” internet. The harms associated with the operation of online social media platforms are varied, and Canada’s leadership in this domain can help advance global conversations about how best to promote international human rights and protect users from harm. As drafted, however, the proposed approach fails to meet its stated goals and raises a set of issues that jeopardize freedom of expression and user privacy online. We also note that the framework contradicts commitments Canada has made to the Freedom Online Coalition (FOC) and Global Conference for Media Freedom, as well as previous work initiating the U.N. Human Rights Council’s first resolution on internet freedom in 2012. As Canada prepares to assume the chairmanship of the FOC next year, it is especially important for its government to lead by example. Online freedom begins at home. As RDR’s founder Rebecca MacKinnon emphasized in her 2013 FOC keynote speech in Tunis, “We are not going to have a free and open global Internet if citizens of democracies continue to allow their governments to get away with pervasive surveillance that lacks sufficient transparency and public accountability.”

Like many other well-intentioned policy solutions, the government’s proposal falls into the trap of focusing exclusively on the moderation of user-generated content while ignoring the economic factors that drive platform design and corporate decision-making: the targeted-advertising business model. In other words, restricting specific types of problematic content overlooks the forest for the trees. Regulations that focus on structural factors—i.e., industry advertising practices, user surveillance, and the algorithmic systems that underpin these activities—are better suited to address systemic online harms and, if properly calibrated, more sensitive to human rights considerations. 

In this comment we identify five issues of concern within the proposal and a set of policy recommendations that, if addressed, can strengthen human rights protections and tackle the underlying causes of online harms.

Download our entire submission here.

Highlights

A decade of tech accountability in action

Over the last decade, Ranking Digital Rights has laid the bedrock for corporate accountability in the tech sector by demanding transparency from both Big Tech and Telco Giants.

RDR Series:
Red Card on Digital Rights

A story of control, censorship, and state surveillance during the FIFA World Cup in Qatar

Related Posts

Sign up for the RADAR

Subscribe to our newsletter to stay in touch!