Discussion guide

On this page

List of acronyms and abbreviations

BSI
Basic Subscriber Information
CSIS
Canadian Security and Intelligence Service
The Mandatory Reporting Act
An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service
NCECC
National Child Exploitation Crime Centre
The Board
The Advisory Board
The Commissioner
The Digital Safety Commissioner of Canada
The Recourse Council
The Digital Recourse Council of Canada

Purpose

The Government believes in supporting a safe, inclusive, and open online environment. In partnership with the Ministers of Justice and Public Safety, the Minister of Canadian Heritage is publishing a detailed technical discussion paper that outlines the Government’s proposed approach to regulating social media and combating harmful content online.

This approach is based on extensive work that the Government has conducted over the last year. It reflects consultations with equity-deserving communities, Indigenous organizations, non-governmental organizations, and victims of hate speech. It draws on insights shared by civil society and advocacy groups across the country. And it balances perspectives and approaches developed and shared by Canada’s partners across the globe.

The Government intends to introduce a bill in the fall of 2021. This consultation is an important step to provide Canadians and stakeholders with the opportunity to better understand the proposed approach and for the Government to consider additional perspectives.

This bill will be part of an overall strategy to combat hate speech and other harms. As part of this overall strategy, the Government introduced Bill C-36 on June 23, 2021 to provide legal remedies for victims of hate speech and hate crimes. Bill C-36 proposes to:

Bill C-36 would complement the regulatory approach for online social media platforms that is proposed here.

Background

Social media platforms and other online services help connect families, friends, and those with common interests in Canada and around the world. They are key pieces of economic infrastructure that enable Canadian companies to reach domestic and foreign markets, and are particularly crucial for small and medium-sized enterprises. They provide space for people in Canada to participate in their democracy, and for activists and civil society organizations to organize and share their messages, and amplify the voices of underrepresented and equity-deserving communities, including Indigenous Peoples.

But a growing body of evidence shows that these benefits also come with significant harms.

Individuals and groups use social media platforms to spread hateful messaging. Indigenous Peoples and equity-deserving groups such as racialized individuals, religious minorities, LGBTQ2 individuals and women are disproportionately affected by hate, harassment, and violent rhetoric online. Hate speech harms the individuals targeted, their families, communities, and society at large. And it distorts the free exchange of ideas by discrediting or silencing targeted voices.

Social media platforms can be used to spread hate or terrorist propaganda, counsel offline violence, recruit new adherents to extremist groups, and threaten national security, the rule of law and democratic institutions. At their worst, online hate and extremism can incite real-world acts of violence in Canada and anywhere in the world, as was seen on January 29, 2017 at the Centre culturel islamique de Québec, and on March 15, 2019, in Christchurch, New Zealand.

Social media platforms are also used to sexually exploit children. Women and girls, predominantly, are victimized through the sharing of intimate images without the consent of the person depicted. These crimes can inflict grave and enduring trauma on survivors, which is made immeasurably worse as this material proliferates on the internet and social media.

Social media platforms have significant impacts on expression, democratic participation, national security, and public safety. These platforms have tools to moderate harmful content. Mainstream social media platforms have voluntary content moderation systems that flag and test content against their community guidelines. But some platforms take decisive action in a largely ad-hoc fashion. These responses by social media companies tend to be reactive in nature and may not appropriately balance the wider public interest. Also, social media platforms are not required to preserve evidence of criminal content or notify law enforcement about criminal content, outside of mandatory reporting for child pornography offences. More proactive reporting could make it easier to hold perpetrators to account for harmful online activities.

The Government of Canada is committed to confronting online harms while respecting freedom of expression, privacy protections, and the open exchange of ideas and debate online.

This discussion paper and attached technical document outlines the Government’s vision for a safe, inclusive and open online environment. It details a proposal for a new legislative and regulatory framework, with rules for how social media platforms and other online services must address harmful content. It also outlines possible ways to address some of the challenges that law enforcement and the Canadian Security Intelligence Service (CSIS) face when confronting illegal and/or national security-related content online.

This document is organized into two thematic modules. Each module speaks to an element of the Government’s vision, with detailed elements included in the technical paper that follows. The accompanying technical document details how each module could be implemented in forthcoming legislation.

Module 1: A new legislative and regulatory framework for social media platforms

The new legislative and regulatory framework would target the most egregious and reprehensible types of harmful content online, including criminal content and content of national security concern. It would prioritize a safe, open, and inclusive Internet where Canadians feel they can express themselves without being victimized or targeted by certain kinds of harmful content. The proposed approach upholds and protects human rights, while also respecting fundamental freedoms, notably freedom of expression.

Who and what would be regulated

New legislation would apply to ‘online communication service providers’.

The concept of online communication service provider is intended to capture major platforms, (e.g., Facebook, Instagram, Twitter, YouTube, TikTok, Pornhub), and exclude products and services that would not qualify as online communication services, such as fitness applications or travel review websites.

The legislation would not cover private communications, nor telecommunications service providers or certain technical operators. There would be specific exemptions for these services.

The legislation would also authorize the Government to include or exclude categories of online communication service providers from the application of the legislation within certain parameters.

The legislation would target five categories of harmful content:

While all of the definitions would draw upon existing law, including current offences and definitions in the Criminal Code, they would be modified in order to tailor them to a regulatory – as opposed to criminal – context.

These categories were selected because they are the most egregious kinds of harmful content. The Government recognizes that there are other online harms that could also be examined and possibly addressed through future programming activities or legislative action.

The Government is also proposing improvements to An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service (the Mandatory Reporting Act). The details of these changes are in Module 2: Modifying Canada’s Existing Legal Framework. The new legislation for harmful content online will not supersede or replace any of the definitions or obligations in the Mandatory Reporting Act, nor implementation, institutions, or protocols for the reporting and investigation of child pornography.

New rules and obligations

The new legislation would set out a statutory requirement for regulated entities to take all reasonable measures to make harmful content inaccessible in Canada. This obligation would require regulated entities to do whatever is reasonable and within their power to monitor for the regulated categories of harmful content on their services, including through the use of automated systems based on algorithms.

Once platform users flag content, regulated entities would be required to respond to the flagged content by assessing whether it should be made inaccessible in Canada, according to the definitions outlined in legislation. If the content meets the legislated definitions, the regulated entity would be required to make the content inaccessible from their service in Canada within 24 hours of being flagged.

Regulated entities would also be required to establish robust flagging, notice, and appeal systems for both authors of content and those who flag content. Once a regulated entity makes a determination on whether to make content inaccessible in Canada, they would be required to notify both the author of that content and the flagger of their decision, and give each party an opportunity to appeal that decision to the regulated entity.

New proposed legislation would also compel regulated entities to be more transparent in their operations. Regulated entities would be required to publish information that they do not currently publish, with baseline transparency requirements set out in statute and further specified in regulation. This would include Canada-specific data on the volume and type of content dealt with at each step of the content moderation process, as well as information on how regulated entities develop, implement, and update their guidelines for the kinds of content they prohibit. Regulated entities would also be required to publish transparency reports on the Canada-specific use and impact of their automated systems to moderate, take down, and block access in Canada to harmful content.

Engaging law enforcement and CSIS

The construction of a regulatory framework with content removal requirements involves consideration of the interplay between new regulations and the role of law enforcement and CSIS in identifying public safety threats and preventing violence. Removal alone may push public threat actors beyond the visibility of law enforcement and CSIS, to encrypted websites and platforms with more extremist and unmoderated harmful content.

To balance these public interests, the regulatory framework would require regulated entities to notify law enforcement and CSIS of specific types of content to allow for appropriate investigative and preventive action. The Government is considering options to achieve the right balance on the mandatory notification requirement, specifically on its scope and the thresholds for triggering the notification obligations. Two potential options for consideration are as follows:

In addition, regulated entities would be required to preserve prescribed information that could support an investigation when sought by lawful means (e.g., a production order). The specific nature of the information to be preserved (including basic subscriber information, location data, the content itself, the alleged offence or national security threat) would be determined through regulations issued by the Governor in Council. The evidence-preservation obligation would be designed to prevent regulated entities from deleting evidence, such as identifying information, that could be lawfully obtained by law enforcement and CSIS through judicial authorizations including production orders and warrants.

Establishment of new regulators

The proposed legislation would create a new Digital Safety Commission of Canada to support three bodies that would operationalize, oversee, and enforce the new regime: the Digital Safety Commissioner of Canada, the Digital Recourse Council of Canada, and an Advisory Board.

The Digital Safety Commissioner of Canada (the Commissioner) would administer, oversee, and enforce the new legislated requirements noted above. It would also be mandated to lead and participate in research and programming, convene and collaborate with relevant stakeholders, and support regulated entities in reducing the five forms of harmful content falling under the new legislation on their services.

The Digital Recourse Council of Canada (Recourse Council) would provide people in Canada with independent recourse for the content moderation decisions of regulated entities like social media platforms. Once users have exhausted all avenues for appeals within regulated entities themselves, they would be able to take their case to the Recourse Council for decision. The Recourse Council would provide independent and binding decisions on whether or not content qualifies as harmful content as defined in legislation and should be made inaccessible.

If the Recourse Council finds that content constitutes harmful content and should be made inaccessible, it would issue a binding decision to the regulated entity on which the content was posted. Conversely, if it finds that the content does not constitute harmful content under the legislation, it would communicate this decision to the regulated entity, which would then decide whether to keep up, reinstate, block access to, or remove the content, subject to the entity’s own community guidelines.

Legislation would require the consideration of diverse subject-matter expertise and membership that is reflective of the Canadian population when appointing members to the Recourse Council. The Recourse Council would be subject to public reporting obligations to ensure it remains transparent in its operations and decision-making processes.

An Advisory Board (the Board) would provide both the Commissioner and the Recourse Council with expert advice to inform their processes and decision-making, such as advice on emerging industry trends and technologies and content-moderation standards.

The Board would not implicate itself in the specific content-moderation decisions of the Recourse Council, nor the regulation-making, compliance and enforcement activities of the Commissioner. Legislation would provide for the composition of the Board, including for the importance of diverse subject-matter experts and membership from civil society, legal experts, equity deserving communities, Indigenous peoples, civil liberties groups, victim advocacy groups, industry, and academia.

The operating budgets for the Commissioner and Recourse Council would be paid for by regulated entities themselves in the long-term in the form of regulatory charges.

Compliance and enforcement

The Commissioner would have powers to:

Module 2: Modifying Canada’s existing legal framework

In addition to the legislative amendments proposed under Bill C-36, further modifications to Canada’s existing legal framework to address harmful content online could include:

An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service (the Mandatory Reporting Act)

Enacted in 2011, the Government is proposing to amend the Mandatory Reporting Act to better enable it to deal with the rapid pace of change and the evolution of how material depicting child sexual exploitation is created and shared online today. Targeted reforms to the Mandatory Reporting Act and its regulations would enhance measures to address online child sexual exploitation, support investigations and assist in rescuing children who are in abusive circumstances.

The Government would amend the Mandatory Reporting Act in the following ways:

  1. Centralize mandatory reporting of online child pornography offences through the Royal Canadian Mounted Police’s National Child Exploitation Crime Centre (NCECC);
  2. clarify that the Mandatory Reporting Act applies to all types of internet services, including social media platforms and other application-based services;
  3. enhance transparency by requiring an annual report to the Ministers of Public Safety and Emergency Preparedness and Justice from the NCECC;
  4. impose a 12-month preservation requirement for computer data (as opposed to the current 21 days);
  5. designate a person in regulations for the purpose of collecting information to determine the application of the Mandatory Reporting Act; and
  6. add a requirement for persons who provide an internet service to provide, without a requirement for judicial authorization, additional information to the NCECC where a child pornography offence is identified.

Providing transmission data or Basic Subscriber Information

Law enforcement requires specific information when it receives reports of content in which a child pornography offence has clearly been committed so that it may initiate investigations, identify offenders, remove victims from continued sexual exploitation, and prevent additional children from being victimized. Under existing law, when content is reported to them, they pursue this information by seeking a court order.

Currently, the information provided in reports under the Mandatory Reporting Act is not required to include transmission data (i.e., Internet protocol address (IP address), date, time, type, origin, destination of the material, as defined in the Criminal Code) or basic subscriber information (BSI) (i.e., customer’s name, address, phone number, billing information associated with the IP address). The Government is considering whether transmission data or BSI should be included in reports to law enforcement without judicial authorization.

The purpose of including such information would be to expedite the police response in cases where a child pornography offence is clearly evident.

If reporting organizations were required to include transmission data in their mandatory reporting to law enforcement when a child pornography offence has been committed, this would provide police with the information they need to identify the applicable person who provides an internet service and correct jurisdiction. Police would still require a production order to obtain BSI, the identifying information, from persons who provide an internet service in order to advance an investigation.

Alternatively, if reporting organizations were required to include any held BSI in their mandatory reporting to law enforcement, police would not require a production order to obtain this information from persons who provide an internet service.

Additional information beyond what is required by the Mandatory Reporting Act would continue to be obtained from providers by police pursuant to a court order.

Amendment to the Canadian Security and Intelligence Service Act (CSIS Act)

Online Ideologically-Motivated Violent Extremist communities can range in the tens of thousands, acting as echo chambers of hate for adherents from all over the world. Mobilization to violence can occur rapidly, often within a period of weeks or months. More timely access to BSI could help mitigate the threat of online violent extremism and its real-world impacts.

Amendments to the current judicial authorization process, while maintaining Federal Court and Ministerial oversight, could represent a way to enable CSIS to identify online threat actors quicker, and to investigate and mitigate the spread of violent extremist narratives that may inspire real-world acts of violence. Currently, CSIS only has one warrant option which is designed for seeking intrusive powers from the Federal Court. It takes 4-6 months to develop the application and seek the Federal Court’s approval. Canadian law enforcement, by contrast, is able to obtain BSI in 8-10 days.

The potential new authorization for BSI would be issued by an independent judge of the Federal Court and be subject to Ministerial oversight. It would not replace or eliminate CSIS’ requirement to obtain full warrant powers from the Federal Court should further investigation into the threat be necessary. As with all CSIS activities, requests for BSI could be reviewed by the National Security and Intelligence Review Agency and the National Security and Intelligence Committee of Parliamentarians.

Page details

Date modified: