Proposed Bill to address Online Harms
The internet is an exceptional tool for people of all ages to learn, play and connect with family, friends and those with similar interests. However, just like the outside world, the digital world can pose significant dangers. Social media can be used to sexually exploit children, promote self-harm to children, incite violence, put people’s safety at risk and foment hate. Online harms have real-world impacts with tragic, even fatal, consequences.
The Government of Canada has introduced legislation to hold social media platforms accountable for addressing harmful content on their platforms and for creating a safer online space that protects all people in Canada, especially kids.
On this page:
- About the proposed Online Harms Bill
- Expected outcomes of the proposed legislation
- Next Steps
- Engagement and consultations
- Related links
About the proposed Online Harms Bill
On February 26, 2024, the Government of Canada introduced Bill C-63 to create a new Online Harms Act—a baseline standard for online platforms to keep Canadians safe—to hold online platforms accountable for the content they host. Bill C-63 will create stronger protections for kids online and better safeguard everyone in Canada from online hate. The bill sets out a new vision for safer and more inclusive participation online.
The proposed Online Harms Act would specifically target seven types of harmful content:
- Content that sexually victimizes a child or revictimizes a survivor;
- Intimate content communicated without consent;
- Content used to bully a child;
- Content that induces a child to harm themselves;
- Content that foments hatred;
- Content that incites violence; and
- Content that incites violent extremism or terrorism.
Under the Act, social media services would be subject to three duties:
- A duty to act responsibly;
- A duty to protect children; and
- A duty to make certain content inaccessible, specifically (1) content that sexually victimizes a child or revictimizes a survivor and (2) intimate images posted without consent.
These duties would apply to social media services including livestreaming and user-uploaded adult content services. They would require these services to actively reduce the risk of exposure to harmful content on their services; to provide clear and accessible ways to flag harmful content and block users; to put in place special protections for children; to take action to address child sexual exploitation and the nonconsensual posting of intimate content, including deepfake sexual images; and to publish transparency reports.
The Bill also proposes changes to the Criminal Code, the Canadian Human Rights Act (CHRA), and An Act respecting the mandatory reporting of internet child pornography by persons who provide an internet service.
The changes to the Criminal Code and CHRA would help to better combat hate speech and hate crimes, provide improved remedies for victims and hold individuals accountable for the hatred they spread. Changes to the Mandatory Reporting Act would support investigations into serious crime related to child pornography.
The Act would establish a Digital Safety Commission of Canada and a Digital Safety Ombudsperson of Canada, supported administratively by a Digital Safety Office. The Commission would oversee and enforce the new regulatory framework and the Ombudsperson would act as a resource and advocate for users and victims.
This Bill is an essential step forward in ensuring the safety and wellbeing of Canadians on social media platforms.
Expected outcomes of the proposed legislation
The Online Harms Act aims to hold online platforms accountable for the harmful content they host. It also aims to instill greater transparency around how this harmful content is managed on the service.
All people in Canada would benefit from:
- Special protections for children and stronger reporting of child pornography;
- Public oversight of and accountability from online services, including better data on how they protect their users;
- Strengthened Criminal Code provisions against hate; and
- Improved safety over time, both online and in communities.
The Bill would give all people in Canada the tools to:
- Flag harmful content and request removal of content that (1) sexually victimizes children or revictimizes survivors and (2) intimate images posted without consent;
- Submit complaints and report non-compliance to the Digital Safety Commission;
- Contact Digital Safety Ombudsperson to receive support and be directed to the right resources; and
- File complaints with the Canadian Human Rights Commission when facing hate speech online.
Next Steps
Once Bill C-63 receives Royal Assent, the Governor in Council would work to bring the Act into force, including establishing the necessary regulations. These regulations would provide additional clarity around the application of the Act and set thresholds for scoping in regulated services and for the Administrative Monetary Penalties regime. Concurrently, the Digital Safety Commission of Canada would be set up. This would include the appointments of members of the Commission and the Ombudsperson by the Governor in Council. Once the Commission is established, it would issue regulations on the administration and enforcement of the Act as well as transparency and accountability obligations for regulated services, measures services could adopt to mitigate risk, amounts for regulatory charges and standards and processes for data sharing with researchers.
Engagement and consultations
-
Expert advisory group (April to June 2022)
In March 2022, the Government of Canada established an expert advisory group on online safety, mandated to provide the Minister of Canadian Heritage with advice on how to design the legislative and regulatory framework to address harmful content online and how to best incorporate the feedback received during the national consultation held from July to September 2021.
The expert advisory group, composed of twelve individuals, participated in 10 weekly workshops on the components of a legislative and regulatory framework for online safety, including an introductory workshop and a summary concluding workshop.
The Government undertook its work with the expert advisory group in an open and transparent manner.
Each advisory group session was supported by a worksheet made available to the group in advance. The goal of these worksheets was to support the discussion and organize feedback and input. The ideas and language shared were intended to represent a starting point for reaction and feedback. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada. These worksheets are published below.
After each meeting of the expert advisory panel, a non-attributed summary of input and the discussion was published here.
Introductory Session
Session One – Subjects of Regulation
Session Two – Objects of Regulation
Session Three – Legislative and Regulatory Obligations
Session Four – Regulatory Powers
Session Five – Risk-Based Approach
- Supplementary Worksheet: Subjects of Regulation
- Supplementary Worksheet: Objects of Regulation
- Supplementary Worksheet: Legislative and Regulatory Obligations
- Session Five Summary
Session Six – Freedom of Expression and Other Rights
Session Seven – Connection to Law Enforcement
Session Eight – Disinformation
Concluding Session
-
Citizens’ Assembly on Democratic Expression (June 2022)
The Department of Canadian Heritage, through the Digital Citizen Initiative, is providing financial support to the Public Policy Forum's (PPF) Digital Democracy Project, which brings together academics, civil society and policy professionals to support research and policy development on disinformation and online harms.
One component of this multi-year project is an annual Citizens’ Assembly on Democratic Expression that considers the impacts of digital technologies on Canadian society. The Assembly is selected using a civic lottery, a process which employs random selection while ensuring that it broadly represents the diversity of the Canadian population.
This year’s Assembly took place between June 15-19 in Ottawa and focused on Online Safety. Participants heard views from a representative group of citizens on the core elements of a successful legislative and regulatory framework for Online Safety.
For more information, see the report detailing the Assembly’s recommendations to the federal government, and the Canadian public.
-
Roundtables on Online Safety (July to November 2022)
From July to November 2022, the Minister of Canadian Heritage conducted 19 virtual and in-person roundtables across the country on key elements of a legislative and regulatory framework on online safety. A list of those roundtables is available below:
Regional
- Charlottetown, Prince Edward Island
- Moncton, New Brunswick
- Halifax, Nova Scotia
- St John’s, Newfoundland and Labrador
- Québec City, Québec
- Montréal, Québec
- Niagara, Ontario
- Windsor, Ontario
- Winnipeg, Manitoba
- Saskatoon, Saskatchewan
- Edmonton, Alberta
- Surrey, British Columbia
- Northern Canada - Hybrid in Whitehorse, Yukon
Virtual
- Antisemitism
- Islamophobia
- Anti-Black Racism
- Anti-Asian Racism
- Women and Gender-based Violence
- Tech Industry
Participants received an information document in advance of each session to prepare for the discussion. This document sought comments on the advice from the Expert Advisory Group on Online Safety, which concluded its meetings on June 10.
The feedback gathered from participants touched upon several key areas related to online safety. For more information, see a summary of what we heard.
-
Indigenous Sharing Circle and Interviews (November 2022 to January 2023)
Between November 2022 and January 2023, the Government of Canada, in collaboration with Archipel Research and Consulting Inc. conducted a range of outreach and engagement activities with Indigenous peoples on online safety. This included an Indigenous sharing circle with the participation of the Minister of Canadian Heritage, as well as a number of one-on-one interviews with Indigenous people. The goals were to hear their experience with harmful content online, and ideas on how best to design a legislative and regulatory framework for online safety.
A total of 25 participants took part in this process. The participants represented diverse backgrounds and experiences and included First Nation, Inuit, and Métis people from across Canada. Participants either worked with victims and advocacy groups for Indigenous peoples or were themselves an Indigenous person who had experienced harm online.
Participants received an participant information document in advance of the session or interview to prepare for the discussion. This document sought comments on the advice from the Expert Advisory Group on Online Safety, which concluded its meetings on June 10, 2022.
A non-attributable report was developed detailing what was heard during the sharing circle and one-on-one interviews.
-
Online Consultation (2021)
From July 29 to September 25, 2021, the Government published a proposed approach to address harmful content online for consultation and feedback.
See the “What we heard” report for summaries of the discussions that took place.
2 documents were presented for consultation:
- A discussion guide that summarized and outlined an overall approach.
- A technical paper that summarized drafting instructions that could inform legislation.
Related links
Page details
- Date modified: