Government of Canada introduces legislation to combat harmful content online, including the sexual exploitation of children
News release
OTTAWA, February 26, 2024
The digital world can pose significant risks. Social media can be used to sexually exploit children, promote self-harm to children, incite violence, put people’s safety at risk and foment hate. Online harms have real world impact with tragic, even fatal, consequences.
Today, the Honourable Arif Virani, Minister of Justice and Attorney General of Canada, introduced Bill C-63, the Online Harms Act. The Bill would create stronger online protection for children and better safeguard everyone in Canada from online hate and other types of harmful content. The Bill sets out a new vision for safer and more inclusive online participation. It would hold online platforms, including livestreaming and adult content services, accountable for the design choices made that lead to the dissemination and amplification of harmful content on their platforms and ensure that platforms are employing mitigation strategies that reduce a user’s exposure to harmful content.
For too long, we have tolerated a system where online platforms have offloaded their responsibilities onto parents, expecting them to protect their kids from harms that platforms create or amplify.
The Bill would do this by:
- Creating and implementing a new legislative and regulatory framework through a new Online Harms Act. This framework would mandate online platforms, including livestreaming and user-uploaded adult-content services, to adopt measures that reduce the risk of harm in seven specific categories of harmful content. The Online Harms Act would also require services to remove content (1) that sexually victimizes a child or revictimizes a survivor, and (2) is intimate content posted without consent. Non-compliance could lead to strict penalties;
- Requiring, through the new Online Harms Act, that services provide clear and accessible ways to flag harmful content and block users, implement safety measures tailored for children and implement other measures to reduce exposure to seven categories of harmful content, including content that involves bullying children or promotes self-harm among young people;
- Creating stronger laws to help protect all people in Canada from hatred, on and offline, by creating a definition of “hatred” in the Criminal Code, increasing penalties for existing hate propaganda offences, creating a standalone hate crime offence and creating an additional set of remedies for online hate speech in the Canadian Human Rights Act;
- Enhancing the Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to better protect young people; and
- Establishing a new Digital Safety Commission to oversee and enforce the Online Harms Act’s regulatory framework and a new Digital Safety Ombudsperson to act as a resource and advocate for the public interest with respect to systemic issues related to online safety.
Everyone in Canada should be able to access an online environment where they can express themselves freely, without fearing for their safety or their life. The Government will always uphold Canadians’ constitutional right to freedom of expression, which is essential in a healthy democracy. There is also an urgent need for better safeguards for social media users, particularly children. This is why the new framework is focused on seven types of the most damaging and extremely harmful content online: content that sexually victimizes a child or revictimizes a survivor; intimate content communicated without consent; violent extremist and terrorist content; content that incites violence; content that foments hatred; content used to bully a child; and content that induces a child to harm themselves.
Online platforms, including livestreaming and adult content services, must be transparent and they must be held accountable. The safety of everyone in Canada, especially children—society’s most vulnerable—depends on it.
Quotes
“I am the parent of two young boys. I will do whatever I can to ensure their digital world is as safe as the neighbourhood we live in. Children are vulnerable online. They need to be protected from online sexual exploitation, hate and cyberbullying. Now more than ever, especially given the evolving capabilities of AI, online platforms must take responsibility for addressing harmful content and creating a digital world where everyone can participate safely and freely. This legislation does just that.”
— The Hon. Arif Virani, Minister of Justice and Attorney General of Canada
Quick facts
-
The Online Harms Act would set out obligations for online platforms, including livestreaming and adult-content services, like Facebook, Twitch and PornHub. When it comes to these services, there is currently little accountability and transparency in terms of what platforms need to do to help ensure the safety of their users. Under this legislation, services would be required to reduce exposure to seven categories of harmful content and be open and transparent about the steps they are taking to do so. They would also be required to expeditiously remove content that sexually victimizes a child and revictimized a survivor, and intimate content communicated without consent. Services would be required to be transparent with Canadians about how they are working to protect users, especially children and survivors. All users should have the ability to express themselves freely, without the risk of harm and better curate their own online experience with accessible ways of flagging harmful material.
-
The Online Harms Act would see the creation of a new Digital Safety Commission of Canada to administer the framework and to help foster a culture of online safety in Canada. A new Digital Safety Commissioner would:
- Enforce legislative and regulatory obligations and hold online services accountable for their responsibilities through auditing for compliance, issuing compliance orders and penalizing services that fail to comply;
- Collect, triage and administer user complaints and reports about services‘ obligations under all three duties;
- Enforce the removal of content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent; and
- Set new standards for online safety by providing guidance to services on how to mitigate risk, perform research, work with stakeholders and develop educational resources for the public, including children and parents.
-
The Online Harms Act would establish a new Digital Safety Ombudsperson of Canada. The Ombudsperson would act as a point of contact and a resource for users and victims and would advocate for users’ needs and interests on systemic issues regarding online safety. Appointed on a five-year term, the Ombudsperson would:
- Gather information from users on an ongoing basis and issue calls for written submissions to solicit views on specific issues;
- Conduct consultations with users and victims;
- Direct users to proper resources such as law enforcement or help lines; and
- Develop advice, publish public reports and advocate to the Commission, the Government, and online platforms calling attention to frequent, severe or systemic issues from a user perspective.
-
The changes are being made following extensive consultations by the Government of Canada since 2021, including public consultations, an Expert Advisory Group on Online Safety, a Citizens’ Assembly on Democratic Expression focused on online safety, and 22 online and virtual roundtables across Canada, as well as consultations held in 2020 by the Minister of Justice, when he was Parliamentary Secretary to the Minister of Justice.
Related products
Associated links
Contacts
For more information (media only), please contact:
Chantalle Aubertin
Deputy Director, Communications
Office of the Minister of Justice and Attorney General of Canada
613-992-6568
Chantalle.aubertin@justice.gc.ca
Media Relations
Canadian Heritage
1-819-994-9101
1-866-569-6155
media@pch.gc.ca
Media Relations
Department of Justice Canada
613-957-4207
media@justice.gc.ca
Page details
- Date modified: