Scott Hutton to the Standing Senate Committee on Legal and Constitutional Affairs

Speech

March 2, 2022

Scott Hutton, Chief of Consumer, Research and Communications
Canadian Radio-television and Telecommunications Commission (CRTC)

Check against delivery

Thank you, Madam Chair, for inviting us to appear before your Committee. I am joined today by my colleague Peter McCallum, a consultant Legal Counsel for the CRTC.

We are aware of your study of Bill S-210, which proposes to restrict young peoples’ online access to sexually explicit material.

The Internet has exponentially increased access to all kinds of content, including sexually explicit material. We acknowledge and share the concerns about the adverse effects and negative social impacts that exposure to pornography can have on youth and adolescents. This is a global issue that, in our view, requires a comprehensive, whole-of-government approach and many different tools.

There is no simple solution to regulate harmful online content. Many jurisdictions are struggling with this issue, and I would note that the Standing Committee on Canadian Heritage recently adopted a motion to conduct a study on the harms caused by online access to sexually explicit material.

Of course, there is no place on the Internet for harmful or illegal content. There are provisions in the Criminal Code to address this type of content and several organizations at the federal level are actively engaged in these files. They include Public Safety Canada, Justice Canada, the RCMP’s National Child Exploitation Crime Centre and the Canadian Centre for Child Protection.

At the moment, Canadians can control their access – and their children’s access – to inappropriate content using filtering software and parental controls.

Bill S-210 would enable designated enforcement authorities to take steps to prevent sexually explicit material from being made available to youth on the Internet in Canada. While we support the aims of the proposed legislation, the CRTC does not currently have such authority.

Canada’s Telecommunications Act does not clearly provide for the regulation of content with respect to Internet service providers. Our legislation is built on the foundational principle of net neutrality. This refers to the concept that all traffic on the Internet should be given equal treatment by Internet service providers (ISPs). ISPs should not manipulate, discriminate or give preference to the content that passes through their networks.

The CRTC was one of the first regulators in the world to implement an approach to uphold net neutrality. We have issued three decisions that, combined, form the current regulatory framework for net neutrality in Canada.

Even if the CRTC were given the power to order ISPs to verify the appropriateness of the content passing through their network, it may not be technically feasible for them to implement an age verification system.

In terms of content, the CRTC’s powers were designed with the traditional broadcasting system in mind. As you are most likely aware, Bill C-11, which is currently being debated in the House, proposes to modernize the Broadcasting Act. If adopted by Parliament, Bill C-11 will empower the CRTC to ensure that online broadcasters contribute to Canadian content and achieve other important public policy objectives. The legislation would give us the three key elements that we are missing to regulate online platforms: the clarity of jurisdiction, the ability to gather data and the necessary enforcement tools.

That being said, let me repeat that harmful and illegal online content is a global problem. Various countries are looking at different strategies to prevent minors from accessing this type of content online.

For instance, Australia is working to implement a mandatory age verification system, but its approach also recognizes the need for greater education, awareness and understanding of respectful and harmful sexual behaviours among youth. The Australian eSafety Commissioner is consulting the public and stakeholders. One of the key insights from its initial consultation is that a one-size-fits-all technological approach would not be effective.

In 2021, the European Council proposed amendments to the draft Digital Services Act to improve provisions related to the use of age verification and parental control tools to mitigate the risk of exposure to harmful content. These provisions would apply to large online platforms and search engines. Debates are expected to begin soon regarding the final text of the Act.

The European Commission has also funded a project to enable service providers to verify the age of their users, which will be piloted later this year by 1,500 children, parents and adults from at least three countries in the European Union.

Finally, the United Kingdom is in the process of implementing new regulations to ensure that video-sharing platforms implement measures to protect users from harmful content. It requires the platforms to establish age verification systems, with priority for those providing access to pornography.

Clearly these international efforts are in the early stages and it remains to be seen how effective they will be at protecting children. This is certainly a challenging area, as research has found that children are increasingly adept at finding workarounds to age verification systems.

To be frank, there is no single organization or single measure that can effectively address this issue. We believe that it is important to learn from our international counterparts, take the time to evaluate the most effective means to prevent minors from being exposed to harmful online content and develop a whole-of-government approach.

We would be pleased to answer your questions to the extent that we are able.

Contacts

Media Relations
819-997-9403

General Inquiries
819-997-0313
Toll-free 1-877-249-CRTC (2782)
TTY 819-994-0423
Ask a question or make a complaint

Stay Connected
Follow us on Twitter @CRTCeng
Like us on Facebook

Page details

Date modified: