Session One: Subjects of Regulation
What is a Worksheet?
Each advisory group session will be supported by a worksheet, like this one, made available to the group in advance of each session. The goal of these worksheets is to support the discussion and organize feedback and input received. These worksheets will be made public after each session.
Each worksheet will have a set of questions for which the group members will be asked to submit written responses to. A non-attributed summary of these submissions will be published weekly to help conduct the work in a transparent manner.
The proposed approach in each worksheet represents the Government’s preliminary ideas on a certain topic, based on feedback received during the July-September 2021 consultation. It is meant to be a tool to help discussion. The ideas and language shared are intended to represent a starting point for reaction and feedback. The advice received from these consultations will help the Government design an effective and proportionate legislative and regulatory framework for addressing harmful content online. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.
Discussion Topic
What types of entities would be regulated under the online harms legislative and regulatory framework?
Objectives
- Assess the viability of proportionate regulation organized by categories. The legislation could establish categories of regulated services, based on the variability of user base, content volume, and risk of harmful content across these services. Each category would have its own set of regulatory obligations that the services would need to fulfill.
- Specifically, evaluate if there should be a separate category for adult content services. Adult content services pose a heightened risk for child sexual exploitation material and the non-consensual sharing of intimate images or media. These services could be filtered into their own category, which could have separate or additional obligations placed on their services.
- Determine the range of services to exclude from the framework. Excluded services would be those that the framework does not intend to regulate, and for which it would seek to pre-empt any confusion about their being within the scope of the framework. They would include Telecommunications Service Providers, the core infrastructure of the internet (e.g., Domain Name Server providers) and private communications.
Starting Points
- The objective is to regulate online services that pose the greatest risk of hosting and distributing harmful content. These services include social media platforms, such as Facebook, YouTube, Instagram, Twitter and TikTok. They also include services that may not be widely recognized as ‘social media platforms’ per se, such as Parlor and 4Chan. Finally, they include services that pose a heightened risk for child sexual exploitation material and the non-consensual sharing of intimate images or media, such as adult content services like PornHub.
- A graduated scope of application may be appropriate for this framework. The range of entities that would be scoped in under legislation may not pose the same degree of risk when it comes to harmful content online, and accordingly may not warrant as heavy of a regulatory burden. Online platforms have different business models and varying levels of maturity and effectiveness when it comes to the programs and strategies to manage harmful content on their platforms. A baseline of core obligations for all platforms would be established and additional variable regulation aimed at ensuring proportionality – as exemplified in the United Kingdom’s approach in its Online Safety Bill – could be effective given these facts.
- Online services beyond the scope of what Canadians intuitively and reasonably expect to be regulated would not be captured. Services like websites, fitness apps, and private communications (e.g., email, text message, private messaging apps) would be excluded. Telecommunications Service Providers (TSPs) and the core infrastructure of the internet (e.g., DNS providers, cloud server providers and infrastructure) would also be excluded.
- Some online platforms have both private communication services and public facing services. There are online platforms that encompass both private communication features and public facing features. The private messaging features on these platforms would be excluded from regulation. For example, Facebook has a public facing news feed, but also has the private messaging feature, Facebook Messenger. Facebook Messenger would be excluded from the legislative and regulatory regime, but Facebook’s public facing news feed would be included in the regime.
Overview of Proposed Approach
- Retain the definition of Online Communication Services that was originally proposed in the July-September 2021 public consultations. Online Communication Service (OCS) means an online service that a) is accessible in Canada and b) the primary purpose of which is to enable communication, over the internet, between persons interprovincially or internationally. This definition would encompass services including, but not limited to:
- online services widely and intuitively understood to be ‘social media’, like Facebook, YouTube, Instagram, or Twitter;
- less widely-subscribed online services like 4chan, Rumble, and Parlor; and
- online adult content services like PornHub.
- Exclude services beyond the intent of the legislative and regulatory framework. As outlined in the July-September 2021 consultation, the legislation would explicitly exclude:
- Telecommunication service providers (TSPs) as defined in the Telecommunications Act;
- Services that enable persons to engage only in private communications; and
- Categories of services that are excluded by Governor in Council regulations.
- This last exclusion addresses the concern that the definition of Online Communication Services (OCS) could be understood to capture services that the framework does not intend to regulate. For instance, the framework does not intend to regulate services like Uber, Airbnb, and Peloton. These services would be excluded from the definition of OCS, as their primary purpose is not to enable communication between persons per se, but to arrange transportation, or rent accommodations, or participate in fitness classes. However, if confusion arose as to whether these services were captured by the definition, the Government would have the authority to explicitly exclude them through regulations under this third category.
- Explore whether to establish categories of regulated services, given the variability of user base, content volume, and risk of harmful content across these services. Category 1 services could include social media services with the greatest user base and content volume. Category 2 services could include services that pose a lesser risk, which could include some smaller or less popular services. Category 3 services could include adult content services that pose a unique risk for child sexual exploitation materials and/or the non-consensual sharing of intimate images. Regulatory obligations could differ across categories. For example, where Category 2 entities could be required to publish an annual Digital Safety Plan, Category 1 entities could be required to file their plans with the Digital Safety Commissioner for review and certification, and/or include impact assessments for regulated harmful content distributed or housed on their services. Similarly, Category 3 entities could be held to more stringent regulatory requirements specifically related to child sexual exploitation or the non-consensual sharing of intimate images. These requirements could include, for example, more stringent age and consent verification measures.
Supporting questions for discussion
- Assess the viability of proportionate regulations organized by categories.
- Should there be different types of regulatory obligations on categories of online communication service providers or categories of services based on the size and risk of harm on their platforms?
- Should the relative maturity or development of a service or platform (e.g., a more recent service like TikTok versus a longer-standing service like Facebook) factor into how an entity is scoped into regulation?
- Should platforms’ nature, design attributes, and overall risk of harm be considered when creating categories and imposing obligations, regardless of their size?
- How many graduated categories should there be and what parameters would differentiate each category?
- How should services that include both public and private communications services be scoped in?
- Should adult content services have their own category with additional obligations imposed on these services?
- Should there be different types of regulatory obligations on categories of online communication service providers or categories of services based on the size and risk of harm on their platforms?
- Determine the range of services to exclude from the framework.
- What parameters should be laid out in legislation or regulation to exclude a service from regulatory obligations?
Page details
- Date modified: