Session Three: Legislative and Regulatory Obligations

What is a Worksheet?

Each advisory group session will be supported by a worksheet, like this one, made available to the group in advance of each session. The goal of these worksheets is to support the discussion and organize feedback and input received. These worksheets will be made public after each session.

Each worksheet will have a set of questions for which the group members will be asked to submit written responses to. A non-attributed summary of these submissions will be published weekly to help conduct the work in a transparent manner.

The proposed approach in each worksheet represents the Government’s preliminary ideas on a certain topic, based on feedback received during the July-September 2021 consultation. It is meant to be a tool to help discussion. The ideas and language shared are intended to represent a starting point for reaction and feedback. The advice received from these consultations will help the Government design an effective and proportionate legislative and regulatory framework for addressing harmful content online. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.

Discussion Topic

What are the legislative and regulatory obligations that should be imposed on regulated entities to reduce the amount of harmful content online, and manage the risk it poses?


  1. Explore the benefits and downsides of regulatory obligations designed around a duty of care. Participants’ perspectives are sought on a systems-based regulatory framework designed to increase transparency around platform content moderation, introduce mechanisms that hold online services accountable for the harmful content on their platforms, and establish baseline standards and norms around content moderation in Canada.
  2. Determine the appropriate degree and scope of obligations to impose on regulated online services. Participants’ views are sought on how to balance the objective of addressing harmful content online with a reasonable and practical set of obligations for regulated entities. Participants’ views are also sought on how to manage issues related to broader market consequences as a result of the proposed regulation.
  3. Consider the amount of flexibility that is appropriate to attach to regulatory obligations. The legislation could set out broad requirements. Subsequent regulations could then outline how regulated entities are to adhere to these obligations. Such regulations could be informed by conversations between the regulator and the regulated entities to ensure maximum collaboration, and by extension, compliance.

Starting Points

Overview of Proposed Approach

These obligations would, at a minimum, compel regulated entities to:

Supporting questions for discussion

  1. Explore the benefits and downsides of regulatory obligations designed around a ‘duty of care’.
    1. Acknowledging the overall goal of reducing exposure to harmful content online for Canadians, what are some more specific objectives that a regulatory regime of this nature should have?
    2. Are all of these objectives equal? Are some more important to strive for than others?
    3. Would the proposed digital safety plans be sufficient to achieve the abovementioned objectives?
    4. What information should transparency and reporting obligations require regulated entities to provide? What data would be most useful to the government, the public, researchers, and academics?
    5. What procedural safeguards and recourse mechanisms would be most appropriate and effective to provide users with? What platform behaviour, if any, should users be able to seek redress for at the Commissioner level? Would it be appropriate for users to be able to complain to the Commissioner about specific content moderation decisions taken by platforms?
  2. Determine the appropriate degree of regulatory burden to impose on regulated platforms.
    1. What is the appropriate degree of regulatory burden to impose on regulated entities?
      1. What is the appropriate regulatory burden to place on “smaller” platforms?
      2. How should the framework identify “smaller” platforms compared to other regulated services?
    2. What specific obligations might be suitable for different types of regulated entities? For example, should age verification mechanisms or mandatory takedown be required of adult content platforms, but not other services?
  3. Consider the amount of flexibility that is appropriate to attach to regulatory obligations.
    1. How much specificity in obligations is appropriate? What is the appropriate balance between sufficient specificity to ensure platforms act in a meaningful way and providing adequate flexibility so as not to constrain responses that may be otherwise appropriate to achieve the framework’s objectives?
    2. Will effective industry norms and standards inevitably develop over time as a result of these obligations?

Page details

Date modified: