Session Three: Legislative and Regulatory Obligations
What is a Worksheet?
Each advisory group session will be supported by a worksheet, like this one, made available to the group in advance of each session. The goal of these worksheets is to support the discussion and organize feedback and input received. These worksheets will be made public after each session.
Each worksheet will have a set of questions for which the group members will be asked to submit written responses to. A non-attributed summary of these submissions will be published weekly to help conduct the work in a transparent manner.
The proposed approach in each worksheet represents the Government’s preliminary ideas on a certain topic, based on feedback received during the July-September 2021 consultation. It is meant to be a tool to help discussion. The ideas and language shared are intended to represent a starting point for reaction and feedback. The advice received from these consultations will help the Government design an effective and proportionate legislative and regulatory framework for addressing harmful content online. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.
What are the legislative and regulatory obligations that should be imposed on regulated entities to reduce the amount of harmful content online, and manage the risk it poses?
- Explore the benefits and downsides of regulatory obligations designed around a duty of care. Participants’ perspectives are sought on a systems-based regulatory framework designed to increase transparency around platform content moderation, introduce mechanisms that hold online services accountable for the harmful content on their platforms, and establish baseline standards and norms around content moderation in Canada.
- Determine the appropriate degree and scope of obligations to impose on regulated online services. Participants’ views are sought on how to balance the objective of addressing harmful content online with a reasonable and practical set of obligations for regulated entities. Participants’ views are also sought on how to manage issues related to broader market consequences as a result of the proposed regulation.
- Consider the amount of flexibility that is appropriate to attach to regulatory obligations. The legislation could set out broad requirements. Subsequent regulations could then outline how regulated entities are to adhere to these obligations. Such regulations could be informed by conversations between the regulator and the regulated entities to ensure maximum collaboration, and by extension, compliance.
- The objective is to reduce the amount of harmful content online and the risk it poses to Canadian users. Based on the responses to the 2021 public consultation, a harmful content regulatory model should not be organized around whether a regulated platform appropriately moderates a given piece of content. Instead, and similar to the proposal brought forward by the United Kingdom in its recent Online Safety Bill, a regulatory model focused on the systems, tools, and approaches that platforms have in place to address harmful content would be preferred.
- Effective, systems-based regulation of online content would pursue three goals:
- Increase transparency and scrutiny around how regulated entities monitor and moderate harmful content;
- Introduce accountability for the systems, plans, and procedures platforms have in place to identify and manage harmful content; and
- Drive baseline standards and norms for the monitoring and moderation of harmful content on these platforms.
- There are two broad benefits associated with a systems-based approach oriented towards these goals. First, it seeks to minimize limitations on freedom of expression, within reasonable bounds and mitigated by procedural fairness and safeguards. Second, it seeks to implement a practical, achievable, and administratively effective regulatory regime. These benefits are particularly important for a liberal democratic society, like Canada, and for a regulator to be able to operate effectively.
Overview of Proposed Approach
- Establish a duty of care on regulated entities with a mix of legislative and regulatory measures. Legislation would establish the scope of the duty of care in broad terms and set out some specific obligations that would be imposed on regulated entities. Regulations – issued either by the Governor in Council, Digital Safety Commissioner, or both – would detail the systems and processes that regulated entities would need to adopt in order to fulfill their legislative obligations. The Commissioner would have the ability to tailor the regulatory requirements to different types of regulated entities, depending on prescribed factors such as risk of harm, size, and capacity.
These obligations would, at a minimum, compel regulated entities to:
- Monitor for harmful content and take proportionate steps to mitigate the risks identified. Regulated entities would be compelled to file Digital Safety Plans (DSPs) with the Digital Safety Commissioner, and periodically update these documents. The DSPs would require that regulated entities conduct a risk assessment of the harmful content on their platforms, and detail their mitigation measures, systems and processes to address such risks. Regulations would specify what regulated entities are to include in their DSPs, and what appropriate mitigation measures might be. For instance, regulations would specify that regulated entities must include details on the volume of harmful content present on their platforms, the exposure their users have had to such content, how much of it has been reported to law enforcement, details about how their moderation practices reduce the presence and dissemination of such content, and their measurements for success.
- Bring accountability and transparency to content moderation practices. Regulated entities would be required to publish information that they do not currently publish, with baseline transparency requirements set out in legislation and further specified in regulation. Legislation would ensure that regulated entities specify how their users are protected from harmful content on their platforms. They would be compelled to outline the broad strokes of their DSPs in publicly available terms of service. Their terms of service would need to be clear, accessible, and consistently applied. Regulated entities would also be compelled to issue periodic public reports when they update their DSPs outlining their risk assessments and corresponding mitigation measures. Regulations would specify what type of information from the DSPs would need to be included in these public-facing documents.
- Promote users’ freedom of expression, equality rights and privacy rights. Legislation would require that when crafting their DSPs, regulated entities consider the importance of protecting users’ freedom of expression, equality rights, and privacy rights. Evidence of such consideration would have to be shared by regulated entities as part of their periodic public reports. Category 1 services, major social media platforms, would have the additional obligation to publish an assessment in their DSPs of the impact that their safety policies and procedures have on users’ rights to freedom of expression, equality rights, and privacy. These services would be required to publicly state in their DSPs what steps they have taken to safeguard these rights in response to their impact assessment. Further details regarding the types of safeguards expected would be outlined in regulations, but could, for example, include requiring that human moderators take decisions in complex cases where context is important.
- Provide appropriate tools, procedural fairness, and recourse mechanisms for users. Legislation would compel regulated entities to have systems and processes in place that allow users to flag content that they believe is harmful. Requirements would be imposed on regulated entities compelling them to establish robust flagging, notice, and appeal systems for both authors of content and those who flag content. Appeal mechanisms would need to be easy to access, easy to use and transparent. Regulated entities would also be required to appropriately address appeals in a timely manner.
Supporting questions for discussion
- Explore the benefits and downsides of regulatory obligations designed around a ‘duty of care’.
- Acknowledging the overall goal of reducing exposure to harmful content online for Canadians, what are some more specific objectives that a regulatory regime of this nature should have?
- Are all of these objectives equal? Are some more important to strive for than others?
- Would the proposed digital safety plans be sufficient to achieve the abovementioned objectives?
- What information should transparency and reporting obligations require regulated entities to provide? What data would be most useful to the government, the public, researchers, and academics?
- What procedural safeguards and recourse mechanisms would be most appropriate and effective to provide users with? What platform behaviour, if any, should users be able to seek redress for at the Commissioner level? Would it be appropriate for users to be able to complain to the Commissioner about specific content moderation decisions taken by platforms?
- Determine the appropriate degree of regulatory burden to impose on regulated platforms.
- What is the appropriate degree of regulatory burden to impose on regulated entities?
- What is the appropriate regulatory burden to place on “smaller” platforms?
- How should the framework identify “smaller” platforms compared to other regulated services?
- What specific obligations might be suitable for different types of regulated entities? For example, should age verification mechanisms or mandatory takedown be required of adult content platforms, but not other services?
- What is the appropriate degree of regulatory burden to impose on regulated entities?
- Consider the amount of flexibility that is appropriate to attach to regulatory obligations.
- How much specificity in obligations is appropriate? What is the appropriate balance between sufficient specificity to ensure platforms act in a meaningful way and providing adequate flexibility so as not to constrain responses that may be otherwise appropriate to achieve the framework’s objectives?
- Will effective industry norms and standards inevitably develop over time as a result of these obligations?
- Date modified: