Session Six: Freedom of Expression and Other Rights
What is a Worksheet?
Each advisory group session will be supported by a worksheet, like this one, made available to the group in advance of each session. The goal of these worksheets is to support the discussion and organize feedback and input received. These worksheets will be made public after each session.
Each worksheet will have a set of questions for which the group members will be asked to submit written responses to. A non-attributed summary of these submissions will be published weekly to help conduct the work in a transparent manner.
The proposed approach in each worksheet represents the Government’s preliminary ideas on a certain topic, based on feedback received during the July-September 2021 consultation. It is meant to be a tool to help discussion. The ideas and language shared are intended to represent a starting point for reaction and feedback. The advice received from these consultations will help the Government design an effective and proportionate legislative and regulatory framework for addressing harmful content online. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.
Discussion Topic
How can Government best protect and promote freedom of expression and other rights in its proposed legislative and regulatory framework to address harmful content online?
Objectives
- Obtain feedback on the elements of the proposal that raise concerns with respect to the freedom of expression, equality rights and users’ privacy rights. Elements concerning content identification and moderation obligations, as well as information and data sharing provisions in the previous proposal raised significant concern.
- Determine whether there are groups or communities whose privacy rights, equality rights or freedom of expression would be disproportionately impacted by the regulatory proposal. Marginalized groups, victims of sexually exploitative content, and other vulnerable individuals and communities may be impacted by a regulatory framework in unique and harmful ways.
- Identify possible safeguards and mitigation measures to address concerns around preserving and protecting the freedom of expression, equality rights, and users’ privacy rights. Powers, obligations, and authorities could all be used in legislation to help promote and protect Charter rights. The key will be in establishing the most appropriate and proportionate tools.
- Determine whether there are any effective alternative approaches to regulation that would fulfill the objectives while limiting interference with Charter rights. There are different approaches that can be taken to fulfill the Government’s objective of introducing legislation to address harmful content online. It is important to canvass and explore all alternatives to determine the most proportionate and appropriate framework.
Starting Points
- Protecting the freedom of expression, equality rights, and privacy rights is a fundamental pillar of any legislative or regulatory action. Any framework for harmful content online is likely to attract scrutiny about the implications for these Charter rights. The revised framework must minimize any limitations to freedom of expression, equality rights and/or privacy rights, while considering the concerns and recommendations made through the public consultation over the summer of 2021.
- Freedom of expression is a fundamental freedom under the Canadian Charter of Rights and Freedoms, and any limitations on it must be rational, minimal, and proportionate to the Government’s objective and the benefits of the proposed legislation. Legislation that requires platforms to remove harmful content posted by users could limit the freedom of expression for users whose content is suppressed, and users receiving the content. At the same time, Canadian jurisprudence acknowledges that the freedom of expression is not absolute and is subject to reasonable limits.
- Reflecting this, procedural fairness must be built-in as a safeguard for the freedom of expression in the proposed regime. Under the proposed regulatory obligations, regulated entities would be required to introduce tools and procedures for users to seek recourse when their content is moderated – and these tools and procedures would be subject to review by the Digital Safety Commissioner to determine their adequacy and effectiveness.
- The freedom of expression must also be considered along with other Charter rights, such as rights to equality. Restricting harmful content that directly targets and silences marginalized communities, or results in their members self-censoring, promotes and protects both the freedom of expression and equality rights.
- Similarly, the privacy rights of people in Canada must be protected. The regulatory goals of increased transparency and accountability cannot advance without respecting the privacy rights of individuals in Canada. Procedural safeguards should be established to protect individual privacy where disclosure of content, inspections, audits, or other information-seeking and/or disclosing activities are taken.
Overview of Proposed Approach
- Any legislative framework to regulate harmful content online must include measures to mitigate the risk of undue infringements on the rights of freedom of expression, equality, and privacy. This would be part of the ‘duty of care’ imposed on regulated entities and would be an explicit element to include when designing and implementing safety policies and procedures under platforms’ Digital Safety Plans. A more limited segment of regulated services (e.g., Category 1 services) could have the additional obligation to carry out, maintain, and publish proactive impact assessments on the effects that their safety policies and procedures would have on users’ freedom of expression, equality rights, and privacy rights.
- The regime must not be overbroad and unduly limit the freedom of expression. First, the regime would be limited to a subset of harmful content, set out in legislation, for which there is good evidence that its dissemination is harmful. The Canadian regime would focus at inception on five types of harmful content – child sexual exploitation content, terrorist content, content that incites violence, the non-consensual sharing of intimate images, and hate speech. The nature of this content furthers the argument that the Canadian approach would not be overbroad. Its moderation is important to ensuring the reduction of these harms and the safety of Canadians. Finally, the scope of regulated entities would have a strict focus on the public communication of harmful material. Private messaging services would be excluded.
- Additionally, an emphasis on procedural fairness in decision making and recourse mechanisms would mitigate concerns around unreasonable censorship. The proposed regime includes significant procedural fairness mechanisms, including obligations on platforms to give notice to users, allow representations to be made, and grant the opportunity to request a reconsideration of specific content moderation decisions. These mechanisms and procedures would be reviewed by the Digital Safety Commissioner to ensure they provide effective and proportionate procedural safeguards for Canadian users.
- The framework could also compel regulated entities to apply their systems, tools, and procedures in a manner that promotes equality rights. Regulated services would be required to publicize how they take power imbalances, systemic discrimination, and the historical and ongoing oppression of marginalized groups into account when designing, implementing, and reviewing their content moderation tools. Regulations would provide further detail regarding the specific obligations that services would be required to fulfill.
- Finally, a new legislative and regulatory framework would respect and protect users’ rights to privacy. Legislation should mandate that users’ anonymity be preserved, where appropriate, when flagging content to regulated entities, asking for reconsideration of a content-moderation decision, as well as when issuing a complaint to the Digital Safety Commissioner about platform non-compliance. The privacy of victims is particularly important. Some representations and materials may be particularly sensitive, such as those relating to child sexual exploitation or the non-consensual sharing of intimate images. As such, the Commissioner could have the ability to hold in camera or ex parte hearings where necessary to address complaints of a sensitive nature for victims. Given this context, the nature and extent of the information preserved or shared under the framework between the regulated entities and the Commissioner or Government, including law enforcement agencies, would be designed to protect privacy and would be subject to the law, including the Charter.
Supporting questions for discussion
- Obtain feedback on the elements of the proposal that raise concerns regarding the freedom of expression, equality rights and users’ privacy rights.
- What are the most concerning elements of the old proposal with regard to Canadians’ Charter rights? Have these concerns been mitigated by the proposed new vision?
- What concerns, if any, have been mitigated by the proposed new vision? What appears to strike the right balance in the proposed approach?
- What are the main concerns with the current thinking as it relates to protecting Charter rights?
- Determine whether there are groups or communities whose privacy rights, equality rights or freedom of expression would be disproportionately impacted by the regulatory proposal.
- Are the rights of vulnerable groups disproportionately affected by the regime? If so, what are potential safeguards that could mitigate or eliminate this unintended consequence?
- Identify possible safeguards and mitigation measures to address concerns around preserving and protecting the freedom of expression, equality rights, and users’ privacy rights.
- What powers and authorities could be given to the new regulator to ensure Canadians’ rights are protected? What obligations could be imposed on regulated entities to protect Canadians’ rights?
- What safeguards could be included in legislation (i.e. through definitions, objectives, procedural safeguards, etc.) that could help address concerns?
- Determine whether there are any effective alternative approaches to regulation that would fulfill the objectives while limiting interference with Charter rights.
- How can the regime best advance a comprehensive approach to confronting harmful content online that may involve law enforcement and security intelligence communities while still respecting users’ equality rights, data and privacy rights?
- Are there alternative obligations, definitions, or other approaches that would fulfill the obligation, of supporting a safe and inclusive internet where people in Canada feel free to express themselves, while minimally impairing the freedom of expression?
Page details
- Date modified: