Summary of Session Six: Freedom of Expression and Other Rights

The Expert Advisory Group on Online Safety held its sixth session on May 20 from 1:00-4:00 pm EDT, on freedom of expression and other rights. Ten members were present. The Advisory Group was joined by Government representatives from the Departments of Canadian Heritage, Justice, Innovation, Science and Economic Development, Public Safety, Women and Gender Equality, and the Privy Council Office. Representatives from the Royal Canadian Mounted Police were also present.

This summary provides an overview of the sixth session. Per the Terms of Reference for the Advisory Group, these sessions operate under Chatham House Rule. As such, this summary does not attribute the views expressed to any one group member or organization. It outlines the views expressed during the session; reports areas of agreement, disagreement, and discussion; and organizes the discussion under thematic categories. It should not be considered a verbatim recitation of the discussion.

The discussion question for the workshop was “How can Government best protect and promote freedom of expression and other rights in its proposed legislative and regulatory framework to address harmful content online?”

The objectives for the session were:

  1. Obtain feedback on the elements of the proposal that raise concerns with respect to the freedom of expression, equality rights and users’ privacy rights;
  2. Determine whether there are groups or communities whose privacy rights, equality rights or freedom of expression would be disproportionately impacted by the regulatory proposal;
  3. Identify possible safeguards and mitigation measures to address concerns around preserving and protecting the freedom of expression, equality rights, and users’ privacy rights; and
  4. Determine whether there are any effective alternative approaches to regulation that would fulfill the objectives while limiting interference with Charter rights.

This summary reports on the perspectives raised in relation to these objectives and organizes the discussion points according to issue-specific themes. Footnote 1

Theme A: Charter Rights

Protecting Charter Rights

Many experts emphasized that a successful legislative approach should balance rights and freedoms. Experts stressed the importance of acknowledging that all Charter rights are interrelated, and that no rights have priority over others. They explained that determining how to balance rights will be a crucial, but difficult, task to accomplish. Some experts also emphasized that international human rights should be considered.

Many experts canvassed the most problematic aspects of the Government’s previously proposed approach from 2021 from a Charter rights perspective. They explained that the twenty-four-hour takedown requirement considered by the Government last summer would have created a structural incentive for platforms to over-censor speech by prioritizing the speed of content removal over accuracy. Experts also cited proactive content monitoring obligations as being problematic from a freedom of expression and privacy perspective. Many experts explained that insofar as a new system-focused, risk-based approach would not include these elements, it would reduce the risk of Charter rights being undermined. Experts explained how other provisions in the 2021 proposal, namely mandatory reporting of content to law enforcement based on the threshold of suspicion, instead of actual knowledge, as well as website blocking also raised Charter concerns. Overall, many experts stated that pivoting away from regulating particular categories of content, towards an approach that requires companies to take a design-oriented and risk-based approach to dealing with harmful content, may be less problematic. However, experts stressed that Charter compliance will largely depend on the regime’s details.

Experts explored multiple ways to safeguard fundamental rights and freedoms. For instance, some experts stressed that it would be necessary to clearly specify what content is to be regulated in legislation to prevent regulated services from interpreting their mandates too broadly, which could lead to over-censorship. Experts also suggested providing clear, accessible, fair and transparent redress mechanisms for users as another way to safeguard users’ rights. Other experts stressed that a duty to act responsibly should not only relate to harm mitigation, but also to the protection of Charter values. Finally, many experts stated that the protection of all rights should be explicitly stated in the legislative preamble. On this latter point, some experts emphasized that a strong preamble will be necessary, one that references how all Charter rights act together, as opposed to singling out certain rights. Other experts expressed that any preambulatory language must explicitly explain the freedom of expression, stating that it includes the right to express distasteful, offensive and unpopular opinions and the right to form our opinions free from interference.

Section 35: Indigenous and Treaty Rights

Some experts stressed that when speaking about Charter rights, it is essential to also speak about Section 35 of the Constitution on indigenous and treaty rights. They argued that the duty of care discourse may not align with section 35 rights, and as such, they preferred the duty to act responsibly. Experts clarified that section 35 rights are given to a group and not to an individual. They explained that often both a specific plaintiff and their community will be listed in a section 35 claim. In contrast, a duty of care, they stated, is owed to a specific person. Experts explained that the duty of care would risk more individualized as opposed to communal harm. They suggested that a duty to act responsibly would be more fitting as it could be owed to a community, thus aligning more appropriately with section 35.

Freedom of Expression

Some experts emphasized that the law on freedom of expression is underdeveloped in Canada, and around the world, when it comes to questions of content moderation. Experts stated that there are a number of difficulties to address. For instance, they explained that it is very difficult to determine when expression triggers criminal liability, as a determination of intent is necessary. They also emphasized that the freedom of expression does not deal with cumulative harms or novel harms very well. For instance, they explained that it is difficult to reconcile the issue of disinformation with the freedom of expression. Some experts stressed that this is why a broad risk-based systemic approach is appropriate, as it is challenging and risky to draw lines in legislation for when exactly content should be moderated.

Some experts questioned where the limits on the freedom of expression should be. They asked whether it made sense to tolerate a racist theory, one that would inevitably influence people to be violent. Some experts emphasized that the right to freedom of expression includes the right to offend. Other experts argued that it is acceptable to have a higher standard of responsibility online, compared to offline, as the online environment allows users to hide and not be held accountable for their speech.

Some experts stressed that the scope of regulated content is critical to determining whether the right to freedom of expression is infringed upon. For instance, they explained that more stringent action can be taken against certain types of content, such as material that harms children. They argued that content must be defined and scoped in sufficient detail within the legislation. Experts emphasized that the United Kingdom’s Online Safety Bill is being criticized for its lack of specificity regarding the legislated definitions of harmful and illegal content. Some experts pointed to the definition used in Australia’s Online Safety Act, which includes content that “would be likely to have the effect on the Australian child of seriously threatening, seriously intimidating, seriously harassing or seriously humiliating the Australian child”. Experts argued that such an approach should be the blueprint for in-scope harmful content within a Canadian context as it provides a clear and defensible avenue to regulate harmful content while protecting users’ freedom of expression. Other experts stated that the best way to safeguard the freedom of expression would be through a broad framework that does not define content in a detailed way.

Many experts stressed that hate speech raises complexities regarding the freedom of expression. Some expressed concern that the freedom of expression can be used as a justification for hate speech. Other experts emphasized that people are allowed to be ignorant and offensive, but, when ignorant and offensive remarks spur violent attacks, they take on a different, higher degree of harm. Experts stressed that it is difficult but important to determine when content crosses this line. Other experts argued that words alone do not cause violent attacks and stressed that there are a host of other factors involved that must also be considered. The multi-faceted nature of many violent attacks, they explain, makes it that much more difficult to scope and define regulated content. Some experts introduced another complexity with hate speech, explaining that when hate is not expressed against a protected group, platforms have great difficulty determining what to do with it. For instance, they explain, hate against journalists or health practitioners is difficult for platforms to moderate as they are not protected groups under the Charter.

Privacy Rights

Some experts cautioned against requiring or even incentivizing proactive or general monitoring by regulated services. They stated that perhaps specific monitoring could be justified in certain circumstances, but even then, caselaw has demonstrated that it is very challenging to justify such a scheme as it is very difficult to identify the difference between general and specific monitoring.

Some experts stressed that privacy reform must be introduced prior to, or at least in tandem with, a regulatory online harms framework. They insisted that such reform would address the overall structure that drives platform behaviour. Experts argued that many of the most significant problems faced in the online sphere – including the amplification of problematic content – can be ultimately traced back to relatively underdeveloped privacy protections in Canada.

Some experts cautioned against obligations to provide user information to law enforcement agencies where no illegal acts are being committed. Experts stressed that certain marginalized groups may targeted by such measures. They insisted that to protect users’ privacy rights, such obligations would need to be clearly circumscribed regarding the threshold for information sharing, where the information is stored and for how long, and who has access to such data.

Some experts explained that private messaging services should be regulated. They argued that the exclusion of private messaging services would diminish the framework’s ability to address child sexual exploitation content in any meaningful way. Experts stressed that exempting private messaging from the regulatory framework would embolden offenders to maximize the use of public spaces to connect with other offenders or children, then quickly move to private spaces before crossing the line into illegal/harmful activity, knowing they will be protected. They insisted that by exempting private messaging services, the framework would be effectively giving companies permission to provide the infrastructure to share child sexual exploitation material without imposing any obligation on them to prevent such hosting from occurring.

Theme B: Risk-Based Systemic Approach to Regulation

Key Elements

Experts canvassed the core elements of a risk-based systemic approach to online safety regulation. First, many experts emphasized that the regulator should fulfill a public educational role regarding harm prevention. They stated that this role would help Canadians, companies, and organizations understand and feed into a risk-based approach. Second, many experts stressed that a risk-based framework should be based on a duty to act responsibly. They explained that regulated services should have a duty to implement tools and systems that help prevent harm to their users. They emphasized that such a framework would not prevent all harm, and that to remove all harmful content online is not an attainable goal. Third, many experts emphasized that in situations where harm is not successfully prevented, there must be a remedial mechanism both at the platform level and through an independent body. Finally, experts agreed that continuous consultation would be necessary to ensure a sustained dialogue about how services define risk and how risk evolves over time. Experts stressed the need to create a regulatory framework that is sufficiently dynamic and adaptive to changes in technology but also reflective of changing social concerns and interests. On this latter point, they emphasized that ongoing consultation would lead to the development of effective systems and processes to manage risk. Some experts also stated that it is crucial for legislation to define risk in way that not only includes the views of historically marginalized communities but allows them to continuously engage with and feed into the regime in order to reflect lived experiences of victims.

Many experts stressed that transparency obligations must be structured in a way to allows experts in this field to effectively study online safety. They emphasized that it is crucial for any legislation to ensure that researchers and academics have access to information from regulated services on content moderation, so that they can study the spread and impact of harmful content in Canada. Experts explained that the research could then feed back into the risk-based model thereby allowing academics and researchers to contribute to the risk management process. They suggested that part of the duty to act responsibly would be to act upon any new research on online safety.

Experts disagreed on what role legislation should play in defining how regulated services should action content. Some experts emphasized that legislation should leave it open to companies to determine what type of actioning is appropriate to manage their own risk. Other experts called for more detailed requirements and standards concerning what type of actioning would meet the threshold for fulfilling their duty to act responsibly.


Experts disagree on the necessity for, and structure of, an external recourse mechanism for content moderation decisions. Some experts explained that an external recourse mechanism would take too long to get content removed. Experts also cautioned that it would be very difficult to ensure that such a recourse body is Charter compliant, as it would be very challenging to craft legal standards to define what types of content should be removed. Instead, experts stressed that emphasis should be placed on getting regulated services to “get it right from the start” as much as possible by focusing on ensuring they implement effective identification, moderation and recourse mechanisms which are subject to transparency, oversight and accountability. Other experts emphasized that an appeal mechanism for certain narrow circumstances is necessary to ensure that vulnerable groups impacted by harmful content have recourse available to them. For instance, they explain, if there is an allegation of the non-consensual sharing of an intimate image, the image should be removed until proven otherwise. For other types of content, they insisted that the equities for removal look different and removal orders are likely not appropriate. Experts cautioned that a recourse body would not be able to operate at the necessary speed to keep up with the amount of content, which could be problematic. They suggested that any independent body be reserved for less time-sensitive appeals, and that a collaborative relationship between the independent body and the regulated services be fostered to leverage removal for certain content.

Applicability: Tragedy in Buffalo

The experts discussed the recent shooting in Buffalo, analyzing what factors contributed to the tragedy, and thinking about how a Canadian risk-based model could have addressed the situation. Some experts stressed that the attack reveals that the status quo regarding platform content moderation is no longer acceptable. They explained how the perpetrator used, and was influenced by, online communication. Namely, that they became radicalized through online forums, uploaded their manifesto online, and livestreamed the attack. Experts stated that a risk-based regulatory framework could have introduced a time-delay for the livestreaming of videos to curb the spread of such content. Other experts raised the fact that the perpetrator of the attack included details about their planned attack on a private online page. They explained that the perpetrator’s privacy rights may have been infringed upon if the platform had monitored their private page, however, such monitoring could have prevented the attack. Finally, experts explained that the tragedy demonstrates how important cross-platform collaboration is, as perpetrators often post the same content on multiple platforms to maximize its reach.

Experts also discussed the impact of the internet on hateful and violent events. Some experts stressed that the internet has played a part in incentivizing such attacks by allowing perpetrators to publish manifestos and other works that rapidly reach a global audience. They emphasized that such information can now be produced, reproduced, and transmitted infinitely at great speed. However, other experts pointed to historical instances of hate, like the holocaust, to emphasize that abuse, hate crimes, and ideologically motivated violence was far reaching even before the internet. Such experts warned against “golden-age nostalgia”, explaining that even in the pre-internet world, communications technology was used to undermine the safety and rights of minorities. They suggested that instead of looking at the speed in which hateful messaging is spread, it would be helpful to examine the bottlenecks to such sharing of information, to assess how a regulatory framework may be able to create friction against the proliferation of such content today.

Public Outreach

Some experts emphasized that if Canadians do not understand the legislation, or the regulatory framework, they will not benefit from it. Experts stressed that a person should be able to understand the framework even if they are not well versed in Canadian law. Experts explained that if the communities that the framework is trying to protect do not understand the framework and how it works, it will not be used to its full potential. To remedy this concern, experts suggested that the Government place an emphasis on education, especially for individuals and communities who will be most impacted by the framework. Some experts suggested that an educational campaign around what the freedom of expression is in Canada would be an important component of such outreach activities. Others stressed that providing education and support to regulated services, especially small platforms that do not yet have sophisticated resources, would help them understand how best to implement their duty to act responsibly.

Theme C: Further Engagement

Stakeholder Engagement

The Department and the co-chairs for the expert group presented a path forward on stakeholder engagement. They proposed to not engage widely with stakeholders due to both time constraints and a fundamental concern from experts about not being equipped or best placed to conduct a fulsome stakeholder engagement process. Instead, experts would meet with officials from a select few international jurisdictions to hear about lessons learned from other similar legislative and regulatory frameworks. These discussions would be meant to help the expert group present the government with advice on what they believe a successful online framework could look like. No further stakeholder engagement would be expected from the panel. Instead, the Government would conduct further substantive and intentional engagement on its own following the conclusion of the group’s work, to verify and test some of the core ideas that were expressed during the expert group workshops.

The Department and the co-chairs also acknowledged that group members may have ideas about how the government should conduct its own engagement. The Department welcomed any views on such engagement, emphasizing that it is open to hearing from experts on this topic and benefiting from their experience on how best to craft meaningful and robust engagement. In particular, some experts provided advice on how best to craft engagement with Indigenous Peoples. The Department welcomed this advice and invited interested experts to meet with the group leading engagement with Indigenous communities at the Departmental level, to learn about their approach and provide welcome input.

Next Steps

The next session of the Expert Advisory Group will take place on Friday, May 27 from 1:00-4:00 p.m. EDT. Experts will discuss the Connection to Law Enforcement at this session.

Page details

Date modified: