Introduction Workshop Summary

Members of the Expert Advisory Group on Online Safety held their introductory workshop on April 8th from 1:00-4:00 pm EDT. All twelve group members were present for this introductory session. The Advisory Group was joined by Government representatives from the Departments of Canadian Heritage, Justice, Innovation, Science and Economic Development, Public Safety, and the Privy Council Office. Representatives from the Royal Canadian Mounted Police and the Canadian Security Intelligence Service were also present. The Group members shared their views about the main challenges in trying to design a new legislative and regulatory framework for online safety.

This summary provides an overview of the first session. Per the Terms of Reference for the Advisory Group, these sessions operate under Chatham House Rule. As such, this summary does not attribute the views expressed to any one group member or organization. Finally, the summary outlines the views expressed during the session; reports areas of agreement, disagreement, and discussion; and organizes the discussion under thematic categories. It should not be considered a verbatim recitation of the discussion.

There were three stated objectives for the introductory workshop:

  1. Obtain the group members’ observations on the previous proposal and online content regulation in general;
  2. Identify further themes that the group members want to explore in some depth, in addition to what is outlined in the itinerary; and
  3. Introduce and gather feedback on the process for the sessions going forward.

The Advisory Group spent most of its time on the first objective. Accordingly, this summary reports on the perspectives raised in relation to it and organizes the discussion points raised according to issue-specific themes. Feedback received for the second and third objectives are addressed at the end of the summary.Footnote 1

Objective 1: Obtain group members’ observations on the previous proposal and online content regulation in general.

Theme A: The Scope of Regulated Content

Types of harmful content to include

Many members highlighted that it will be challenging to settle on the scope of content to regulate. Some cautioned against casting too wide a net over regulated content, suggesting instead that the framework concentrate on the five types of harmful content enumerated in the Government’s proposal. Other views expressed included that additional harms should be considered, such as content that harms children by promoting body image issues and cyberbullying; and, that it is the victim’s perception of harm that should matter when identifying content that ought to be removed, not the harm itself.

Many experts indicated that harmful content ought to be examined on a spectrum. They conveyed that it would likely be necessary to regulate different types of harmful content in unique ways. For instance, some experts stated that a legislative and regulatory framework would benefit from addressing child sexual abuse content differently than hate speech. They explained that it is notoriously difficult to identify hate speech as it is very contextual and can sometimes be accompanied by counter speech, which can be taken down wrongfully. In contrast, experts claimed that the scope of child sexual exploitation content is often less contextual, as such content is often easier to identify, and likely warrants a stricter response.

To illustrate their points, some experts provided examples of how platforms have inaccurately identified content as harmful. For instance, it was raised that platforms have disproportionately removed irony, satire, and jokes from marginalized communities because they do not have the tools to interpret content from other cultures or in different languages. Other experts highlighted that Muslim communities have suffered significantly when it comes to platforms wrongly identifying and taking down content that is not harmful. Some also raised that while child sexual exploitation content may seem to be ‘easier to define and identify’, many social media companies are uncooperative and unresponsive when presented clear cases of child sexual abuse material.

Lawful and awful content vs. illegal content

The experts shared a number of views about whether to distinguish between "illegal content" and content that is lawful but nonetheless harmful (i.e., "awful but lawful" content). These views were not all in agreement. A few experts suggested the introduction of two separate regimes – a take-down regime for illegal content and a systems-based approach for legal yet harmful content. Other experts stressed that making a distinction between illegal and legal yet harmful content is problematic for two main reasons. They emphasised that first, there is very little content that is clearly illegal; and second, that determining legality of expression is extremely difficult and contextual. It was stressed that "what" would be regulated would need to be linked to "who" would be regulated. They pointed out that a considerable amount of child sexual exploitation content is hosted on dark-web servers, as opposed to "top of the stack" social media services. They observed that the extent to which this material would be captured by a framework would depend on how it defines and captures regulated services.

Many members stated that "grey-zone" content – speech that is lawful yet harmful, or, "awful but lawful" – poses unique challenges. They explained that most harmful content online falls into this category and stressed how damaging its effects often are. However, they asserted that it is also the most challenging content to regulate, as the right to freedom of expression becomes difficult to reconcile when addressing such content. Some experts asserted that a balance would need to be struck between preserving Charter rights while also addressing legal yet harmful content. It was also stated that lawful but harmful content cannot legally be banned but could be regulated by means other than take-down measures.

Some members indicated that a future discussion on what the Criminal Code identifies as "illegal content", and on how the law might apply to content regulation, could help the group set the scope for regulated content.

Theme B: A Systems-Based Approach to Regulation

Moving beyond a take-down approach

Most experts supported the notion of moving beyond a "take-down" approach to content regulation (which was the focus of the approached presented by the Government in the July to September 2021 consultation), shifting instead towards incentivizing platforms to manage risk when developing their products. Many members indicated that this move would achieve harm reduction while also upholding fundamental rights. Others emphasized that a systems-based approach would give businesses the freedom to work toward harm reduction in a way that suits their unique business models. It was also observed that it might initially be challenging for a systems-based approach to be sufficiently clear and consistent in setting out what kind of content needs to be regulate - but that creative ambiguity might be a good thing in the longer term. Some explained that it could lead to platforms doing more to comply because they are unsure what the scope of their obligations truly is. In contrast, others cautioned against creating a legislative framework that is imprecise and offers too much leeway to platforms in interpreting their obligations. These experts stated that such an approach may be favourable to industry but would not be in the public’s best interest.

Some experts highlighted the United Kingdom’s Duty of Care model, introduced in its Online Safety Bill, as one to emulate insofar as it shifts the regulatory conversation away from speech towards platform conduct. Relatedly, a number of experts stressed that the focus of the framework should not be whether regulated services make the "correct" content moderation decisions on a case-by-case basis. They claimed that it should instead be on oversight and compliance. A number also indicated that the Government should take a pragmatic view when it comes to online safety and should have clear eyes that the removal of all harmful content is neither possible nor should it be a regulatory goal.

Risk based regulation

Experts stressed that the Canadian framework ought to treat platforms like other companies that offer products to consumers. They explained that this shift would allow the framework to focus on issues that are addressed in other product-based sectors, like product safety, consumer harm, and risk assessment and mitigation. They asserted that as opposed to concentrating on speech, the framework would focus on the risk that platforms take when developing products and incentivize them to optimize for minimizing harm, as opposed to optimizing for engagement, virality, and speed – their current business model. Experts emphasized that such an approach would incentivize platforms to innovate quickly, as they would be held liable should their products cause harm and that risk of harm not be mitigated. It was also raised whether such an approach would still amount to regulating speech. Some cautioned against doing something indirectly that cannot, or should not, be done directly.

A few experts voiced concern over a systems-based approach. They worried that such an approach would enable the Government to delegate the responsibility over harmful content to the private sector. They were also concerned that it would not guarantee, or even necessarily promote, harmonization from one platform to the next. Finally, they worried that such a model would impose an obligation of means on platforms as opposed to an obligation of result. In contrast, other experts contended that a successful systems-based regime would impose both an obligation of means and one of result.

Theme C: The Objectives of a Legislative and Regulatory Framework

There was consensus among experts over the need for a legislative and regulatory regime to address harmful content online. Experts provided information about the prevalence of such content and shared accounts of its harmful effects on Canadians, especially vulnerable groups like children. It was expressed that current enforcement mechanisms and resources allocated to monitoring the online space for illegal content are not sufficient, as a plethora of such content remains online. Some emphasized the importance of instituting more effective tools to remove and address such content. Some experts also stressed that social media has allowed harmful content, which has always existed, to spread at an unprecedented rate and reach an unprecedented number of Canadians. They explained that the online environment has changed the scope of risk of such content, as its virality and accessibility causes it to be much more harmful. In contrast, other experts expressed that hateful messaging is as old as humanity and cautioned against exaggerating the role that social media companies play in spreading such harm.

Apart from establishing that a framework is needed, most experts also highlighted that the framework adopted would need clear goals. Many expressed that the Government’s previous proposal was not clear in its objectives. These experts stressed that it would be necessary to 1) establish what a new framework would seek to accomplish; 2) clearly communicate its purpose so that Canadians understand what the Government is trying to do, and how it might impact their behavior online; and 3) measure its effectiveness, so that governments, civil society, academia, industry and people in Canada can all see how these measures work or may need to be adapted.

Most experts acknowledged that settling on a legislative objective is a difficult task. Many emphasized that the framework’s objective must include the importance of preserving Canadians’ fundamental rights and freedoms. They stressed that any objective must reconcile the protection of Canadians from harm with the promotion of equality, freedom of expression, and privacy rights. Many also highlighted that a legislative framework to regulate harmful content online will not be able to achieve perfect results. They cautioned against aspiring to perfection, asserting instead that an objective of harm reduction, as opposed to the elimination of harm, should be sought. Finally, two additional objectives were raised in discussions: 1) the reduction of suffering caused by harmful content online; and 2) providing an avenue of redress to victims harmed by harmful content online.

Theme D: Freedom of Expression, Equality and Privacy Rights

Most experts emphasized that any effective framework would need to balance Charter rights. Many members cautioned against prioritizing certain rights over others and emphasized that all rights are fundamentally limited.

Regarding the freedom of expression, some experts explained that not all speech is protected to the same extent. They emphasized that the rights of both victims and those spreading harmful content are important to consider, and that when those rights are in conflict, justifiable limits exist. Some experts emphasized that freedom of expression and other Charter rights must be given equal weight. A few experts explained that limiting a user’s individual right to share harmful content online may be justifiable if it is done in order to preserve the collective right of other users to a safe online space.

Many experts emphasized that an effective framework must also consider equality rights and protect marginalized communities. Some stressed that marginalized communities are often silenced and threatened by harmful content online. Many also highlighted that marginalized communities often do not feel comfortable flagging or issuing complaints about harmful content they face online.

Theme E: How to Best Support Victims and their Needs

Many experts spoke about how important it is to come at any legislative framework that addresses harmful content online from a victim’s perspective. They provided examples of how vulnerable groups in Canada, like children, are routinely victimized and targeted online, and how imperative it is that the Government act to ensure there is safety online for these groups. They explained that existing criminal laws are not adequately protecting Canadians from being victimized by harmful content online, and that hateful people and organizations have found a comfortable place on social media to issue their threats.

Many experts also highlighted that the onus must not be placed on the victim to file or engage in a complaint procedure. They stressed that putting the onus on victims to seek recourse or a reasonable content moderation outcome often deepens or exacerbates the process of victimization. Procedural safeguards like redress mechanisms would need to be carefully thought through to ensure victims are adequately protected.

Finally, experts emphasized the importance of hearing from victims prior to the creation of legislation, to ensure that their voices are reflected in a framework aimed at protecting them. A few experts stressed that children do not have agency, and that if we do not proactively give them a voice, they are not going to be heard in the conversation around regulating content that is so harmful to them.

Theme F: The Regulatory Toolkit

Some experts expressed that a systems-based approach would need to be supplemented by other regulatory tools. They shared that even with an effective systems-based framework there would still be a need for a recourse mechanism to be available to victims of harmful content online in exigent cases.

A few experts emphasized that the framework would need to concern itself with prevention, democratic engagement and debate as much as risk-based assessments. They highlighted a need for programmatic supports related to harms facing children, as well as supports for democracy and civic engagement.

Experts also discussed platform compliance. Many stated that enforcement cannot be left to the good graces of industry players. They highlighted that under a systems-based approach where platforms are given the flexibility to determine how they will fulfill their obligations, penalties and oversight would be critically important. Many explained that obligations are only useful insofar as they are backed up by penalties for non-compliance. A few experts stated that it would be necessary to address the question of intermediary liability during future workshops.

Members also discussed metrics for oversight. A few experts said that platforms have been resistant to act on even the most flagrant types of harmful content, like child pornography. In contrast, a few other experts stated that while the darker parts of the internet may not comply, large mainstream platforms have a policy commitment to moderate harmful content and are generally responsive to social and political pressure. A few experts also emphasized the importance of equipping a regulator with audit powers to ensure it has authority to hold platforms accountable for their risk-assessments and ensure maximum transparency.

Theme G: Lessons Learned from Other Jurisdictions

Many experts highlighted that Canada is at a second-mover advantage in regulating harmful content online. They stressed that the empirical evidence put forth from other jurisdictions that have already legislated in this space is a significant advantage because there are a lot of lessons that can be learnt from what has gone right, and wrong, under earlier frameworks.

They explained that there are components of other legislative and regulatory regimes that can and should be onboarded and modified for a Canadian context. For instance, they vocalized that there is a consensus emerging globally that a systems-based approach has seen the most success internationally.

They also listed elements that they would advise against replicating. For instance, according to such experts, research has demonstrated that content-based regimes are problematic for several reasons, one of the biggest being that they have a chilling effect on speech. They also surfaced that some legislation in other jurisdictions has set wide goals, such as the elimination of extremism, which cannot be measured using the ethnographic data available. Such an approach raises serious concern about how to assess the success of the regulatory regime.

Finally, a few experts stressed that a strong lesson learned from other jurisdictions was that consulting with and hearing from minority groups and civil society organizations is a necessary part of any successful regulatory process.

Objective 2: Identify further themes that group members want to explore in some depth, in addition to what is outlined in the itinerary.

A few experts identified additional themes that they would like to discuss in future workshops. The first was a discussion around how best to communicate a new legislative and regulatory framework to Canadians. They emphasized that the communication of a framework to regulate harmful content will be important, as such a framework has the potential to contribute to, erode or reinforce the public's faith in Government and other democratic institutions. The second theme raised was the exploration of other tools to address harmful content online, apart from legislation, such as providing greater resources and funding for victim support.

Two additional concerns were raised regarding the itinerary. First, a few experts emphasized that any legislation introduced would need to be clear and intelligible. They expressed that many Canadians would be making use of this legislation and that it would be crucially important that they be able to interpret and understand it. They also stressed the importance of bringing Canadians along as the Advisory Group generates its advice and as the Government introduces legislation. They explained that words matter and cited examples of Canadians believing that they have First Amendment rights, a Bill of Rights, or limitless free speech rights. They stated that the concept of a duty of care has intuitive communicative power and can, and indeed should, be communicated properly to help Canadians accurately understand what is being proposed. Second, a few experts identified elements they want to learn more about, such as the law and its limits. They explained that they wish to have a better understanding of how established laws play into the issues that the Advisory Group is tasked with exploring.

Objective 3: Introduce and gather feedback on the process for the sessions going forward.

Many experts shared and committed to share readings and other resources on the topics discussed during the workshop, as well as additional relevant topics of interest. The expert group also began discussing which stakeholders they would like to meet with as part of their external consultative process. A few experts stated that it would be beneficial to meet community members who would be willing to speak about their experience with harmful content online so that the Expert Advisory Group can hear about how such content is affecting Canadians. They emphasized that the group would surely be able to learn much from the lived experience of victim groups directly.

Next Steps

The next workshop for the Expert Advisory Group will take place on Thursday April 14th from 1:00-4:00 pm EDT. Experts will discuss the Subjects of Regulation worksheet at this session.

Page details

Date modified: