Summary of Session One: Subjects of Regulation

Members of the Expert Advisory Group on Online Safety held their first session on April 14th from 1:00-4:00 pm EDT. Eleven group members were present for the Subjects of Regulation session. The Advisory Group was joined by Government representatives from the Departments of Canadian Heritage, Justice, Innovation, Science and Economic Development, Public Safety, and the Privy Council Office. Representatives from the Royal Canadian Mounted Police were also present. The Group members shared their views about the types of entities that should be regulated in the legislative and regulatory framework for online safety.

This This summary provides an overview of the first session. Per the Terms of Reference for the Advisory Group, these sessions operate under Chatham House Rule. As such, this summary does not attribute the views expressed to any one group member or organization. It outlines the views expressed during the session; reports areas of agreement, disagreement, and discussion; and organizes the discussion under thematic categories. It should not be considered a verbatim recitation of the discussion.

There were two stated objectives, with one sub-objective, for the Subjects of Regulation workshop:

  1. Assess the viability of proportionate regulation organized by categories:
    1. Specifically, evaluate if there should be a separate category for adult content services; and
  2. Determine the range of services to exclude from the framework

This summary reports on the perspectives raised in relation to the objectives and organizes the discussion points raised according to issue-specific themes.Footnote 1

Introductory Remarks

Introductory remarks set the stage for the discussion, and touched on the following topics:

The need for a clear, understandable and flexible definition to set the scope for an eventual framework, and how to position this definition in legislation and regulation. One challenge will be to strike a balance between:

  1. a precise definition in legislation that may not stand the test of time as technology evolves; and
  2. a broader definition set out in legislation that takes into account the fast-moving evolution of the internet, but which would be paired with specific terms in regulation to set the scope of the framework.

The need to determine how deeply legislation and regulation should go in the tech stack. Should legislation only capture social media platforms (i.e., the ‘top layer of the technology stack’) or should it reach further down? Should domain name registries or content processors be included in the scope of the definition? Should top layer interactive services that are not social media platforms – like, video gaming platforms and streamers, or crowdfunding platforms – also be included be included?

The challenge of managing the ‘grey area’ when it comes to deciphering private versus public communications. How should private communications be treated? Should legislation target all communications services without regard to the public-private nature of the messaging, or should legislation exclude private messaging apps specifically?

The range of lessons to learn from other jurisdictions. Should inspiration be drawn from the United States, Europe, or the United Kingdom, for example?

How prescriptive legislation should be when it comes to setting a ‘duty of care’ or ‘duty to act responsibly’. Should regulated entities be placed in categories with varying levels of regulation and obligation, or would a ‘duty of care’ model, by definition, need to be broadly applicable in scope – and thereby neutralize the purpose of categorization by applying to all regulated entities equally? Should adult content services be set in their own regulatory category?

Theme A: The Scope and Definition of Entities Captured

Scope of Legislation

Experts agreed that the scope of entities to be captured under the legislative framework should be broad in scope. Many shared that a broad scope of application is a natural consequence of a systems-based, ‘duty of care’ model for regulation. Most experts noted that the scope of regulated entities should be broad to incorporate all entities that communicate online and that duty of care obligations should be followed, not only by social media platforms, but also by other actors within the technology stack, such as intermediary services. A few explained that it would be hard to assert generalized obligations to act responsibly on some platforms (e.g., social media companies) but not others who operate on the same level of the tech stack (e.g., AirBNB, video gaming platforms). Some experts noted that a broad definition would help address evolving/emerging technologies to help future-proof the legislation.

Many experts mentioned that there is justification to look more widely at including some interactive services like Airbnb and gaming platforms or content delivery networks like Cloudflare under the scope of legislation, but a few experts also indicated that it could be difficult to regulate all of the various types of business models found in platforms. For example, it may be difficult to regulate platforms that are based on live verbal communications such as ClubHouse or gaming platforms. Many experts noted that a broad definition could be the best approach for capturing these different types of business models. A few added that a broader scope would also help to include entities that are successful in the recruitment of violent extremists. These communities adapt quickly and have been moving to video game services, file sharing sites, and live audio applications like ClubHouse.

A few experts spoke to the concept of intermediary liability. They highlighted setting out clear rules of the road for intermediary liability could, for example, create a regulatory environment wherein a regulated service would not be liable for the content itself, but it could be liable for not doing a good enough job monitoring, moderating and managing harmful content.

A few experts also cautioned against an overly broad scope. They noted that the scope would need to be broad enough to capture entities that are likely to cause harm while also balancing Charter rights and freedoms, including privacy rights. Some pointed to the challenge of regulating entities that facilitate the housing and distribution of harmful content, particularly when it comes to child sexual exploitation, and discussed how this could include the regulation of entities deeper in the tech stack (e.g., server operators) that are not as intuitively public-facing as top-of-the-stack applications like social media platforms

Definition of Online Communication Services and Alternative Suggestions

The experts reviewed wording for a prospective definition, using the concept of an ‘online communications service’ as set out in the worksheet. They also explored alternative ways to define regulated entities. They did not agree on an approach and expressed a range of views on the matter.

A principal driver of the discussion around the definition of regulated entities was the definition of an 'Interactive Computer Service' (ICS) as defined in section 230 of the United States' Communications Decency Act, and which figures in chapter 19, section 17, part 2 of the CUSMA trade agreement. It was suggested that Canada’s definition of regulated entities should mirror that agreed to under the CUSMA trade agreement, whereby any service through which a user can post content, without editorial review, would be captured. It was said that this definition would include entities such as Airbnb, Facebook/Meta, messenger, the metaverse, and discord. It was also said that this definition is starting to be used more internationally as it becomes more prevalent in various trade agreement with the U.S. and other countries.

A few experts were of the view that the broad definition of an ICS as "a system or service that provides or enables electronic access by multiple users to a computer service" could appropriately scope in the entities to be regulated, and also help address evolving / emerging technologies to help future-proof the legislation.

Other experts didn’t agree with this approach, as they did not want to import an American definition or frame. They noted that they did not want to be tied to the U.S. definition, as the jurisprudence associated with it is ‘loaded’ and they would not want to impose the legacy and conventions of that framework on Canadian legislation. It was also said that a broad definition like that of an ICS would go too deep below the 'top stack' of the internet – and regulate entities that the government would not mean to regulate, like DNS providers or the transmission of data by telecommunications service providers.

Other experts suggested that the definition of regulated entities should align more with European models. They underlined that there are more legal commonalities in jurisprudence between Canada and Europe, for example, and that Canada’s approach to defamation under common law is similar to Europe’s e-commerce directive. They mentioned that moving too quickly to a definition, such as the definition of ICS, for example, would limit the ability to fully examine lessons from other jurisdictions, it will be a missed opportunity. Other experts stressed the importance that a made-in-Canada definition be in line with the CUSMA trade agreement, as well as other trade agreements that Canada has entered into.

Theme B: Duty of Care/Duty to Act Responsibly

Systems-Based Approach

Most experts noted that an ex-ante, ‘duty of care’ approach that puts the responsibility and obligations on platforms would be the best approach when it comes to dealing with platforms with varying business models. It was said that a systems-based approach is particularly useful because platforms are also the source of innovation in devising solutions, in particular, to reduce harm and to address some of the more lawful but awful types of content. In this view, regulations should be about creating incentives to address harmful content on their platforms, while also pairing these incentives with accountability mechanisms.

A few experts mentioned that they do not trust larger platforms to do what they should be doing under a ‘duty of care’ model. They strongly advised against self-regulation or self-monitoring altogether as, in their view, it simply doesn't work. They emphasized that the tools that large companies have in place already to address harmful content are falling short when it comes to protecting minority communities and vulnerable communities.

Duty of Care vs. Duty to Act Responsibly

The Expert Advisory Group explored the terminology and jurisprudence of a ‘duty of care’ model, as outlined in the worksheets and in the United Kingdom’s Online Safety Bill, as opposed to the concept of a ‘duty to act responsibly’. It was noted that ‘duty of care’ is a very old concept in law, particularly in tort law. Some members observed that over the last two years the Canadian Commission on Democratic Expression has developed the idea of a ‘duty to act responsibly’, in distinction from a duty of care and asked for some clarification on the concept.

It was explained that a ‘duty to act responsibly’ is meant to have some resonance with the idea of a ‘duty of care’, in addition to setting it out as a distinct legal concept in statute. Instead of drawing on legacy implications associated with a ‘duty of care’, the ‘duty to act responsibly’ could be a distinct statutory duty to, for example, engage in risk assessments or risk minimization.

Many experts agreed that the terminology of a ‘duty to act responsibly’ would provide some breathing room from the baggage and jurisprudence of the term ‘duty of care’, which has been litigated for hundreds of years. A few others added that there could be unpredictability and uncertainty associated with the term ‘duty of care’, insofar that it could be unclear to those that have a duty just what that duty entails and could lead to negative consequences. The uncertainty could be mitigated with a ‘duty to act responsibly’, which could clearly set out obligations around what a ‘duty’ entails.

A few experts noted that it is also important to sketch out what is meant by a ‘duty of care’ or a ‘duty to act responsibly’ and what an approach around this concept would look like in terms of end goals. In particular, experts indicated an eventual framework would need to be crystal clear in setting out the duties, establishing an acceptable baseline or standard of care, and metrics to indicate when a regulated entity is falling short of this standard of care.

Theme C: Categorization

Members discussed the idea of categorizing regulated entities, such that the obligations for a regulated entity could vary according to its size, volume of content, or risk of harm. The experts shared a range of mixed views on this topic.

A few experts cautioned against the use of categories. Some mentioned that categories might be useful when it comes to adult content services or content such as child sexual exploitation content, because these are comparatively clear-cut – but that it would be difficult to have clear categories around other types of service and/or content. Another expert stressed that multiple regulatory categories could lead to debates about which entities belong to which categories, especially if certain entities are in the grey area between categories, and undermine the efficiency of the regime as a whole.

Others shared that the focus should be on risk, first of all, instead of arranging services by category or type. They shared that a focus on risk assessment would draw out natural fault-lines and categories of risk, whereby categories of low-risk, medium-risk and high-risk would surface naturally. They mentioned that, in this model, the onus would be on the platforms to demonstrate their risk level. They explained that there could be more stringent obligations on entities with the highest risk of harm, and a lower level of obligation on entities with the lowest risk of harm.

Theme D: Risks and Risk Assessment

The discussion of regulatory categorization led to a deeper discussion of risk and risk assessment.

Many experts mentioned that regulated entities should have a ‘duty to act responsibly’, which includes determining the degree of risk on their platforms. They stressed the importance of a clear, broad and accessible definition of risks that would be managed under a prospective 'duty of care' or ‘duty to act responsibly’ model. The group discussed a four-part framework for how to think about definitions of risks and how obligations should be considered for regulated entities. This discussion was organized around the concepts of risk, rights, duty, and liability. It was explained that the focus on risk assessment also needs to be seized with the rights of those who use the service; articulate duties for the services to live up to; and define liability for services that fail to live up to their duties. Here, too, a number of experts agreed with a frame of this kind in principle but underlined the need for details and specificity. A few experts noted that these concepts are multidimensional and intersect with others and that they may need to be examined in a grid-like fashion.

Some experts cautioned that the framework would need to be quite specific about what constitutes risky content. They asked: Is it a risk for the user, the person posting the content, or the platform? And does the seriousness of the harm of the risky content justify the nature of the rules and regulations that could be put in place? Other experts strongly indicated that technology companies should not be in the driver’s seat in setting out what constitutes risk on their services.

A few experts also raised the importance of thinking about implementation and how these concepts would play out in practice. They stressed the need to develop an approach that does not exceed the practical limits of the government’s ability to deliver and execute. They also stressed the need not to exceed the public’s ability to follow along and understand.

A few experts noted that there should also be caution taken when it comes to implementing a risk-based approach as regulations or implementation actions could potentially have unintended consequences on or could be biased against marginalized and diverse communities. They noted the possibility of voices being disenfranchised and silenced if there was significant use of automated systems or over surveillance.

Theme E: Private vs. Public Communications

Experts discussed whether legislation should differentiate between private and public communications, and how this differentiation could work in practice. They had mixed views about this notion and provided various explanations to support their conclusions.

There was consensus among experts that there is a significant amount of grey area between private and public communications, and that it would be difficult to decipher between private and public communications when it comes to regulation. Experts spoke about, for example, the size of Facebook groups or group chats in Facebook messenger. They questioned how they would decipher a private group from a public group and what factors, such as the size of the group, should be considered when making the distinction.

Many experts supported the notion that private communications should be included under the scope of the legislative framework. Some experts highlighted that a lot of times a high level of harmful content, such as terrorist content or child pornography are shared in private communications instead of on public forums – and that excluding these types of communications would leave a lot of harmful content on the table. Some experts noted that a systems-based framework with risk-based assessments could address harmful content in private communications, whereby the platform has a duty to act responsibly to create a safer environment in all aspects of their platform. As an example, platforms could put tools in place, such as reporting mechanisms or tools to accept or deny a message before seeing it, or tools that would mitigate the risk before it emerges. They noted that, in this way, regulations wouldn’t need to impose a proactive monitoring obligation on platforms to monitor private communications to mitigate harms. Others mentioned that the downside to tools such as these are that the onus is placed on those that are experiencing these problems. It was noted that there will need to be a balance between the right to privacy and the right to redress.

On the other hand, some experts indicated that there needs to be operational definitions about what functions – public, private, encrypted, or hybrid – should be within the regime, and what functions should be out of scope. Others noted that private messaging applications should be excluded as many platforms already monitor private spaces for hateful content, for example when sending a QAnon video through a private message on Twitter, the receiver will get a ‘harmful content’ warning. They suggested that private messaging could be included in a broader duty of care approach, which could have regulated entities ensure a duty of care on all content on their platforms in general.

Theme F: Adult Content

Members discussed whether adult content and adult content services should be regulated or treated in a different manner than other content and services

Many experts suggested that child sexual exploitation content should be treated the same as other types of content and should not constitute its own category. Experts mentioned that there is room for this type of content to be addressed in a risk-based framework. They indicated that treating it differently may create a slippery slope – such that, in practice, regulated services would push for other categories of harmful content to be treated in unique fashion, too.

A few experts mentioned that, although they don’t think child sexual exploitation content should be treated separately, there could be different obligations in place or mechanisms used for more immediate action against it and extreme prejudicial content. For example, mechanisms like artificial intelligence work well with certain types of content such as child sexual exploitation content, using hashes and extensive databases to enable quick action. Some experts noted that mechanisms that work well for certain types of content may not work as well with other types of content and there could be other ways to address these types of content.

Next Steps

The next workshop for the Expert Advisory Group will take place on Thursday April 21th from 1:00-4:00 pm EDT. Experts will discuss the Objects of Regulation worksheet at this session.

Page details

Date modified: