Summary of Session Three: Legislative and Regulatory Obligations

The Expert Advisory Group on Online Safety held its third session on April 29 from 1:00 - 4:00 p.m. EDT, on Legislative and Regulatory Obligations. All members were present. The Advisory Group was joined by Government representatives from the Departments of Canadian Heritage, Justice, Innovation, Science and Economic Development, Public Safety, and the Privy Council Office. Representatives from the Royal Canadian Mounted Police were also present.

This summary provides an overview of the third session. Per the Terms of Reference for the Advisory Group, these sessions operates under Chatham House Rule. As such, this summary does not attribute the views expressed to any one group member or organization. It outlines the views expressed during the session; report areas of agreement, disagreement, and discussion; and organizes the discussion under thematic categories. It should not be considered a verbatim recitation of the discussion.

The topic for the session was “What are the legislative and regulatory obligations that should be imposed on regulated entities to reduce the amount of harmful content online, and manage the risk it poses?”

The worksheet for the session included three objectives:

  1. Explore the benefits and downsides of regulatory obligations designed around a duty of care;
  2. Determine the appropriate degree and scope of obligations to impose on regulated online services; and
  3. Consider the amount of flexibility that is appropriate to attach to regulatory obligations.

This summary reports on the perspectives raised in relation to these objectives and organizes the discussion points according to issue-specific themes.Footnote 1

Theme A: Duties Imposed on Regulated Services

Specificity of Obligations

The Expert Group disagreed on how specific and prescriptive legislative and regulatory obligations ought to be.

Some members argued that the goals of an ex-ante framework are best achieved through the indirect means of legislating a broad duty to act responsibly. They explained that this duty would obligate regulated entities to identify risks posed by their service and to appropriately mitigate them. They highlighted that it would not impose any specific obligations that govern content policies or content moderation procedures. Experts emphasized that ambiguity regarding what regulated services must do to meet their obligations would drive compliance, as services would err on the side of caution. They suggested that instead of complying minimally, services would take additional measures to ensure they adhered to their obligations. Proponents of this approach also stressed that some degree of flexibility would be needed when it comes to setting out regulatory obligations in the digital sphere, considering the rapidly fluctuating nature of the internet. They explained that if the legislation were too detailed, it would quickly become outdated. Other experts emphasized that it would be necessary to establish clear expectations for services’ behavior. They explained that if regulated services are given the flexibility to determine what “acting responsibly” means, they would inevitably underestimate their risk thereby allowing them to adopt less onerous risk-mitigation measures.

Other members were of a different view, and argued for more detail and specificity. Some experts suggested that greater specificity regarding obligations could be developed through binding, or non-binding, codes of conduct. These codes of conduct, they explained, could be drafted by multi-stakeholder groups involving regulated services, civil society, victim advocacy groups, and the regulator.

In a similar vein, some experts stressed that tech companies are not averse to more guidance, and that it is common for such services to invite government to clarify obligations when responding to questions about their practices.

Transparency Obligations

Many experts agreed that it would be necessary to impose robust transparency obligations on regulated services regarding their content moderation practices. They explained that this information should be public and should contain sufficient data to allow the Digital Safety Commissioner to assess regulatory compliance. Other experts stated that transparency obligations should be coupled with safeguards to help protect services’ commercial information and other sensitive data.

Some experts emphasized that regulated services should also have obligations for data disclosure to journalists, independent researchers, and academics who wish to conduct their own research into platform content moderation. They indicated that this stream of disclosure could involve some form of certification for those allowed to access the data. Other experts stressed that civil society organizations and victim advocacy groups, who are in the trenches of content removal efforts, also require access to such data.

Experts detailed the type of information that should be included in transparency obligations, including, but not limited to: quantitative and qualitative data concerning platform users; risk assessments; mitigation measures; volume and type of illegal and harmful content; content moderation procedures and outcomes; content moderator profiles; suspected illegal content transmitted to law enforcement; algorithms and recommender systems; complaints and appeals; response time; and protocols, incident reports, and outcomes related to the protection for children.

Tailored Obligations

The Expert Group discussed different ways to tailor the regulatory obligations. Some approaches could distinguish between different types of services, while others could distinguish between types of content.

Regarding regulated services, many experts stated that both risk and capacity must be considered when imposing obligations. Some suggested a matrix approach whereby the obligations imposed on a service depend on the risk posed by what they do, coupled with their capacity to do something about it. They explained that the European Digital Services Act incorrectly assumes that the biggest platforms will pose the most risk, ignoring important factors like the type of content a platform hosts or its business model. It was explained that a service should be held to a higher regulatory standard if it hosts adult content, considering the risk posed by such content - even if the service itself would be considered relatively small in the size of its user base and/or revenue streams. Experts also explained the framework should expect more from a bigger platform than a smaller one for a given level of risk.

Several experts advocated for the needs of smaller or start-up services. They argued that tailoring regulatory obligations to the size or capacity of the service would ensure that it has the means to comply with its obligations, and would protect smaller or start-up platforms from being overwhelmed. They explained that the regulatory framework could box out smaller players and consolidate power in the hands of the biggest platforms if regulatory obligations were applied to all services, regardless of size or capacity, without considering proportionality. Other experts cautioned against imposing a lighter regulatory burden on smaller services. They insisted that smaller platforms are ripe for exploitation by threat actors, thereby causing them to be some of the riskiest services.

Many experts stated that it would be reasonable to interpret the duty to act responsibly in a heightened fashion when it comes to platforms that target children or adult content platforms. Others stated that heightened obligations should be imposed on a content-based spectrum, with little leeway being granted to platforms regarding their duties when faced with child sexual exploitation content, the non-consensual sharing of intimate images, and the live streaming of an attack.

Human Rights

Some experts reiterated that there are real constitutional risks that stem from regulating in this space, even under an ex-ante risk-based approach. They emphasized that user rights will depend on the type of content being managed. For instance, they explain, the freedom of expression implications for illegal content may not be the same as for legal but harmful content. For that reason, these experts advocated for a two-tiered regulatory approach recognizing this fundamental distinction whereby the “harder” obligations would be reserved for illegal content, while “softer” obligations would be introduced for lawful but harmful content.

Concerning “harder” tools, experts explained that there should be a complaints-based notice and takedown obligation for child sexual exploitation content and the non-consensual sharing of intimate images, as the nature of this content supports its removal with minimal barriers to victims. Regarding “softer” tools, some experts suggested that user controls might be an appropriate tool to deal with lawful yet awful content. These experts explained that tools that empower users in this way would address harmful content while still upholding fundamental rights.

Some experts emphasized that it is easy for a risk-based approach to give minimal attention to user rights. They stated that if regulated services are not told how to comply with their duty to act responsibly, the systems they put in place might be rudimentary and result in blunt over-regulation of the space in the name of “compliance”. These experts stressed that obligations must clearly compel regulated entities to assess the risks their services pose to user rights, mitigate any harmful effects, and be transparent about both risks and mitigation measures.

Some experts stressed that there is a risk that a systems-based approach could indirectly promote a system of general monitoring. They explained that each legislative provision must be scrutinized to ensure no general monitoring obligation exists, as such obligations have a negative impact on freedom of expression, equality rights and privacy rights - especially for equity-deserving communities.

Finally, some experts suggested implementing age-verification mechanisms for users. Others stated that mandating age verification to access certain content online would be practically impossible to implement without disproportionate effects on the freedom of expression.

Data and Privacy

Some experts emphasized that when talking about user rights, it is difficult to divorce content moderation issues from other issues like data, privacy, and competition. They questioned whether regulated entities should also be compelled to assess the risk posed by their services with regards to their information and data management as well. It was suggested that the Government turn back to the Digital Charter as a foundation to build on.

Theme B: The Framework's Objectives

Some experts questioned whether the objective of reducing harmful content online was an appropriate one for the framework overall.

They emphasized that it would be difficult to measure the reduction in exposure to harmful content, as there is no baseline from which to proceed. Some of the experts stressed the importance of transparency obligations to ensure that a quantitative measure of exposure to harmful content could reasonably be measured in the future. Others suggested alternative objectives around compelling platforms to act responsibly and manage their risk, ensuring platform transparency, holding services accountable, and establishing basic standards for content moderation in Canada.

Some experts emphasized that there is a tendency when regulating in this space to weigh the objective of the removal of harmful content against respect for the fundamental rights of users. In actuality, they explained, the regulatory framework would contribute above all to the respect and promotion of users' rights. Experts stressed that the matter of being exposed to less harmful content is co-equal to the goal of advancing the fundamental rights of users, such that they would be able to participate in online discourse more freely. These experts explained that the framework would not be trying to reduce the negative impacts on these rights, but rather would be striving to protect them. They suggested that reducing harmful content online and respecting user rights be co-objectives, afforded the same degree of importance. Experts stated that this approach was adopted by the European Union’s Digital Services Act, which states that one of its main objectives is to guarantee user rights.

Theme C: User Recourse and Other Victim Support

Independent Recourse Body for Content Moderation Decisions

The Expert Group disagreed over whether a legislative and regulatory framework should provide for an independent, external recourse body for content moderation decisions.

A number of experts emphasized that victims need the ability to seek recourse for content moderation decisions independent from the regulated services. They explained that external recourse is necessary, because even with a risk-based ex-ante approach, regulated services may over-remove content, a phenomenon that has disproportionately affected equity seeking groups. They indicated that these groups do not have access to recourse mechanisms. They stressed that many do not feel comfortable going to law enforcement, and even when they do, the content usually remains online. They also insisted that asking users to go through the courts is not realistic as doing so is time consuming, expensive, and often forces victims to re-live their trauma. Finally, proponents of the independent recourse model underlined that victims would continue to be taken advantage of, without such an adjudicative body. They pointed out that there are private services that exploit victims by promising to get the harmful content removed. Experts stressed that there is a void that needs to be filled by a trustworthy body able to support victims who are seeking recourse.

Other experts explained that such a system would be essentially unworkable. They cited concerns around the freedom of expression, government censorship, and practicality regarding an inevitable large volume of complaints. Instead, they proposed a recourse body more compatible with an ex-ante model whereby users could issue complaints regarding systemic problems with the manner that a regulated service operates (e.g., a systemic failure to act against a specific type of content, or evidence that a service is engaging in discriminatory practices against a certain group), as opposed to specific content-moderation decisions.

Broadly speaking, however, most experts seemed to agree on the need for an independent adjudicatory body to address exceptional cases as a measure of last resort. Experts largely converged on the need for a progressive escalation process for recourse that begins at the platform level with the duty to act responsibly. They explained that if the duty to act responsibly is well implemented, many content moderation disputes would be resolved through either ex-ante obligations to ensure that harmful material does not get shared in the first place or efficient and effective adjudication at the platform level. Experts explained that there would inevitably be errors with the duty to act responsibly, and it is when those errors occur that an ex-post recourse council would come into play. Many experts also agreed that the recourse body’s mandate could be limited in terms of the content that the body addresses, proposing that its mandate apply to a subset of illegal content, such as child sexual exploitation content and the non-consensual sharing of intimate images.

Finally, some experts questioned whether such a recourse body should solely action content, or whether it should also serve an additional dispute resolution function.

Characteristics of an Effective Independent Recourse Body

Many experts stated that safeguards would be necessary to mitigate the high volume of complaints to the independent recourse body. Some suggested that legislation provide factors that would need to be met prior to issuing a complaint, such as going through a service’s internal recourse process first. Others stressed that there would need to be concrete steps to weed out frivolous complaints. Finally, some experts emphasized that independent recourse should only be afforded for disputes around the moderation of illegal content. On this latter point, some stated that circumscribing external recourse tools in this way would not only help address the volume problem, but would also rightly acknowledge that as private entities, companies have the right to choose whether to host legal but harmful content.

Some experts emphasized that it would be necessary to establish what a recourse council would be expected to do. They asked: Is it to assess whether the regulated services are implementing their own terms of service correctly? Or is it to determine whether content is illegal? They explained that the latter task is notoriously difficult, even for courts, and would necessitate a judicial or administrative body with the authority to make such determinations. Others questioned whether the body would have the power to order that content be both removed and reinstated.

Many experts stressed that an independent recourse body would only be effective if it was accessible and user-friendly. They argued that many victims faced with harmful content are worried about their privacy and security. They explained that the complaint process could not be onerous, and that a recourse body would only be successful if users were not intimidated by the system. They highlighted that users would need to be able to represent themselves, convey what they have experienced, and make representations in a more informal way compared to the usual strict rules around evidence necessary in a court of law. At the same time, they stated that the recourse body must signal that the matter is being taken seriously and should not be so informal as to undermine the gravity of the harmful content being disputed.

Several experts stated that the recourse body would need to be independent from the regulator. They emphasized that the Government cannot be seen to be making rules about what can or cannot be posted on online services. They also insisted on a separation of the investigative and enforcement functions of the Digital Safety Commissioner with the adjudicative functions of a recourse body.

Finally, some experts emphasized that to be effective, an independent recourse body would need to be sufficiently resourced, issue timely decisions, and be composed of subject matter experts who know the field of content moderation.

Independent Ombudsperson

Many experts emphasized that an independent ombudsperson should also be set up to assess platform responsibility on a system level. Some experts proposed this ombudsperson as an alternative to the recourse body, while others stressed that both bodies would be necessary - the ombudsperson would provide recourse for systemic issues regarding platform behaviour, whereas the recourse body could address individual cases of content moderation.

Recourse Embedded within the Regulated Services

Experts largely converged around the idea of requiring regulated services to have two types of internal recourse mechanisms: 1) appeal processes for content moderation decisions; and 2) an ombudsperson to help support victims who need guidance concerning problematic content or platform behaviour.

Many experts agreed that as part of their duty to act responsibly, regulated services should be required to have appeal mechanisms for their users that are effective, efficient, and user-friendly. They stressed that some social media platforms currently have no recourse mechanisms, which is highly problematic. Experts emphasized that regulations should prescribe the criteria for such internal appeal mechanisms including the right to human review, user friendliness, and timeliness.

Many experts also emphasized that an ombudsperson could support victims by teaching them how to flag harmful content or guiding them through the steps to issue a complaint. Some experts advocated for an ombudsperson that can act as a voice for children.

Support for Children

Some experts stressed the importance of compelling regulated entities to provide special support for children, including easy reporting options and terms that make sense to them (i.e., understandable language). They also argued that reports from children should be automatically prioritized for review, and that consideration should be given on how to facilitate reporting from parents and others on the child’s behalf.

Theme D: Enforcement Mechanisms

Many experts questioned what would happen to regulated services if they fell below the standard of care required of them. They stressed that services should not be held liable for non-compliance, but rather should be obligated to pay a fine. Some argued that users should not be allowed to sue companies for a failure to meet their duty to act responsibly. Other experts mentioned liability for senior officers as an effective enforcement mechanism. Overall, most experts emphasized that the Digital Safety Commissioner should have the ability to intervene in a flexible and proportionate manner depending on the scale and severity of the situation.

Some experts emphasized that public pressure is an effective incentive for compliant behaviour. They suggested that regulated services be obligated to include a statement on their platforms indicating to users that the service is subject to a duty to act responsibly, and that users have redress mechanisms available to them if they believe the service is not complying with this duty. Experts emphasized that such a reminder would likely put public pressure on services to be compliant and would have the added benefit of fostering an environment where users feel empowered in their capacity to respond to harmful content and comfortable making use of the redress mechanisms available to them.

Several experts stressed that an effective way to encourage norms and standards to develop would be to encourage partnership and collaboration among regulated entities and the Digital Safety Commissioner, by which parties could discuss best practices, challenges, and innovative solutions. They emphasized that the key to regulating in this area is to think of how best to create "races to the top" whereby regulation incites services to take their responsibilities ever more seriously over time.

Finally, some experts questioned whether the Digital Safety Commissioner would be adequately staffed to properly enforce compliance. They questioned whether the regulator would be able to find people to employ with the requisite technical competence. Experts explained that such details matter because even the most thorough regulatory framework could be undermined by a lack of enforcement.

Theme E: External Consultation

Experts agreed to devoting some time during the next workshop to set expectations and a plan regarding additional external consultation on a legislative and regulatory framework to address harmful content online.

Next Steps

The next session of the Expert Advisory Group will take place on Friday, May 6 from 1:00-4:00 p.m. EDT. Experts will discuss the Regulatory Powers worksheet at this session.

Page details

Date modified: