Supplementary Worksheet: Legislative and Regulatory Obligations

What is a supplementary worksheet?

The Expert Advisory Group on Online Safety launched by the Minister of Canadian Heritage on March 30, 2022, meets each week to discuss key elements of the legislative and regulatory framework for online safety. As discussions take place, additional questions emerge.

This supplementary worksheet contains follow-up questions to collect more detailed information to inform the design of an effective and proportionate legislative and regulatory framework. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.

Objective

Specify what duties regulated services ought to have, and how prescriptive those duties should be.

Follow-up questions

  1. What is a duty to act responsibly?
    1. Broadly, what does it entail? Specifically, what does it include?
    2. How will the Digital Safety Commissioner know if a regulated service is fulfilling its duty to act responsibly?
    3. Can a service still comply with its duty to act responsibly if it is not removing egregious, or “illegal”, content? Put differently, are removal obligations necessarily part of a duty to act responsibly?
  2. How should a regulatory framework:
    1. Institute special obligations to protect children?
    2. Institute special obligations to protect user rights (e.g., free expression, privacy, equality)?
    3. Appoint an internal ombudsperson to help support victims?
  3. What would be the goal of an external, independent recourse body?
    1. To assess whether the regulated services are implementing their terms of service correctly?
    2. To determine if content should be removed?
    3. To determine if content is criminal (i.e., is it an offence under the Criminal Code)?
    4. To determine if content is unlawful under this regulatory regime?
  4. Should a recourse body be granted the power to order a service to remove content?
    1. How would the recourse body be different from the Digital Recourse Council proposed by the Government in 2021?
  5. How do you understand a ‘general monitoring scheme’ when it comes to online services?
    1. Would it be problematic if a service’s digital safety plan outlined a general monitoring scheme to determine whether its structural identification and mitigation measures were working, as part of its fulfillment of its duty to act responsibly?

Summary of the Expert Advisory Group Discussion

Specificity of Obligations

The Expert Group disagreed on how specific and prescriptive legislative and regulatory obligations ought to be. Some members argued that the goals of an ex-ante framework are best achieved through the indirect means of legislating a broad duty to act responsibly. They explained that this duty would obligate regulated entities to identify risks posed by their service and to appropriately mitigate them. They highlighted that it would not impose any specific obligations that govern content policies or content moderation procedures. Experts emphasized that ambiguity regarding what regulated services must do to meet their obligations would drive compliance, as services would err on the side of caution. They suggested that instead of complying minimally, services would take additional measures to ensure they adhered to their obligations. Proponents of this approach also stressed that some degree of flexibility would be needed when it comes to setting out regulatory obligations in the digital sphere, considering the rapidly fluctuating nature of the internet. They explained that if the legislation were too detailed, it would quickly become outdated. Other experts emphasized that it would be necessary to establish clear expectations for services’ behavior. They explained that if regulated services are given the flexibility to determine what “acting responsibly” means, they would inevitably underestimate their risk thereby allowing them to adopt less onerous risk-mitigation measures.

Other members were of a different view, and argued for more detail and specificity. Some experts suggested that greater specificity regarding obligations could be developed through binding, or non-binding, codes of conduct. These codes of conduct, they explained, could be drafted by multi-stakeholder groups involving regulated services, civil society, victim advocacy groups, and the regulator.

Tailored Obligations

The Expert Group discussed different ways to tailor the regulatory obligations. Some approaches could distinguish between different types of services, while others could distinguish between types of content.

Regarding regulated services, many experts stated that both risk and capacity must be considered when imposing obligations. Some suggested a matrix approach whereby the obligations imposed on a service depend on the risk posed by what they do, coupled with their capacity to do something about it. They explained that the European Digital Services Act incorrectly assumes that the biggest platforms will pose the most risk, ignoring important factors like the type of content a platform hosts or its business model. It was explained that a service should be held to a higher regulatory standard if it hosts adult content, considering the risk posed by such content - even if the service itself would be considered relatively small in the size of its user base and/or revenue streams. Experts also explainedthe framework should expect more from a bigger platform than a smaller one for a given level of risk.

Many experts stated that it would be reasonable to interpret the duty to act responsibly in a heightened fashion when it comes to platforms that target children or adult content platforms. Others stated that heightened obligations should be imposed on a content-based spectrum, with little leeway being granted to platforms regarding their duties when faced with child sexual exploitation content, the non-consensual sharing of intimate images, and the live streaming of an attack.

General Monitoring Scheme

Some experts stressed that there is a risk that a systems-based approach could indirectly promote a system of general monitoring. They explained that each legislative provision must be scrutinized to ensure no general monitoring obligation exists, as such obligations have a negative impact on freedom of expression, equality rights and privacy rights - especially for equity-deserving communities.

Independent Recourse Body for Content Moderation Decisions

The Expert Group disagreed over whether a legislative and regulatory framework should provide for an independent, external recourse body for content moderation decisions.

A number of experts emphasized that victims need the ability to seek recourse for content moderation decisions independent from the regulated services. They explained that external recourse is necessary, because even with a risk-based ex-ante approach, regulated services may over-remove content, a phenomenon that has disproportionately affected equity seeking groups. They indicated that these groups do not have access to recourse mechanisms. They stressed that many do not feel comfortable going to law enforcement, and even when they do, the content usually remains online. They also insisted that asking users to go through the courts is not realistic as doing so is time consuming, expensive, and often forces victims to re-live their trauma. Finally, proponents of the independent recourse model underlined that victims would continue to be taken advantage of, without such an adjudicative body. They pointed out that there are private services that exploit victims by promising to get the harmful content removed. Experts stressed that there is a void that needs to be filled by a trustworthy body able to support victims who are seeking recourse.

Other experts explained that such a system would be essentially unworkable. They cited concerns around the freedom of expression, government censorship, and practicality regarding an inevitable large volume of complaints. Instead, they proposed a recourse body more compatible with an ex-ante model whereby users could issue complaints regarding systemic problems with the manner that a regulated service operates (e.g., a systemic failure to act against a specific type of content, or evidence that a service is engaging in discriminatory practices against a certain group), as opposed to specific content-moderation decisions.

Broadly speaking, however, most experts seemed to agree on the need for an independent adjudicatory body to address exceptional cases as a measure of last resort. Experts largely converged on the need for a progressive escalation process for recourse that begins at the platform level with the duty to act responsibly. They explained that if the duty to act responsibly is well implemented, many content moderation disputes would be resolved through either ex-ante obligations to ensure that harmful material does not get shared in the first place or efficient and effective adjudication at the platform level. Experts explained that there would inevitably be errors with the duty to act responsibly, and it is when those errors occur that an ex-post recourse council would come into play.

Independent Ombudsperson

Many experts emphasized that an independent ombudsperson should also be set up to assess platform responsibility on a system level. Some experts proposed this ombudsperson as an alternative to the recourse body, while others stressed that both bodies would be necessary - the ombudsperson would provide recourse for systemic issues regarding platform behaviour, whereas the recourse body could address individual cases of content moderation.

Recourse Embedded within the Regulated Services

Experts largely converged around the idea of requiring regulated services to have two types of internal recourse mechanisms: 1) appeal processes for content moderation decisions; and 2) an ombudsperson to help support victims who need guidance concerning problematic content or platform behaviour.

Many experts agreed that as part of their duty to act responsibly, regulated services should be required to have appeal mechanisms for their users that are effective, efficient, and user-friendly. They stressed that some social media platforms currently have no recourse mechanisms, which is highly problematic. Experts emphasized that regulations should prescribe the criteria for such internal appeal mechanisms including the right to human review, user friendliness, and timeliness.

Many experts also emphasized that an ombudsperson could support victims by teaching them how to flag harmful content or guiding them through the steps to issue a complaint. Some experts advocated for an ombudsperson that can act as a voice for children.

Page details

Date modified: