Session Seven: Connection to Law Enforcement
What is a Worksheet?
Each advisory group session will be supported by a worksheet, like this one, made available to the group in advance of each session. The goal of these worksheets is to support the discussion and organize feedback and input received. These worksheets will be made public after each session.
Each worksheet will have a set of questions for which the group members will be asked to submit written responses to. A non-attributed summary of these submissions will be published weekly to help conduct the work in a transparent manner.
The proposed approach in each worksheet represents the Government’s preliminary ideas on a certain topic, based on feedback received during the July-September 2021 consultation. It is meant to be a tool to help discussion. The ideas and language shared are intended to represent a starting point for reaction and feedback. The advice received from these consultations will help the Government design an effective and proportionate legislative and regulatory framework for addressing harmful content online. Neither the group’s advice nor the preliminary views expressed in the worksheets constitute the final views of the Department of Canadian Heritage nor the Government of Canada.
Discussion Topic
What are the relevant issues to consider and appropriate connections to make between a new systems-based regulatory regime and the work of Canadian law enforcement and national security agencies?
Objectives
- Determine what role, if any, regulated entities should play under the proposed framework in:
- Notifying law enforcement and/or national security agencies of specific types of content that the platforms chose to remove; and
- Preserving data related to certain types of content, recognizing that any existing legal obligations would continue to take precedence.
When content that is illegal, reasonably suspected of being linked to criminality or of national security concern, is removed before law enforcement and the Canadian Security Intelligence Service (CSIS) can be made aware of it, these agencies may be blinded to potential threat actors and unable to prevent acts of violence or criminality emanating from the online space. Notification and preservation obligations could help address this concern.
- Identify and discuss potential consequences of imposing notification or preservation requirements on regulated entities with respect to specific types of harmful content, including the issue of legal thresholds (e.g., reasonable grounds to believe content is criminal, likelihood of causing harm, imminent risk of serious harm, etc.) Stakeholders are concerned about users’ privacy rights, the ability of platforms to make decisions on the legality of content, the feasibility of platforms being able to preserve vast amounts of data, and the disproportionate impact such obligations could have on certain marginalized groups. A proportionate response will need to address concerns related to the removal of certain content for law enforcement and national security agencies, and also consider the abovementioned unintended consequences.
Note: Mandatory reporting and preservation elements of this proposal do not intend to replace or supersede any existing law enforcement reporting or preservation requirements related to child pornography offences under An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service (Mandatory Reporting Act or MRA), as those would remain status quo in accordance with existing legislation.
Starting Points
- The regulatory framework should be constructed in a way that reduces users’ exposure to harmful content online but does not unduly hinder efforts to address harms emanating from the online space. The former objective can be addressed through a systems-based regulatory approach which focuses on content moderation. The latter must be fulfilled through law enforcement and public safety responses, facilitated through timely access to information about threat actors as well as criminal and national security threat activity. The framework envisioned in this exercise presents an opportunity to develop tailored solutions to the public safety aspects of online platform regulation, keeping in mind the need to protect Charter rights and freedoms.
- The design of the framework should be mindful of any negative effects that a systems-based content moderation regime could have on the work of law enforcement and national security agencies and mitigate such an impact. There is a risk that a regulatory regime focused on the timely removal of harmful content could result in the loss of evidence required for law enforcement investigative purposes. The creation of a regulatory framework that compels platforms to have systems and processes in place to moderate harmful content involves consideration of the impact of such new regulations on law enforcement and national security agencies.
- The reporting of allegedly illegal activity by private citizens or groups to authorities has historically disproportionately targeted certain marginalized groups in Canada. Many respondents to the July-September 2021 consultation emphasized that content from marginalized communities already receives a disproportionate amount of flagging compared to similar content from other communities. Coupled with the 24-hour takedown obligation, some argued that the previous proposal could have resulted in regulated entities excessively forwarding content associated with these groups to law enforcement or CSIS for investigation.
- The temporal window between illegal content being posted and then quickly being removed is a key issue affecting law enforcement and CSIS. The absence of a 24-hour content moderation requirement in a new systems-based approach may, overall, raise fewer public safety risks than the previous proposal. However, significant concerns remain around ensuring that public safety and national security agencies have access to data for investigative purposes.
- Finally, data deletion, coupled with the absence of data retention laws and industry standard practices, continues to be a significant problem affecting law enforcement and CSIS’ ability to investigate criminal offences and national security threats linked to content that has been removed. This longstanding problem warrants consideration of whether specific data preservation requirements are desirable from a public policy perspective.
Overview of Proposed Approach
- Broad mandatory notification requirements to law enforcement and national security agencies are likely not necessary under a systems-based regulatory approach, recognizing that any existing legal reporting and data retention obligations would continue to take precedence. Given the absence of 24-hour removal requirements, Charter concerns around privacy, equality rights, and the freedom of expression, and practical considerations around the potential inundation of reports to law enforcement and CSIS, the risks posed by reporting or notification obligations are likely to outweigh the benefits under this proposed framework.
- However, some basic reporting and preservation obligations would be appropriate for certain types of content. Not all types of harmful content are equal. Consideration should be given to whether the most egregious forms of content should be subject to special notification obligations when they fall within removal parameters. An obligation could be placed on platforms compelling them to notify law enforcement and national security agencies of a particular piece of content in instances where a) they have identified the content as falling within certain narrow categories (e.g., terrorist content), b) they have chosen to remove the content, and c) they have grounds to satisfy a particular legal threshold for notification. With respect to part c) the threshold for notification could be (1) reasonable grounds to suspect that the content in question poses an imminent risk of harm to any person or to property; (2) reasonable grounds to suspect that the content in question poses a likelihood of harm to any person or property; or (3) an alternative threshold. In cases falling within the notification parameters, platforms would also be compelled to preserve data related to the content, including user identifying information. The preservation and notification obligations are seen as necessary complements. Having received a notification of the publicly available information, agencies could then obtain a judicial authorization to acquire the preserved information. The starting point for discussing data preservation is to track the notification requirements, but data preservation beyond this limited scope should be considered (e.g., broader types of criminal content).
- Voluntary obligations could be placed on platforms to address a real-world ongoing or recently concluded terrorist attack and exceptionally violent content online (e.g., the livestreaming of a shooting). Like in the previous proposal, an Incident Response Protocol (IRP) could be established, which would empower the Digital Safety Commissioner to issue a voluntary request for regulated entities to immediately remove content associated with an exceptional, emergent, ongoing (or recently concluded) real-world terrorist attack. This protocol would help Canada meet its commitments under the Christchurch Call to Eliminate Terrorist & Violent Extremist Content Online.
Supporting questions for discussion
- Determine what role, if any, platforms should play under the proposed framework in a) notifying law enforcement and national security agencies of specific types of harmful content that the platforms chose to remove, and b) preserving data related to certain types of content recognizing that any existing legal obligations would continue to take precedence.
- Should mandatory notification and preservation requirements be included in the regulatory framework?
- If so, what types of content should be subject to mandatory notification and preservation and what should the thresholds be?
- Should platforms be asked to make a determination of legality (i.e. report content that they reasonably believe is illegal or evidence of certain offences), likelihood of resulting harm, or imminent risk of harm?
- Should regulatory guidance be issued to help platforms assess content against such thresholds?
- If so, what types of content should be subject to mandatory notification and preservation and what should the thresholds be?
- Are there other obligations that could be imposed on platforms to complement the new framework and mitigate any unintended consequences for law enforcement and national security agencies in fulfilling their duties?
- Should mandatory notification and preservation requirements be included in the regulatory framework?
- Identify and discuss potential consequences of imposing notification or preservation requirements on platforms with respect to specific types of harmful content, including the issue of legal thresholds (e.g., reasonable grounds to believe content is criminal, likelihood of causing harm, imminent risk of serious harm, etc.), recognizing that any existing legal reporting and data retention obligations would continue to take precedence.
- Are there safeguards that could be imposed to help craft a regime that addresses concerns regarding effects on law enforcement while still respecting users’ fundamental rights and freedoms?
- Would the rights of marginalized groups be disproportionately affected by notification, reporting or preservation obligations? If so, are there safeguards that could mitigate this unintended consequence?
- Do platforms have the capacity and resources necessary to fulfill notification, reporting and preservation obligations?
- Does that capacity change depending on the type of regulated platform?
- Should obligations only be placed on certain platforms (i.e., services with the greatest user base and content volume)?
Mandatory Reporting Act
Discussion Topic
What changes are necessary to An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service (the MRA)?
Objectives
Determine how the Government of Canada should amend the MRA to better protect children while respecting the privacy of Canadians and their right to freedom of expression. The MRA, which came into force in 2011 and specifically addresses Internet facilitated child pornography offences, is not equipped to deal with the evolution of how child pornography is produced and disseminated via current online platforms.
Previous Proposal
The public consultation outlined several amendments to the MRA to enhance law enforcement’s ability to protect children in regard to online child sexual exploitation. While many respondents did not address the MRA amendments, those who did expressed support for the proposed change, which include:
- Centralized mandatory reporting of online child pornography offences through the RCMP’s National Child Exploitation Crime Centre (NCECC).
- Enhanced transparency by requiring an annual report to the Ministers of Public Safety and Emergency Preparedness and Justice from the NCECC.
- Clarification that the MRA applies to all types of Internet services including social media platforms and other application-based services.
- Designation of an entity in regulations for the purpose of collecting information to determine the application of the Mandatory Reporting Act.
- Extension of the preservation period for information related to online child pornography offences from the current 21 days to 12 months.
For Discussion: Basic Subscriber Information (BSI)/Transmission Data
The government also sought stakeholder input on two options that would require Internet Service Providers (ISPs) to report certain additional information in their mandatory reporting only when they have reasonable grounds to believe that a child pornography offence has already been committed (i.e., when a criminal offence, such as distributing child pornography, has already been committed). These options include requiring platforms to report either transmission data (i.e., Internet protocol address (IP address), date, time, type, origin, destination of the material) or basic subscriber information (BSI) (i.e., customer’s name, address, phone number, billing information associated with the IP address).
Law enforcement requires specific information when it receives reports of content in which an online child pornography offence has been committed so that it may initiate investigations, identify offenders, remove victims from continued sexual exploitation, and prevent additional children from being victimized. Under existing law, when content is reported to them, they pursue this information by seeking a court order.
Currently, the information provided in reports under the MRA is not required to include: (1) transmission data or (2) transmission data and BSI. The purpose of including such information is to expedite the police response in cases where an online child pornography offence has been committed.
- Option 1: Transmission Data: If reporting organizations were required to include transmission data in their mandatory reporting to law enforcement when a child pornography offence has been committed, this would provide police with the information they need to identify the applicable person who provides an internet service and the correct jurisdiction. While the mandatory report would identify the applicable Internet Service provider, police would still require a production order to then obtain BSI, the identifying information, from persons who provide an internet service in order to advance an investigation.
- Options 2: Transmission Data and BSI: Alternatively, reporting organizations could be required to include any held BSI in addition to any transmission data, as opposed to only transmission data, in their mandatory reporting to law enforcement. The quicker the information can be provided, the quicker an investigation can begin to identify offenders and remove any potential victims from continued sexual exploitation. In this case, police would not require a production order to obtain this information from persons who provide an internet service. Eliminating this step in the investigative process allows law enforcement to act faster and would save time and resources, as the investigation could proceed without requiring further production orders.
This is important given the volume of reports the NCECC receives each year; in fiscal year 2020/21, the NCECC received 52,306 complaints, reports and requests for assistance from Canadian and international sources, representing a 510% increase compared to 2013/14. This change would enable the NCECC to more efficiently refer the matter to the police of jurisdiction for action. Receiving the BSI from the onset also removes significant burden on the local police who receive the referrals from the NCECC. Presently, police agencies of jurisdiction must often pursue judicial authorization to obtain BSI for investigations reported under the MRA, which imposes significant delays and a strain on already limited resources. Of note, while this change would remove certain steps in the process, all subsequent investigative steps, such as search warrants, would continue with judicial oversight, as per usual.
Supporting questions for discussion
- How can the MRA be amended to expedite law enforcement response while respecting Charter and Privacy rights? What safeguards are needed?
- Are there alternative options to address these challenges that reduce the impact on the rights of Canadians while still protecting the rights of the victims?
Page details
- Date modified: