Summary of Session Seven: Connection to Law Enforcement
The Expert Advisory Group on Online Safety held its seventh session on May 27 from 1:00-4:00 pm EDT, on the connection to law enforcement. Eleven members were present. The Advisory Group was joined by Government representatives from the Departments of Canadian Heritage, Justice, Innovation, Science and Economic Development, Public Safety, Women and Gender Equality, and the Privy Council Office. Representatives from the Royal Canadian Mounted Police were also present.
This summary provides an overview of the seventh session. Per the Terms of Reference for the Advisory Group, these sessions operate under Chatham House Rule. As such, this summary does not attribute the views expressed to any one group member or organization. It outlines the views expressed during the session; reports areas of agreement, disagreement, and discussion; and organizes the discussion under thematic categories. It should not be considered a verbatim recitation of the discussion.
The discussion question for the workshop was “What are the relevant issues to consider and appropriate connections to make between a new systems-based regulatory regime and the work of Canadian law enforcement and national security agencies?”
The objectives for the session were:
- Determine what role, if any, regulated entities should play under the proposed framework in:
- Notifying law enforcement and/or national security agencies of specific types of content that the platforms chose to remove; and
- Preserving data related to certain types of content, recognizing that any existing legal obligations would continue to take precedence.
- Identify and discuss potential consequences of imposing notification or preservation requirements on regulated entities with respect to specific types of harmful content, including the issue of legal thresholds (e.g., reasonable grounds to believe content is criminal, likelihood of causing harm, imminent risk of serious harm, etc.)
This summary reports on the perspectives raised in relation to these objectives and organizes the discussion points according to issue-specific themes.Footnote 1
Theme A: Mandatory reporting and data preservation obligations
Many experts agreed that mandatory reporting and data preservation would be required under very specific circumstances and should be narrowly carved to avoid over-policing, especially for marginalized communities, and infringements on privacy rights. Some experts advocated for enshrining such obligations based on the risk of imminent or serious harm in law.
Other experts cautioned against extending mandatory reporting and data preservation requirements to content that is not obviously illegal since it is very difficult for platforms to determine the illegality of content. Experts explained that such requirements could raise serious human rights and Charter concerns and, therefore, must be used as a last resort, in exceptional circumstances, when there are no other reasonable alternatives available to achieve the policy objectives.
Mandatory reporting or notification obligations
Most experts agreed that mandatory reporting requirements for child sexual abuse material (CSAM) should be dealt separately under a strengthened Mandatory Reporting Act considering the volume of CSAM material shared online, the egregious nature of such content, and the complexity of investigations. Some experts stated that the Mandatory Reporting Act should be amended to extend the data preservation period from 21 days to 12 months. A few experts supported authorizing police access to transmission data and Basic Subscriber Information (BSI) to ensure timely investigations, recognizing that safeguards, such as obtaining a court order, should also be carved in.
Some experts explored whether other types of content should be subject to mandatory reporting. Some experts proposed a threshold for reporting by referring to an “imminent” or “serious” risk of violence towards a specific individual or group (e.g. a user communicating their intention to conduct a mass shooting). A few experts cautioned that “imminence” and “severity” are subjective. Some experts cautioned against including any reporting obligations in legislation citing concerns around Charter rights.
A few experts argued that reporting and preservation obligations are of limited use, as law enforcement cannot act on all content reported to them. . They stressed that police often have access to a wide range of online data but it is challenging to filter through the volume of information and cross-reference it with offline information.
A few experts flagged that there are risks to instituting mandatory reporting requirements absent safe harbour provisions that provide protection from liability. Some experts suggested that instead of imposing mandatory reporting obligations, regulated entities should be invited to make good faith, voluntary disclosures to law enforcement.
Data preservation obligations
Some experts were wary of mandating explicit preservation requirements because it could open the door to a wider range of content collection and sharing. They emphasized the need for a highly limited scope and stressed that safeguards would need to be implemented to avoid scope creep and the over collection of data by platforms. Experts also stressed that content is seldom truly deleted, explaining that there are often copies of such content saved somewhere else online.
A few experts suggested that instead of imposing obligations for data preservation, the framework could leverage the online services’ review and appeal process to preserve content. They explained that a review and appeal process at the platform level would require platforms to retain content and could thus provide time to law enforcement agencies to access the required information for their investigation. A few experts raised a risk with this proposed approach as not all content useful to law enforcement would be subject to a platform’s review. Alternatively, some experts suggested that preservation obligations could be framed similar to record keeping obligations imposed on corporate entities for taxation purposes.
Applicability of existing laws in the online space
A few experts emphasized that the Criminal Code applies to online spaces, but that a mechanism is necessary to give effect to rules within the Criminal Code. They explained that there is a need for greater tools to ensure compliance with criminal laws in Canada and to act quickly in instances where efficiency is required.
Some experts proposed the creation of a cyber-judge to determine the legality of the content, explaining that platforms do not have the legitimacy to make such calls. The cyber-judge’s decision on whether content is legal would be temporary, experts explained, and could be reviewed more extensively by a judiciary body. Other experts stressed that reporting content is not the same as determining the legality of the content.
Risks associated with legal obligations
Risk of systemic bias
Many experts asserted that there is a risk of perpetuating systemic biases associated with platforms implementing mandatory reporting obligations and data preservation, especially in instances where content moderation is automated. A few experts voiced a preference for human moderation but recognized that human moderators are also not immune to biases. Some experts argued that there is a balance to be struck between mitigating any risk of online bias associated with mandatory reporting and the risk of these same communities being targeted by offenders, which could be prevented by reporting content to law enforcement agencies. A few experts suggested that future legislation include both the objective of reducing harmful content and that of guaranteeing users’ fundamental rights, including equity rights. Experts explained that including such objectives could influence the way platforms fulfill their obligations. Other experts suggested that legislation explicitly compel platforms to reduce systemic bias through risk mitigation measures.
Many experts also stressed that law enforcement agencies hold biases, exemplified by how they apply the law. A few experts stated that many police forces do not have anti-hate units. Many experts called for equity and inclusion training for law enforcement agencies, and indicated that a cultural change is necessary. Some experts suggested that law enforcement agencies coordinate with and seek support from other organizations in improving their biases. A few experts also suggested that the full bill, particularly the law enforcement portions, be subject to equity assessment to protect marginalized communities’ rights.
Risk for privacy
Many experts mentioned that broad legal requirements for reporting and data preservation, as well as corresponding automated detection mechanisms, could pose a risk to users’ privacy. Many experts cited the lack of trust some communities have towards law enforcement agencies, explaining that this trust deficit could further reinforce the perception of such obligations infringing on users’ privacy rights.
Many experts stated that preservation requirements should be narrow in scope to avoid incentivizing a general system of monitoring. Some suggested that the regulatory framework should flesh out explicit criteria for when information could be transferred, complemented by clear and narrow provisions around retention and storage.
Theme B: Aligning a risk-based approach with the needs of law enforcement
Many experts suggested that platforms assessment of their level of risk (based on factors like size, volume of activity, type of content) could modulate their expected level of cooperation with law enforcement agencies. They explained that platforms could identify their individual risk and explain to law enforcement what measures they plan to put in place to minimize this risk. Experts stressed that this approach could help focus on services that pose a significant risk to Canadians and prevent the overregulation of services that present a lower risk.
They also mentioned that such approach could enable the risk assessment of systemic biases associated with mandatory reporting and data preservation for marginalized communities. According to them, the release of platforms’ risk assessment to the public could inform users on how platforms meet their obligations. Many experts agreed that regulated entities should include how they plan to address systemic biases in their Digital Safety Plans.
A few experts suggested that similar to an Amber alert, in the event that very harmful content is shared, platforms could have a mechanism in place to warn authorities as quickly as possible. Such an approach, experts explained, would be a proportional response in line with Charter rights and freedoms.
A framework based on the risk of harm
A few experts suggested that reporting and preservation obligations be limited to certain types of content. According to them, some types of content, because of their illegal nature or their risk to life and safety, should be reported to law enforcement and quickly removed. They explain that such content should also be preserved. Experts stressed that “grey-zone” content would not be subject to such obligations.
Other experts highlighted the importance of communication and collaboration amongst platforms in instances of imminent violence. Some experts suggested a coordinating role for the regulator to engage and collaborate on a regular basis with stakeholders, the industry, and federal departments, such as law enforcement agencies, to implement the legislative and regulatory scheme, and adapt it over time.
Development of codes of practice
For content that does not meet the threshold for mandatory reporting, most experts agreed that codes of practice could be developed by the regulator in partnership with industry and stakeholders. Such codes could include guidance on when and how to report to law enforcement. Experts suggested that different codes of practice could be instituted for different types of content and platforms. Many experts stressed that an inclusive engagement process would be necessary when developing such codes. Other experts emphasized the proactive role companies could play in sharing good practices among themselves. Finally, a few experts emphasized the importance of having a regulator in place to enforce codes of practice.
Importance of transparency and oversight
Many experts stressed the importance of oversight and auditing of both regulated entities and law enforcement agencies by the regulator to ensure obligations are implemented properly and within the limits of the law. Experts emphasized that regulated services must be transparent about the requests they receive from law enforcement, the number of cases they report to law enforcement, and the type of content that is reported and preserved. They also suggested that the same transparency reporting be applied to law enforcement agencies, including how they acted upon the information shared with them. A few experts mentioned that platforms are already transparent about how they manage content in their terms of services and the requests made to them by law enforcement.
Many experts stressed that access to data is key for researchers and civil society to understand potential systemic issues. However, experts also acknowledged that providing access to such data while preserving users’ privacy rights will be challenging. Some experts suggested giving access to vetted researchers to an anonymized database of such data.
Concerns with integrating law enforcement requirements in a regulatory framework
Many experts pointed to the low level of trust by certain communities towards policing agencies as an additional consideration for how the proposal might be received by Canadians, including by civil society and advocacy groups. They raised the concern that, if not done thoughtfully, these obligations could have spill-over effects on the rest of the framework.
Next Steps
The next session of the Expert Advisory Group will take place on Friday, June 3 from 1:00-4:00 p.m. EDT. Experts will discuss Disinformation at this session.
Page details
- Date modified: