Online Safety Sharing Circle – Participant Information Document
On this page
- Issue
- Context – 2021 Consultation
- Expert Advisory Group
- Summary of Feedback Received from the Expert Advisory Group
- Former Bill C-36 amending the Canadian Human Rights Act and Criminal Code
- Your Participation in the Roundtable
- Proposed Questions for Participants
Issue
An overwhelming majority of adults in Canada (94%) have an account on at least one social media platform. Social media helps connect people around the world. Among other things, it provides a space for people in Canada to participate in their communities, our democracy, and for activists and civil society organizations to organize and amplify the voices of underrepresented and equity-deserving communities.
However, a growing body of evidence shows that the benefits associated with the use of social media platforms and other online services also come with significant harms for many users. Such platforms can be used to threaten and intimidate Canadians and to promote views that target communities, put people’s safety at risk, and undermine Canada’s social cohesion or democracy. Research also shows that despite certain efforts, platforms are not doing enough to limit the spread of these harms on their own services.
Now, more than ever, online services must be held responsible. A Nanos Research poll commissioned by the Globe and Mail found that 55% of Canadians generally support greater government regulation of the Internet. Canadians are concerned about safety in the online world and 62% of Canadians believe that online hate should be regulated.
Context – 2021 Consultation
The Government of Canada held a national online consultation from July to September 2021 on what a legislative and regulatory framework for online harms could look like. The framework focused on what regulated services (such as social media platforms) could be obligated to take down and how they could be required to do it, including a specific timeframe to abide by.
Feedback received during the 2021 online consultation recognized that there was support from a majority of respondents for a national legislative and regulatory framework to confront harmful content online. However, respondents identified several concerns relating to the freedom of expression, privacy rights, the potential impact on certain marginalized groups, and compliance with the Canadian Charter of Rights and Freedoms generally.
Respondents raised concerns about certain kinds of interventions, including content moderation obligations, a possible 24-hour removal provision, and the obligation for platforms to proactively monitor their services for harmful content.
On February 3, 2022, the Government published the What We Heard Report that provides an overview of the feedback submitted through the consultation
Expert Advisory Group
The Government acknowledged the concerns expressed through the 2021 consultation and announced the formation of an expert advisory group on March 30th, 2022. The group was mandated to provide advice on how best to design the legislative and regulatory framework to address harmful content online while incorporating the feedback received during the national consultation.
The advisory group was made up of 12 experts from across the country. They brought diverse backgrounds with experience on issues relating to platform governance and content regulation, online harms, civil liberties, tech regulation, and national security.
They participated in 10 workshop sessions to discuss core elements of a framework to ensure that platforms are more transparent and accountable for the content that they host. The expert advisory group also met with the United Kingdom, European Union, and Australia to learn and hear how similar models are operating in other jurisdictions. The expert advisory group concluded their sessions on June 10th, 2022. Worksheets developed for discussion and summaries of the expert advisory group’s sessions are published on Canadian Heritage’s website.
The expert advisory group was not tasked with writing the legislation itself. It provided advice and feedback on elements of an effective legislative and regulatory framework to ensure a more transparent and accountable online environment that supports participation. This advice and feedback could feed into the development of legislation.
The Roundtables and further discussions with stakeholders have the opportunity to build upon the advice received from the expert advisory group and will also be considered as part of the development of policy for online safety.
Summary of Feedback Received from the Expert Advisory Group
- This Risk-Based Approach:
-
The experts advised that a risk-based approach to regulation could be most appropriate. It would not remedy all incidents of harmful content online, however, it could improve online services’ overall practices in addressing risk. The experts emphasized that content like child pornography, or child sexual exploitation content more generally, may require its own solution.
- This approach focuses on the systems and tools that online services can put in place instead of focusing on taking down harmful content and rigid moderating obligations
- Regulated entities could have a “duty to act responsibly” when it comes to harmful content online. This means that they could be required to take reasonable steps to introduce tools, protocols, and procedures to mitigate foreseeable harms arising from the operation and design of their services
- A risk-based approach to regulation, anchored in a duty to act responsibly, could obligate regulated services to fulfill three key steps. The first step could have regulated services identify and assess the risks posed by their service. In the second step, services could mitigate these risks. Third, services could report on their identification and mitigation tools to ensure accountability. Information gathered through this last step could then feed into the services’ next round of risk assessment. This cycle could promote continuous improvement of services’ systems and tools aimed at reducing the risk they pose to users
- Regulated Services:
- Experts agreed that the regulatory framework should apply to a broad range of services operating online, including those lower in the ‘tech stack’, such as App stores, web hosting services and content delivery networks. Experts disagreed on whether, and how, to include private communications in a regulatory framework. Some experts emphasized that the duty to act responsibly should apply to all services operating online but that regulated services could be categorized with specific obligations based on risk of harm and capacity of the service.
- Content to Regulate:
- Experts agreed that legislation should go beyond the 5 types of harm previously proposed by the Government in 2021 (content related to child sexual exploitation, terrorist content, content that incites violence, hate speech and the non-consensual sharing of intimate images). They stressed that some forms of harm are perspective-driven or extremely difficult to define and identify. They mentioned that child sexual abuse material and the non-consensual sharing of intimate images are easier to define and identify and could be treated differently.
- Obligations:
-
A risk-based approach could place obligations on regulated services around transparency and accountability. Experts mentioned that obligations put on services should be flexible so that the legislation can stand the test of time. They stressed that obligations should not undermine human rights, but instead protect them. Experts agreed on the need for an independent source of support to users. Its role could include receiving complaints from users on services’ adherence to their duty to act responsibly and collecting data on pressing issues for users and victims—this would help inform the development of codes of practice.
Finally, experts disagreed on whether a content removal decision making body could be workable in a risk-based approach. Some experts explained that victims need a venue to express their concerns. They explained that victims often feel that the platforms do not listen to their complaints, that they are still being harmed, and that the content they are flagging is not being addressed. Other experts disagreed and emphasized that victim support can be achieved without creating a content removal authority. They stressed that creating an independent body to make takedown decisions could be a massive undertaking akin to creating an entirely new quasi-judicial system with major constitutional issues related to both federalism and Charter concerns.
- Regulatory Powers of the Commissioner:
- Experts stressed that the regulator should be equipped with robust audit and enforcement powers, including the ability to levy administrative monetary penalties (AMPs) and issue compliance orders. Experts also mentioned that codes of practice are essential, and ought to be co-developed with industry and other stakeholders. They explained that the legislative and regulatory framework could be supported by codes of practice or guidelines co-developed by the regulator and stakeholders, outlining how regulated services could fulfill their obligations.
- Charter Rights:
- Experts stressed that a regulatory framework should put equal emphasis on managing risk and protecting human rights. They emphasized that it is important to balance the ability to freely post content, while also ensuring that users feel safe on these services. They insisted that any obligations imposed on regulated services, and any tools available to users, consider how best to protect marginalized communities from unintended consequences.
- Connection to Law enforcement:
- Experts emphasized that there is a low level of trust in law enforcement for some marginalized communities. Experts disagreed on whether mandatory reporting or preservation to law enforcement should form part of the online safety framework. Concerns were raised that requirements to report content to law enforcement or preserve such content for criminal investigations could lead to over policing of these communities. Other experts emphasized that law enforcement must be given the tools necessary to ensure they are able to successfully conduct their investigations.
- Disinformation:
- Experts emphasized that disinformation is harmful, but it is challenging to scope and define while protecting freedom of expression. They also expressed caution against defining disinformation in legislation for several reasons, including that doing so could put the Government in a position to distinguish between what is true and false – which it simply should not do. Instead of defining disinformation in legislation, some experts suggested that legislation could focus on the harmful effects of disinformation or certain behaviours associated with disinformation, like coordinated manipulation using bots and bot networks.
Summaries of all Expert Advisory Group meetings are published online on the Canadian Heritage website.
Former Bill C-36 amending the Canadian Human Rights Act and Criminal Code
This In June 2021 the Minister of Justice and Attorney General of Canada introduced a bill to combat hate speech and hate crimes. The bill proposed to amend the Canadian Human Rights Act and the Criminal Code and to make related amendments to the Youth Criminal Justice Act. The bill did not proceed through the previous Parliament before the election of the new Parliament.
The amendments proposed in this bill would complement the online safety measures being discussed at this Roundtable by providing remedies for victims and holding individuals accountable for the harms of the hatred they spread.
A description of the bill can be found online. The full text of the bill can be found on the Parliament of Canada website.
Your Participation in the Roundtable
As a participant at this Roundtable, you would have the opportunity to discuss core elements of a risk-based legislative and regulatory framework with other participants at the table.
You would also have the opportunity to contribute towards the Government’s policy development by sharing your thoughts, concerns and perspectives about online safety. Your knowledge and perspectives are vital as the Government of Canada begins to develop policy to ensure a more transparent and accountable online environment.
Proposed Questions for Participants
- What made you interested in participating in this sharing circle or interview?
- Are you concerned about harmful content online? What types of harmful content are you concerned about?
- On which platforms do you think these harmful contents are most present? Are there platforms or online services that you believe pose the greatest risk of harm to Indigenous Peoples?
- Should platforms be treated like any other product by identifying potential risks and mitigating them?
- Should a new online safety regulator have the power to order that specific pieces of content be removed from platforms? If so, what types of content should this power apply to?
- Should online services be required to report content that they believe is likely evidence of a criminal offence to law enforcement agencies?
- Would the proposal in the former Bill C-36 for a new peace bond help to prevent hate propaganda offences and hate crimes?
- Would the proposal in the former Bill C-36 for a hate-speech complaint process in the Canadian Human Rights Act be useful as an additional and complementary measure for combatting hate speech online?
- What outcomes would you like to see come from these discussions?
- Is there anything else you would like to say that hasn’t been addressed so far?
Page details
- Date modified: