What We Heard: 2022 Roundtables on Online Safety

On this page

Executive Summary

Between the months of July and November 2022, the Minister of Canadian Heritage and his representatives conducted 19 roundtable discussions on key elements of a legislative and regulatory framework on online safety. In total, 13 regional roundtables and 6 thematic roundtables on Antisemitism, Islamophobia, Anti-Black Racism, Anti-Asian Racism, Gender-Based Violence, and Big Tech were held between July and November 2022. Participants were also called to provide their views on the advice from the Expert Advisory Group on Online Safety, which concluded its meetings on June 10. The feedback gathered from participants touched upon several key areas related to online safety, which are outlined below.

Role of Government in Promoting Online Safety

There was overall agreement and consensus over the need for government action in addressing online safety. Participants recognized that the harms were mostly occurring on social media platforms, where Canadians are the most connected.  Most participants acknowledged the need for future legislation to entail strong enforcement measures to change how online services operate and hold them accountable for the harms occurring on their platforms. Participants expressed a desire for an online safety regulator to be inclusive of marginalized communities, and accessible to the public. They also preferred an arm’s length regulator with no risk of government influence, similar to what was advised by the Expert Advisory Group. Ensuring the regulator is properly resourced was a key priority for participants as many viewed it was vital to enable the regulator to ensure compliance and be able to provide Canadians with timely and high-quality service.

Responsibility of Online Platforms

Participants believed online platforms have a responsibility to ensure the safety of users online given their role in enabling the creation, sharing, and promotion of content. Participants flagged how quickly a user can be flooded with unwanted content due to platform algorithms. Participants expressed skepticism over the ability of social media platforms to self-regulate content on their platforms. Many participants cited social media platforms’ business models as a hinderance to self-regulation as they are driven by site traffic and views. Participants expressed concern over the lack of user verification processes.

Participants expressed frustration over the lack of transparency in how platforms moderate their content. Participants reported being uncertain about how platforms handle harmful content as there is a lack of information regarding content moderation decisions and how content reported by users is addressed by platforms. This led many participants to express sentiments such as being helpless and lost in regards of where and who to turn to for recourse. Many participants mentioned that they often never hear back from platforms when they flag content or submit complaints, only to see the content they flag remain on the platform, unmoderated. Participants were supportive of the risk-based approach proposed by the Expert Advisory Group as many believed the approach could increase transparency and bring clarity to how platforms handle harmful content. Some participants did question whether the approach would be sufficient in increasing transparency or if additional action would be required.

Greater Support for Victims of Online Harm

Participants shared experiences of how online harms impacts victims. Many participants described the real-world consequences of harmful content on victims, including physical harm, reputational harm, and consequences for victims’ mental health. Participants noted the disproportional impact of online harm on marginalized communities.

Participants expressed concern over the inadequacy of existing support for victims of online harm, with most participants noting a lack of resources within their local community. Participants expressed a desire for greater resources designed specifically for victims of online harm such as stronger reporting systems for harmful content and an ombudsperson who could advocate on behalf of victims and handle anonymous complaints. Participants noted that if an ombudsperson was created, it would need to be well resourced and culturally aware to handle culturally sensitive material. The need and desire for greater victim supports aligns with the conclusion many experts had on establishing an ombudsperson for victims to access.

Protecting Children and Youth

There was wide consensus from participants on the importance of protecting children and youth from online harm. Participants cited the vulnerability of youth to disinformation content, online luring, and inappropriate sharing of intimate images as three areas of concern. The impact of online harm on youth’s mental health was a key point of concern with many participants sharing their first-hand accounts with youth who were in distress due to harmful content.

Participants noted the challenges parents face in protecting youth. They raised how easy it is for youth to create social media accounts and engage online without their knowledge. Many participants suggested platforms have an obligation to increase online safety through adopting measures such as greater parental controls or age-appropriate design features. The views expressed by participants align with experts’ recommendation that platforms should have a special duty to protect children due to their vulnerability to the risks of the online space.

Increasing Education on Online Safety

There was wide consensus from participants over the need for greater education on online safety and digital literacy. Participants noted that there is currently a lack of awareness and knowledge on how Canadians can protect themselves online. Participants suggested the launch of educational initiatives targeting specific vulnerable groups, such as youth and immigrant communities. Participants noted the importance of making online safety resources and information easily accessible to the public to increase public awareness around how to stay safe online. The views expressed by participants align with the Expert Advisory Panel’s advice to incorporate public education as a fundamental component of any legislative framework on online safety.

Role of Law Enforcement in Addressing Online Safety

Participants acknowledged the limitations law enforcement currently face in addressing cases of online harms, including resource constraints and limitations on gathering data from platforms. Some participants shared reservations about giving further authorities to law enforcement in future online safety legislation due to concerns including risk of discrimination against marginalized communities, previous track record in handling cases related to online hate, harassment and other online harms, and the way social media data may be used by law enforcement. Some participants also expressed strong reservations at including law enforcement in future legislation due to the lack of cultural awareness law enforcement has shown towards certain segments of the population. Many participants described frustration at reporting cases of online harm to law enforcement and seeing no action taken.

Introduction

On February 3, 2022, the department of Canadian Heritage released a What We Heard report entitled “The Government’s proposed approach to address harmful content online” summarizing responses received during the summer and fall 2021 online public consultations. While Canadians expressed a desire for the Government of Canada to take action to hold platforms accountable for the content they host, they also identified a number of overarching concerns relating to the freedom of expression, privacy rights, and the impact of the proposal on certain marginalized groups.

In March 2022, the Minister of Canadian Heritage introduced an Expert Advisory Panel on Online Safety to advise the government on how to incorporate the feedback received during the 2021 national consultation and how best to design the legislative and regulatory framework to address harmful content online.

Throughout their 10 workshops, the Expert Advisory Panel outlined multiple considerations for designing a new legislative and regulatory framework for online safety. With the work of the Advisory Panel completed, the Minister of Canadian Heritage sought to engage people in Canada on the findings of the Advisory Panel.

Beginning in July 2022, a series of in-person and virtual roundtables were organized across Canada with participants from groups representing victims, religious and equity deserving communities, women, 2SLGBTQI+, youth, and Indigenous peoples. The goal of the roundtables was to listen to and understand the perspective of victims and tech platforms on the outcome of the expert group and elements of the suggested risk-based approach.

Each roundtable began with the Minister, or his representative, providing an overview of the work Canadian Heritage has conducted to date on Online Safety. Following the overview, participants were able to express their feedback. A representative from Canadian Heritage was present to facilitate the discussion and answer questions raised during the roundtable session.

In preparation for the roundtable session, participants were given an information document containing an overview of the work Canadian Heritage had completed to date on online safety. Participants were given the questions below to prepare for their roundtable.

  1. Why did you decide to join us today in this roundtable?
  2. What do you aspire of getting out of this roundtable?
  3. Have you ever experienced harmful content online? What types of harmful content are you most concerned about?
  4. On which platforms do you think these harmful contents are most present?
  5. What types of platforms or online services do you believe pose the greatest risk of harm to Canadians should be regulated?
  6. Should platforms be treated like any other product by through identifying potential risks and mitigating them?
  7. Should a new online safety regulator have the power to order that specific pieces of content be removed from platforms? If so, what types of content should this power apply to?
  8. Should online services be required to report content that they believe is likely evidence of a criminal offence to law enforcement agencies?

The following summary outlines the views expressed during each roundtable session; reports areas of agreement, disagreement, and discussion; and organizes the discussion under thematic categories. It should not be considered a verbatim recitation of the discussion.Footnote 1

Regional Roundtables

Charlottetown, Prince Edward Island

The first regional roundtable occurred on July 8 in Charlottetown, Prince Edward Island. Participants discussed an array of issues related to online safety including the responsibilities of social media platforms in addressing online harm, the toll online harm has on victims, and the role future legislation could have in promoting online safety.

Most participants were frustrated over inconsistencies in how social media platforms handle reported content. Participants described instances where either reported content was not removed, or more serious harmful content remained while other content was removed leading to uncertainty over the enforcement standards social media platforms follow. A few participants noted that online posters are adapting their content to avoid being flagged by social media moderation systems.

Participants were also concerned over the length of time it takes for reported content to be taken down by platforms. Participants were concerned the delay in addressing reported content can result in content reaching a large group of users before it is taken down. A few participants mentioned that harmful content could be downloaded or screenshot before being taken down.

Some participants referred to social media platforms’ business model as a potential hinderance to effectively addressing online harmful content. Participants were skeptical over the willingness of social media platforms to self-regulate content on their platforms due to the site traffic and revenue the content can generate.

Participants expressed unease over the level of personal information social media platforms disclose. According to them, the disclosure of personal information, such as location data and contact information, can lead to people being directly targeted by strangers online. Participants reported instances of community members receiving direct threats and flagged that online harm can occur across different online platforms such as online review sites through targeted negative reviews.

Addressing the inappropriate sharing of intimate images was a key priority for many participants. The mental health toll the sharing of such images has on victims and particularly youth was noted by many participants. Concerns were expressed regarding the lack of recourse, including legal recourse, victims have available to combat the issue.

Some participants expressed a desire for social media platforms to adopt greater verification processes for users. However, a few participants raised concerns over the potential resource barriers these systems could create for smaller organizations, such as nonprofit organizations.

A few participants believed online safety legislation should be forward thinking to include future tech platforms and entail clear definitions on the responsibilities and obligations of online platforms in addressing online harm. Participants noted there is currently uncertainty regarding the legal responsibilities of online platforms in addressing online harm.

There was support amongst participants for creating an online safety regulator. A few participants suggested the regulator should be inclusive of marginalized communities and that an advisory board to the regulator be created to represent their perspective. A few participants noted that the regulator should have trauma trained staff as opposed to relying fully on automated systems when working with victims.

Participants emphasized the importance of creating strong enforcement measures to accompany any proposed regulations. Participants believed the effectiveness of any proposed legislation depends on the enforcement mechanisms associated with it. It was noted that law enforcement lacks resources and capabilities to effectively address instances of online harm.

There was widespread agreement over the importance of increasing public education on digital literacy and online safety. A few participants highlighted the importance of proactively educating youth on consent culture and treating others with respect.

Moncton, New Brunswick

The second regional roundtable also took place on July 8 in Moncton, New Brunswick. The discussion touched upon the dangers of disinformation, the need for greater content reporting systems and the importance of educating the public on online safety.

Participants expressed concern over the business model of social media platforms that utilize algorithms to generate content views and direct users to specific content. Participants highlighted how quickly a user can be flooded with unwanted content due to algorithms. A few participants noted how algorithms could result in youth being exposed to sensitive or mature material.

Participants were concerned over how quickly disinformation can be created and shared to a wide range of people. Participants noted how long it can take to produce and fact-check content compared to how quickly disinformation can be created and shared. Most participants expressed concern over how difficult it can be to distinguish between what is factual and what is false. A few stakeholders mentioned the importance of making fact-checking tools more accessible to Canadians.

Participants described an urgency for better reporting systems for online content. A few participants recounted instances where they reported online content and no action was taken. A few participants described the limited options available to users for reporting content occurring over a live feed online or through private messages on a platform. Participants expressed a desire for platforms to offer more accessible reporting and content flagging systems.

The vulnerability of immigrant communities was highlighted by many participants due to the lack of technological and social media knowledge new Canadians may have. Participants noted the difficulties new Canadians experience in distinguishing factual information from disinformation.

Many stakeholders expressed concern over the vulnerability of youth online. Participants noted the wide range of content youth can access online and the difficulties youth experience in distinguishing between factual and false information. Participants discussed the challenges parents and youth face in staying safe online. Limited parental controls, lack of awareness on platform safety features and a lack of safeguards on sensitive content were cited as challenges facing parents and youth.

Participants found consensus on the need for increasing education on digital literacy and online safety. Youth and immigrant communities were identified by participants as two groups to tailor educational efforts towards through educational campaigns, school curriculum reform and changes to resettlement programs.

Some participants expressed concern over the lack of user verification requirements in platforms and the high degree of anonymity online users have. A few stakeholders described instances of people creating fake accounts to gain access to private group pages. Youth online luring was a particular concern for participants. Video games were identified as a type of platform to examine due to the high degree of anonymity online gamers have, which can lead to adults gaming and interacting with youth with little safeguards in place.

Many participants expressed desire for stronger legislation, regulations, and systems to be introduced to combat online harm. Participants acknowledged the importance of future online safety legislation striking a balance between protecting freedom of speech and protecting Canadians from online hate.

St John’s, Newfoundland and Labrador

The third regional roundtable occurred on July 12 in St. John’s, Newfoundland and Labrador. Participants discussed the role of social media platforms, increasing awareness of online safety resources, and the importance of offering support for victims of online harm. Participants stressed that citizens, the government, and the tech industry are all collectively responsible for ensuring online safety.

Participants emphasized the responsibility online platforms have in addressing online harms. A few participants expressed concerns regarding the business model of social media platforms that relies on site traffic and content exposure, which was seen as contributing to the spread of online harm. In particular, a few participants pointed to the ability for platform algorithms to create echo-chambers where users are only exposed to a specific set of content.

Participants also raised concerns with existing moderation and content reporting systems. A few participants shared stories of instances when reported content was left untouched by social media platforms and instances where content was taken down due to being misidentified as being inappropriate content. Most participants believed there should be content reporting systems easily accessible to users.

Most participants expressed a desire for platforms to introduce more accountability measures on users. A few participants discussed the issue of online anonymity on platforms and how it allows some users to avoid being held accountable for the content they post, which can embolden users to post comments they otherwise would not.

Participants did not agree as to whether private messaging should be scoped in. A few participants were wary of private messaging moderation while others recognized private messages can be used to spread harm. Participants raised the issues of privacy and perception of state surveillance should private messaging be scoped in.

There was wide consensus amongst participants around the importance of educating the public on how to stay safe online and the impact harmful content can have on victims. A few participants raised the importance of educating parents so they have the tools to protect their children online. It was suggested that guidance be developed to assist Canadians in identifying and addressing different forms of online harm, such as microaggressions.

Most participants mentioned the importance of providing support for victims of online harm. A few participants suggested the creation of communities of support and care to help people cope with online harm. Providing support to youth was a particular focus for participants due to the mental health impact online harm can have.

A few participants highlighted a lack of trust in law enforcement as a deterrence to reporting harmful content to law enforcement agencies. It was noted that inaction in addressing reported instances of hate further fueled these sentiments.

Participants expressed a desire for online safety policies and regulations to be forceful and adaptable to changes in the tech industry. Participants emphasized the need for strong enforcement measures to ensure compliance by online platforms.

Quebec City, Quebec

The next roundtable took place in Quebec City on July 15th. Participants discussed the positive and negative impacts social media has on society. Participants specifically mentioned the mental health toll social media has on youth.

Participants noted that social media platforms provide people with a space to connect with others and to express themselves. However, participants noted that social media platforms can also spread negative messages to a wide audience. Social media algorithms were identified by participants as a contributing factor to online harm as it can reinforce negative or misinformed perspectives since platform users are subjected to targeted content based on their search history.

A few participants pointed out how the anonymous nature of social media gives users more freedom to spread online harm such as bullying, death threats and online hate. A few participants noted that this can cause greater strain on the mental health of youth and could contribute to a feeling of loneliness, which, if unchecked, could lead to self-harm.

There was wide acknowledgment over the importance for greater education on online safety. Participants suggested establishing awareness campaigns and changes to school curriculums to inform Canadians of the resources and tools available for staying safe online. A few participants noted that educational campaigns should aim to educate youth on what they could do if they are victims of online harm.

Ensuring accessibility to resources and information about online safety was identified as an important step to take by a few participants. It was noted that information should be communicated in a manner that people of all backgrounds and education levels can digest.

Participants raised that equity deserving communities, and especially Muslims and members of the 2SLGBTQI+ communities are being targeted online. There were concerns that platforms are not doing enough to protect young queer users and Muslim Canadians and that their algorithms amplify negative emotions and content that spreads discrimination towards them. Participants called for more tools and resources for these communities.

Montreal, Quebec

The second Quebec regional roundtable took place in Montreal on July 21st. Discussion centered on the importance of digital literacy education, the mental health impacts of online harm and increasing transparency on social media platforms.

Increasing education on online safety was widely supported by participants. Participants suggested focusing education initiatives on youth due to their online vulnerability. Some participants mentioned the need for Federal-Provincial collaboration to establish digital literacy education in school curricula. Participants also suggested that a national digital literacy campaign should be created with the support of social media platforms.

A few participants expressed a desire to examine the role of the video game industry in facilitating online harm. It was noted that online harms are occurring on video game platforms, particularly against youth. It was also noted that video games are currently being used by far-right extremists to recruit or radicalize.

Helping victims of online harm was a focus for many participants. Many noted that the onus is currently on victims to report cases of online harm. Some participants noted this is a big challenge as victims are not always comfortable coming forward to report content and online behaviour. Participants believe that providing resources and accessible reporting mechanisms should be part of new online safety legislation.

Some participants expressed interest in addressing defamatory content due to the consequences it can have on victims and the time and resources needed for victims to address it. Participants noted that women are particularly targeted by defamation online through reputational and personal attacks that negatively impact their mental health.

Participants discussed the role advertisers have on social media platforms. A few participants indicated that advertisers should be included in a new online safety framework to add pressure on platforms to comply with reporting obligations. This view was contested by other participants as it was mentioned that advertisers do not have the same obligations and control over content that is posted on a social media platform.

Participants expressed a desire for online safety legislation to require greater transparency from social media platforms on their content moderation process. Participants were interested in ensuring greater transparency over the usage of artificial intelligence (AI) by social media platforms to moderate content. A few participants suggested that legislation should take inspiration from the Santa Clara principles on transparency and accountability in content moderation.

Participants discussed the importance of platforms having authentication and verification systems in place. Participants noted the importance for verification systems for platforms that include sensitive content such as pornography sites due to the risk of children accessing these platforms. Participants also suggested the inclusion of deep fake technology in online safety legislation.

Participants had a thorough discussion on setting definitions for different categories of online content. Participants indicated a preference for setting broad category definitions for content to ensure that the definitions capture new content that may emerge. Regarding hate, many participants indicated a preference for a clear definition on what constitutes hateful content. This content should be applicable to all online users as opposed to focused on certain segments of users.

Participants stressed the importance of ensuring the independence and impartiality of an online safety regulator. It was noted that these elements are vital in ensuring the neutrality of the regulator.

Edmonton, Alberta

The roundtables then shifted to the Prairies, beginning with a roundtable in Edmonton Alberta on July 25th. Participants discussed the role of regulations in holding social media platforms accountable, the challenges facing youth and potential solutions to addressing harmful content.

Participants discussed the role online anonymity has in contributing to the spread of online harm. It was noted that being anonymous online can embolden people to post and do actions they otherwise would not do if their identity was known. Requiring user authentication was identified as a potential solution to this issue.

Participants discussed potential solutions to addressing harmful content on social media platforms. Some participants suggested adopting automatic content warnings, introducing an opt-in/opt-out approach to content viewing and shifting from automated moderation to human moderation as solutions to combating harmful content. Algorithms were identified by participants as a key contributor to the spread of online harm.

Participants discussed the challenges youth experience in identifying and reporting online hate due to the inaccessibility of content flagging systems and the discrete forms of harmful content. Educating and providing resource supports to youth were identified as important solutions to pursue to prepare for instances when they encounter online harm.

There was agreement on the importance of increasing digital media literacy education to ensure people have the tools and knowledge necessary to identify online harm. School curriculum reform was suggested as a potential option to directly reach youth.

Participants discussed the mental health impact and the increased polarizing effect online harm have on people. The importance of providing victims support and peer-to-peer support for youth to address the mental health impact of online harm was highlighted by participants.

Participants discussed the role regulations can have in holding social media platforms accountable. A few participants mentioned the need to include both large and smaller platforms such as 4chan in a regulatory framework. Participants discussed the importance for regulations to be balanced to ensure it is compatible with Charter rights and freedoms. Some participants expressed reservations over the effectiveness of introducing regulations to address online harms due to the length of time it would take to create and enforce the regulations. Participants were also concerned about the potential for platforms to avoid enforcement.

Saskatoon, Saskatchewan

The second roundtable in the Prairies took place on July 26th in Saskatoon. Topics of discussion included the importance of holding platforms accountable for their action, the need for greater digital literacy education, and importance for protecting youth and children.

Participants agreed on the need for greater public education on digital literacy. Education was viewed as a key step in preventing the creation and spread of harmful content. Participants discussed different educational methods, including an online digital campaign and a bottom-up, grassroots approach.

There was wide agreement on the need to protect children and youth in online safety legislation. Many noted the vulnerability of youth to online predators and the risk of youth being subject to echo chambers on particular topics.

Participants expressed a desire to hold platforms accountable for managing content posted on their platforms. Many participants noted concerns over platforms prioritizing profits rather than monitoring content and educating users on digital literacy. There was wide agreement that platforms should increase content moderation efforts and commit to basic operating standards. Many participants noted the need for regulations to include enforcement mechanisms to ensure that platforms take steps to reduce online harm.

A few participants raised concerns about verification systems for online users. Although they may address concerns regarding user anonymity, participants noted such systems may impact negatively marginalized people.

Indigenous participants expressed concerns over the fact that platforms do not understand or respect their cultural specificity. Indigenous participants mentioned that they often feel like neither law enforcement nor platforms understand the discrimination they face online and that both law enforcement and platforms are not sophisticated enough to address online hate, especially towards Indigenous Peoples.

Niagara, Ontario

The roundtables shifted back to Ontario on August 12th with a roundtable in Niagara, Ontario. Participants discussed the importance of digital media literacy, making content reporting information accessible and providing more resources to victims of online harm.

Participants mentioned the importance of increasing the accessibility of online safety resources and platform content reporting systems for users. Many participants described uncertainty regarding who and where to report cases of online harm such as harassment and catfishing. It was suggested that platforms could provide tools that use plain language and include easy-to-understand instructions for reporting content.

Protecting youth was a focus for many participants. Participants expressed support for greater educational resources on the dangers of social media and how youth can navigate social media safely. A few participants mentioned the need for greater collaboration with universities and schools in establishing online codes of conduct for students.

Several participants raised concern over the responsiveness of platforms in addressing online harm. Participants expressed desire for additional information on how the duty to act responsibly would impact how platforms handle online harm. Participants were supportive of establishing a regulatory entity that monitors online platforms.

Providing support to victims of online harm was a focus for participants. Many participants described challenges with going to law enforcement to handle cases of online harm, including how law enforcement classifies cases of online harm and their limitations in providing recourse. Participants were supportive of offering mental health resources to victims of online harm.

Surrey, British Columbia

Roundtables moved to the west coast with a roundtable held in Surrey, British Columbia on September 2nd. Participants discussed the need for increasing digital media literacy, the importance of protecting vulnerable populations, such as youth and seniors, the reach online harm can have and key considerations when drafting online safety legislation. Participants emphasized that the status quo is no longer acceptable.

The distinction between misinformation and disinformation was discussed by participants. Participants viewed misinformation as a step before disinformation and questioned if legislation can capture both issues. A few participants discussed whether online safety legislation should include misinformation and disinformation due to the challenges of tackling both. Some participants expressed unease at the notion that the government should be the entity deciding what material constitutes misinformation and disinformation.

Protecting youth online was a key priority for participants due to the long-term impact online harm can have on a victim. A few participants noted that it can be difficult for youth to determine what content is truthful or fake and that online predators can communicate with youth from around the world across different platforms. Some participants expressed desire for separate legislation to be tabled that specifically focuses on protecting youth and children from harm.

Participants agreed that increased digital media literacy education is needed to prepare people for navigating the internet. Youth and seniors were identified as two groups to focus educational campaigns on due their online vulnerability. A few participants suggested integrating digital literacy into school curricula, while others believed educational resources should be developed by government and civil society.

The effect of individuals leveraging large social media platforms to spread online harm was discussed by a few participants as a factor to consider. Participants expressed concerns at how individuals can use platforms to quickly spread harmful content. Participants noted that platforms should hold individuals with a large base of followers even more accountable to the different content moderation standards compared to other users due to the platform engagement those individuals generate.

Participants noted that proposed online safety legislation should establish a set of universal definitions for harmful content and not focus on specific platforms due to the likelihood of new platforms emerging over time. It was noted by some participants that the legislation should consider the different types of online platforms ranging from larger mainstream platforms to smaller ones that tend to exist in deeper spaces on the internet. A few participants suggested the legislation should focus on tackling less complex issues to ensure the expediency of passing the legislation.

Respecting freedom of expression was a key priority for some participants. It was noted that anonymity and online communities can be therapeutic to people escaping harm or troubles occurring within their own personal lives.

Winnipeg, Manitoba

The next roundtable was hosted in Winnipeg on September 29th. Participants discussed the challenges faced by immigrants online, the importance of protecting youth and a desire for greater platform accountability.

Participants described the challenges faced by immigrants and new Canadians when using online platforms. It was mentioned by many that language is a key barrier to many users in using online platforms. Additionally, participants noted that new Canadians were vulnerable to issues such as hateful comments and online scams seeking personal information. Particular concern was expressed for youth immigrants as they often are not aware of online rules of engagement and culture.

Participants discussed the importance for protecting children and youth online. Participants were concerned that youth lack the tools and knowledge to navigate the internet in a safe manner. It was suggested that this could be addressed through the development of digital literacy campaigns and changes to school curricula to include digital literacy.

Participants noted their displeasure at the current practices social media platforms take to protect youth. Many participants expressed a desire for social media to take greater responsibility and steps to protect youth such as developing apps with safety mechanisms built in.

Participants discussed the role law enforcement could have within online safety legislation. Many participants noted the difficulties law enforcement have in obtaining data from platforms in cases of child sexual exploitation. Rather than having a strong law enforcement component to the legislation, participants expressed a desire for law enforcement to only have strong authorities in extreme instances such as the broadcasting of live attacks.

Participants described their experiences with social media content reporting systems as being long, unresponsive, and uncooperative. Many participants noted the need for platform moderation systems to be more active in addressing content that promotes hate and violence.

There was general support for a risk-based approach, although participants raised several questions regarding how it could be implemented. A few participants noted concerns with the type of language that could be used to define certain harms. Participants were concerned that terms such as “extremism” or “radicalization” carry negative connotations towards racialized communities. A few participants expressed a desire for online safety legislation to adopt a greater focus on preventing online harm as opposed to addressing online harm.

Participants noted that women are a key target of online harm. The non-consensual sharing of intimate images and online blackmailing were highlighted in particular due to the impact they can have on the well-being of victims.

The international element to online harm was discussed by several participants. Many noted how child pornography and cyber blackmailing can originate from outside of Canada. Participants expressed frustration over the lack of recourse and tools available to victims to handle such instances and mentioned the need for a collaborative international effort to address online safety.

Participants agreed that social media platforms should be held responsible for addressing the spread of harmful content on their platforms. Many participants noted that social media platforms should be required to adopt measures such as putting disclaimer warnings on content or making stronger content moderation decisions. A few participants noted this was preferable to the Government taking a role in content moderation.

Windsor, Ontario

Back to Ontario on October 24th with a roundtable in Windsor, participants discussed the importance of protecting vulnerable groups such as youth and immigrants, the role of law enforcement and the user experience of reporting content to platforms.

Protecting children and youth was a key priority for participants due to how early and often youth are using online platforms. The mental health toll and risk of sexual exploitation on children were identified as key dangers youth face online. A few participants mentioned the difficulties parents have in trying to protect their children from online harms due to the fast-paced rise of new platforms and trends.

Increasing digital literacy awareness was identified as a key priority to pursue to help protect vulnerable groups such as youth, seniors, and immigrants. A few participants suggested that partnerships could be developed with departments like Immigration, Refugees and Citizenship Canada (IRCC) to integrate digital literacy into training programs for new Canadians. Participants also discussed tailoring digital literacy materials to parents to help them teach their children how to be safe online.

Many participants indicated the need for adopting a broad regulatory scope that is flexible enough to include new platforms as the social media landscape evolves. Participants noted that regulating some platforms but not all carries the risk of harmful content moving to smaller, un-regulated platforms.

Participants described challenges in communicating and obtaining information regarding content moderation decisions. There was wide support for requiring social media platforms to be more transparent in terms of communicating their content moderating processes. Participants were supportive of having an ombudsperson to go to when they have an issue with a social media platform.

Participants acknowledged the difficulties law enforcement have in obtaining information from social media platforms. However, participants were mindful of the risks of discrimination and systemic racism that could arise with enhanced law enforcement powers.

Halifax, Nova Scotia

The roundtables headed back to the East Coast with a roundtable in Halifax on October 27th. Among the topics participants discussed was the connection between online and physical harm, the role of a digital safety commissioner, the need for human moderation and the importance of digital literacy.

The risk of online harm translating into physical harm was flagged by several participants who encountered instances of in-person harm that can be attributed to misconceptions and ideas shared online. The mental health toll of online harm was also discussed by participants with some describing instances of youth being afraid to engage in the physical world due to the level of hate they see online.

Participants were supportive of establishing a regulator to address online harms and engage with social media platforms. Participants noted that a key challenge in combating disinformation and online harm is how quickly content can spread before being taken down. Concerns were expressed over the responsiveness of the regulator. Several participants were wary of bureaucratic delays, which could impede the effectiveness of the regulator.

Many participants were concerned about how reliant social media platforms are on AI moderation tools. Participants indicated a preference for greater human oversight and decision making in how social media platforms moderate content. A few participants suggested online safety legislation could require platforms to have compliance departments to handle content moderation requests.

Participants identified the need for greater digital literacy awareness and were supportive of enhancing resources for education on digital literacy to ensure people have the knowledge and tools to stay safe online. It was noted that people should know how to report content and who to go to for help. It was suggested that a proactive approach consisting of public education, public service announcements and services can help with digital safety awareness.

Participants were supportive of legislation adopting a risk-based approach and discussed elements they would like to see covered. One element raised by a few participants is the importance of capturing harmful content that is more difficult to detect through AI systems. Another area discussed was addressing instances where banned users create new accounts to continue the spread of online hate.

Participants discussed the role of law enforcement in addressing online harms. There was acknowledgement that public trust in law enforcement is currently low, and that law enforcement must step up to regain it. Some participants mentioned that law enforcement should be part of the solution of addressing online harms by providing them more tools and training in cyber-related crimes. A few participants suggested that rather than cyber investigations being handled by a single officer, a committee approach would be better to minimize the risk of personal biases impacting the actions of law enforcement.

Northern Canada

The regional roundtables concluded on November 9th with a hybrid roundtable held in the North consisting of participants from across Canada’s three territories. The in-person portion of the roundtable was in Whitehorse, Yukon, while the virtual portion was held over MS Teams for participants from outside Whitehorse. Participants discussed their experiences reporting content to social media platforms, the need for greater digital literacy knowledge and concerns regarding providing more authority to law enforcement.

Participants described issues they encountered regarding how social media platforms moderated content. Participants found that the onus was often on users to report and manage content in online spaces, such as group chats and private message boards. Participants experienced difficulties communicating directly with social media platforms regarding content moderation decisions. It was noted that a regulator could help address this issue.

Many participants expressed general support for a risk-based approach and a digital safety regulator. However, a few participants raised concerns as to whether administrative and bureaucratic barriers could impact the effectiveness of the regulator, resulting in delays in handling user complaints.

Participants discussed their experiences with seeing and receiving harmful online content. It was found that marginalized communities, such as visible minorities and the 2SLGBTQI+ community, often receive harmful comments. A few participants flagged that comment sections of news articles are an area where harmful content is posted.

Participants discussed the importance of providing digital literacy education to youth and members within remote communities. Youth were identified as a particular group on which to focus educational resources due to their vulnerability online. Many participants suggested directly consulting with youth to learn their firsthand experience navigating online platforms. It was also noted that further guidance and educational materials should be developed to assist teachers and organizations in teaching Canadians on staying safe online.

Many participants expressed clear and strong concerns at providing law enforcement with greater resources and authority in handling online harms. They raised the absence of cultural awareness of many law enforcement officers posted in the North. Participants discussed their difficulties when approaching law enforcement and suggested creating a body with the right tools, training and knowledge of the online safety framework rather than relying on law enforcement.

Thematic Roundtables

AntiSemitism

The first thematic roundtable was held on August 16th on the topic of AntiSemitism. Participants representing organizations from the Jewish community across Canada were invited to discuss their experiences with online harms and antisemitism. Participants discussed the risk of online harm translating into physical harm, the need for educational institutions to take on a greater role in combating online harm and skepticism over the ability of social media platforms to self-regulate online harm.

Participants emphasized the importance of education and educational institutions in combating online harm. Participants discussed the role that digital literacy education and cultural education can have in addressing the spread of disinformation. A few participants expressed desire for educational institutions such as universities to take further efforts in addressing instances of online harm within the student community. A few participants described instances post-secondary institutions not acting on antisemitic incidents due to it occurring online as opposed to on campus. It was noted that further federal-provincial collaboration is needed on increasing digital literacy education.

Participants discussed the role algorithms have in contributing to the spread of disinformation and the radicalization of individuals. The stealth nature of algorithms and how they expose readers to conspiracy theories was of concern for many participants. They indicated that AI-suggested content could radicalize readers without their knowledge.

Participants questioned the ability of social media platforms to self-regulate harmful content on their platforms. Several participants referred to the role of social media platform algorithms in spreading harmful material, their past track record in addressing harmful content and the traffic harmful content can generate on social media platforms.

Many participants noted that harmful content should be viewed holistically rather than as separate pieces of content to understand how it contributes to an unsafe online environment. Participants discussed how online harm aggregates, which can lead to people being unwilling to express their views online or contribute to online platforms. They also discussed their experiences in seeing antisemitic content online and a few participants remarked the different forms in which it is posted and shared, including videos and subtle references within posts.

The risk of online harm manifesting into real world consequences and physical harm for victims was discussed by several participants. One risk identified by participants was the potential for online users to locate and confront victims in-person based on information posted on social media platforms. A few participants shared instances of online harm turning into real-person confrontation on university campus.

The role anonymity has in emboldening the spread of online harm was flagged by many participants. It was suggested that lessening the anonymity people have online would help bring social media spaces towards a more collegial town-square environment. A few participants suggested user verification systems could be a solution to decreasing anonymity.

Participants discussed elements they would like to see included in online safety legislation. Participants believed that the legislation should be broad to include all platforms as opposed to having a narrow scope. Many participants were supportive of a regulator having the ability to hold social media platforms accountable for their actions. It was suggested that the regulator should be an arms-length body and have the necessary resources to fulfil its duties.

Anti-Black Racism

The second thematic roundtable was hosted on September 28th on the topic of Anti-Black Racism. Participants discussed the impact of disinformation, the need for greater transparency from social media platforms, the inaccessible nature of current social media content reporting systems, and the need for continuous consultations in shaping the legislation.

Participants discussed the rising amount of Anti-Black content posted in online spaces. A few participants noted that they have seen an increase in negative comments and disinformation being shared compared to previous years. Participants discussed the different platforms and forms Anti-Black content is being shared online. It was noted that marginalized youth should be at the forefront of online safety discussions due to the volume of content they are consuming online.

Some participants noted that online safety legislation should look beyond hate speech to include content that may not classify as hate speech but could still cause mental trauma and stress on victims, such as harassment or microaggressions.

The spread of disinformation was a key concern for many participants as it can lead to the spread of hate speech. Many people do not have the time or ability to fact check the information they see on social media. This can result in disinformation evolving into hate speech due to how widely it can be shared without challenge. Participants noted the sophistication of disinformation, which can make it difficult for people to determine whether a piece of content is coming from a reliable source. Many participants suggested that disinformation should be a key focus of online safety legislation.

Many participants were frustrated at the content moderation systems social media platforms currently use. Several participants mentioned how content reporting systems are slow to respond to requests and rely on AI systems rather than human interactions, which can result in important contextual information being overlooked by AI systems. Participants also raised concerns that current reporting systems are not accessible for marginalized communities who may not have the knowledge base or ability to navigate the reporting systems social media platforms have.

Participants discussed the need for social media platforms to be more transparent in their operations including how personal data is collected and utilized, as well as their content moderation decisions. Participants favored requiring social media platforms to provide more information on their moderation decisions, making a feedback loop where users are informed of the factors that contributed to a content moderation decision. A few participants suggested requiring platforms to flag or include messages on content that is under review to notify users that the content is being examined.

Participants were supportive of establishing a digital safety ombudsperson to hold social media platforms accountable and to be a venue for victims to report online harms. It was suggested the ombudsperson could act as a body that takes in victim complaints and works with the corresponding platform or governmental body to resolve the complaint. Some participants expressed concern over the ombudsperson’s ability to process and respond to user complaints in a timely manner. To ensure the effectiveness of the ombudsperson, participants believe the body needs to have enough resources to keep pace with the complaints it receives. A few participants also noted the importance for the ombudsperson to be trained in cultural nuances to understand the cultural contexts behind content that is reported to them.

Participants expressed concern at furthering the authority of law enforcement in online safety legislation. A few participants noted that victims may be less inclined to come forward with reporting online harm due to fear of having personal data entered into police databases or the risk of undue use of police force. Having an ombudsperson as the entity to engage law enforcement was seen as a stronger alternative.

Participants agreed that continuous consultation should be held with marginalized communities and grassroots organizations to ensure future online safety legislation and the work of the ombudsperson is responsive to user experiences.

Islamophobia

On September 29th, the third thematic roundtable took place on the topic of Islamophobia. Participants from organizations representing the Muslim community were invited to participate. Topics discussed included the need for government action on online hate, the connection between online and in-person harm, the role of anonymity in emboldening harmful content and the need for creating a regulatory entity.

Participants agreed that action to address online hate was urgent. Participants discussed seeing an uptake within their communities in people seeking recourse and help in dealing with hate. It was noted that Muslim women are more likely to experience online harm due to their intersectional identities, which makes them more identifiable.

Participants noted the connection between online harm and in-person, physical harm. Several participants noted that a hostile online environment is silencing voices online as people are fearful of being targeted. Participants shared stories of people using personal information they found online to target people both virtually and in-person.

A few participants discussed the role of online anonymity in emboldening users into posting harmful content as they face little consequences for it. One solution proposed was for platforms to introduce user authentication that would require users to authenticate their identity before being able to post.

Participants were supportive of establishing a regulator that would have the power to give penalties to social media platforms, educate the public and recommend action to government. It was mentioned that penalties would need to be impactful to incite change from social media platforms. A few participants noted that the regulator should have cultural sensitivity training to ensure they can effectively address content where a cultural understanding is needed. Participants also highlighted the need for the powers of the regulator to be clearly defined for public knowledge.

Many participants were supportive of having trusted flaggers assist social media platforms and a potential regulator. Some participants questioned who would qualify as a trusted flagger and what kind of action a commissioner could take on the advice of trusted flaggers.

A few participants were apprehensive at the inclusion of terrorist content in the legislation and expressed caution in how the section is defined. There were fears that a terrorist content definition could be discriminatory against marginalized communities.

Anti-Asian Racism

The next thematic roundtable was on October 13th on the topic of Anti-Asian Racism. Participants representing organizations from the Asian Canadian community were invited to talk about their experiences and their views on the upcoming online safety legislation. Issues discussed included the connection between online and physical harm, the need for more victim support, the scope of future legislation and the importance for continuous dialogue with civil society as online safety legislation is developed.

Participants discussed the sharp rise in anti-Asian hate since the start of the pandemic, which has impacted the livelihoods of Asian-Canadians ranging from negative impacts on physical and mental health to socio-economic consequences. Participants noted the mental health of young Asian Canadians has specifically been negatively impacted by anti-Asian hate, which has contributed to some feeling ashamed of their self-identity.

The connection between online hate and physical harm was discussed by participants with many describing instances where online harm impacted the livelihood of victims. Participants also described the multifaceted nature of harm, which can take the form of physical, mental, emotional, psychological, and social harm.

Many participants expressed frustration at the lack of resources and recourse available for reporting anti-Asian hate. Several participants shared their hesitation in contacting law enforcement for online hate due to police inaction in handling the incident. This results in victims of online hate not knowing who to go to when they are victims of online hate.

Participants expressed a desire for online safety legislation to have a greater focus on supporting victims. There was a high interest in ensuring legislation encompasses victims’ perspectives when establishing legislative obligations of social media platforms and in creating reporting tools for victims.

Several participants indicated the need for government to have a two-tiered approach to address online hate that first tackles the sources of online hate and disinformation on the darker places of the internet, and second, the rapid dissemination of the content on more mainstream platforms. It was mentioned that hateful rhetoric incubates on smaller sites such as 4chan and Kiwi Farms before spreading to the wider public on platforms such as Facebook and Instagram.

A few participants identified news outlets as contributing to the spread of anti-Asian racism through their coverage of certain news stories and online comment sections. The comments section of online news articles was flagged as a space where hateful content can spread and should be included in future legislation.

Many participants noted the need for a comprehensive approach to addressing online harm that includes strong enforcement measures to ensure compliance. Participants discussed the importance of future legislation to be forward-thinking to ensure it reflects changes in the online space as it evolves. Participants were supportive of establishing an ombudsperson for victims and noted that the office must be accessible for all Canadians.

Participants urged the government to move forward with an online safety legislation and expressed a desire to remain engaged. They voiced strong support for a legislation aiming to create a safer space online and hold social media accountable and noted that civil society can provide input and feedback as the legislation is developed and implemented. There was also a strong desire for further updates from Canadian Heritage on the development of the legislation.

Women and Gender-Based Violence

The next thematic topic was women and gender-based violence on November 3rd with representatives from women’s organizations and individual citizens who were invited to discuss their experiences with online gender-based violence. Participants discussed the challenges victims of online harms face, sources of online harm, and the track record of social media platforms in moderating harmful content. Some participants also shared their personal experience with online hate and how it impacted their daily life.

Participants raised concerns about the level of vitriol in online spaces that can lead to women and gender diverse people being silenced in the public sphere due to fear of being targeted by online hate. Participants discussed the role of anonymity in emboldening people to promote views and behave in manners that they otherwise would not in person.

Participants discussed the challenges victims face in reporting cases of online harm including inaction and lack of understanding by law enforcement, slow response times by social media platforms, a lack of available resources and the risk of experiencing further trauma. Participants reported stories of having online posts promoting events or information being subjected to online abuse, death threats, and harmful comments. They shared they required the need of protective services as a result of online harassment, and of being physically put at risk and needing to be escorted by the police off campus after giving a presentation live streamed on Facebook. Some raised that even when they reported harmful comments to the platform and content was removed, posters moved to another platform. Participants described frustration at the lack of options available to them to handle such incidents.

Many participants highlighted the need to regulate smaller platforms as online hate tends to originate on smaller platforms before making its way to larger mainstream platforms. Participants discussed the role of social media in giving individuals with misogynist views a platform to spread their rhetoric, which then presents the risk of the population being radicalized and emboldened by these views.

Several participants expressed frustration at the time platforms take in responding to reported content. It was noted that the delay in acting allows hateful content to reach a large audience before being taken down. Some participants express skepticism that a risk-based approach could positively impact platform response times.

Participants emphasized the need for taking preventative measures against online harm as opposed to waiting and responding to harm when it occurs. Participants mentioned measures including digital literacy campaigns, increasing education on how people should behave online and providing funding to academic and civil society groups to conduct research and develop educational resources.

Participants shared reservations at the idea of allowing social media platforms to self-regulate their content due to their past track record on responding to reported content. The continued use of AI and algorithms also invoked skepticism due to the way these systems are used to spread harmful content to users.

Participants were supportive of establishing an ombudsperson to engage with social media platforms on behalf of Canadians. It was noted that ensuring the ombudsperson is accessible to Canadians is vital in its effectiveness to hold social media platforms accountable for content moderation decisions.

Participants expressed concerns at the potential of providing further authorities to law enforcement in the online safety legislation. Participants had serious hesitations about the involvement of law enforcement and intelligence agencies in light of the absence of trust between equity-deserving of communities and law enforcement. They found law enforcement were unresponsive to their concerns or did not take sufficient action in addressing cases of online harm. Some shared that, in their experience, going to the police has resulted in more harm, not protection. If law enforcement is included in the framework, some participants expressed a desire to see more coordination between law enforcement and the judiciary in addressing cases of online harm.

Participants were generally supportive of a risk-based approach to promoting online safety and proposed a few additions that could further strengthen an online safety framework, including establishing clear standards and guidance for how platforms should conduct their risks assessments and including a trauma-informed, intersectional, rights-based approach to the framework.

Tech Industry

The last virtual roundtable was hosted on November 17th with representatives from the tech industry. Participants discussed the current measures platforms take to promote online safety, the need to ensure legislation reflects the diverse nature of the various platforms, and greater clarity over certain elements of the government’s approach to online safety.

Participants discussed the various measures their respective platforms have put in place to protect users including content moderating processes, age specific content restrictions, age-appropriate design features, and specialized platforms for children. A few participants noted the importance of educating users on how platforms operate and were supportive of establishing further educational campaigns and resources to inform users on how the online space operates.

Participants indicated the importance for online safety legislation to account for the differences between different platforms. Participants noted many factors to consider when evaluating the differences between platforms including the control a platform has over its content, the purpose of the platform, and the user base. Participants were supportive of a risk-based approach that is flexible and reflects the diversity within the industry.

Many participants sought greater clarity over definitions and obligations they would have in a potential legislative framework in order to be able to implement and meet their obligations. Participants were concerned that definitions may be too broad which would lead to challenges when operationalizing the definitions into their content moderation processes. Participants expressed a desire for flexibility in the ways they can meet their obligations to ensure the framework is adaptable to the different types of platforms. Several participants flagged the importance of including intermediary liability protection in a legislative framework.

Participants sought clarity over the role and responsibilities of a digital safety regulator. Participants noted the regulator should have the necessary resources to handle the workload it will have to manage and should have a strong understanding of the tech industry. Participants were concerned that a regulator may not have the capacity to handle the high volume of complaints filed by users. Several participants expressed reservations on the idea of reinstating previously removed content they deemed in violation of their platforms’ terms and conditions.

Indigenous Communities

On November 24, Archipel - a third-party organization, in collaboration with Canadian Heritage, held a sharing circle with Indigenous victims and advocacy groups on the key elements of an online safety framework.

Next Steps

The feedback gathered during the roundtables provided valuable information and insights on what Canadians and key stakeholders are hoping to see included in future legislation. Canadian Heritage is drawing upon the information collected during the roundtables in developing policy and legislation. Pursuant to his mandate letter, the Minister of Canadian Heritage continues with the Minister of Justice and Attorney General of Canada to work to table legislation that aims to protect children, communities, equity-deserving groups, and Canadians online as soon as possible.

Annex

Organizations that participated in the roundtables on online safety from July to November 2022

Regional Roundtables

Charlottetown, Prince Edward Island
  1. Black, Indigenous, People of Colour United for Strength, Home Relationship
  2. Native Council of Prince Edward Island
  3. Women’s Network PEI
  4. Charlottetown Chamber of Commerce
  5. Adventure Group
  6. P.E.I.'s Public School Branch
  7. Charlottetown Police
  8. PEI Transgender Network
  9. Muslim Society of P.E.I.
Moncton, New Brunswick
  1. Moncton Muslim Association
  2. La Société de l’Acadie du Nouveau-Brunswick
  3. Tiferes Israël Synagogue
  4. New Brunswick Multicultural Council
  5. Fédération des jeunes francophones du Nouveau-Brunswick
  6. YWCA
  7. Indo-Canada Association of the Greater Moncton Area
  8. Pays de la Sagouine
St.John’s, Newfoundland and Labrador
  1. The Pathways Foundation
  2. First Light: St. John’s Friendship Centre
  3. Lifewise NL
  4. NL Sexual Assault Crisis and Prevention Centre
  5. Muslim Association of Newfoundland and Labrador
  6. THRIVE Community Youth Network
  7. YWCA
  8. Muslim Association of Newfoundland and Labrador
Quebec City, Québec
  1. Programme d’encadrement clinique et d’hébergement
  2. Centre culturel islamique de Québec
  3. SOS suicide jeunesse
  4. Children first Canada
  5. R.I.R.E.
  6. GRIS-Québec
Montreal, Québec
  1. Holocaust education & genocide prevention foundation
  2. Les AMIS/FRIENDS
  3. Canadian Muslim Forum
  4. Montreal Institute for Genocide and Human Rights Studies, Concordia University
  5. Canadian Council of Muslim Women- Montreal Chapter
  6. Jewish Community Council of Montreal
  7. Service à la famille chinoise du grand Montréal
  8. La ligue des Noirs du Québec
  9. Fondation Marie-Vincent
  10. Fondation Jasmin Roy
  11. Les 3 sex
  12. Centre for Israel & Jewish Affairs (CIJA)
  13. Centre de recherche-action sur les relations raciales
Edmonton, Alberta
  1. Mexican Society of Edmonton
  2. LEAF Edmonton
  3. Ribbon Rouge Foundation
  4. Support our Students
  5. REACH Edmonton
  6. Young Canadian’s Parliament
  7. National Council of Canadian Muslims
  8. Islamic Family and Social Services Association
  9. Stride Advocacy
  10. Organization for Prevention for Violence
  11. Don’t Click Youth Group
Saskatoon, Saskatchewan
  1. La Fédération des francophones de Saskatoon
  2. National Council of Canadian Muslims
  3. Concentus Citizenship Education Foundation
  4. Out Saskatoon
  5. Truly Alive Youth and Family Foundation
  6. Saskatoon Indian and Métis Friendship Centre
  7. Saskatchewan Human Rights Commission
Niagara, Ontario
  1. TOES Niagara
  2. Welland Heritage Council and Multicultural Centre
  3. YWCA Niagara
  4. Niagara Region Anti Racism Association
  5. Gillian’s Place
  6. Brock University -Human Rights and Equity Office
  7. Pathstone Mental Health
  8. Future Black Female
  9. Niagara Sexual Assault Centre
Surrey, British Columbia
  1. Amanda Todd Legacy Society
  2. OpenMedia
  3. Organized Crime Agency of BC
  4. Constituency Youth Council
  5. Digital Democracies Institute at Simon Fraser University
  6. University of British Columbia Library (emeritus)
  7. Surrey Schools
  8. KidsPlay Foundation
Winnipeg, Manitoba
  1. N.E.E.D.S Inc
  2. Bilal Community and Family Centre Inc
  3. Magnet Strategy Group Inc.
  4. University of Manitoba
  5. Winnipeg RISE
  6. Teen Stop Jeunesse Inc
  7. General Counsel at Canadian Centre for Child Protection
  8. Ethnocultural Council of Manitoba
  9. University of Manitoba Students' Union
  10. Islamic Social Services Association
  11. University of Winnipeg
  12. Youth Parliament of Manitoba
  13. African Communities of Manitoba Inc
Windsor, Ontario
  1. Big Brothers Big Sisters Windsor Essex
  2. India Canada Association Windsor
  3. Épelle-Moi Canada
  4. Essex County Chinese Canadian Association
  5. New Beginnings Essex County
  6. Hiatus House
  7. The Safety Village
  8. The Windsor Youth Centre
  9. Windsor Essex Child/Youth Advocacy Centre
  10. Windsor Women Working With Immigrant Women
Halifax, Nova Scotia
  1. Atlantic Region Association of Immigrant Serving Agencies
  2. Sabeel Muslim Youth and Community Center
  3. Saint Mary University’s Muslim Society
  4. Atlantic Jewish Council
  5. Hate Crime Unit, Halifax Regional Police
  6. Black Cultural Centre for Nova Scotia
Northern Canada
  1. Canadian Mental Health Association, Yukon Division
  2. Yukon Status of Women Council
  3. Pinnguaq Association
  4. Nunavut Black History Society
  5. Aqqiumavvik Society
  6. Donald Suluk Library
  7. Arctic Afro Cultural Association
  8. Black Advocacy Coalition
  9. City of Iqaluit
  10. Northern Mosaic Network

Thematic Roundtables

Antisemitism
  1. Centre for Israel and Jewish Affairs
  2. Zelikovitz centre for Jewish studies
  3. Canadian Anti-Hate Network
  4. B'nai Brith Canada
  5. Atlantic Jewish Council
  6. Areto Lab
  7. Jewish Federation of Ottawa
  8. Laad Canada
  9. Hillel Ontario
  10. JSpaceCanada
  11. Friends of Simon Wiesenthal Center for Holocaust Studies
Anti-Black Racism
  1. African Canadian Civic Engagement Council
  2. Black Opportunity Fund
  3. Black Diplomats Academy Fellow
  4. Black Cultural Centre for Nova Scotia
Islamophobia
  1. Centre Culturel Islamique de Québec
  2. Coalition of Muslim Women of Kitchener-Waterloo
  3. National Council of Canadian Muslims
  4. Muslim Advisory Council of Canada
  5. Canadian Council of Muslim Women
  6. African Civic Engagement Council
  7. Islamic Society of Kingston
  8. Canadian Muslim Forum
  9. Canadian Arab Institute
  10. National Black Coalition of Canada Society/Somali Canadian Cultural Society Edmonton
Anti-Asian Racism
  1. Chinese Canadian National Council for Social Justice
  2. Centre Sino-Québec de la Rive-Sud
  3. ACCT Foundation
  4. SUPRE Incorporated
  5. Canada Committee 100 Society
  6. Hong Fook Mental Health Association
  7. Chinese Canadian Heritage and Future Foundation
  8. Canada Race Relation Foundation
Women and Gender-Based Violence
  1. Ending Violence Association
  2. Canadian Women’s Foundation
  3. Women’s Shelters Canada
  4. YWCA
  5. LEAF
  6. Canadian Alliance for Sex Work Law Reform
  7. Enchanté Network-Le Réseau Enchanté
  8. The Canadian Centre for Gender & Sexual Diversity
  9. Justice Trans
  10. Council of Agencies Serving South Asians
  11. Canadian Council of Muslim Women
  12. Luke’s Place
  13. Ottawa U
  14. BC Society Of Transition Houses
  15. Women’s Health in Women’s Hand
  16. White Ribbon
Tech Industry
  1. TikTok
  2. Google Canada
  3. YouTube Canada
  4. Microsoft
  5. Wattpad/Rubicon Strategy
  6. Tucows
  7. TechNation
  8. Competitive Network Operators Canada
  9. Entertainment Software Association of Canada
  10. Wikimedia Foundation
  11. Twitch
  12. Meta
  13. MindGeek

Page details

Date modified: