Summary Report of the Meeting of the National Security Transparency Advisory Group – March 10, 2023
Held via Videoconference
- Lesley Soper (co-chair)
- Jeffrey Roy
- Stéphane Leman-Langlois
- Mary Francoli (co-chair)
- Lorelei Williams
- Rizwan Mohammad
- Amira Elghawaby
- Daniel Jean
- Chantal Bernier
- John Ariyo
- Jillian Stirk
- “The Use of Digital Tools in the Protection of National Security”
Invited Guests and Speakers:
- Wendy Cukier – Founder and Academic Director of the Diversity Institute, Academic Director of the Woman Entrepreneurship Knowledge Hub, and Research Lead of the Future Skills Centre
- Sarah Shoker – Research Scientist at OpenAI
National Security Community Members Present (as observers):
Canada Border Services Agency (CBSA), Communications Security Establishment (CSE), Financial Transactions and Reports Analysis Centre of Canada (FINTRAC), Global Affairs Canada (GAC), Public Safety Canada (PS), Royal Canadian Mounted Police (RCMP), Transport Canada (TC), Canadian Security and Intelligence Service (CSIS), Treasury Board of Canada Secretariat (TBS), Canadian Armed Forces (CAF), Department of National Defence (DND).
- Opening of the Meeting and Roll Call
- Guest Speaker Session: “The Use of Digital Tools in the Protection of National Security”
- Private NS-TAG Discussion
The twenty-fourth virtual NS-TAG meeting took place on March 10, 2023 on the theme of “The Use of Digital Tools in the Protection of National Security”. The session focused on exploring measures that can be established to ensure that the national security community is increasing its knowledge of, and ability to use, digital tools during times of rapid technological change. The guest speakers also explored how GBA+ methodology can minimize the adverse impacts that new technologies could have on diverse communities.
Key Takeaways of the Discussion Session:
- The discussion began with guest speakers addressing how gaps in diversity within the national security community affect organizational development. NS institutions are struggling to fully understand and mitigate challenges posed by the pace of technological change. This is particularly evident in hiring and recruitment processes.
- Guest speakers asserted the need for a concrete long-term goal to successfully fill diversity gaps. This would involve building trust with marginalized communities, and re-evaluating bilingual requirements in the recruitment of employees from equity deserving communities.
- NS-TAG members highlighted the occupational segregation that exists within bureaucratic systems, particularly at senior levels. This is an issue that can be addressed by changing hiring initiatives for new members and ensuring that the public service represents Canadian society.
- NS-TAG members also raised issues of religious diversity within companies, specifically how Muslim Canadians have been adversely affected by technology and its expanded use. Some members noted that when engaging in conversations about national security, it is not uncommon to see forms of Islamophobia exhibited.
- The guest speakers suggested two approaches to tackle this issue:
- Adopting a trauma-informed approach when working with communities with a history of intergenerational discrimination; and,
- Ensuring programs are designed with an inclusive approach in every facet of their organization's structure. This should include considerations such as the language used in the job description, the treatment of candidates in interviews, and the interpretation of behaviors.
- The discussion transitioned to exploring the risks associated with AI tools in national security agencies. The guest speakers highlighted that the integration of AI tools in language learning apps and tools has the tendency to promote bias. Factors such as geographical location, community size, and language dialects present direct challenges to the performance of AI.
- Trust and reliance on AI tools, as well as mis-/dis-/malinformation, were also flagged as areas of concern. Since there is no international code of conduct governing the use of AI tools, AI can pose risks to government officials looking for ways to mitigate threats to social cohesion.
- When asked about what methods international jurisdictions can use to monitor algorithms and guard against AI bias, the guest speakers suggested using algorithm auditing and external review bodies. Through these methods, external auditors can access AI systems to keep track of the level of document input and output. This would help identify gaps or biases that exist within the system which could be used to take corrective action.
- The discussion concluded with members of the NS-TAG reiterating the importance of accountability in filling the gaps of diversity in bureaucratic employment processes and in utilizing AI tools. They emphasized the need to initiate accountability measures not only by improving internal review systems, but also through expanded consultation with external stakeholders.
- Date modified: