Remarks by Rumina Velshi at the 2021+1 ICRP Symposium



Good morning.

Before beginning my remarks, I would like to acknowledge that we are gathered today on the shared, unceded and ancestral territories of the Musqueam, Squamish and Tsleil-Waututh Nations.

I want to start today’s proceedings and my remarks with some questions. What is safe? How do we decide what actions are safe enough? When does a reasonable risk veer into an unreasonable one? Who makes this decision?

In the case of the use of nuclear energy and substances in Canada, these questions fall to the Canadian Nuclear Safety Commission. The CNSC is Canada’s nuclear regulator and is led by the Commission, supported by more than 800 scientific, technical, and professional staff. We regulate the full lifecycle of nuclear facilities and activities in Canada.

The Commission is an independent quasi-judicial administrative tribunal, structured to minimize inappropriate influence from Government.

The Commission makes decisions on licensing as well as on significant regulatory framework issues. This includes performing a challenge function to licensees, proponents, and our own staff in order to ensure the protection of people and the environment from the potential harms of nuclear materials and substances.

Our job as a Commission is to prevent unreasonable risks – to people, to the environment, and to Canada’s national security. Risk management is our business.

I want to speak to you today about how our unstated assumptions – our values and ethics, our personal risk tolerances, and our subject matter expertise – influence the decisions we make and the actions we take. More and more, those in expert roles are being called on to share the rationale for their thinking and their recommendations. Public trust in expertise is under assault. It’s vital that we all make concerted efforts to meet what the moment demands.

I will offer my thoughts on how international organizations like the ICRP and regulators like the CNSC can be more transparent with their thinking, and can work to build trust in their recommendations, decisions, and actions.

I will also speak to my views on key focuses for the revision to the ICRP’s System of Radiological Protection, and the opportunities it offers to combine it with local considerations such as the integration of Indigenous Knowledge to bring the best recommendations to life in supervision and regulation. More and more, we must be mindful that taking a more holistic approach in decision-making – above and beyond science alone – is what is expected in meeting the needs of those we serve.

What is ‘safe’?

So let me begin by returning to my original question. What is safe?

Safe is a relative term. It is subjective and depends on our individual perspective of risk, meaning that people will often have different estimates of what is safe when confronted with the same risk. There is no such thing as absolute safety.

Consider a vehicle travelling at a speed of 100 kilometres per hour. On a back road, that is excessive. But on a highway? That’s normal. It may in fact be more dangerous to travel at a lower speed in that environment.

Vaccines are not to be administered without stringent testing, and as with any medical intervention carry risks. But weighed against the spectre of COVID-19, the vaccines are safe. I will return to COVID-19 later in my talk, but again – the central point is that absolute safety never exists. ‘Safe’ is a relative term.

Our own enabling legislation – the Nuclear Safety and Control Act – does not define ‘safe’. So how do we judge what we deem to be ‘safe’?

What is risk?

We make a risk assessment. We weigh outcomes against one another, based on the best available information and the context in which we operate.

The traditional formula for risk is “Impact times Probability = Risk”. Risk is the product of the likelihood of an event occurring multiplied by the consequences posed by that event.

But after we do the calculation how do we decide what is a reasonable versus an unreasonable risk?

This is where values & ethics enter the discussion. The calculus of risk appetite – the amount of risk we are willing to bear – is ultimately a product of our values, our ethics, and our personal circumstances.

How values and ethics inform our actions

The CNSC mandate is to prevent ‘unreasonable risk’. Implicit in that mandate is that Commission Members weigh competing information and views filtered through their values & ethics.

And this is critical – we are not automatons, crunching numbers and setting simple thresholds. Each Commission Member brings to the table a unique experience and perspective.

And we have wrestled with these issues over time. In regulatory decision-making, science is but one aspect of several to consider.

One example that comes to mind is from 2011.

Bruce Power, a licensee which operates one of the largest nuclear generating stations in the world, sought permission to transport 16 decommissioned steam generators to Sweden for recycling.

The Commission would not have even had to render a decision if the steam generators fit in approved transport packaging, but being the size of a school bus, they required a special arrangement.

Each steam generator weighed 100 tonnes. But the radioactive substances measured less than 4 grams by weight and carried less radioactivity than an historic nuclear-powered pacemaker.

All doses –whether it be to members of the public driving or walking by the steam generators while in transit, the driver of the vehicle, or the ship crew members for the duration of the trip from Canada to Sweden – were estimated to be well below regulatory limits, and in the case of the public, negligible.

All openings within the steam generators were welded shut, such that the vessel served as its own containment and shielding.

The technical case was entirely sound. The Commission decided that the transit could be made safely.

But this was not before a Commission hearing that included 77 intervenors with a variety of interests – many of whom fundamentally believed that the transit was unsafe, that the process for approval was out-of-order, and that approving this one shipment might lead to many more.

CNSC staff were called before municipal councils and before Parliamentary Committees to explain the Commission’s decision.

Concerns abounded about drinking water contamination, the potential for road or marine accidents, and approaches for nuclear waste minimization.

Ultimately, Bruce Power decided not to proceed in spite of having received Commission approval to proceed. Their decision was taken as a result of public concerns – and not the science.

Concerns about radioactivity are understandable. It is perhaps one of the most concerning hazards that exist – I would suggest precisely because people do not commonly understand it. It is intangible.

I come to this conclusion because these steam generators were far from the only dangerous good transported by water. In 2009, 481,000 tonnes of gasoline was transported on the St. Lawrence Seaway and the Great Lakes. And not just gasoline.

Almost 21,000 tonnes of sulphuric acid. 3,200 tonnes of fertilizer. 14,000 tonnes of biofuels. And a further 638,000 tonnes of road fuels and petroleum oils.

All with much greater possibilities of disaster. But with minimal instances of emergency events.

What is it about this particular shipment – of 1,600 tonnes of steel, of which 64 grams, or approximately 4 millionths of 1 percent in total was irradiated – that led to such concern?

I believe that for too long, we have ignored the anxieties and perceptions that so often come with nuclear. We have simply pushed them to the side as ‘unscientific’.

And when I started drafting these remarks, it is this case I returned to frequently to think about how values & ethics drive us.

In my view, one of the values driving some of these concerns ultimately related to anti-nuclear thought. But above and beyond that, you can tease out some of the values driving concern: environmental protection; clean drinking water; a belief that those who create waste should manage it entirely themselves; safety; security.

And so, in sharing this example, I want to emphasize the importance of being clear on the rationale for our decisions and thoughts – expressed in terms of values & ethics – as a key method for fostering public trust and bringing people with us in our decision-making. This means going beyond “the science” and into the realm of values and the factors behind how people make decisions every day. It is very rarely a simple mathematical risk calculation.

We should seek to demonstrate how our decisions have espoused the values of those concerned. Perhaps the Commission’s decision on the steam generators could have better demonstrated that the Commission shared many of the values driving the concerns – for example, the importance of clean drinking water – and the scientific information that the decision was based on was filtered through those values.

More and more, I have been thinking about the need to better document both what we recommend as experts – and do a better job explaining why we recommend what we do. Again – through the lens of values and ethics.

And the reason I have been thinking about this more and more is due to a mounting threat in society: that of disinformation and misinformation.

The threat of disinformation and misinformation

Misinformation has of course been brought to the fore by recent events such as the COVID-19 pandemic and the ongoing Russia-Ukraine conflict.

During the early months of the COVID-19 pandemic, 90% of Canadians used online sources to find information about COVID-19. Of these, a striking 96% of users said they saw COVID-19 information they believed to be misleading, and 40% of respondents admitted to believing information that later turned out to be false.

As the pandemic progressed, misinformation became a more distinct threat to population health in the form of vaccine misinformation. The scientific consensus is clear that vaccination is the best way to fight back against COVID-19.

Misinformation gained a greater foothold in the population as recommendations changed over time regarding vaccines. At first viewed as a method for preventing infection, over time it became clear that while vaccines may not prevent against infection, they were still the best option for preventing severe outcomes.

In Canada, the story of the AstraZeneca vaccine further complicated matters. As stories of vaccine-induced blood clots circulated, guidance shifted to providing the AZ vaccine to only those at a reduced risk for these outcomes. Eventually, AZ administration was largely discontinued in Canada.

Unfortunately, some took this as evidence that the vaccines were inadequately tested, and vaccine hesitancy grew – spurred on by disinformation and misinformation.

What was missing here was clear: an acknowledgement of uncertainty. Scientists were working at warp speed to develop treatments and vaccines. Real-world evidence was informing public policy and public health interventions in real-time. And it paid dividends.

I am reminded of Dr. Michael Ryan, Executive Director of the World Health Organization’s Health Emergencies Programme saying in the early days of the pandemic that:

“If you need to be right before you move, you will never win ... Speed trumps perfection.”

This is what I mean when I say that we must be clear on the values driving recommendations and decisions. In the situation we were facing with COVID, an urgent response was the priority.

Implicit in expressing our values is also being open and honest about uncertainty.

A key focus of the CNSC in the past few years has been examining how to build and maintain trust in a frazzled information environment. What some have termed an ‘infodemic’.

What we have come back to is the importance of being clear where we are making a value judgment.

It means being clear that we understand that ‘reasonable’ and ‘unreasonable’ risks are value judgments, based on expert opinion, yes, but also our varied experiences and our sensitivity to the context.

At the CNSC, we are refocusing our attention on long-term engagement with key stakeholders and expanding our outreach in communities impacted by the decisions the Commission makes.

And building trust – by sharing what has informed our decisions – is at the core of that work.

Sharing the basis of our decisions – and communicating the science that has informed it – allows people to understand the underlying thinking. It teaches them about the considerations we make.

It equips them to think twice when they potentially encounter disinformation and misinformation.

And it sets the bar higher for those who would spread disinformation and misinformation to outline what has informed the information they are providing.

This is a key aspect of how we can better equip the population and fight misinformation and disinformation.

And I am speaking today to issue the same challenge to all of you.

Explain your thinking.

Share your uncertainties.

I am deeply encouraged by the ICRP’s commitment to encouraging elaboration and documentation of how recommendations have been informed by ethical and value judgments. This is precisely the direction we need to go in.

Tell people how you’ve come to your decision, recommendation, or action.

Bring them along with you.

Disinformation and misinformation are challenges we must all confront as scientists, as regulators, and as practitioners. We cannot afford to ignore it. Where we stay silent, others will fill the void.

Optimisation and questioning our past assumptions


In thinking about how to address disinformation and misinformation, particularly through sharing our thinking, I believe that part of our work involves some introspection.

Inherent to sharing our thinking, our assumptions, and our values is ensuring that we are making the right call. It acts as a check on our instincts – typically, to follow past practice.

And so, in preparing these remarks I began to think about the three fundamental principles of the System of Radiological Protection: justification, optimisation, and dose limitation.

I want to speak now about optimisation, where I think we need to question our past assumptions.

As you know, the process for optimisation of protection expects that the likelihood of incurring exposures, the number of people exposed, and the magnitude of their individual doses should be kept as low as reasonably achievable, taking into account economic and societal factors.

Part of questioning our past assumptions involves asking uncomfortable questions. Like: does ALARA continue to be the right guidepost?

I am encouraged by the inclusion of this question in the Publication launching the review of the System of Radiological Protection.

The right answer is likely not a wholesale abandoning of this principle – but rather, embracing nuance. It is important that we consider a wide variety of factors when determining whether an activity is optimized, particularly in healthcare settings but also more generally.

From the regulators’ perspective, I acknowledge that we haven’t always done an ideal job in applying a graded approach. Part of this difficulty lies in what ALARA seems to demand: a dose as low as reasonably achievable. Implicit in this is an understanding that the acceptable dose is contingent on the purpose and effects. But this is tough to tease out. The revision to the System of Radiological Protection could well benefit from clearer guidance on what we mean when we say ‘reasonably achievable’. Regulators need to do some harder thinking as well on the scope of factors when considering the reasonableness of a dose.

I am encouraged by the workplan of Task Group 114, which will include reviewing the approaches used in other fields of toxicology. I propose to you that other relevant fields could include the regulation of pesticides – a field where similar considerations of ‘the appropriate dose’ matter a great deal – as well as pharmaceuticals regulation.

I also think that this is an appropriate time to consider if being so stringent in our regulatory approaches has contributed to fears of radiation?

While uncertainty of risk remains with regards to stochastic effects from radiation exposure, we have learned a great deal more over time and continue to do so.

As scientists, practitioners, and policymakers, we need to be sensitive to the authority of our statements.

The revision to the System of Radiological Protection is an ideal time to consider whether our stern approach has fostered the fear we see embodied in recent documentaries like Meltdown: Three Mile Island.

Three Mile Island is an excellent example of minimal risk plus poor communication resulting in significant safety concerns and fears in the general public.

Fear of radiation and its supposed far-reaching effects should prompt some introspection on our approach and the way we speak of radiation.

As we move forward, it is also important for the ICRP and member state regulators to consider the role of Traditional – or Indigenous – Knowledge in decision-making. Recommendations, from organizations like the ICRP for example, are of course based on science. But in tandem with the assumptions underpinning the science, incorporating Traditional Knowledge is necessary for regulatory decision-makers to help build trust and ensure safety that is sensitive to the needs of diverse populations.

For the CNSC, building relationships and trust with Indigenous Peoples is a priority as Canada continues on a path of reconciliation.

The CNSC has published an Indigenous Knowledge Policy Framework, which outlines how Indigenous Knowledge should be considered in our evaluations of information and decision-making processes.

And we are signing Terms of Reference for Long-Term Engagement with Indigenous Nations and communities with whom we are seeking to build relationships.

It is critical that we engage with the people affected by our decisions – to build trust, to ensure we reflect the best available information, and to position ourselves to better understand concerns and misgivings.


In closing, let me elaborate on the key messages which I want to leave you with.

First, it is more critical than ever that guidance be clear about the ethics, values, societal norms and assumptions underpinning recommendations and decisions. Clarity in science communication and how decisions are made will help to build trust in our decisions as professionals and as regulators.

There can be no uncertainty that in every decision we make, the benefit must outweigh the risk.

Second, clarity of communication – including our too often unstated assumptions – is critical to combatting disinformation and misinformation. As a professional community, we cannot shy away from this discussion. Action is needed across the spectrum, and it starts with us explaining our thinking and our actions. Clearly and completely.

Finally, it is time for some introspection on the System of Radiological Protection, particularly regarding optimisation. We need to spend some time thinking about whether we have contributed to the fear that people feel about radiation.

I close with a call to action. For too long, scientists and policymakers have relied on words like ‘based on the best available evidence’ and have failed to consistently acknowledge uncertainty. Where uncertainty exists, we often rely on our values & ethics to drive the decision we take.

As professionals, we owe it to ourselves and to the people we serve to more clearly elaborate on how and why we make the decisions and recommendations we make. Moving ahead, we should consider it our shared professional obligation.

Thank you for your time this morning and I welcome your questions and comments.

Page details

Date modified: