Page 6: Guidelines for Canadian Drinking Water Quality: Guideline Technical Document –  Radiological Parameters

7.0 Treatment technology

Several technologies exist to remove radiological contaminants from drinking water to levels considered to be protective of human health. It should be noted that the chemical characteristics of tritium prevent its removal from water, emphasizing the need to prevent contamination of the source water.

Because of the various factors influencing the removal of the various radionuclides, such as the chemical characteristics of a specific water and the nature of the contaminants, testing of any potential treatment process should be conducted with the actual water to be treated before the final selection of a treatment technology. Handling and disposal of waste produced by the treatment need to be carefully addressed when removing radionuclides from drinking water.

7.1 Municipal scale

Most radionuclides can be effectively treated in municipal-scale treatment facilities. Generally, the technologies recognized for the removal of radionuclides at municipal-scale treatment plants are ion exchange, reverse osmosis and lime softening (U.S. EPA, 2000b). Removal efficiency is influenced by the specific chemical characteristics of the water. Reported removal efficiencies by reverse osmosis varied from 70 to 99% (Annanmäki, 2000; Health Canada, 2004). The ion exchange process is also dependent on the specific ion exchangers utilized, and the removal efficiency was reported as high as 95% (U.S. EPA, 2000a). Reported efficiency for lime softening treatment for the removal of radium varied from 80% to 95% (U.S. EPA, 2000a). For artificial radionuclides such as tritium, the strategy should be to prevent contamination of the source water.

Technologies recognized specifically for the removal of both 226Ra and 228Ra, in addition to ion exchange, reverse osmosis and lime softening, are greensand filtration, precipitation with barium sulphate, electrodialysis/electrodialysis reversal and hydrous manganese oxide filtration (U.S. EPA, 2000b). The ion exchange process to remove radium uses a cation exchange medium rejuvenated by an acid solution. The acid solution is prepared with common salt (sodium chloride) or, alternatively, with a potassium salt (Annanmäki, 2000).

In addition, technologies recognized specifically for the removal of uranium are activated alumina (reported removal efficiency up to 99%) and enhanced coagulation and filtration (U.S. EPA, 2000b). The ion exchange process to remove uranium requires ion exchange media rejuvenated by a strong base solution. A strong base solution with sodium hydroxide is often preferred (Annanmäki, 2000).

The U.S. EPA lists high-performance aeration as the best available technology for the removal of radon in groundwater supplies. High-performance aeration methods include packed tower aeration and multistage bubble aeration and can achieve up to 99.9% removal. However, these methods may create a large source of airborne radon.

Adsorption via granular activated carbon (GAC), with or without ion exchange, can also achieve high radon removal efficiencies, but is less efficient and requires large amounts of GAC, making it less suitable for large systems. GAC and point-of-entry GAC may be appropriate for very small systems under some circumstances (U.S. EPA, 1999). Two potential concerns associated with this technology are the elevated gamma radiation fields that develop close to the column and the waste disposal difficulties associated with treatment residuals.

Wastewater and solid waste produced by drinking water treatment need to be evaluated for their radiation level and disposed of as per the applicable regulation.

7.2 Residential scale

Municipal treatment of drinking water is designed to reduce contaminants to levels at or below their guideline values. As a result, the use of residential-scale treatment devices on municipally treated water is generally not necessary, but is primarily based on individual choice. In cases where an individual household obtains its drinking water from a private well, residential drinking water treatment devices may be an option for removing radionuclides from the water.

Residential treatment devices are available that are affordable and can remove some radionuclides from drinking water to make it compliant with the applicable guidelines. Periodic testing by an accredited laboratory should be conducted on both the water entering the treatment device and the water it produces to verify that the treatment device is effective. Devices can lose removal capacity through usage and time and need to be maintained and/or replaced. Consumers should verify the expected longevity of the components in their treatment device as per the manufacturer's recommendations.

The most common types of treatment devices available for the removal of radionuclides from drinking water in a residential setting are ion exchange and reverse osmosis systems (Health Canada, 2004). Efficiencies for point-of-use and point-of-entry treatment technologies are generally similar to those for municipal-scale treatment. Water softeners are an ion exchange technology that can remove radionuclides (but note that individuals on sodium-restricted diets should consult their physician before drinking artificially softened water). Similar to the municipal-scale ion exchange, the rejuvenating solution required could vary for different contaminants. For example, uranium removal requires a rejuvenating solution with a strong base, whereas radium removal requires an acid rejuvenating solution. Water softeners rejuvenated with common salt (sodium chloride) solution could add a significant amount of sodium to the water. This factor needs to be considered when selecting the most appropriate treatment process.

There is no certified treatment process for the removal of 210Pb. However, since the lead isotope behaves chemically like elemental lead, the treatment process for the removal of the stable lead would also remove the radioactive isotope. The treatment systems certified for lead removal are adsorption (i.e., carbon/charcoal), reverse osmosis, and distillation (NSF International, 2005a). Although there are currently no certified products for the reduction of uranium, reverse osmosis, distillation, or ion exchange resins should be able to remove uranium from drinking water.

A number of residential treatment devices are available to remove radon from drinking water to concentrations below 300 pCi/L (11 Bq/L). Filtration systems may be installed at the faucet (point-of-use) or at the location where water enters the home (point-of-entry). Point-of-entry systems are preferred for radon because they provide treated water for bathing and laundry as well as for cooking and drinking. In the case where certified point-of-entry treatment devices are not available for purchase, systems can be designed and constructed from certified materials.

Health Canada does not recommend specific brands of treatment devices, but it strongly recommends that consumers look for a mark or label indicating that the device has been certified by an accredited certification body as meeting the appropriate NSF International (NSF)/American National Standards Institute (ANSI) standard. These standards have been designed to safeguard drinking water by helping to ensure the material safety and performance of products that come into contact with drinking water. Certification organizations provide assurance that a product or service conforms to applicable standards and must be accredited by the Standards Council of Canada (SCC). In Canada, the following organizations have been accredited by the SCC to certify drinking water devices and materials as meeting NSF/ANSI standards:

An up-to-date list of accredited certification organizations can be obtained from the Standards Council of Canada.

Drinking water treatment devices certified to remove 226Ra from untreated water (such as from a private well) use ion exchange and reverse osmosis. Consumers should use only certified treatment devices; these are certified to reduce levels from an average influent (challenge) concentration of 25 pCi/L (925 mBq/L) to a maximum finished effluent concentration of 5 pCi/L (185 mBq/L) or less (NSF International, 2005b).

The treatment devices certified to remove radon are generally based on activated carbon adsorption technology. Certification for the removal of radon requires that the device be capable of reducing the concentration of radon from an influent (challenge) concentration of 4000 pCi/L (148 Bq/L) to a maximum final (effluent) concentration of less than 300 pCi/L (11 Bq/L) (NSF International, 2005b).

In treating drinking water with naturally occurring levels of radionuclides, the liquid and solid waste from point-of-use/point-of-entry treatment may generally be eliminated in sewer or septic systems for the liquids and in municipal landfills for the solids; however, consumers should consult with the appropriate authority before doing so.

8.0 Health effects

8.1 Dose concepts and units

Radiological protection requires the establishment of a link between exposure to radiation and biological outcomes. This link is provided by the absorbed dose or amount of energy imparted by ionizing radiation to a unit mass of tissue. The International System of Units (SI) unit for absorbed dose is the gray (Gy), where 1 gray is equal to 1 joule of energy absorbed per kilogram of tissue.

The extent of biological damage depends not only on the absorbed dose but also on the type or quality of the radiation. A given dose of alpha radiation, because of its higher ionization potential, will produce much more damage than the same dose of X-rays or gamma rays. To put all ionizing radiations on an equal footing in terms of biological harm, the ICRP (1996) has introduced a set of radiation weighting factors as follows:

  • 1 for beta rays, gamma rays, and X-rays;
  • 10 for neutrons; and
  • 20 for alpha particles.

An absorbed dose in grays multiplied by a radiation weighting factor gives an equivalent dose in sieverts (Sv). The sievert is actually a very large unit. Normal background radiation exposure in Canada is about 2-3 mSv/year. A single chest X-ray would give an exposure of about 0.01 mSv. For simplicity, equivalent dose will be referred to as dose throughout the rest of this document.

The radiation dose resulting from the ingestion of a radionuclide through food or drinking water depends on a number of factors, including:

  • the amount of activity taken into the body;
  • the energy and the weighting factor of the radiation;
  • the percentage uptake by the gastrointestinal tract;
  • the distribution of the radionuclide to the various organs of the body; and
  • the period of time during which the radionuclide remains in the body.

Depending on the chemical and biological properties of the radionuclide, it may persist in the body for times varying from days to years. Internal exposures are therefore measured in terms of the integrated or committed dose. Standard periods of integration are 50 years for the adult population and 70 years for a lifetime exposure. The ICRP has incorporated all of these factors into a set of dose coefficients for all the common radionuclides, which give the committed dose from the ingestion or inhalation of one unit of radioactivity.

8.2 Health effects from radiation exposure

Radioactivity refers to the particles that are emitted from nuclei as a result of nuclear instability. The most common types of radiation are alpha, beta, and gamma radiation. Alpha particles consist of two protons and two neutrons bound together into a particle identical to a helium nucleus; these particles are emitted by radioactive nuclei such as uranium or radium by a process known as alpha decay. Beta particles are high-energy, high-speed electrons or positrons emitted by certain types of radioactive nuclei, such as 40K; the production of beta particles is called beta decay. Alpha emitters and beta emitters differ in the magnitude of their biological effects. Alpha particles interact very strongly with human tissues through a transfer of energy. Beta particles interact less strongly than alpha particles, which allows them to travel farther through tissue before their energy is transferred. The difference between alpha and beta particle effects is the concentration of tissue damage. Alpha particles may damage many molecules over a short distance, whereas beta particles may damage molecules spread out over a greater distance. The extent of damage depends upon the energy emitted by individual alpha or beta particle species. Gamma radiation is a form of electromagnetic radiation or light emissions at a specific frequency resulting from subatomic particle interactions such as radioactive decay. Gamma radiation is generally considered as light having the highest frequency and energy as well as the shortest wavelength within the light spectrum. As a result of its high energy content, gamma radiation is able to cause serious damage when absorbed by living cells.

When ionizing radiation passes through matter, neutral atoms and molecules acquire an electric charge as a result of interactions in which small amounts of energy are transferred to the atoms of the material. If the material is body tissue, this can result in alterations of sensitive biological structures.

Radiation causes damage to human tissue or any other material through the ionization of atoms. Ionizing radiation absorbed by human tissue has enough energy to remove electrons from the atoms that make up molecules of the tissue. In very simple terms, when an electron shared by atoms forming a molecular bond is dislodged, the bond is broken, and the molecule falls apart. This process may occur by a direct "hit" to these atoms, or it may result indirectly by free radical formation due to irradiation of adjacent molecules. The most sensitive structure in the cell is the DNA molecule, which carries the genetic blueprint for the cell and, indeed, for the whole organism. If radiation damage to the DNA is not repaired, the cells may fail to survive or reproduce. If insufficient cells survive, then loss of tissue or organ function may occur. Alternatively, the damage may be incompletely or improperly repaired so that the cells continue to divide, but become transformed or cancerous.

There are basically two broad classes of radiation effects on the human body. The first class involves deterministic effects, which do not occur until the dose reaches a certain threshold level. Above this level, the effect will definitely occur, and the severity of harm will increase with dose. Deterministic effects include nausea, vomiting, diarrhoea, hair loss, haemorrhage, immune function loss, nervous system damage, and death. The threshold for these effects in humans is about 500 mSv delivered over a short period of time (hours to days). Fortunately, such doses are extremely rare and do not arise from environmental exposures, such as ingestion of radionuclides in drinking water.

The second class of effects is termed stochastic effects, which means that the likelihood of occurrence increases with the amount of radiation received. These effects may occur at doses well below the threshold for deterministic effects. The main stochastic effects in humans are cancer in the exposed individual and possible genetic effects in the offspring. The types of cancer most frequently associated with radiation exposure are leukaemia and solid tumours of the lung, breast, thyroid, bone, digestive organs, and skin. The latency period between exposure and recognition of a cancer can range from 5 years to several decades (ICRP, 1991). No conclusive evidence exists for hereditary radiation effects in humans, although experimental studies on plants and animals suggest that such effects occur.

Radiation-induced cancers are indistinguishable from those that occur from other causes. The correlation between radiation and cancer induction can be shown only in large populations of irradiated individuals as an increase of cancers over the background incidence. The main sources of epidemiological information on radiation risks and effects have come from studies of individuals or groups who have had relatively high exposures, such as:

  • atomic bomb survivors at Hiroshima and Nagasaki;
  • patients who received high radiation doses for diagnostic or therapeutic purposes; and
  • occupationally exposed workers, including uranium miners and radium dial painters.

Since it is impossible to establish with any certainty the shape of the dose-response relationship for stochastic effects, particularly at low doses and low dose rates, it is usually assumed that the frequency of their occurrence is linear with dose and without threshold (linear no-threshold hypothesis). The absence of a threshold implies that there is no dose, however small, that may be considered absolutely safe. This assumption is simple, and there is considerable evidence that it is conservative (i.e., it is more likely to overestimate than to underestimate the risk). In 1996, ICRP published a revised set of dose coefficients for individuals. This revision is based on more recent measurements and more accurate metabolic models for the uptake and retention of radionuclides by the various organs of the body. The revised dose coefficients used to derive the MACs in this technical document assume the linear no-threshold hypothesis.

The ICRP (1991) has determined the lifetime probability of fatal cancer induction following a single low-dose, low dose rate exposure to be 5% per sievert. If allowance is made for hereditary risks and for non-fatal cancers weighted for severity, then the lifetime risk rises to 7.3% per sievert. The CNSC has set a dose limit of 1 mSv/year for members of the public (for artificial sources excluding natural background radiation and medical treatments). The linear no-threshold hypothesis implies that exposure to 1 mSv/year of radiation would give a lifetime excess cancer risk of 7.3 × 10-5 per year of exposure. Radionuclides in drinking water are assessed based on a reference dose level that is one-tenth of this dose limit. Even if the linear no-threshold hypothesis is valid, any health effects produced at this low level of exposure would be lost in the statistical background of spontaneous occurrences.

8.2.1 Radon

No experimental or epidemiological studies have linked ingested radon with health impacts in humans, and it has generally been concluded from experimental animal studies that the risk from ingestion is insignificant compared with the risk from inhalation. The U.S. National Research Council (U.S. NRC, 1999a) estimates the risk of death from stomach cancer following lifetime exposure to 1 Bq/L of radon by ingestion to be 2.0 × 10-12 based on calculations with risk projection models for specific cancer sites. Radon consumed in water appears to rapidly enter the bloodstream from the stomach (Crawford-Brown, 1989), perfusing all the cells of the body (Gosink et al., 1990). As it is lipid soluble (IRC, 1928; von Dobeln and Lindell, 1965), it does not distribute evenly throughout the body (Hursh et al., 1965). Clearance of radon from the bloodstream is relatively rapid, with a half-time on the order of minutes (Underwood and Diaz, 1941; Lindell, 1968).

Most radon inhaled is exhaled and remains in the lungs for only a short period of time. The radon daughter 218Po is very reactive and electrostatically attracted to tiny particulates in air. These particulates are inhaled and deposited in the lung. Radon's daughters then decay sequentially, releasing damaging alpha and beta particles. Therefore, it is the radon progeny, not radon itself, that actually cause damage to the bronchial epithelium, because only the progeny remain in the lungs long enough to decay significantly.

Radon is classified as a human carcinogen (IARC, 1988). This classification is based on the strong evidence of lung cancers in underground miners exposed to high levels of radon (Lubin et al., 1994; U.S. NRC, 1999b). A combined analysis of 11 cohorts of 65 000 underground miners conducted by Lubin et al. (1994) with an update by the U.S. NRC (1999b) provides a thorough assessment of lung cancer risks associated with radon. These cohort studies show that about 40% of the greater than 2700 lung cancer deaths that occurred among the 65 000 miners were due to radon (Lubin et al., 1995). Extrapolating the miner cohort data to lower household radon levels results in an odds ratio of 1.12 (95% confidence interval = 1.02-1.25) per 100 Bq/m3.

More recent studies on residential exposure to lower levels of radon in indoor air provide more evidence of an association between lung cancer and radon exposure. Darby et al. (2005) published a combined analysis of 13 European studies involving 7148 cases of lung cancer with 14 208 matched controls. Their results show an odds ratio of 1.08 (95% confidence interval = 1.03-1.16) for every 100 Bq/m3 of radon in air. If these radon measurements are statistically adjusted for the effects of high outliers, then the odds ratio increases to 1.16 (95% confidence interval = 1.05-1.31) per 100 Bq/m3 of radon.

In a report by Krewski et al. (2005), seven North American studies involving 3663 lung cancer cases and 4966 matched controls were analysed; results also showed a significant association between household radon and lung cancer, with an odds ratio of 1.11 (95% confidence interval = 1.00-1.28) per 100 Bq/m3 of radon. If these data are restricted to subjects who had resided in only one or two houses in the 5- to 30-year period before recruitment and with at least 20 years of alpha-track monitoring data, then the odds ratio increases to 1.18 (95% confidence interval = 1.02-1.43) for every 100 Bq/m3 of radon.

The combined analyses by Darby et al. (2005) and Krewski et al. (2005), along with the downward extrapolation from the miner studies, indicate an excess relative risk of about 10% for every 100 Bq/m3 of radon in air.

Page details

Date modified: