Enteric Protozoa: Giardia and Cryptosporidium

Download the alternative format
(PDF format, 1.1 MB, 121 pages)

Organization: Health Canada

Type: Guideline

Published: 2019-04-12

Health Canada, Ottawa, Ontario
April, 2019

Table of Contents

Part I. Overview and Application

1.0 Guideline

Where treatment is required for enteric protozoa, the guideline for Giardia and Cryptosporidium in drinking water is a health-based treatment goal of a minimum 3 log removal and/or inactivation of cysts and oocysts. Depending on the source water quality, a greater log removal and/or inactivation may be required. Treatment technologies and source water protection measures known to reduce the risk of waterborne illness should be implemented and maintained if source water is subject to faecal contamination or if Giardia or Cryptosporidium have been responsible for past waterborne outbreaks.

2.0 Executive summary

Protozoa are a diverse group of microorganisms. Most are free-living organisms that can reside in fresh water and pose no risk to human health. Some enteric protozoa are pathogenic and have been associated with drinking water related outbreaks. The main protozoa of concern in Canada are Giardia and Cryptosporidium. They may be found in water following direct or indirect contamination by the faeces of humans or other animals. Person-to-person transmission is a major route of exposure for both Giardia and Cryptosporidium.

Health Canada recently completed its review of the health risks associated with enteric protozoa in drinking water. This guideline technical document reviews and assesses identified health risks associated with enteric protozoa in drinking water. It evaluates new studies and approaches and takes into consideration the methodological limitations for the detection of protozoa in drinking water. Based on this review, the drinking water guideline is a health-based treatment goal of a minimum 3 log reduction of enteric protozoa.

2.1 Health effects

The health effects associated with exposure to Giardia and Cryptosporidium (oo)cysts, like those of other pathogens, depend upon features of the host, pathogen and environment. The immune status of the host, virulence of the strain, infectivity and viability of the cyst or oocyst, and the degree of exposure are all key determinants of infection and illness. Infection with Giardia or Cryptosporidium can result in both acute and chronic health effects.

Theoretically, a single cyst of Giardia would be sufficient to cause infection. However, studies have only demonstrated infection being caused by more than a single cyst, and the number of cysts needed is dependent on the virulence of the particular strain. Typically, Giardia is non-invasive and results in asymptomatic infections. Symptomatic giardiasis can result in nausea, diarrhea (usually sudden and explosive), anorexia, an uneasiness in the upper intestine, malaise and occasionally low-grade fever or chills. The acute phase of the infection commonly resolves spontaneously, and organisms generally disappear from the faeces. Some patients (e.g., children) suffer recurring bouts of the disease, which may persist for months or years.

As is the case for Giardia and other pathogens, a single organism of Cryptosporidium can potentially cause infection, although studies have only demonstrated infection being caused by more than one organism. Individuals infected with Cryptosporidium are more likely to develop symptomatic illness than those infected with Giardia. Symptoms include watery diarrhea, cramping, nausea, vomiting (particularly in children), low-grade fever, anorexia and dehydration. The duration of infection depends on the condition of the immune system. Immunocompetent individuals usually carry the infection for a maximum of 30 days. In immunocompromised individuals, infection can be life-threatening and can persist throughout the immunosuppression period.

2.2 Exposure

Giardia cysts and Cryptosporidium oocysts can survive in the environment for extended periods of time, depending on the characteristics of the water. They have been shown to withstand a variety of environmental stresses, including freezing and exposure to seawater. (Oo)cysts are commonly found in Canadian surface waters. The sudden and rapid influx of these microorganisms into surface waters, for which available treatment may not be adequate, is likely responsible for the increased risk of exposure through drinking water. (Oo)cysts have also been detected at low concentrations in groundwater supplies; contamination is usually as a result of the proximity of the well to contamination sources, including inadequate filtration in some geological formations, as well as improper design and/or maintenance of the well.

Giardia and Cryptosporidium are common causes of waterborne disease outbreaks and have been linked to both inadequately treated surface water and untreated well water. Giardia is the most commonly reported intestinal protozoan in Canada, North America and the world.

2.3 Analysis and treatment

A risk management approach, such as the source-to-tap approach or a water safety plan approach, is the best method to reduce enteric protozoa and other waterborne pathogens in drinking water. This type of approach requires a system assessment to characterize the source water, describe the treatment barriers that are in place, identify the conditions that can result in contamination, and implement the control measures needed to mitigate risks. One aspect of source water characterization is routine and targeted monitoring for Giardia and Cryptosporidium. Monitoring of source water for protozoa can be targeted by using information about sources of faecal contamination, together with historical data on rainfall, snowmelt, river flow and turbidity, to help identify the conditions that are likely to lead to peak concentrations of (oo)cysts. A validated method that allows for the simultaneous detection of these protozoa is available. Where monitoring for Giardia and Cryptosporidium is not feasible (e.g., small community water supplies), identifying the conditions that can result in contamination can provide guidance on necessary risk management measures to implement (e.g., source water protection, adequate treatment, operational monitoring, standard operating procedures and contingency plans).

Once the source has been characterized, pathogen reduction targets should be established and effective treatment barriers should be in place to achieve safe levels in the treated drinking water. In general, all water supplies derived from surface water sources or groundwater under the direct influence of surface waters (GUDI) should include adequate filtration (or equivalent technologies) and disinfection. The combination of physical removal (e.g., filtration) and inactivation barriers (e.g., ultraviolet light disinfection) is the most effective way to reduce protozoa in drinking water, because of their resistance to commonly used chlorine-based disinfectants.

The absence of indicator bacteria (e.g., Escherichia coli, total coliforms) does not necessarily indicate the absence of enteric protozoa. The application and control of a source-to-tap or water safety plan approach, including process and compliance monitoring (e.g., turbidity, disinfection conditions, E. coli), is important to verify that the water has been adequately treated and is therefore of an acceptable microbiological quality. In the case of untreated groundwater, testing for indicator bacteria is useful in assessing the potential for faecal contamination, which may include enteric protozoa.

2.4 Quantitative microbial risk assessment

Quantitative microbial risk assessment (QMRA) is a tool that uses source water quality data, treatment barrier information and pathogen-specific characteristics to estimate the burden of disease associated with exposure to pathogenic microorganisms in drinking water. QMRA is generally used for two purposes. It can be used to set pathogen reduction targets during the development of drinking water quality guidelines, such as is done in this document. It can also be used to prioritize risks on a site-specific basis as part of a source-to-tap or water safety plan approach.

Specific enteric protozoa whose characteristics make them a good representative of all similar pathogenic protozoa are reference protozoan pathogens. It is assumed that controlling the reference protozoan would ensure control of all other similar protozoa of concern. Giardia lamblia and Cryptosporidium parvum have been selected as the reference protozoa for this risk assessment because of their high prevalence rates, potential to cause widespread disease, resistance to chlorine disinfection and the availability of dose-response models.

2.5 International considerations

Drinking water guidelines, standards and/or guidance from other national and international organizations may vary due to the age of the assessments as well as differing policies and approaches.

Various organizations have established guidelines or standards for enteric protozoa in drinking water. The U.S. EPA generally requires drinking water systems to achieve a 3 log removal or inactivation of Giardia, and a minimum 2-log removal or inactivation of Cryptosporidium. More treatment may be required for Cryptosporidium depending on the results of Cryptosporidium monitoring in the water source. The World Health Organization recommends providing QMRA-based performance targets as requirements for the reduction of enteric protozoa. Neither the European Union nor the Australian National Health and Medical Research Council have established a guideline value or standard for enteric protozoa in drinking water.

3.0 Application of the guideline

Note: Specific guidance related to the implementation of drinking water guidelines should be obtained from the appropriate drinking water authority in the affected jurisdiction.

Exposure to Giardia and Cryptosporidium should be reduced by implementing a risk management approach to drinking water systems, such as the source-to-tap or a water safety plan approach. These approaches require a system assessment that involves: characterizing the water source; describing the treatment barriers that prevent or reduce contamination; highlighting the conditions that can result in contamination; and identifying control measures to mitigate those risks through the treatment and distribution systems to the consumer.

3.1 Source water assessments

Source water assessments should be part of routine system assessments. They should include: the identification of potential sources of human and animal faecal contamination in the watershed/aquifer; potential pathways and/or events (low to high risk) by which protozoa can make their way into the source water and affect water quality; and conditions likely to lead to peak concentrations. Ideally, they should also include routine monitoring for Giardia and Cryptosporidium in order to establish a baseline, followed by long-term targeted monitoring to identify peak concentrations during specific events (e.g., rainfall, snowmelt, low flow conditions for rivers receiving discharges from wastewater treatment plants). Where monitoring for Giardia and Cryptosporidium is not feasible (e.g., small community water supplies), other approaches, such as implementing a source-to-tap or water safety plan approach, can provide guidance on identifying and implementing necessary risk management measures (e.g., source water protection, adequate treatment, operational monitoring, standard operating procedures and contingency plans).

Where monitoring is possible, sampling sites and frequencies can be targeted by using information about sources of faecal contamination, together with historical data on rainfall, snowmelt, river flow and turbidity. Source water assessments should also consider the "worst-case" scenario for that source water. It is important to understand the potential faecal inputs to the system to determine "worst-case" scenarios, as they will be site-specific. For example, there may be a short period of poor source water quality following a storm. This short-term degradation in water quality may in fact embody most of the risk in a drinking water system. Collecting and analyzing source water samples for Giardia and Cryptosporidium can provide important information for determining the level of treatment and mitigation (risk management) measures that should be in place to reduce the concentration of (oo)cysts to an acceptable level.

Subsurface sources should be evaluated to determine if the supply is vulnerable to contamination by enteric protozoa (i.e., GUDI) and other enteric pathogens. These assessments should include, at a minimum, a hydrogeological assessment, an evaluation of well integrity and a survey of activities and physical features in the area. Chlorophyll-containing algae are considered unequivocal evidence of surface water. Jurisdictions that require microscopic particulate analysis (MPA) as part of their GUDI assessment should ensure chlorophyll-containing algae are a principal component of any MPA test. Subsurface sources determined to be GUDI should achieve a minimum 3 log removal and/or inactivation of enteric protozoa. Subsurface sources that have been assessed as not vulnerable to contamination by enteric protozoa, if properly classified, should not have protozoa present. However, all groundwater sources will have a degree of vulnerability and should be periodically reassessed. It is important that subsurface sources be properly classified as numerous outbreaks have been linked to the consumption of untreated well water contaminated with enteric protozoa and/or other enteric pathogens.

3.2 Appropriate treatment barriers

As most surface waters and GUDI supplies are subject to faecal contamination, treatment technologies should be in place to achieve a minimum 3 log (99.9%) removal and/or inactivation of Giardia and Cryptosporidium. In many surface water sources, a greater log reduction may be necessary.

Log reductions can be achieved through physical removal processes, such as filtration, and/or inactivation processes, such as UV light disinfection. Generally, minimum treatment of supplies derived from surface water or GUDI sources should include adequate filtration (or technologies providing an equivalent log reduction credit) and disinfection. The appropriate type and level of treatment should take into account potential fluctuations in water quality, including short-term water quality degradation, and variability in treatment performance. Pilot testing or other optimization processes may be useful for determining treatment variability.

Private well owners (i.e., semi-public supply or individual household) should assess the vulnerability of their well to faecal contamination to determine if the water should be treated. Well owners should have an understanding of the well construction, type of aquifer material surrounding the well and location of the well in relation to sources of faecal contamination (i.e., septic systems, sanitary sewers, animal waste, etc.). General guidance on well construction, maintenance, protection and testing is typically available from provincial/territorial jurisdictions. If a private well owner is not able to determine if their well is vulnerable to faecal contamination, the responsible drinking water authority in the affected jurisdiction should be contacted to identify possible treatment options.

3.3 Appropriate maintenance and operation of distribution systems

Contamination of distribution systems with enteric protozoa has been responsible for waterborne illness. As a result, maintaining the physical/hydraulic integrity of the distribution system and minimizing negative- or low-pressure events are key components of a source-to-tap or water safety plan approach. Distribution system water quality should be regularly monitored (e.g., microbial indicators, disinfectant residual, turbidity, pH), operations/maintenance programs should be in place (e.g., water main cleaning, cross-connection control, asset management) and strict hygiene should be practiced during all water main construction (e.g., repair, maintenance, new installation) to ensure that drinking water is transported to the consumer with minimum loss of quality.

Part II. Science and Technical Considerations

4.0 Description

Protozoa are a diverse group of eukaryotic, typically unicellular, microorganisms. The majority of protozoa are free-living organisms that can reside in fresh water and pose no risk to human health. However, some protozoa are pathogenic to humans. These protozoa fall into two functional groups: enteric protozoa and free-living protozoa. Human infections caused by free-living protozoa (e.g., Naegleria, Acanthamoeba spp.) are generally the result of contact during recreational bathing (or domestic uses of water other than drinking); as such, this group of protozoa is addressed in the Guidelines for Canadian Recreational Water Quality (Health Canada, 2012a). Enteric protozoa, on the other hand, have been associated with several drinking water-related outbreaks, and drinking water serves as a significant route of transmission for these organisms. A discussion of enteric protozoa is therefore presented here.

Enteric protozoa are common parasites in the gut of humans and other mammals. They, like enteric bacteria and viruses, can be found in water following direct or indirect contamination by the faeces of humans and other animals. These microorganisms can be transmitted via drinking water and have been associated with several waterborne outbreaks in North America and elsewhere (Schuster et al., 2005; Karanis et al., 2007; Baldursson and Karanis, 2011; Efstratiou et al., 2017). The ability of this group of microorganisms to produce (oo)cysts that are extremely resistant to environmental stresses and commonly used chlorine-based disinfectants has facilitated their ability to spread and cause illness.

The enteric protozoa that are most often associated with waterborne disease in Canada are Giardia and Cryptosporidium. These protozoa are commonly found in surface waters: some strains are highly pathogenic, can survive for long periods of time in the environment and are highly resistant to chlorine-based disinfection. Thus, they are the focus of the following discussion. A brief description of other enteric protozoa of human health concern (i.e., Toxoplasma gondii, Cyclospora cayetanensis, Entamoeba histolytica, and Blastocystis hominis) is provided in Appendix A. It is important to note that throughout this document, the common names for enteric protozoa are used for clarity. Proper scientific nomenclature is used only when necessary to accurately present scientific findings.

4.1 Giardia

Giardia is a flagellated protozoan parasite (Phylum Metamonada, Subphylum Trichozoa, Superclass Eopharyngia, Class Trepomonadea, Subclass Diplozoa, Order Giardiida, Family Giardiidae) (Cavalier-Smith, 2003; Plutzer et al., 2010). It was first identified in human stool by Antonie van Leeuwenhoek in 1681 (Boreham et al., 1990). However, it was not recognized as a human pathogen until the 1960s, after community outbreaks and its identification in travellers (Craun, 1986; Farthing, 1992). Illness associated with this parasite is known as giardiasis.

4.1.1 Life cycle

Giardia inhabits the small intestines of humans and other animals. The trophozoite, or feeding stage, lives mainly in the duodenum but is often found in the jejunum and ileum of the small intestine. Trophozoites (9–21 µm long, 5–15 µm wide and 2–4 µm thick) have a pear-shaped body with a broadly rounded anterior end, two nuclei, two slender median rods, eight flagella in four pairs, a pair of darkly staining median bodies and a large ventral sucking disc (cytostome). Trophozoites are normally attached to the surface of the intestinal villi, where they are believed to feed primarily upon mucosal secretions. After detachment, the binucleate trophozoites form cysts (encyst) and divide within the original cyst, so that four nuclei become visible. Cysts are ovoid, 8–18 µm long by 5–15 µm wide (U.S. EPA, 2012), with two or four nuclei and visible remnants of organelles. Environmentally stable cysts are passed out in the faeces, often in large numbers. A complete life cycle description can be found elsewhere (Adam, 2001; Carranza and Lujan, 2010).

4.1.2 Species

The taxonomy of the genus Giardia continuously changes as data on the isolation and identification of new species and genotypes, strain phylogeny and host specificity become available. The current taxonomy of the genus Giardia is based on the species definition proposed by Filice (1952), who defined three species: G. duodenalis (syn. G. intestinalis, G. lamblia), G. muris and G. agilis, based on the shape of the median body, an organelle composed of microtubules that is most easily observed in the trophozoite. Other species have subsequently been described on the basis of cyst morphology and molecular analysis. Currently, six Giardia species are recognized (Table 1), although recent work suggests Giardia duodenalis (syn. G. intestinalis, G. lamblia) assemblages A and B may be distinct species and should be renamed (Prystajecky et al., 2015). Three synonyms (G. lamblia, G. intestinalis and G. duodenalis) have been and continue to be used interchangeably in the literature to describe the Giardia isolates from humans, although this species is capable of infecting a wide range of mammals. This species will be referred to as G. lamblia for this document. Molecular characterization of this species has demonstrated the existence of genetically distinct assemblages: assemblages A and B infect humans and other mammals, whereas the remaining assemblages (C, D, E, F and G) have not yet been isolated from humans and appear to have restricted host ranges (and likely represent different species or groupings) (Adam, 2001; Thompson, 2004; Thompson and Monis, 2004; Xiao et al., 2004; Smith et al., 2007; Plutzer et al., 2010). Due to the genetic diversity within assemblages A and B, these grouping have also been characterized into sub-assemblages (Cacciò and Ryan, 2008; Plutzer et al., 2010).

Table 1. Giardia species
Species (assemblage) Major host(s)
G. agilis Amphibians
G. ardeae Birds
G. lamblia, syn. G. intestinalis, syn. G. duodenalis
(A)
Humans, livestock, other mammals
(B)
Humans
(C)
Dogs
(D)
Dogs
(E)
Cattle, other hoofed livestock
(F)
Cats
(G)
Rats
G. microti Muskrats, voles
G. muris Rodents
G. psittaci Birds

In addition to genetic dissimilarities, the variants of G. lamblia also exhibit phenotypic differences, including differential growth rates and drug sensitivities (Homan and Mank, 2001; Read et al., 2002). The genetic differences have been exploited as a means of distinguishing human-infective Giardia from other strains or species (Amar et al., 2002; Cacciò et al., 2002, 2010; Read et al., 2004); however, the applicability of these methods to analysis of Giardia within water has been limited (see section 6.6). Thus, at present, it is necessary to consider that any Giardia cysts found in water are potentially infectious to humans.

4.2 Cryptosporidium

Cryptosporidium is a protozoan parasite (Phylum Apicomplexa, Class Gregarinomorphea, Subclass Cryptogregaria; Ryan et al., 2016). The genus Cryptosporidium is currently the sole member of the newly described Cryptogregaria subclass. The illness caused by this parasite is known as cryptosporidiosis. It was first recognized as a potential human pathogen in 1976 in a previously healthy three-year-old child (Nime et al., 1976). A second case of cryptosporidiosis occurred two months later in an individual who was immunosuppressed as a result of drug therapy (Meisel et al., 1976). The disease became best known in immunosuppressed individuals exhibiting the symptoms now referred to as acquired immunodeficiency syndrome, or AIDS (Hunter and Nichols, 2002).

4.2.1 Life cycle

The recognition of Cryptosporidium as a human pathogen led to increased research into the life cycle of the parasite and an investigation of the possible routes of transmission. Cryptosporidium has a multi-stage life cycle. The entire life cycle takes place in a single host and evolves in six major stages, including both sexual and asexual stages: 1) excystation, where sporozoites are released from an excysting oocyst; 2) schizogony (syn. merogony), where asexual reproduction takes place; 3) gametogony, the stage at which gametes are formed; 4) fertilization of the macrogametocyte by a microgamete to form a zygote; 5) oocyst wall formation; and 6) sporogony, where sporozoites form within the oocyst (Current, 1986). A complete life cycle description and diagram can be found elsewhere (Smith and Rose, 1990; Hijjawi et al., 2004; Fayer and Xiao, 2008). Syzygy, a sexual reproduction process that involves association of the pre-gametes end to end or laterally prior to the formation of gametes, was described in two species of Cryptosporidium, C. parvum and C. andersoni, providing new information regarding Cryptosporidium's biology (life cycle) and transmission (Hijjawi et al., 2002; Rosales et al., 2005).

As a waterborne pathogen, the most important stage in Cryptosporidium's life cycle is the round, thick-walled, environmentally stable oocyst, 4–6 µm in diameter. The nuclei of sporozoites can be stained with fluorogenic dyes such as 4′,6-diamidino-2-phenylindole (DAPI). Upon ingestion by humans, the parasite completes its life cycle in the digestive tract. Ingestion initiates excystation of the oocyst and releases four sporozoites, which adhere to and invade the enterocytes of the gastrointestinal tract (Spano et al., 1998; Pollok et al., 2003). The resulting intracellular parasitic vacuole contains a feeding organelle along with the parasite, which is protected by an outer membrane. The outer membrane is derived from the host cell. The sporozoite undergoes asexual reproduction (schizogony), releasing merozoites that spread the infection to neighbouring cells. Sexual multiplication (gametogony) then takes place, producing either microgametes ("male") or macrogametes ("female"). Microgametes are then released to fertilize macrogametes and form zygotes. A small proportion (20%) of zygotes fail to develop a cell wall and are termed "thin-walled" oocysts. These forms rupture after the development of the sporozoites, but prior to faecal passage, thus maintaining the infection within the host. The majority of the zygotes develop a thick, environmentally resistant cell wall and four sporozoites to become mature oocysts, which are then passed in the faeces.

4.2.2 Species

Our understanding of the taxonomy of the genus Cryptosporidium is continually being updated. Cryptosporidium was first described by Tyzzer (1907), when he isolated the organism, which he named Cryptosporidium muris, from the gastric glands of mice. Tyzzer (1912) found a second isolate, which he named C. parvum, in the intestine of the same species of mice. This species has since been renamed to C. tyzzeri (Ryan et al., 2014). At present, 29 valid species of Cryptosporidium have been recognized (Table 2) (Ryan et al., 2014; Zahedi et al., 2016). The main species of Cryptosporidium associated with illness in humans are C. hominis and C. parvum. They account for more than 90% of human cryptosporidiosis cases (Bouzid et al., 2013). The majority of remaining human cases are caused by C. meleagridis and C. cuniculus. A minority of cases have been attributed to C. ubiquitum, C. canis, C. felis, and C. viatorum. Other species have been found in rare instances. Although many species of Cryptosporidium have not yet been found to cause illness in humans, it should not be inferred that they are incapable of causing illness in humans, only that to date, they have not been implicated in sporadic cases or outbreaks of cryptosporidiosis. These findings have important implications for communities whose source water may be contaminated by faeces from animal sources (see Table 2). The epidemiological significance of these genotypes is still unclear, but findings suggest that certain genotypes are adapted to humans and transmitted (directly or indirectly) from person to person. Thus, at present, all Cryptosporidium oocysts found in water are usually considered potentially infectious to humans, although genotyping information may be used to further inform risk management decisions.

Table 2. Cryptosporidium species
Species (genotype) Major host Human health concernTable 2 footnote a
C. andersoni Cattle +
C. baileyi Poultry -
C. bovis Cattle +
C. canis Dogs ++
C. cuniculus Rabbits ++
C. erinacei Hedgehogs and horses +
C. fayeri Marsupials +
C. felis Cats ++
C. fragile Toads -
C. galli Finches, chickens -
C. hominis (genotype H, I or 1) Humans, monkeys +++
C. huwi Fish -
C. macropodum Marsupials -
C. meleagridis Turkeys, humans ++
C. molnari Fish -
C. muris Rodents +
C. parvum (genotype C, II or 2) Cattle, other ruminants, humans +++
C. rubeyi Squirrel -
C. ryanae Cattle -
C. scophthalmi Turbot -
C. scrofarum Pigs +
C. serpentis Reptiles -
C. suis Pigs +
C. tyzzeri Rodents +
C. ubiquitum Ruminants, rodents, primates ++
C. varanii Lizards -
C. viatorum Humans ++
C. wrairi Guinea pigs -
C. xiaoi Sheep, goats +
Table 2 footnotes
Table 2 footnote a

Human health concern is based solely on the frequency of detection of the species from human cryptosporidiosis cases, designation may change as new cases of cryptosporidiosis are identified

  • +++ Most frequently associated with human illness
  • ++ Has caused human illness, but infrequently
  • + Has caused human illness, but only a few very rare cases (very low risk)
  • - Has never been isolated from humans

Return to Table 2 footnote a referrer

In addition to the 29 species of Cryptosporidium that have been identified, over 40 genotypes of Cryptosporidium, for which a strain designation has not been made, have also been proposed among various animal groups, including rodents, marsupials, reptiles, fish, wild birds and primates (Fayer, 2004; Xiao et al., 2004; Feng et al., 2007; Smith et al., 2007; Fayer et al., 2008; Xiao and Fayer, 2008; Ryan et al., 2014). Research suggests that these genotypes vary with respect to their development, drug sensitivity and disease presentation (Chalmers et al., 2002; Xiao and Lal, 2002; Thompson and Monis, 2004; Xiao et al., 2004).

5.0 Sources and exposure

5.1 Guardia

5.1.1 Sources

Human and other animal faeces are major sources of Giardia. Giardiasis has been shown to be endemic in humans and in over 40 other species of animals, with prevalence rates ranging from 1% to 5% in humans, 10% to 100% in cattle, and 1% to 20% in pigs (Olson et al., 2004; Pond et al., 2004; Thompson, 2004; Thompson and Monis, 2004). Giardia cysts are excreted in large numbers in the faeces of infected humans and other animals (both symptomatic and asymptomatic). Infected cattle, for example, have been shown to excrete up to one million (106) cysts per gram of faeces (O'Handley et al., 1999; Ralston et al., 2003; O'Handley and Olson, 2006). Other mammals, such as beaver, dog, cat, muskrat and horses have also been shown to shed human-infective species of Giardia in their faeces (Davies and Hibler, 1979; Hewlett et al., 1982; Erlandsen and Bemrick, 1988; Erlandsen et al., 1988; Traub et al., 2004, 2005; Eligio-García et al., 2005). Giardia can also be found in bear, bird and other animal faeces, but it is unclear whether these strains are pathogenic to humans (refer to section 5.1.3). Cysts are easily disseminated in the environment and are transmissible via the faecal–oral route. This includes transmission through faecally contaminated water (directly, or indirectly through food products), as well as direct contact with infected humans or animals (Karanis et al., 2007; Plutzer et al., 2010).

Giardia cysts are commonly found in sewage and surface waters and occasionally in groundwater sources and treated water.

Surface water

Table 3 highlights a selection of studies that have investigated the occurrence of Giardia in surface waters in Canada. Typically, Giardia concentrations in surface waters range from 2 to 200 cysts/100 L (0.02 to 2 cysts/L). Concentrations as high as 8,700 cysts/100 L (87 cysts/L) have been reported and were associated with record spring runoff, highlighting the importance of event-based sampling (Gammie et al., 2000). Recent studies in Canada have also investigated the species of Giardia present in surface waters. G. lamblia assemblages A and B were the most common variants detected (Edge et al., 2013; Prystajecky et al., 2014). This has also been found internationally (Cacciò and Ryan, 2008; Alexander et al., 2014; Adamska, 2015).

Table 3. Occurrence of Giardia in surface waters in CanadaTable 3 footnote a
Province Site/watershed Frequency of positive samples Unit of measure Giardia concentration (cysts/100 L)Table 3 footnote b Reference
National survey Various 245/1,173 Maximum 230 Wallis et al., 1996
Alberta Not available 1/1 Single sample 494 LeChevallier et al., 1991a
Alberta North Saskatchewan River, Edmonton N/A Annual geometric mean 8–193 Gammie et al., 2000
Maximum 2,500Table 3 footnote c
Alberta North Saskatchewan River, Edmonton N/A Annual geometric mean 98 EPCOR, 2005
Maximum 8,700
British Columbia Black Mountain Irrigation District 24/27 Geometric mean 60.4 Ong et al., 1996
Vernon Irrigation District 68/70 26
Black Mountain Irrigation District 24/27 Range 4.6–1,880
Vernon Irrigation District 68/70 2–114
British Columbia Seymour 12/49 AverageTable 3 footnote d 3.2 Metro Vancouver, 2009
Capilano 24/49 6.3
Coquitlam 13/49 3.8
Seymour   Maximum 8.0
Capilano 20.0
Coquitlam 12.0
British Columbia Salmon River 38/49 Median 32 Prystajecky et al., 2014
Coghlan Creek 59/65 107
Salmon River   Maximum 730
Coghlan Creek 3,800
Nova Scotia Collins Park 1/26 Maximum 130 Nova Scotia Environment, 2013
East Hants 2/12 10
Stewiacke 3/12 140
Stellarton 4/12 200
Tatamagouche 0/12 <10
Bridgewater 0/12 <10
Middle Musquodoboit 4/25 1,067
Ontario Grand River 14/14 Median 71 Van Dyke et al., 2006
Maximum 486
Ontario Grand River watershed 101/104 Median 80 Van Dyke et al., 2012
Maximum 5,401
Ontario Ottawa River N/A Average 16.8 Douglas, 2009
Ontario Lake Ontario   Maximum   Edge et al., 2013
Water Treatment Plant Intakes
WTP1 17/46 70
WTP2 4/35 12
WTP3 6/43 18
Humber River 32/41 540
Credit River 19/35 90
Quebec ROS Water Treatment Plant, Thousand Islands River, Montreal 4/4 Geometric mean 1,376 Payment and Franco, 1993
STE Water Treatment Plant, Thousand Islands River, Montreal 8/8 336
REP Water Treatment Plant, Assomption River, Montreal 4/5 7.23
Quebec Saint Lawrence River N/A Geometric mean 200 Payment et al., 2000
Quebec 15 surface water sites impacted by urban and agricultural runoff 191/194 Medians 22–423 MDDELCC, 2016
Maximums 70–2278
Table 3 footnotes
Table 3 footnote a

The studies were selected to show range of concentrations, not as a comprehensive list of all studies in Canada. The sampling and analysis methods employed in these studies varied; as a result, it may not be appropriate to compare cyst concentrations. The viability and infectivity of cysts were rarely assessed; little information is therefore available regarding the potential risk to human health associated with the presence of Giardia in these samples.

Return to Table 3 footnote a referrer

Table 3 footnote b

Units were standardized to cysts/100 L.

Return to Table 3 footnote b referrer

Table 3 footnote c

Associated with heavy spring runoff.

Return to Table 3 footnote c referrer

Table 3 footnote d

Average results are from positive filters only.

Return to Table 3 footnote d referrer

The typical range for Giardia concentrations in Canadian surface waters is at the lower end of the range described in an international review (Dechesne and Soyeux, 2007). Dechesne and Soyeux (2007) found that Giardia concentrations in surface waters across North America and Europe ranged from 0.02 to 100 cysts/L, with the highest levels reported in the Netherlands. Source water quality monitoring data (surface and GUDI sources) were also gathered for nine European water sources (in France, Germany, the Netherlands, Sweden and the United Kingdom) and for one Australian source. Overall, Giardia was frequently detected at relatively low concentrations, and levels ranged from 0.01 to 40 cysts/L. An earlier survey by Medema et al. (2003) revealed that concentrations of cysts in raw and treated domestic wastewater (i.e., secondary effluent) typically ranged from 5,000 to 50,000 cysts/L and from 50 to 500 cysts/L, respectively.

Groundwater

There is a limited amount of information on groundwater contamination with Giardia in Canada and elsewhere. The available studies discuss various types of sources that emanate from the subsurface. Most of these studies use the term "groundwater", although it is clear from the descriptions provided by the authors that the water sources would be classified as GUDI or surface water (e.g., infiltration wells, springs). However, for this document, the terminology used by the authors has been maintained.

A review and analysis of enteric pathogens in groundwaters in the United States and Canada (1990–2013), conducted by Hynds et al. (2014), identified 102 studies, of which only 10 investigated the presence of Giardia. Giardia was found in 3 of the 10 studies; none of the positive sites were in Canada. Three studies were conducted in Canada (out of the 10 studies identified). Two of the studies were conducted in Prince Edward Island and one was conducted in British Columbia. The PEI studies included a total of 40 well water samples on dairy and beef farms (Budu-Amoako et al., 2012a, 2012b). None of the samples tested positive for Giardia. The study conducted in BC also did not find any Giardia present in well water samples (Isaac-Renton et al., 1999).

Other published studies have reported the presence of cysts in groundwaters (Hancock et al., 1998; Gaut et al., 2008; Khaldi et al., 2011; Gallas-Lindemann et al., 2013; Sinreich, 2014; Pitkänen et al., 2015). As noted above, many of the sources described in these studies would be classified as GUDI or surface water. Hancock et al. (1998) found that 6% (12 /199) of the sites tested were positive for Giardia. Of the positive sites, 83% (10/12) were composed of springs, infiltration galleries and horizontal wells while the remaining positive sites (2/12) were vertical wells. The same study also reported that contamination was detected intermittently; many sites required multiple samples before Giardia was detected. However, sites that were negative for Giardia in the first sample were not always resampled. This may have led to an underestimation of contamination prevalence. Many of the other studies reported in the literature have focused on vulnerable areas, such as karst aquifers (Khaldi et al., 2011; Sinreich, 2014) or areas where human and animal faecal contamination are more likely to impact the groundwater (i.e., shallow groundwaters, or close proximity to surface water sources or agricultural or human sewage sources; Gaut et al., 2008; Pitkänen et. al., 2015). These studies have reported Giardia prevalence rates ranging from none detected in any of the wells (Gaut et al., 2008) to 20% of the wells testing positive for Giardia (Pitkänen et. al., 2015). Khaldi et al., (2011) also found that Giardia positive samples increased from 0%, under non-pumping conditions, to 11% under continuously pumping conditions. Hynds et al. (2014) reported that well design and integrity have a significant impact on the likelihood of detecting enteric protozoa.

The above studies highlight the importance of assessing the vulnerability of subsurface sources to contamination by enteric protozoa to ensure they are properly classified. Subsurface sources that have been assessed as not vulnerable to contamination by enteric protozoa, if properly classified, should not have protozoa present. However, all groundwater sources will have a degree of vulnerability and should be periodically reassessed.

Treated water

Treated water in Canada is rarely tested for the presence of Giardia. When testing has been conducted, cysts are typically not present or are present in very low numbers (Payment and Franco, 1993; Ong et al., 1996; Wallis et al., 1996, 1998; EPCOR, 2005; Douglas, 2009), with some exceptions. In 1997, a heavy spring runoff event in Edmonton, Alberta resulted in the presence of 34 cysts/1,000 L in treated water (Gammie et al., 2000). Cysts have also been detected in treated water derived from unfiltered surface water supplies (Payment and Franco, 1993; Wallis et al., 1996).

5.1.2 Survival

Giardia cysts can survive in the environment from weeks to months (or possibly longer), depending on a number of factors, including the characteristics specific to the strain and of the water, such as temperature. The effect of temperature on survival rates of Giardia has been well studied. In general, as the temperature increases, the survival time decreases. For example, Bingham et al. (1979) observed that Giardia cysts can survive up to 77 days in tap water at 8°C, compared with 4 days at 37°C. DeRegnier et al. (1989) reported a similar effect in river and lake water. This temperature effect is, in part, responsible for peak Giardia cyst prevalence reported in winter months (Isaac-Renton et al., 1996; Ong et al., 1996; Van Dyke et al., 2012). Other factors such as exposure to ultraviolet (UV) light (McGuigan et al., 2006; Heaselgrave and Kilvington, 2011) or predation (Revetta et al., 2005) can also shorten the survival time of Giardia.

The viability of Giardia cysts found in water does not seem to be high. Cysts found in surface waters often have permeable cell walls, as shown by propidium iodide (PI) staining (Wallis et al., 1995), indicating they are likely non-viable. Wallis et al. (1996) found that only approximately 25% of the drinking water samples that were positive for Giardia contained viable cysts using PI staining. Observations by LeChevallier et al. (1991b) also suggest that most of the cysts present in water are non-viable; 40 of 46 cysts isolated from drinking water exhibited "non-viable-type" morphologies (i.e., distorted or shrunken cytoplasm). More recent work in British Columbia also found that the vast majority of cysts detected from routine monitoring of two drinking water reservoirs displayed no internal structures using DAPI staining and differential interference contrast (DIC) microscopy, suggesting they are aged or damaged and, therefore, unlikely to be viable (Metro Vancouver, 2013). Studies have frequently revealed the presence of empty cysts ("ghosts"), particularly in sewage.

5.1.3 Exposure

Person-to-person transmission is by far the most common route of transmission of Giardia (Pond et al., 2004; Thompson, 2004). Persons become infected via the faecal–oral route, either directly (i.e., contact with faeces from a contaminated person, such as children in daycare facilities) or indirectly (i.e., ingestion of contaminated drinking water, recreational water and, to a lesser extent, food). Since the most important source of human infectious Giardia originates from infected people, source waters impacted by human sewage are an important potential route of exposure to Giardia.

Animals may also play an important role in the (zoonotic) transmission of Giardia, although it is not clear to what extent. Cattle have been found to harbour human-infective (assemblage A) Giardia, as have dogs and cats. Assemblage A Giardia genotypes have also been detected in wildlife, including beavers and deer (Plutzer et al., 2010). Although there is some evidence to support the zoonotic transmission of Giardia, most of this evidence is circumstantial or compromised by inadequate controls. Thus, it is not clear how frequently zoonotic transmission occurs or under what circumstances. Overall, these data suggest that, in most cases, animals are not the original source of human-infective Giardia. However, in some circumstances, it is possible that they could amplify zoonotic genotypes present in other sources (e.g., contaminated water). In cattle, for example, the livestock Giardia genotype (assemblage E) predominates (Lalancette et al., 2012); however, cattle are susceptible to infection with human-infective (zoonotic) genotypes of Giardia. Given that livestock, such as calves, infected with Giardia commonly shed between 105 and 106 cysts per gram of faeces, they could play an important role in the transmission of Giardia. Other livestock, such as sheep, are also susceptible to human-infective genotypes. It is possible that livestock could acquire zoonotic genotypes of Giardia from their handlers or from contaminated water sources, although there is also evidence to the contrary. In a study investigating the use of reclaimed wastewater for pasture irrigation, the human infective genotypes present in the wastewater could not be isolated from the faeces of the grazing animals (Di Giovanni et al., 2006).

The role that wildlife plays in the zoonotic transmission of Giardia is also unclear. Although wildlife, including beavers, can become infected with human-source G. lamblia (Davies and Hibler, 1979; Hewlett et al., 1982; Erlandsen and Bemrick, 1988; Erlandsen et al., 1988; Traub et al., 2004, 2005; Eligio-García et al., 2005) and have been associated with waterborne outbreaks of giardiasis (Kirner et al., 1978; Lopez et al., 1980; Lippy, 1981; Isaac Renton et al., 1993), the epidemiological and molecular data do not support zoonotic transmission via wildlife as a significant risk for human infections (Hoque et al., 2003; Stuart et al., 2003; Berrilli et al., 2004; Thompson, 2004; Hunter and Thompson, 2005; Ryan et al., 2005a). The data do, however, suggest that wildlife acquire human-infective genotypes of Giardia from sources contaminated by human sewage. As population pressures increase and as more human-related activity occurs in watersheds, the potential for faecal contamination of source waters becomes greater, and the possibility of contamination with human sewage must always be considered. Erlandsen and Bemrick (1988) concluded that Giardia cysts in water may be derived from multiple sources and that epidemiological studies that focus on beavers may be missing important sources of cyst contamination. Some waterborne outbreaks have been traced back to human sewage contamination (Wallis et al., 1998). Ongerth et al. (1995) showed that there is a statistically significant relationship between increased human use of water for domestic and recreational purposes and the prevalence of Giardia in animals and surface water. It is known that beaver and muskrat can be infected with human-source Giardia (Erlandsen et al., 1988), and these animals are frequently exposed to raw or partially treated sewage in Canada. The application of genotyping procedures has provided further proof of this linkage. Thus, it is likely that wildlife and other animals act as a reservoir of human-infective Giardia from sewage-contaminated water and, in turn, amplify concentrations of Giardia cysts in water. If infected animals live upstream from or in close proximity to drinking water treatment plant intakes, they could play an important role in the waterborne transmission of Giardia. Thus, watershed management to control both human and animal fecal inputs is important for disease prevention.

As is the case for livestock and wildlife animals, it is unclear what role domestic animals play in the zoonotic transmission of Giardia. Although dogs and cats are susceptible to infection with zoonotic genotypes of Giardia, few studies have provided direct evidence of transmission between them and humans (Eligio-García et al., 2005; Shukla et al., 2006; Thompson et al., 2008).

5.2 Cryptosporidium

5.2.1 Sources

Humans and other animals, especially cattle, are important reservoirs for Cryptosporidium. Human cryptosporidiosis has been reported in more than 90 countries over six continents (Fayer et al., 2000; Dillingham et al., 2002). Reported prevalence rates of human cryptosporidiosis range from 1 to 20%, with higher rates reported in developing countries (Caprioli et al., 1989; Zu et al., 1992; Mølbak et al., 1993; Nimri and Batchoun, 1994; Dillingham et al., 2002; Cacciò and Pozio, 2006). Livestock, especially cattle, are a significant source of C. parvum (Pond et al., 2004). In a survey of Canadian farm animals, Cryptosporidium was detected in faecal samples from cattle (20%), sheep (24%), hogs (11%) and horses (17%) (Olson et al., 1997). Overall, prevalence rates in cattle range from 1 to 100% and in pigs from 1 to 10% (Pond et al., 2004). Oocysts were more prevalent in calves than in adult animals; conversely, they were more prevalent in mature pigs and horses than in young animals. Infected calves can excrete up to 107 oocysts per gram of faeces (Smith and Rose, 1990) and represent an important source of Cryptosporidium in surface waters (refer to section 5.2.2). Wild ungulates (hoofed animals) and rodents are not a significant source of human-infectious Cryptosporidium (Roach et al., 1993; Ong et al., 1996).

Oocysts are easily disseminated in the environment and are transmissible via the faecal–oral route. Major pathways of transmission for Cryptosporidium include person-to-person, contaminated drinking water, recreational water, food and contact with animals, especially livestock. A more detailed discussion of zoonotic transmission is provided in section 5.2.3.

Cryptosporidium oocysts are commonly found in sewage and surface waters and occasionally in groundwater sources and treated water.

Surface water

Table 4 highlights a selection of studies that have investigated the occurrence of Cryptosporidium in surface waters in Canada.

Table 4. Occurrence of Cryptosporidium in surface waters in CanadaTable 4 footnote a
Province Site/watershed Frequency of positive samples Unit of measure Cryptosporidium concentration (oocysts/100 L)Table 4 footnote b Reference
National survey Various 55/1,173 Maximum (for most samples) 0.5 Wallis et al., 1996
Alberta Not available 1/1 Single sample 34 LeChevallier et al., 1991a
Alberta North Saskatchewan River, Edmonton N/A Annual geometric mean 6–83 Gammie et al., 2000
Maximum 10,300Table 4 footnote c
Alberta North Saskatchewan River, Edmonton N/A Annual geometric mean 9 EPCOR, 2005
Maximum 69
British Columbia Black Mountain Irrigation District 14/27 Geometric mean 3.5 Ong et al., 1996
Vernon Irrigation District 5/19 9.2
Black Mountain Irrigation District 14/27 Range 1.7–44.3
Vernon Irrigation District 5/19 4.8–51.4
British Columbia Seymour 0/49 AverageTable 4 footnote d 0.0 Metro Vancouver, 2009
Capilano 5/49 2.4
Coquitlam 1/49 2.0
Seymour   Maximum 0.0
Capilano 4.0
Coquitlam 2.0
British Columbia Salmon River 36/49 Median 11 Prystajecky et al., 2014
Coghlan Creek 36/65 333
Salmon River   Maximum 126
Coghlan Creek 20,600
Nova Scotia Collins Park 1/26 Maximum 130 Nova Scotia Environment, 2013
East Hants 0/12 <10
Stewiacke 0/12 <10
Stellarton 0/12 <10
Tatamagouche 0/12 <10
Bridgewater 0/12 <10
Middle Musquodoboit 0/25 <10
Ontario Grand River N/A Maximum 2,075 Welker et al., 1994
Ontario Grand River 33/98 Average 6.9 LeChevallier et al., 2003
Maximum 100
Ontario Grand River 13/14 Median 15 Van Dyke et al., 2006
Maximum 186
Ontario Grand River watershed 92/104 Median 12 Van Dyke et al., 2012
Maximum 900
Ontario Ottawa River N/A /53 Average 6.2 Douglas, 2009
Ontario Lake Ontario   Maximum   Edge et al., 2013
Water Treatment Plant Intakes
WTP1 5/46 40
WTP2 5/35 3
WTP3 3/43 1
Humber River 18/41 120
Credit River 21/35 56
Ontario South Nation (multiple sites) 317/674 Mean 3.3–170 Ruecker et al., 2012
Quebec ROS Water Treatment Plant, Thousand Islands River, Montreal   Geometric mean 742 Payment and Franco, 1993
STE Water Treatment Plant, Thousand Islands River, Montreal <2
REP Water Treatment Plant, Assomption River, Montreal <2
Quebec Saint Lawrence River   Geometric mean 14 Payment et al., 2000
Quebec 15 surface water sites impacted by urban and agricultural runoff 99/194 Medians 2–31 MDDELCC, 2016
Maximums 7–150
Table 4 footnotes
Table 4 footnote a

The sampling and analysis methods employed in these studies varied; as a result, it may not be appropriate to compare oocyst concentrations. The viability and infectivity of oocysts were rarely assessed; little information is therefore available regarding the potential risk to human health associated with the presence of Cryptosporidium in these samples.

Return to Table 4 footnote a referrer

Table 4 footnote b

Units were standardized to oocysts/100 L. However, the text cites concentrations/units as they were reported in the literature.

Return to Table 4 footnote b referrer

Table 4 footnote c

Associated with heavy spring runoff.

Return to Table 4 footnote c referrer

Table 4 footnote d

Average results are from positive filters only.

Return to Table 4 footnote d referrer

Typically, Cryptosporidium concentrations in Canadian surface waters range from 1 to 100 oocysts/100 L (0.001 to 1 oocyst/L), although high concentrations have been reported. Concentrations as high as 10,300 oocysts/100 L (103 oocysts/L) were associated with a record spring runoff (Gammie et al., 2000), and as high as 20,600 oocysts/100 L (206 oocysts/L) during a two-year bi weekly monitoring program (Prystajecky et al., 2014). These results highlight the importance of both event-based sampling and routine monitoring to characterize a source water. Analysis of data collected in the United States showed that median oocyst densities ranged from 0.005/L to 0.5/L (Ongerth, 2013a).

Recent studies have also investigated the species of Cryptosporidium present in source waters. Two studies in Ontario watersheds reported the frequency of detection of C. parvum and C. hominis, the species most commonly associated with human impacts, at less than 2% of positive samples (Ruecker et al., 2012; Edge et al., 2013). In contrast, a study in British Columbia in a mixed urban-rural watershed reported that around 30% of the species detected were potentially human infective types (Prystajecky et al., 2014). Other studies in Canada report human infective genotypes somewhere between these two levels (Pintar et al., 2012; Van Dyke et al., 2012). These findings are not unique. In a recent research project that genotyped 220 slides previously confirmed to be positive by the immunofluorescence assay (IFA), 10% of the slides contained human genotypes (Di Giovanni et al., 2014).

An international review of source water quality data (surface and GUDI sources) demonstrated that concentrations of Cryptosporidium in source waters across North America and Europe ranged from 0.006 to 250 oocysts/L (Dechesne and Soyeux, 2007). Although this range is large, a closer look at nine European sites and one Australian site showed that, overall, Cryptosporidium was frequently detected at relatively low concentrations, and levels ranged from 0.05 to 4.6 oocysts/L. In an earlier survey of wastewater effluent, Medema et al. (2003) reported concentrations of oocysts in raw and treated domestic wastewater (i.e., secondary effluent) ranging from 1,000 to 10,000 oocysts/L and from 10 to 1,000 oocysts/L, respectively.

Groundwater

There is a limited amount of information on groundwater contamination with Cryptosporidium in Canada and elsewhere. The available studies discuss various types of sources that emanate from the subsurface. Most of these studies use the term "groundwater", although it is clear from the descriptions provided by the authors that the water sources would be classified as GUDI or surface water (e.g., infiltration wells, springs). However, for this document, the terminology used by the authors has been maintained.

Hynds et al. (2014) reviewed groundwater studies in Canada and the United States between 1990 and 2013. Nine studies looked for Cryptosporidium; of these nine studies, only three occurred in Canada. Two of the studies were conducted in Prince Edward Island (Budu-Amoako et al., 2012a, 2012b) and one study was conducted in BC (Isaac-Renton et al., 1999). The study in BC did not find any Cryptosporidium in the community well during the study period. The PEI studies tested 40 well water samples on beef and dairy farms, and they found that 4 of 40 samples tested positive for Cryptosporidium with a concentration density range of 0.1 to 7.2/L in 100 L samples. To confirm their findings, the positive wells were retested and both Cryptosporidium and bacterial indicators were found.

Other published studies have reported the presence of oocysts in groundwaters (Welker et al., 1994; Hancock et al., 1998; Gaut et al., 2008; Khaldi et al., 2011; Füchslin et al., 2012; Gallas-Lindemann et al., 2013; Sinreich, 2014). As noted above, many of the sources described in these studies would be classified as GUDI or surface water. Welker et al. (1994) reported the presence of Cryptosporidium in low concentrations in induced infiltration wells located adjacent to the Grand River. Hancock et al. (1998) found that 11% (21 /199) of the sites tested were positive for Cryptosporidium. Of the positive sites, 67% (14 /21) were composed of springs, infiltration galleries and horizontal wells, and 33% (7 /21) were from vertical wells. The same study also reported that contamination was detected intermittently; many sites required multiple samples before Cryptosporidium was detected. However, sites that were negative for Cryptosporidium in the first sample were not always resampled. This may have led to an underestimation of contamination prevalence. Many of the other studies reported in the literature have focused on vulnerable areas, such as karst aquifers (Khaldi et al., 2011; Sinreich, 2014) or areas where human and animal faecal contamination are more likely to impact the groundwater (i.e., shallow groundwaters, or close proximity to surface water sources or agricultural or human sewage sources; Gaut et al., 2008; Füchslin et al., 2012; Gallas-Lindemann et al., 2013). These studies have reported Cryptosporidium prevalence rates ranging from 8 to 15 % of tested samples (Gaut et al., 2008; Gallas-Lindemann et al., 2013; Sinreich, 2014) to 100% of samples (from 3 wells) testing positive for Cryptosporidium in a small study in Switzerland (Füchslin et al., 2012). Khaldi et al. (2011) also found that Cryptosporidium positive samples increased from 44% under non-pumping conditions to 100% under pumping conditions. Hynds et al. (2014) reported that well design and integrity have been shown to have a significant impact on the likelihood of detecting enteric protozoa.

The above studies highlight the importance of assessing the vulnerability of subsurface sources to contamination by enteric protozoa to ensure they are properly classified. Subsurface sources that have been assessed as not vulnerable to contamination by enteric protozoa, if properly classified, should not have protozoa present. However, all groundwater sources will have a degree of vulnerability and should be periodically reassessed.

Treated water

The presence of Cryptosporidium in treated water in Canada is rarely assessed. When testing has been conducted, oocysts are typically not present or are present in very low numbers (Payment and Franco, 1993; Welker et al., 1994; Ong et al., 1996; Wallis et al., 1996; EPCOR, 2005; Douglas, 2009), with some exceptions (Gammie et al., 2000). Oocysts have been detected in treated water derived from unfiltered surface water supplies (Wallis et al., 1996) and after extreme contamination events. For example, in 1997, a heavy spring runoff event in Edmonton, Alberta resulted in the presence of 80 oocysts/1,000 L in treated water (Gammie et al., 2000). During the cryptosporidiosis outbreak in Kitchener-Waterloo, Welker et al. (1994) reported that two of 17 filtered water samples contained 0.16 and 0.43 oocysts/1000 L.

Treated waters have been extensively monitored in other countries. Daily monitoring of finished water in the United Kingdom (between 1999 and 2008) showed that the prevalence of positive samples was as high as 8% but, through improvements to the drinking water systems, this dropped to approximately 1% (Rochelle et al., 2012). The method used for daily monitoring did not provide information on the viability or infectivity of the oocysts. However, genotyping was conducted on a subset of positive samples and it was found that the species of Cryptosporidium most often detected were C. ubiquitum (12.5%), C. parvum (4.2%) and C. andersonni (4.0%) (Nichols et al., 2010).

5.2.2 Survival

Cryptosporidium oocysts have been shown to survive in cold waters (4°C) in the laboratory for up to 18 months (AWWA, 1988). In warmer waters (15°C), Cryptosporidium parvum has been shown to remain viable and infectious for up to seven months (Jenkins et al., 2003). In general, oocyst survival time decreases as temperature increases (Pokorny et al., 2002; King et al., 2005; Li et al., 2010). Robertson et al. (1992) reported that C. parvum oocysts could withstand a variety of environmental stresses, including freezing (viability greatly reduced) and exposure to seawater; however, Cryptosporidium oocysts are susceptible to desiccation. In a laboratory study of dessication, it was shown that within two hours, only 3% of oocysts were still viable, and by six hours, all oocysts were dead (Robertson et al., 1992).

Despite the common assumption that the majority of oocysts in water are viable, Smith et al. (1993) found that oocyst viability in surface waters is often very low. A study by LeChevallier et al. (2003) reported that 37% of oocysts detected in natural waters were infectious. Additionally, a study by Swaffer et al. (2014) reported that only 3% of the Cryptosporidium detected was infectious. In fact, research has shown that even in freshly shed oocysts, only 5–22% of the oocysts were infectious (Rochelle et al., 2001, 2012; Sifuentes and Di Giovanni, 2007). Although the level of infectivity reported to date is low and is dependent on the infectivity method employed, recent advancements in the cell-culture methodology used for determining Cryptosporidium infectivity have shown significant increases in the level of infectious oocysts (43–74% infectious) (King et al., 2011; Rochelle et al., 2015).

Low oocyst viability has also been reported in filtered water. A survey by LeChevallier et al. (1991b) found that, in filtered waters, 21 of 23 oocysts had "non-viable-type" morphology (i.e., absence of sporozoites and distorted or shrunken cytoplasm). In a more recent study of 14 drinking water treatment plants, no infectious oocysts were recovered in the approximately 350,000 L of treated drinking water that was filtered (Rochelle et al., 2012).

5.2.3 Exposure

Direct contact with livestock and indirect contact through faecally contaminated waters are major pathways for transmission of Cryptosporidium (Fayer et al., 2000; Robertson et al., 2002; Stantic-Pavlinic et al., 2003; Roy et al., 2004; Hunter and Thompson, 2005). Cattle are a significant source of C. parvum in surface waters. For example, a weekly examination of creek samples upstream and downstream of a cattle ranch in the BC interior during a 10-month period revealed that the downstream location had significantly higher levels of Cryptosporidium oocysts (geometric mean 13.3 oocysts/100 L, range 1.4–300 oocysts/100 L) compared with the upstream location (geometric mean 5.6/100 L, range 0.5–34.4 oocysts/100 L) (Ong et al., 1996). A pronounced spike was observed in downstream samples following calving in late February. During a confirmed waterborne outbreak of cryptosporidiosis in British Columbia, oocysts were detected in 70% of the cattle faecal specimens collected in the watershed close to the reservoir intake (Ong et al., 1997). Humans can also be a significant source of Cryptosporidium in surface waters. A study in Australia showed that surface waters that allowed recreational activities had significantly more Cryptosporidium than those with no recreational activities (Loganthan et al., 2012).

Waterfowl can also act as a source of Cryptosporidium. Graczyk et al. (1998) demonstrated that Cryptosporidium oocysts retain infectivity in mice following passage through ducks. However, histological examination of the avian respiratory and digestive systems at seven days post-inoculation revealed that the protozoa were unable to infect birds. In an earlier study (Graczyk et al., 1996), the authors found that faeces from migratory Canada geese collected from seven of nine sites on Chesapeake Bay contained Cryptosporidium oocysts. Oocysts from three of the sites were infectious to mice. Based on these and other studies (Graczyk et al., 2008; Quah et al., 2011), it appears that waterfowl carry infectious Cryptosporidium oocysts from their habitat to other locations, including drinking water supplies.

5.3 Waterborne illness

Giardia and Cryptosporidium are the most commonly reported intestinal protozoa in North America and the world (Farthing, 1989; Adam, 1991). Exposure to enteric protozoa through water can result in both an endemic rate of illness in the population and waterborne disease outbreaks. As noted in section 4.0, certain species and genotypes of Giardia and Cryptosporidium, respectively, are more commonly associated with human illness. Giardia lamblia assemblages A and B are responsible for all cases of human giardiasis. For cryptosporidiosis, Cryptosporidium parvum and C. hominis are the major species associated with illness, although C. hominis appears to be more prevalent in North and South America, Australia and Africa, whereas C. parvum is responsible for more infections in Europe (McLauchlin et al., 2000; Guyot et al., 2001; Lowery et al., 2001b; Yagita et al., 2001; Ryan et al., 2003; Learmonth et al., 2004).

5.3.1 Endemic illness

The estimated burden of endemic acute gastrointestinal illness (AGI) annually in Canada, from all sources (i.e., food, water, animals, person-to-person), is 20.5 million cases (0.63 cases/person-year; Thomas et al., 2013). Approximately 1.7% (334,966) of these cases, or 0.015 cases/person-year, are estimated to be associated with the consumption of tap water from municipal systems that serve >1000 people in Canada (Murphy et al., 2016a). Over 84% of the total Canadian population (approximately 29 million) rely on these systems; with 73% of the population (approximately 25 million) supplied by a surface water source; 1% (0.4 million) supplied by a groundwater source under the direct influence of surface water (GUDI); and the remaining 10% (3.3 million) supplied by a groundwater source (Statistics Canada, 2013a, 2013b). Murphy et al. (2016a) estimated that systems relying on surface water sources treated only with chlorine or chlorine dioxide, GUDI sources with no or minimal treatment, or groundwater sources with no treatment, accounted for the majority of the burden of AGI (0.047 cases/person-year). In contrast, an estimated 0.007 cases/person-year were associated with systems relying on lightly impacted source waters with multiple treatment barriers in place. The authors also estimated that over 35% of the 334,966 AGI cases were attributable to the distribution system.

In Canada, just over 3,800 confirmed cases of giardiasis were reported in 2012, which is a significant decline from the 9,543 cases that were reported in 1989. In fact, the total reported cases have decreased steadily since the early 1990s. The incidence rates have similarly declined over this period (from 34.98 to 11.12 cases per 100,000 persons; PHAC, 2015). On the other hand, since cryptosporidiosis became a reportable disease in 2000, the number of reported cases in Canada has been relatively constant, ranging from a low of 588 cases (in 2002) to a high of 875 cases (in 2007) (PHAC, 2015). The exception was in 2001, where the number of cases reported was more than double (1,763 cases) due to a waterborne outbreak in North Battleford, Saskatchewan. For the Canadian population, this corresponds to an incidence rate between 1.82 and 2.69 cases per 100,000 persons per year, with the exception of 2001, where the incidence rate was 7.46 cases per 100,000 persons (PHAC, 2015). The reported incidence rates are considered to be only a fraction of the illnesses that are occurring in the population due to under-reporting and under-diagnosis, especially with respect to mild illnesses such as gastrointestinal upset. It is estimated that the number of illnesses acquired domestically from Giardia and Cryptosporidium are approximately 40 and 48 times greater, respectively, than the values reported nationally (Thomas et al., 2013). However, even though Giardia and Cryptosporidium are commonly reported intestinal protozoa, they still only accounted for approximately 16% and 4%, respectively, of all foodborne and waterborne related illnesses in 2013 (PHAC, 2015).

Similar to illness rates in Canada, giardiasis rates in the United States have been declining, and in 2012, the average rate of giardiasis was 5.8 cases per 100,000 people (Painter et al., 2015a). Although the rate of giardiasis reported in the United States is lower than in Canada, this difference can at least partially be explained by differences in disease surveillance. In the United States, giardiasis is not a reportable illness in all states, whereas it is a nationally notifiable disease in Canada. The incidence rate of cryptosporidiosis in the United States in 2012-2013 was 2.6 to 3 cases per 100,000 people (from all sources, including drinking water) (Painter et al., 2015b). This is similar to the incidence rate in Canada.

5.3.2 Outbreaks

Giardia and Cryptosporidium are common causes of waterborne infectious disease outbreaks in Canada and elsewhere (Fayer, 2004; Hrudey and Hrudey, 2004; Joachim, 2004; Smith et al., 2006). Between 1974 and 2001, Giardia and Cryptosporidium were the most and the third most commonly reported causative agents, respectively, associated with infectious disease outbreaks related to drinking water in Canada (Schuster et al., 2005). Giardia was linked to 51 of the 138 outbreaks for which causative agents were identified and Cryptosporidium was linked to 12 of the 138 outbreaks. The majority of Giardia (38/51; 75%) and Cryptosporidium (11/12; 92%) outbreaks were associated with public drinking water systems; a selection of these outbreaks can be found in Appendix B. From 2002 to 2016, only one outbreak of giardiasis has been reported in Canada with an association to a drinking water source (PHAC, 2009; Efstratiou et al., 2017; Moreira and Bondelind, 2017). No outbreaks of cryptosporidiosis related to drinking water have been reported in the same time period.

In the United States, drinking water related outbreaks have been reported for both Giardia and Cryptosporidium (Craun, 1979; Lin, 1985; Moore et al., 1993; Jakubowski, 1994; CDC, 2004; U.S. EPA, 2006a; Craun et al., 2010). Giardia was the most frequently identified etiological agent associated with waterborne outbreaks in the United States between 1971 and 2006, accounting for 16% of outbreaks (126/780); Cryptosporidium accounted for 2% (15/780). These outbreaks were associated with 28,127 cases of giardiasis and 421,301 cases of cryptosporidiosis (Craun et al., 2010). Most of the cryptosporidiosis cases (403,000) were associated with the Milwaukee outbreak in 1993 (U.S. EPA, 2006a).

In a worldwide review of waterborne protozoan outbreaks, G. lamblia and Cryptosporidium accounted for 40.6% and 50.6%, respectively, of the 325 outbreaks reported between 1954 and 2003 from all water sources, including recreational water (Karanis et al., 2007). The largest reported Giardia drinking water related outbreak occurred in 2004, in Norway, with an estimated 2500 cases (Robertson et al., 2006; Baldursson and Karanis, 2011). Updates to this review were published in 2011 and 2017, capturing 199 protozoan outbreaks between 2004 and 2010 (Baldursson and Karanis, 2011) and a further 381 protozoan outbreaks between 2011 and 2016 (Efstratiou et al., 2017). Giardia still accounted for 35.2% and 37%, and Cryptosporidium still accounted for 60.3% and 63%, respectively, of the outbreaks from all water sources, including recreational water.

Several authors have investigated whether there are commonalities in the causes of the drinking water outbreaks related to enteric protozoa. For the outbreaks identified in Canada, contamination of source waters from human sewage and inadequate treatment (e.g., poor or no filtration, relying solely on chlorination) appear to have been major contributing factors (Schuster et al., 2005). An analysis by Risebro et al. (2007) of outbreaks occurring in the European Union (1990–2005) found that most outbreaks have more than one contributing factor. Similar to the findings of Schuster et al. (2005), contamination of the source waters with sewage or livestock faecal waste (usually following rainfall events) and treatment failures (related to problems with filtration) frequently occurred in enteric protozoa outbreaks. Risebro et al. (2007) also noted that long-term treatment deficiencies, resulting from a poor understanding of, or lack of action upon, previous test results resulted in drinking water outbreaks. Although less common, distribution system issues were also reported to have been responsible for outbreaks, mainly related to cross-connection control problems (Risebro et al., 2007; Moreira and Bondelind, 2017).

A recent review, focusing on outbreaks occurring between 2000 and 2014 in North America and Europe, reported very similar problems still occurring (Moreira and Bondelind, 2017). The outbreaks caused by enteric protozoa were the result of source water contamination by sewage and animal faeces, usually following heavy rains, and by ineffective treatment barriers for the source water quality. Some of the source waters were described as untreated groundwater supplies, however, it is unknown if some would be classified as GUDI. Wallender et al. (2014) reported that of 248 outbreaks in the United States between 1971 and 2008 involving untreated groundwater, 14 (5.6%) were due to G. intestinalis, two (0.8%) due to C. parvum and G. intestinalis and five (2.0%) due to multiple etiologies some of which involved G. intestinalis. The same study also noted that 70% of these 248 outbreaks were related to semi-public and private drinking water supplies using untreated well water. Further information on outbreaks of enteric protozoa from both Canada and worldwide can be found in Appendix B.

5.4 Impact of environmental conditions

The concentrations of Giardia and Cryptosporidium in a watershed are influenced by numerous environmental conditions and processes, many of which are not well characterized and vary between watersheds. However, there are some consistent findings that seem to be applicable to a variety of source waters. Lal et al. (2013) provides a good review of global environmental changes—including land-use patterns, climate, and social and demographic determinants—that may impact giardiasis and cryptosporidiosis transmission.

Numerous studies have found rainfall to have a large impact on the microbiological quality of water sources, although the precise rainfall conditions that lead to increased pathogen loads vary. Outbreaks tend to be classified as either occurring in a surface water or groundwater supply, however, it is unknown if the groundwater category includes all subsurface supplies or only properly classified groundwater systems. Curriero et al. (2001) evaluated the relationship between rainfall and waterborne disease in the U.S. and found that outbreaks were preceded by rainfall events above the 90th percentile of the monthly accumulated rainfall. Outbreaks due to surface water contamination were most significant for extreme precipitation during the month of the outbreak, whereas groundwater outbreaks showed the highest significance for extreme precipitation two months prior to the outbreak. In Canada, Thomas et al. (2006) reported that rainfall events above the 93rd percentile of the rolling 5 day average accumulated rainfall increased the risk of an outbreak by a factor of 2.3. A study in England of drinking water related outbreaks used historical information on outbreaks (bacterial, viral and protozoan pathogens) and rainfall patterns. They determined that the risk of an outbreak was associated with two situations: low rainfall levels over the preceding three weeks and excessive rainfall in the week prior to the outbreak (Nichols et al., 2009). Drayna et al. (2010) reported a significant association between rainfall and pediatric emergency department visits for gastrointestinal related illnesses for a community served by treated surface water; visits increased by 11% four days after rainfall. Uejio et al. (2014) found that extreme precipitation was associated with an increase in childhood AGI for untreated municipal groundwater. Wallender et al. (2014) reported that heavy rainfall or flooding was a contributing factor in 36 of 248 outbreaks involving untreated groundwater supplies.

In Belgium, a study showed that Giardia concentration peaks coincided with both rainfall and snow melt events (Burnet et al., 2014). The same study also noted that the concentration of cysts could fluctuate from below detection limit to peak concentrations within three days, making it important to understand site-specific triggers for increases in surface water concentrations. In the study by Burnet et al. (2014), successive rainfall events occurred in the three days between sample collections, increasing river flows and carrying contaminated run-off from upstream locations. Recently, a meta-analysis was published that investigated the effects of weather events on Giardia and Cryptosporidium in fresh surface waters globally. The researchers found an association between extreme weather events and the odds of detecting these organisms (Young et al., 2015). The same analysis attempted to find a suitable, easy-to-monitor surrogate that could be used for Giardia and Cryptosporidium, but was unable to find a correlation between other weather variables such as water temperature, turbidity or flow rates.

Changes in the Canadian climate may also play a role in Giardia and Cryptosporidium concentration variability. In addition to potentially impacting rainfall and flooding, both of which have been shown to impact Giardia and Cryptosporidium occurrence, researchers studying Lake Ontario have suggested that climate changes could impact the summer thermoclines in the lake. As a result, it is hypothesized that the river water discharges feeding the lake could be pushed deeper into the lake, closer to the drinking water treatment plant intakes (Edge et al., 2013). During the study, it was the river water discharges that contained the highest number of faecal pathogens and that, therefore, could result in a greater pathogen load at the drinking water treatment plant intakes.

Reckhow et al. (2007) surveyed 249 surface water utilities serving a population of 50,000 or more to determine if storm events were monitored for water quality, flow or system response. A "no" response was provided by 60.6%, 66.2% and 82.4% of the water utilities, respectively, for these criteria.

5.5 Relationship to indicator organisms

For the purposes of this document, the term indicator refers to a microorganism whose presence in water indicates the presence of contamination. A surrogate refers to an organism, particle or substance that is used to assess the fate of a pathogen in a natural environment (e.g., pathogen transport) or through treatment (e.g., drinking water disinfection).

The indicator organisms routinely monitored in Canada as part of a source-to-tap or water safety plan approach for verifying water quality are E. coli and total coliforms. The presence of E. coli in water indicates recent faecal contamination and, thus, the strong potential for a health risk, regardless of whether specific pathogens such as enteric protozoa are observed. However, its absence does not necessarily indicate that enteric protozoa are also absent. Total coliforms are not faecal specific and, therefore, cannot be used to indicate faecal contamination (or the potential presence of enteric pathogens). Instead, total coliforms are used to indicate general water quality issues. Further information on the role of E. coli and total coliforms in water quality management can be found in the guideline technical documents on E. coli and total coliforms (Health Canada, 2012b, 2012c).

5.5.1 Surface water sources

Several studies have investigated the relationship between indicator organisms and the presence or absence of enteric protozoa in surface water sources. In general, studies have reported little (Medema et al., 1997; Atherholt et al., 1998; Payment et al., 2000; Van Dyke et al., 2012) or no (Rose et al., 1988, 1991; Chauret et al., 1995; Stevens et al., 2001; Hörman et al., 2004; Dorner et al., 2007; Sunderland et al., 2007; Edge et al., 2013; NSE, 2013; Prystajecky et al., 2014) correlation between protozoa and faecal indicators, including E. coli. In the cases where a correlation has been reported, it is with Giardia and at very high indicator levels. Payment et al. (2000) also published data for the St. Lawrence River indicating that for this watershed, at higher levels of indicators, the probability of finding Giardia, Cryptosporidium or enteric viruses was very high. A review of 40 years of published data on indicator–pathogen correlations found that neither Cryptosporidium (odds ratio 0.41, 95% confidence interval 0.25–0.69) nor Giardia (odds ratio 0.65, 95% confidence interval 0.36–1.15) is likely to be correlated with faecal indicator organisms (Wu et al., 2011). This overall lack of correlation is likely due to a variety of factors, including differential survival rates in the environment, sampling location and methodological differences related to the analysis of water (Payment and Pintar, 2006). Watershed characteristics, including sources and levels of faecal contamination, and geochemical factors, may also influence the correlation between faecal indicators and protozoa, leading to site-specific differences (Chauret et al., 1995). Lalancette et al. (2014) examined E. coli and Cryptosporidium data from several studies to determine whether these water systems would be classified as needing additional treatment for Cryptosporidium, using the United States Environmental Protection Agency Long-Term 2 Enhanced Surface Water Treatment Rule (EPA LT 2) regulation for small systems (U.S. EPA, 2006a), which is based on E. coli concentrations. The authors found that E. coli was useful for predicting Cryptosporidium risk when the watershed was impacted by sewage outfalls in close proximity. However, if the sewage outfalls were farther away, or if the main source of faecal material in the waters was from agricultural or forested areas, the E. coli concentrations would underestimate the risk from Cryptosporidium (Lalancette et al., 2014). This work assumes that all Cryptosporidium detected have the potential to cause adverse human health effects. Although this may not be the case, this assumption is currently applied in most assessments of Cryptosporidium risks.

These observations have raised significant questions regarding the appropriateness of using E. coli as an indicator of protozoan contamination in surface waters, and have highlighted the need for protozoa monitoring of surface waters to gain a better understanding of public health risk. Indicator organism monitoring, such as monitoring for E. coli, can be included as part of a risk management approach, especially for systems where protozoa monitoring is not feasible.

5.5.2 Groundwater sources

A few studies have reported the presence of enteric protozoa in groundwater sources (see sections 5.1.1 and 5.2.1). In conjunction with protozoa monitoring, some studies also investigated the presence of various faecal indicator organisms. Based on the analysis of Hynds et al. (2014), the only faecal indicator that showed a positive correlation with the presence of enteric protozoa was intestinal enterococci. However, the authors still concluded that faecal indicator organisms were of limited efficacy for determining protozoa presence. As an alternative approach to monitoring directly for protozoa, a recent study in New Mexico investigated the link between the presence of bacterial indicators in individual homeowners' wells and the homeowner's immunoglobulin G (IgG) response for Cryptosporidium (Tollestrup et al., 2014). Similar to the studies analysed by Hynds et al. (2014), no link was found with the presence of indicators in the wells, although the study did link the IgG response to the presence of an on-site septic system. The study was limited in that the wells were only tested once, so it is possible that contamination was missed (Tollestrup et al., 2014).

Overall, based on the studies that have been conducted, the presence or absence of faecal indicator organisms in a groundwater source are not a good predictor of contamination with enteric protozoa. However, testing for indicator bacteria is useful in assessing the potential for faecal contamination, which may include enteric protozoa or other enteric pathogens of concern.

5.5.3 Treated drinking water

In general, monitoring for indicator organisms in treated drinking water is intended as a verification of treatment efficacy. The commonly used faecal indicator organisms (e.g., E.coli, total coliforms, enterococci) are not correlated with the presence of Giardia or Cryptosporidium in source waters and, therefore, cannot be used to indicate their presence in treated drinking waters. Nor can faecal indicator organisms be used to determine treatment efficacy with respect to enteric protozoa. Enteric protozoa have dissimilar removals rates through physical processes and are more resistant to many of the disinfectants commonly used in the drinking water industry. As evidence of this, Giardia and Cryptosporidium (oo)cysts have been detected in filtered, treated drinking water meeting existing regulatory standards and have been linked to waterborne disease outbreaks (LeChevallier et al., 1991b; Welker et al., 1994; Craun et al., 1997; Marshall et al., 1997; Rose et al., 1997; Nwachuku et al., 2002; Aboytes et al., 2004).

Despite these limitations, bacterial indicators can be used in conjunction with treatment performance (surrogate) data (see section 7.0) to provide information on the adequacy of drinking water treatment. In particular, the presence of E. coli in water leaving a treatment plant signifies that treatment has been inadequate, and there is an increased risk that pathogens, including enteric protozoa, may be present in treated drinking water.

6.0 Analytical methods

The most widely recognized and used method for the detection of Giardia and Cryptosporidium in water is the U.S. Environmental Protection Agency's (EPA) Method 1623/1623.1, as this method allows for the simultaneous detection of these protozoa and has been validated for use in water (U.S. EPA, 2005a, 2006a, 2012). Other standard methods have also been published (ISO, 2006; APHA et al., 2012). It is important to note that Method 1623/1623.1 does not provide information on the viability or infectivity of the organisms. Other methods for the detection of Giardia and Cryptosporidium in water exist. However, most have demonstrated lower recoveries and increased variance compared with EPA Method 1623 (Quintero-Betancourt et al., 2002). Most methods used for the detection of Giardia and Cryptosporidium in water, including EPA Method 1623/1623.1, consist of four steps: 1) sample collection, 2) sample filtration and elution, 3) sample concentration and separation (purification) and 4) (oo)cyst identification. These steps are described in the following sections. Some emerging detection methods are also discussed, as are methods for assessing (oo)cyst viability and infectivity and for determining if (oo)cysts are human-infectious species.

6.1 Sample collection

Proper sample collection is important for obtaining accurate estimates for enteric protozoa. Therefore, this step should involve clear communication with the laboratory conducting the analysis to ensure the proper sampling procedure is followed. Water samples can be collected as bulk samples or filtered in the field and then shipped on ice to a laboratory for processing within the timeframe prescribed by the analytical method. The volume of water collected depends on the expected level of (oo)cysts in the water (i.e., site specific); the lower the expected density of (oo)cysts, the greater the sample volume needed. In most cases, between 10 and 1,000 L of water are collected. In the case of raw water, samples are typically collected near and at the depth of the drinking water intake point, or from a raw water tap in the treatment plant, in an effort to obtain a representative sample of the source water.

6.2 Sample filtration and elution

In Canada, (oo)cysts are generally present in small numbers even in faecally contaminated water; therefore, bulk water samples must be filtered to concentrate the pathogens to a detectable level. Typically, water is pumped through a filter, and (oo)cysts, along with extraneous particulate materials, are retained on the filter. This can be achieved using a variety of filter types, including wound filters, membrane filters, hollow fibre filters and compressed foam filters. These filters vary in terms of the volume of water that they can process, their filtration rates, their practicality, their compatibility with subsequent processing steps, their cost and their retention ability. These differences account for the wide range of recovery efficiencies reported in the literature (Sartory et al., 1998; DiGiorgio et al., 2002; Quintero-Betancourt et al., 2003; Ferguson et al., 2004). A number of filters have been validated by EPA Method 1623/1623.1 (U.S. EPA, 2005a, 2012). Once filtration is complete, entrapped (oo)cysts on the filter are released through the addition of eluting solutions, producing a filter eluate.

6.3 Sample concentration and separation

The most common approach for sample concentration and separation is through centrifugation and immunomagnetic separation (IMS). First, to concentrate the sample, the filter eluate is centrifuged, resulting in the formation of a pellet. This pellet is resuspended in a small volume of buffer to produce a concentrate. To separate the (oo)cysts from the other contaminants in the sample, IMS is used. The concentrate is mixed with (oo)cyst-specific monoclonal antibodies attached to magnetized beads, also referred to as immunomagnetic beads. These beads will selectively bind to (oo)cysts. A magnetic field is then applied, which separates the (oo)cyst–bead complexes from extraneous materials. These materials are removed, the (oo)cyst–bead complex is dissociated and the beads are extracted, resulting in a concentrated suspension of (oo)cysts. Several studies have assessed the recovery potential of the IMS step alone. Fricker and Clancy (1998) reported that (oo)cysts added to low-turbidity waters can be recovered with efficiencies above 90%. In comparison, mean oocyst and cyst recoveries for turbid waters ranged from 55.9% to 83.1% and from 61.1% to 89.6%, respectively (McCuin et al., 2001). Others have reported similar recoveries (Moss and Arrowood, 2001; Rimhanen-Finne et al., 2001, 2002; Sturbaum et al., 2002; Ward et al., 2002; Chesnot and Schwartzbrod, 2004; Greinert et al., 2004; Hu et al., 2004; Ochiai et al., 2005; Ryan et al., 2005b). Changes in pH (from the optimum of 7) may inhibit IMS (Kuhn et al., 2002). Although IMS aids in reducing false positives by reducing the level of debris on slide preparations for microscopic analysis, it is a relatively expensive procedure. However, it is the only sample separation method included as part of EPA Method 1623/1623.1.

6.4 (Oo)cyst identification

Once samples have been concentrated and (oo)cysts have been separated from extraneous materials, a number of detection techniques can be applied. The most commonly used detection approach is the immunofluorescence assay (IFA). Alternative detection methods, such as flow cytometry and various molecular approaches, are increasingly being used within the research community; however, they have not been included as part of standard methods to date.

6.4.1 Immunofluorescence assay

Following sample concentration and separation, the (oo)cyst suspension is transferred to a well slide for staining and visualization. To stain the samples, fluorescently labelled antibodies directed at specific antigens on the (oo)cyst surface are applied to the slide and allowed to incubate. Direct immunofluorescence microscopy is then used to locate fluorescing bodies, which are potential (oo)cysts. This process, referred to as an IFA, requires specialized equipment and a high level of technical skill. It can be highly sensitive. However, because some autofluorescent algae are very close in size and staining characteristics to (oo)cysts, and the fluorescently labelled antibodies may cross-react with other organisms of similar shape and size, final identification of (oo)cysts requires additional staining and microscopy using a DAPI stain. Because DAPI binds to deoxyribonucleic acid (DNA), it will highlight (oo)cyst nuclei and facilitate their identification. DIC microscopy is also used to examine the internal and external morphological characteristics of the presumptive (oo)cysts for atypical structures. Both DAPI and DIC are included as part of standard methods for (oo)cyst identification (ISO, 2006; APHA et al., 2012; US EPA, 2012).

6.4.2 Flow cytometry

Flow cytometry can be used as an alternative technique for detecting (oo)cysts following concentration. Flow cytometry allows the sorting, enumeration and examination of microscopic particles suspended in fluid, based on light scattering. Fluorescently activated cell sorting (FACS) is the flow cytometric technique that is used to enumerate and separate Giardia and Cryptosporidium from background particles. Typically, immunofluorescent antibodies are introduced into the (oo)cyst suspension, and the suspension is passed through a beam of light (within the flow cytometer). As particles pass through the stream of light, their fluorescence is measured, and they are then sorted into two or more vials.

FACS has proven to be highly sensitive and specific (Vesey et al., 1997; Bennett et al., 1999; Reynolds et al., 1999; Delaunay et al., 2000; Lindquist et al., 2001; Kato and Bowman, 2002; Lepesteur et al., 2003; Hsu et al., 2005; Keserue et al., 2011). This approach has the advantage of being rapid, allowing for high throughput. However, flow cytometers are expensive, and their operation requires significant user training. In addition, like IFA, this procedure can be adversely influenced by the presence of autofluorescent algae and antibody cross-reactivity with other organisms and particles. Many FACS methods also require confirmation of (oo)cysts by microscopy; however, methods are being developed that may not require a confirmation step (Ferrari and Veal, 2003; Keserue et al., 2011).

6.4.3 Molecular methods

A number of molecular approaches have also been used in the detection of Giardia and Cryptosporidium (oo)cysts. A brief description of some of these methods is provided below. It is important to note that molecular methods have not yet been included in standard methods for the detection of Giardia and Cryptosporidium in water.

6.4.3.1 Polymerase chain reaction

Polymerase chain reaction (PCR) is the most commonly used molecular method for detection of (oo)cysts. This method involves lysing (oo)cysts to release DNA and then introducing primers that are targeted at specific Giardia or Cryptosporidium coding regions and amplification of these regions. The coding region targets can be genus specific (e.g., 18S ribosomal ribonucleic acid [rRNA]) or specific for one or a limited number of species (e.g., hsp70, gp60). A positive PCR signal is determined using agarose gel electrophoresis, ethidium bromide staining and by visually examining the gel under UV light. PCR can be highly sensitive (i.e., level of a single (oo)cyst per reaction tube) and specific (Deng et al., 1997, 2000; Bukhari et al., 1998; Di Giovanni et al., 1999; Kostrzynska et al., 1999; Rochelle et al., 1999; Hallier-Soulier and Guillot, 2000; Hsu and Huang, 2001; McCuin et al., 2001; Moss and Arrowood, 2001; Rimhanen-Finne et al., 2001, 2002; Sturbaum et al., 2002; Ward et al., 2002), and is also amenable to automation.

The traditional PCR method described above only provides information on whether the DNA target is present or absent in the sample being tested. However, it can be combined with other techniques, such as restriction fragment length polymorphism (RFLP) or DNA sequencing, to discriminate between species and genotypes of Giardia and Cryptosporidium (Morgan et al., 1997; Widmer, 1998; Lowery et al., 2000, 2001a,b; Xiao et al., 2001; Ruecker et al., 2012; Prystajecky et al., 2014), or techniques such as cell culture (Jenkins et al., 2003; Keegan et al., 2003; LeChevallier et al., 2003) to determine (oo)cyst viability and infectivity (see section 6.6).

Variations on the traditional PCR have been developed and used for pathogen detection, the most common being quantitative (q) PCR (also referred to as real-time). qPCR is a modified PCR that involves oligonucleotide probes with the use of dyes that fluoresce when bound to double-stranded DNA. As the target region within (oo)cysts is amplified, the emitted fluorescence is measured in real time, thereby allowing quantification of the PCR products. This method has several advantages over traditional PCR, including eliminating post-PCR analysis, increased throughput, decreased likelihood of contamination (i.e., closed vessel system), and the ability to quantify (oo)cysts using a standard curve (MacDonald et al., 2002; Fontaine and Guillot, 2003; Bertrand et al., 2004), although in practice, quantification of oocysts is not always possible. Research has shown that qPCR was unable to distinguish between the signals for 1, 2, 5, and 10 oocysts (Staggs et al., 2013). In general, a 10 fold increase in oocyst density is needed to detect differences in concentration (Di Giovanni and LeChevallier, 2005; Staggs et al., 2013). A qPCR approach has other unique advantages, including its ability to differentiate between species of Giardia and Cryptosporidium (using melting curve analysis) (Limor et al., 2002; Ramirez and Sreevatsan, 2006) and the simultaneous detection of different microorganisms (i.e., multiplexing) (Guy et al., 2003). This assay offers several advantages over traditional PCR and IFA, and has proven useful in identification and enumeration of (oo)cysts. This technique has also been used in combination with propidium monoazide (PMA) dye to determine (oo)cyst viability (Brescia et al., 2009; Alonso et al., 2014).

Digital PCR is a new technique that provides some of the same advantages of qPCR (e.g., real-time results, multiplexing) and overcomes some of the disadvantages, such as the requirement to generate a standard curve for quantitation (Gutiérrez-Aguirre et al., 2015; Cao et al., 2015). This technique has been used in clinical settings; however, it has only recently been applied to environmental samples. Similar to qPCR, it requires a relatively expensive analyser.

Reverse transcriptase (RT-PCR or RT-qPCR) targets RNA as opposed to DNA and can be used to discriminate between viable and non-viable (oo)cysts. Further information can be found in section 6.6.3.

Although there are numerous advantages to PCR assays, they also have several drawbacks. PCR inhibition by divalent cations and humic and fulvic acids is a significant problem (Sluter et al., 1997). In an effort to remove these inhibitors, samples must go through several purification steps. In addition to inhibition, inefficient (oo)cyst lysis is often an issue. Despite these problems, many PCR assays have been developed for detection of waterborne (oo)cysts (Stinear et al., 1996; Kaucner and Stinear, 1998; Griffin et al., 1999; Lowery et al., 2000; Gobet and Toze, 2001; Karasudani et al., 2001; Ong et al., 2002; Sturbaum et al., 2002; Ward et al., 2002).

6.4.3.2 Other molecular methods

Other molecular methods that have been used for the detection of (oo)cysts include fluorescence in situ hybridization (FISH), microarrays, and loop-mediated isothermal amplification. These methods are not used routinely but have been used with varying levels of success in research laboratories. Further work on validating and standardizing these methods would be needed for their use outside of a research laboratory.

6.5 Recovery efficiencies

An integral part of the Giardia and Cryptosporidium detection process involves determining recovery efficiencies. As mentioned previously, there can be significant losses of (oo)cysts during the concentration and separation processes. In addition, the characteristics of the water (e.g., presence of suspended solids, algae) can have a significant impact on recovery efficiency. To measure recovery efficiency, a known number of (oo)cysts are introduced into the water sample (i.e., seeding) before the sample is analysed. Ideally, the recovery efficiency should be determined for each sample; however, because this is expensive, recovery efficiency data are usually collected for a subset of samples. For example, EPA Method 1623.1 requires a recovery efficiency sample to be analysed for (oo)cysts each week, or every 20 samples if more than 20 samples are analysed weekly (U.S. EPA, 2012). With the introduction of commercial preparations containing a certified number of (oo)cysts, including recovery efficiency measurements has become more cost-effective and routine.

The recovery efficiency for the most commonly used method, EPA Method 1623, has been reported to have an average Cryptosporidium recovery of 40% and an average Giardia recovery of 69% (Jaidi et al., 2009), although the recovery does vary between studies (U.S. EPA, 2001; Quintero-Betancourt et al., 2003). Several studies have also evaluated the recovery efficiencies achieved using EPA Method 1623 with different types of filters (McCuin and Clancy, 2003; Ferguson et al., 2004; Hu et al., 2004; Wohlsen et al., 2004; Karim et al., 2010). Recoveries ranged significantly and correlated with variations in raw water quality. As recovery efficiencies are less than 100%, the estimated concentration of (oo)cysts in a water sample is almost always higher than the measured density. Ongerth (2013b) showed that by correcting the (oo)cysts' density measurements for recovery, the resulting concentration estimate changed by a factor of 2 to 10, depending on the recovery efficiency. It has also been shown that the method used to incorporate recovery data can bias the concentration estimate (Schmidt et al., 2013). Thus, recovery efficiencies should be reported, and they can be used to better approximate the concentration of (oo)cysts. If they are included in the (oo)cyst concentration estimate, the approach used for incorporating the recovery efficiency needs to be recorded.

6.6 Assessing viability and infectivity

A major drawback of existing methods routinely used for the detection of Giardia and Cryptosporidium is that they provide very limited information on the viability or human infectivity of (oo)cysts, which is essential in determining their public health significance.

The viability of (oo)cysts can be assessed using in vitro methods such as excystation, fluorogenic dye inclusion/exclusion (i.e., staining), RT-PCR and FISH. Although these methods may provide information on viability, they do not provide information on whether the (oo)cysts are species or types capable of causing infections in humans.

To assess infectivity, in vitro methods such as RFLP or DNA sequencing can be used to determine whether samples contain potentially human-infective genotypes. However, these methods do not provide any information on viability.

Cell culture, on the other hand, is an in vitro method that can assesses viability and infectivity concurrently. Animal infectivity assays are in vivo methods that also evaluate both viability and infectivity. Unfortunately, both cell culture and animal infectivity methods are costly because of the need for maintaining cell lines, animals, and qualified staff; as a result, they are not typically applied to the routine assessment of (oo)cysts.

A brief discussion of available methods for assessing viability or infectivity is provided in the following sections.

6.6.1 Excystation

Viability (but not infectivity) can be estimated by subjecting (oo)cysts to conditions similar to those in the gut, in an effort to stimulate excystation (i.e., release of trophozoites/sporozoites). Excystation "cocktails" and conditions vary considerably and may result in conflicting observations. If (oo)cysts are capable of excystation, they are considered viable. Giardia can be excysted using acid and enzymes, such as trypsin, and grown in TYI S 33 medium (Diamond et al., 1978; Rice and Schaefer, 1981), but the excystation rate for Giardia is often low. Cryptosporidium parvum oocysts can also be excysted (Black et al., 1996; Hijjawi, 2010); however, excystation methods have been shown to be relatively poor indicators of Cryptosporidium oocyst viability. Neumann et al. (2000b) observed that non-excysted oocysts recovered after commonly used excystation procedures are still infectious to neonatal mice.

6.6.2 Dyes

Various staining methods have been developed to assess (oo)cyst viability, based on the inclusion or exclusion of two fluorogenic dyes: DAPI and PI (Robertson et al., 1998; Freire-Santos et al., 2000; Neumann et al., 2000b; Gold et al., 2001; Iturriaga et al., 2001). These dyes are used to determine (oo)cyst wall permeability. Three classes of (oo)cysts can be identified: 1) viable (inclusion of DAPI, exclusion of PI), 2) non-viable (inclusion of both DAPI and PI) and 3) quiescent or dormant (exclusion of both DAPI and PI, but potentially viable). In general, DAPI and PI give good correlation with in vitro excystation (Campbell et al., 1992). Neumann et al. (2000a) demonstrated a strong correlation between DAPI/PI staining intensity and animal infectivity of freshly isolated C. parvum oocysts. These stains have also been successfully used in conjunction with fluorescently labelled antibodies (used in FACS) to determine the viability and infectivity of (oo)cysts in water samples, because their fluorescence spectra do not overlap with those of the antibodies (Belosevic et al., 1997; Bukhari et al., 2000; Neumann et al., 2000b). Although positive correlations have been reported, dye inclusion/exclusion, like excystation procedures, has also been shown to overestimate the viability and potential infectivity of (oo)cysts and therefore should be interpreted with caution (Black et al., 1996; Jenkins et al., 1997).

PMA is a vital dye that is used in conjunction with molecular methods, such as qPCR, to determine the viability of (oo)cysts. PMA enters (oo)cysts that are damaged where it then intercalates into the DNA, and can form a stable complex that prevents PCR amplification. This property enables differentiation of live and dead organisms (Brescia et al., 2009; Alonso et al., 2014). The use of PMA for determining (oo)cyst viability is a relatively new approach and further research on its advantages and limitations is needed.

6.6.3 Reverse transcriptase–polymerase chain reaction

RT-PCR can be applied to detect viable (oo)cysts in water concentrates (Kaucner and Stinear, 1998). RT-PCR amplifies a messenger ribonucleic acid (mRNA) target molecule. As only viable organisms can produce mRNA, this experimental method may prove useful in assessing (oo)cyst viability. For example, when compared with the IFA DAPI/PI method, the frequency of detection of viable Giardia increased from 24% with IFA to 69% with RT-PCR. An advantage of this approach is that it can be combined with IMS, allowing for simultaneous detection and viability testing (Hallier-Soulier and Guillot, 2000, 2003). It can also be quantitative, although quantitation can be difficult as the number of copies of the mRNA target can vary. RT-PCR, like other PCR-based methods, is highly susceptible to environmental inhibition and suffers from inefficient extraction of nucleic acids from (oo)cysts.

6.6.4 Fluorescence in situ hybridization

FISH has shown modest success in differentiating between living and dead (oo)cysts (Davies et al., 2005; Lemos et al., 2005; Taguchi et al., 2006); however, false positives are common (Smith et al., 2004). As 18S rRNA is present in high copy numbers in viable (oo)cysts but in low numbers in non-viable (oo)cysts, it is a useful target for assessing viability. Like DAPI/PI staining, FISH is unable to assess (oo)cyst infectivity.

6.6.5 Genotyping methods

Genotyping methods, such as RFLP analysis and DNA sequencing, have been used extensively in recent years to determine the species of Giardia and Cryptosporidium present in positive water samples, providing insight on whether they are human-infective genotypes. Genotyping methods can be applied directly to the slides from an IFA. The DNA from the positive wells is extracted and a (nested)(q) PCR is used to amplify a specific DNA target that allows for identifying the organism to the species (and sometimes strain) level. There are several DNA targets that have been used for amplification. Usually, the small subunit rRNA gene is used for genotyping methods since it is present in multiple copies and provides conserved and unique segments that can be used for identification tools. However, other targets have also been used. For Giardia, genotyping has been done using the 5.8S gene along with the two flanking internal transcribed spacer regions (Cacciò et al., 2010) and the β-Giardin gene (Cacciò et al., 2002). For Cryptosporidium, the 18S rRNA gene is often the target for species identification (Ryan et al., 2003; Loganthan et al., 2012), but other genes, such as hsp70, are used for determining strains. The PCR product is used for RFLP analysis or for DNA sequencing to determine the genotypes present (Ruecker et al., 2012; Xiao et al., 2013). Sometimes both methods are used, especially if there are multiple species in the same sample (Ruecker et al., 2012). If the DNA amplification is carried out using qPCR, then melt curve analysis can also been used to aid in the identification of species.

The feasibility of using genotyping methods during routine (oo)cyst analysis needs to be further explored. Recently, a large multi-laboratory study was conducted in the U.S. to determine whether genotyping could be applied successfully to previously collected regulatory samples by laboratories outside of a research setting. It was found that with training, but no previous experience in genotyping, labs were able to successfully genotype 65% of the slides, many of which contained a low number of oocysts per slide (Di Giovanni et al., 2014). This suggests that, with further investigation and validation, it may become possible to include genotyping as part of a standard method.

6.6.6 Animal infectivity assays

The most direct method for assessing (oo)cyst viability and infectivity is to inoculate a susceptible animal and monitor for (oo)cyst shedding and any histological evidence of disease development. Giardia and Cryptosporidium are used to infect experimental animals such as the gerbil (for Giardia) (Belosevic et al., 1983) and the neonatal CD-1 mouse (for Cryptosporidium) (Finch et al., 1993). This approach has shown moderate success (Delaunay et al., 2000; Korich et al., 2000; Matsue et al., 2001; Noordeen et al., 2002; Okhuysen et al., 2002; Rochelle et al., 2002), but it is not practical, as most analytical laboratories do not maintain animal colonies, and animal infectivity assays are expensive to perform. This approach is also not sensitive enough for environmental monitoring (i.e., high median infective dose [ID50]). These assays are typically reserved for research purposes, such as assessing disinfection effectiveness, rather than for routine assessment of (oo)cyst viability/infectivity.

6.6.7 Cell culture infectivity assays

Cell culture infectivity assays, as mentioned earlier, are in vitro methods that can assess viability and infectivity concurrently. These methods have been developed for Cryptosporidium, but are not used for determining Giardia infectivity. In vitro cell culture assays for Cryptosporidium involve exposing oocysts to excystation stimuli followed by their inoculation into a cultured mammalian cell line, such as human ileocaecal adenocarcinoma (HCT-8) cells, which support the parasite's growth and development. Oocysts are typically inoculated on HCT-8 cell monolayers. After a 24- to 72-hour incubation, the cell monolayer is examined for the presence of Cryptosporidium reproductive stages using either an indirect IFA (Slifko et al., 1997) or PCR (Rochelle et al., 1997; Di Giovanni et al., 1999). A standardized cell culture infectivity assay has recently been published (APHA et al., 2012).

This approach has been used to estimate the infectivity of oocysts in water (Di Giovanni et al., 1999; Hijjawi et al., 2001; Weir et al., 2001; Rochelle et al., 2002; Johnson et al., 2005; Schets et al., 2005; Coulliette et al., 2006) and has been shown to yield results comparable to those of the mouse infectivity model (Hijjawi et al., 2001; Rochelle et al., 2002; Slifko et al., 2002). In other comparison studies, average percent viabilities were comparable for cell culture, excystation and DAPI/PI assays (Slifko et al., 1997).

There are several advantages to the cell culture assay, including its high sensitivity (i.e., detection of a single viable oocyst), applicability to analysis of raw and treated water samples, ease of performance and rapid turnaround time for results. Another advantage of this approach is that C. parvum and C. hominis can be maintained in vitro for up to 25 days, facilitating viability and immunotherapy studies (Hijjawi, 2010). In addition, cell culture can be combined with other methods, including PCR and IFA, to more accurately assess viability/infectivity. Cell culture PCR (CC-PCR) has proven useful in assessing watershed contamination and in estimating risk (Joachim et al., 2003; LeChevallier et al., 2003; Masago et al., 2004). Combining cell culture with IFA (CC-IFA) may have an advantage over CC-PCR/CC-RT-PCR in that fewer false positives were found (Johnson et al., 2012). Although cell culture infectivity assays have several advantages, they also possess a number of disadvantages, including the need to maintain a cell line, and poor reproducibility among similar samples for quantitative assessments. Moreover, it is currently unknown whether all species of Cryptosporidium can be detected with existing cell culture methods; however, it has been proven to detect C. parvum, C. hominis, and C. meleagridis, and these are the main human infectious genotypes (Rochelle et al., 2012).

7.0 Treatment technology

The primary goal of treatment is to reduce the presence of disease-causing organisms and associated health risks to an acceptable or safe level. This can be achieved through one or more treatment barriers involving physical removal and/or inactivation. To optimize performance for removal and/or inactivation of microbial pathogens, the relative importance of each barrier should be understood. Some water systems have multiple redundant barriers, such that failure of one barrier still provides adequate treatment. In other cases, all barriers must be working well to provide the required level of treatment. For example, Cryptosporidium is extremely resistant to commonly used chlorine-based disinfectants (Korich et al., 1990; U.S. EPA, 1991, Finch et al., 1994, 1997). As a result, many treatment systems rely solely on physical removal processes (i.e., filtration) to remove oocysts. For these systems, failure of the filtration process, or required pre-treatment processes (i.e., coagulation/flocculation for chemically assisted filtration), could lead to a waterborne outbreak.

The source-to-tap approach, including source water protection, is a universally accepted approach to reduce protozoa and other waterborne pathogens in drinking water (O'Connor, 2002; CCME, 2004). Operator training is also required to ensure the effective operation of treatment barriers at all times (Smeets et al., 2009).

7.1 Municipal scale

The use of a combination of physical removal and inactivation technologies is the most effective way to reduce (oo)cysts in drinking water. It is essential that physical removal and/or inactivation targets be achieved before drinking water reaches the first consumer in the distribution system. Physical removal barriers, such as filtration technology, are assigned a “log removal” credit towards reducing (oo)cyst levels when they achieve specified individual filter effluent turbidity limits as discussed in section 7.1.2. Inactivation barriers include primary disinfection processes. “Log inactivation” credits are calculated using the disinfection concepts described in section 7.1.3. The “log removal” and “log inactivation” credits are summed to calculate the overall (oo)cyst “log reduction” for the treatment process being assessed. Secondary disinfection is used to maintain a residual in the distribution system to protect against microbial regrowth and serve as a sentinel for water quality changes. No log inactivation credits are awarded for secondary disinfection processes.

7.1.1 Level of treatment necessary

To determine the necessary level of treatment, the source water should be adequately characterized. Source water characterization generally includes a number of steps, including delineating the boundary of the source water area, identifying threats and risks to the source water and measuring its microbiological, physical, chemical and radiological quality (WHO, 2011, 2012). Monitoring of seasonal changes is also important to ensure that water utilities consistently produce high quality treated water for the full range of raw water conditions (Valade et al., 2009; Huck and Sozański, 2011).

For surface water sources, the required level of treatment should ideally be determined by measuring Giardia and Cryptosporidium concentrations during normal conditions and following spring runoff or storm events. Sampling results should take into account recovery efficiencies for the analytical method and pathogen viability in order to obtain the most accurate assessment of infectious pathogens present in the source water (Emelko et al., 2008, 2010a; Schmidt and Emelko, 2010; Schmidt et al., 2010, 2013). Where source water sampling and analysis for Giardia and Cryptosporidium are not feasible (e.g., small community water supplies), the source water characterization—including water quality parameters that can provide information on the risk and/or level of faecal contamination—can be used to establish the necessary level of treatment. Further guidance on characterizing risks in small systems can be found elsewhere (WHO, 2012). Safety factors or additional treatment reductions may be applied to ensure the production of microbiologically safe drinking water, particularly if source water quality is highly variable.

Subsurface sources should be evaluated to determine whether the supply is vulnerable to contamination by enteric protozoa (i.e., GUDI) and other enteric pathogens. These assessments should include, at a minimum, a hydrogeological assessment, an evaluation of well integrity and a survey of activities and physical features in the area. Chlorophyll-containing algae are considered unequivocal evidence of surface water. Jurisdictions that require MPA as part of their GUDI assessment should ensure algae are a principal component of any MPA tests (U.S. EPA, 2016). Subsurface sources determined to be GUDI should achieve a minimum 3 log removal and/or inactivation of enteric protozoa. Sources that have been assessed as not vulnerable to contamination by enteric protozoa should meet the recommendations set out in the guideline technical document on enteric viruses (Health Canada, 2011a). It is important that subsurface sources be properly classified as numerous outbreaks have been linked to the consumption of untreated groundwater contaminated with enteric protozoa and/or other enteric pathogens (see sections 5.3 and 5.4)(Wallender et al., 2014; U.S. EPA, 2016).

As most surface waters and GUDI supplies are subject to faecal contamination, treatment technologies should be in place to achieve a minimum 3 log (99.9%) removal and/or inactivation of Giardia and Cryptosporidium. With this level of treatment, a mean source water concentration of 21 cysts/100 L can be reduced to 2.1 × 10−2 cysts/100 L, which meets the population health target of 10−6 disability-adjusted life year (DALY)/person per year, as outlined in section 9.0. Similarly, a mean source water concentration of 13 oocysts/100 L can be reduced to 1.3 × 10−2 oocysts/100 L. As many surface waters in Canada have higher (oo)cyst concentrations (see sections 5.1.1 and 5.2.1), additional removal/inactivation may be needed to meet treatment goals.

The identification of the most appropriate treatment process requires site-specific evaluation and should be made after suitable analysis and/or pilot testing. More variable weather patterns associated with climate change will place increased importance on proper process selection (Huck and Coffey, 2004).

7.1.2 Physical removal

Physical removal of (oo)cysts can be achieved by a variety of technologies, including chemically assisted, slow sand, diatomaceous earth and membrane filtration or an alternative proven filtration technology. Riverbank filtration is a passive filtration approach to remove microorganisms that may also be recognized by the appropriate drinking water authority. For chemically assisted, slow sand and diatomaceous earth filtration, there are sufficient full-scale or pilot-scale studies using (oo)cysts to document log removals achieved by these processes. For membrane filtration, published studies report log removals using either the target organism or an appropriate surrogate (U.S. EPA, 2005b). For riverbank filtration, log removals can be calculated using Cryptosporidium or a surrogate for Cryptosporidium, along with related hydrogeological and water quality parameters required to approximate the groundwater dilution ratio and time lag from the surface water body (e.g., temperature, conductivity, dissolved organic carbon, chloride, bromide, total dissolved solids, hydrogen, oxygen, uranium and other isotopes; U.S. EPA, 2010). As (oo)cysts are seldom detected in supplies using riverbank filtration, published studies use surrogates to report log removals. However, there is no universally accepted surrogate or indicator organism, suite of organisms or physical parameter (e.g., turbidity) that can be used to assess the fate and transport of microbial contaminants in subsurface environments (Schijven et al., 2002; Pedley et al., 2006; Emelko et al., 2010b; Tufenkji and Emelko, 2011; Bradford et al., 2016; Headd and Bradford, 2016). The U.S. EPA (2010) recommends monitoring for at least three or four surrogate organisms using paired surface water and groundwater samples. The following are suggested as possible surrogates to calculate the log removal efficiency of riverbank filtration: aerobic spores (e.g., Bacillus subtilis), anaerobic spores (e.g., Clostridium perfringens), total coliforms, E. coli, enterococci bacteria, bacteriophages, coliphages or diatoms. Other surrogates noted in U.S. EPA (2010) include turbidity, particle counts and MPA although log removal efficiencies cannot be calculated using these results.

7.1.2.1 Chemically assisted filtration

The goal of coagulation is to destabilize (i.e., neutralize the charge of) colloidal particles (including pathogens) so that they effectively aggregate during flocculation and are subsequently removed by clarification (sedimentation or dissolved air flotation). Solids contact units including ballasted sand flocculation processes combine coagulation, flocculation and clarification in one process. Granular media filtration acts as a polishing step for further removal of small colloidal particles not removed during clarification. For effective removal during filtration, colloidal particles must be destabilized during coagulation; it is for this reason that granular media filtration is recognized as a physico-chemical treatment process and is commonly referred to as chemically assisted filtration. Direct filtration plants do not include a clarification step, and inline filtration plants do not include either flocculation or clarification (AWWA, 2011a; MWH, 2012).

Considerable research has been conducted to establish (oo)cyst log removals for chemically assisted filtration. Mean log removals reported in full-scale studies range from >1.54 to >5.2 (median of means = >3.02) for Giardia and from >1.2 to >4.6 (median of means = >2.34) for Cryptosporidium (LeChevallier et al., 1991c; LeChevallier and Norton, 1992; Payment and Franco, 1993; Nieminski and Ongerth, 1995; States et al., 1997; Gammie et al., 2000). Because (oo)cysts are rarely detected in treated water, the effectiveness of treatment depends largely on the source water (oo)cyst concentration. Analytical detection limits and recovery efficiencies also influence log removal calculations.

Numerous pilot-scale challenge studies have also been conducted where microorganisms are spiked into raw water at concentrations large enough to be detected in treated water. Table 5 summarizes published pilot-scale results for Giardia (range = 1.0 to 5.5; median of means = 3.4) and Cryptosporidium (range = 0.73 to >6.6; median of means = 3.4). While most pilot-scale studies have evaluated the removal of Cryptosporidium, Giardia removal should be achieved concurrently at full scale. However, because Giardia may be present at higher concentrations in the source water, it may drive the log removal treatment goal for protozoa (Tfaily et al., 2015).

Pilot-scale log removal calculations are also impacted by analytical issues related to detection limits and recovery efficiencies, as well as how non-detects are handled (Emelko et al., 2005; Zhou et al., 2015). Experimental factors that influence results include the (oo)cyst seeding location and concentration (Hendricks et al., 2005; Assavasilavasukul et al., 2008a, 2008b; Campbell et al., 2014). For example, the minimum log removals reported by Assavasilavasukul et al. (2008a, 2008b) were attributed to the low spiking concentration.

From an operational perspective, coagulation chemistry is reported to be the most important variable (Ongerth et al., 1989; Ongerth and Pecoraro, 1995; Patania et al., 1995; Edzwald and Kelley, 1998; Coffey et al, 1999; Emelko et al., 1999; Dugan et al., 2001; Harrington et al., 2001; Huck et al., 2001, 2002; Dai and Hozalski, 2002; Lytle et al., 2002; Emelko, 2003; Betancourt and Rose, 2004; Hendricks et al., 2005; O’Melia, 2006; Hijnen and Medema, 2007a; Brown and Emelko, 2009). Patania et al. (1995) and Hendricks et al. (2005) reported log removals without chemical pre-treatment in the range of 0.0 to 0.84 and 0.0 to 1.10 for Giardia and Cryptosporidium, respectively; Huck et al. (2001) reported oocyst log removals of 0 to 0.86 without pre-treatment. Several studies reported that oocyst removal by filtration can significantly deteriorate during sub-optimal coagulant conditions (e.g., treatment effectiveness decreased by 2.0 to 3.4 log compared to optimal conditions) (Ongerth and Pecoraro, 1995; Patania et al., 1995; Dugan et al., 2001; Huck et al., 2001).

Table 5. Range of log removals from published pilot-scale filtration studies
Study Log Removal
Giardia
Log Removal
Cryptosporidium
Min Max Mean Min Max Mean
Nieminski and Ongerth, 1995 (Plant 1) 2.20 3.90 3.40 1.94 3.98 2.98
Nieminski and Ongerth, 1995 (Plant 2) 2.90 4.00 3.30 1.31 3.78 2.97
Ongerth and Pecoraro, 1995 2.82 3.70 3.33 2.52 3.40 2.90
Patania et al, 1995 3.4 5.1 4.2 2.7 5.9 4.2
Edzwald and Kelley, 1998 Not tested 4.2 5 Table 5 footnote *
Swertfeger et al., 1999 2.5 4.7 3.7 1.6 4.2 3.1
Edzwald et al., 2000 4.3Table 5 footnote 1 5.5Table 5 footnote 1 Table 5 footnote * 5.3Table 5 footnote 1 5.5Table 5 footnote 1 Table 5 footnote *
Dugan et al., 2001 Not tested 4.5 5.4 >5.0
Huck et al., 2001, 2002 (Plant 1) Not tested 4.7 5.8 5.5
Huck et al., 2001, 2002 (Plant 2) 2.9 4.9 4.4 2.0 5.0 3.0
States et al., 2002 Not tested 2.3 >6.6 >5.8
Emelko et al., 2003a Not tested 4.7 5.8 5.6
Emelko, 2003 Not tested 4.7 5.7 5.1
Emelko and Huck, 2004 Not tested 4.2 5.7 Table 5 footnote *
Hendricks et al., 2005 (Plant 1) 3Table 5 footnote 1 3.5Table 5 footnote 1 3.2 3Table 5 footnote 1 3.6Table 5 footnote 1 3.4
Hendricks et al., 2005 (Plant 2) 2.9Table 5 footnote 1 3.9Table 5 footnote 1 3.24 2.5Table 5 footnote 1 4.2Table 5 footnote 1 3.38
Assavasilavasukul et al., 2008a, 2008b 1.0 5.4 Table 5 footnote * 0.73 5.2 Table 5 footnote *
Brown and Emelko, 2009 Not tested 3.8Table 5 footnote 1 4.7Table 5 footnote 1 Table 5 footnote *
Table 5 footnotes
Table 5 footnote *

Not given

Return to Table 5 footnote * referrer

Table 5 footnote 1

Graphical interpretation

Return to Table 5 footnote 1 referrer

Media design (sand, anthracite/sand, GAC/sand) were found to be equally effective (Hall et al., 1995; Patania et al., 1995; Swertfeger et al., 1999; Dugan et al., 2001; Hendricks et al., 2005) although Hijnen et al. (2010) found that fresh GAC polishing filters removed 2.7 log of C. parvum versus 1.2 log for loaded GAC; the average removal rate for G. lamblia was 2.1 log for both fresh and loaded GAC. Filtration rate did not significantly influence results (Patania et al., 1995; Edzwald and Kelley, 1998; Harrington et al., 2001; Huck et al., 2001).

Patania et al. (1995) and Hendricks et al. (2005) reported that conventional filtration was more effective than direct/inline filtration. Several bench- and pilot-scale studies have demonstrated that for water sources containing algae, natural colour or natural organic matter (NOM), dissolved air flotation can achieve 2 to 3 log removals of Giardia and Cryptosporidium compared to 1 to 2 log removals by sedimentation. In these types of sources, low density flocs tend to be formed, and these are more amenable to flotation than to sedimentation (Plummer et al., 1995; Edzwald and Kelley, 1998; Edzwald et al., 1999; Edzwald et al., 2000; Harrington et al., 2001; Edzwald et al., 2003; Edzwald, 2010; Gregory and Edzwald, 2011).

Numerous studies reported that a filter effluent of 0.1 NTU or less is required to maximize (oo)cyst reduction (Logsdon et al., 1981; Logsdon et al., 1985; Nieminski and Ongerth, 1995; Patania et al., 1995; Harrington et al., 2001; Huck et al., 2001; Emelko et al., 2003a; Emelko et al., 2005; Hendricks et al., 2005; Campbell et al., 2014). However, filter effluent turbidity of <0.1 NTU is not an indication of the absence of (oo)cysts in treated water (Gammie et al., 2000; Aboytes et al., 2004).

The filter-ripening and end-of-filter-run periods are also identified as vulnerable periods (Logsdon et al., 1981; Logsdon et al., 1985; Ongerth et al., 1989; Patania et al., 1995; Huck et al., 2001; Amburgey et al., 2003; Emelko et al., 2003a; Emelko and Huck, 2004; Amburgey et al., 2005; Soucie and Sheen, 2007). Thus, filters must be carefully controlled, monitored and backwashed such that no periods of insufficient removal occur during the filter cycle. Filter-to-waste and the extended sub-fluidization wash technique are options to reduce risk during the filter-ripening period (Amburgey et al., 2003, 2005; Soucie and Sheen, 2007). It is recommended that filter backwash water not be recirculated through the treatment plant without additional treatment.

In summary, well-operated chemically assisted filtration plants optimized for particle and NOM removal should be capable of achieving 3 log removal for (oo)cysts (Hall et al., 1995; Patania et al., 1995; Huck et al., 2002; Betancourt and Rose, 2004; Emelko et al., 2005; Hijnen and Medema, 2007a). For this to occur, it is critical that the preceding coagulation and flocculation steps be optimized. Jar tests should be conducted to optimize the pre-treatment process (Hall et al., 1995; U.S. EPA, 2004; AWWA, 2011b). Monitoring the net surface charge of particles following coagulation may also be helpful where source water quality is highly variable (Conio et al., 2002; Newcombe and Dixon, 2006; AWWA, 2011b; Kundert, 2014; McVicar et al., 2015; Sharp, 2015).

In addition to filter effluent turbidity, other parameters such as dissolved coagulant metal-ion concentration in the filtered water and the stability of clarified water turbidity are indicators of treatment problems and increased (oo)cyst risk (Hall et al., 1995; Hiltz, 2015). Particle counts have also been reported to be a sensitive measure of treatment performance, particularly when filter effluent turbidity is approximately 0.1 NTU or less (Jacangelo et al., 1991; Coffey et al., 1999; Emelko et al., 1999; Huck et al., 2001; Huck et al., 2002).

7.1.2.2 Slow sand and diatomaceous earth filtration

Slow sand filtration generally consists of untreated water flowing by gravity at a slow rate through a bed of submerged porous sand. During operation, biological growth occurs within the sand bed and gravel support. In addition, bacteria and other materials in the source water accumulate on the surface to form a “schmutzdecke”. The biological growth within the filter and schmutzdecke both contribute to the effectiveness of slow sand filtration. Depending on the source water quality, it may take weeks or months for this biological growth to develop. Pilot testing is recommended to ensure slow sand filtration will successfully treat a source water (Bellamy et al., 1985a, 1985b; Logsdon et al., 2002).

Numerous pilot-scale challenge studies have been conducted to establish (oo)cyst log removals for slow sand filtration. Bellamy et al. (1985a) reported a log removal of 2.07 for Giardia with a new sand filter that did not have a mature microbiological population. As the microbiological population within the sand bed and gravel support matured, log removals increased to 3.72 to 4.22. Pyper (1985) reported 3.70 log removal for Giardia under warm temperature conditions; however, log removals decreased to between 2.19 and 3.05 when the temperature was below 7ºC. Log removals were further reduced to 1.20 at cold temperatures (0.5ºC) when sewage was added to simulate a sewer break or other Giardia contamination event. Schuler et al. (1991) reported 2.77 log removal of Giardia for a new filter during winter and >4 log in warm weather as the filter matured; a log removal of ≥4 log for Cryptosporidium was reported for the duration of the study (March 1988 to January 1989). Timms et al. (1995) reported 4.52 log removal of Cryptosporidium with a 50 cm bed depth and observed no adverse effects when the filtration rate was increased from 0.3 to 0.4 m/h. DeLoyde et al. (2006) reported average Cryptosporidium log removals of 2.0 to >5.2 and 1.5 to 3.7 when operated at 0.4 and 0.8 m/h, respectively. Average Giardia log removals ranged from >2.4 to >4.9 and >2.5 to >4.8 when operated at 0.4 m/h and 0.8 m/h, respectively. Cryptosporidium log removals for a mature filter operated at 0.3 m/h and a temperature ranging from 8.2 to 18.8ºC were reported by Dullemont et al. (2006) to be 5.3 on average, while Hijnen et al. (2007) adjusted this to 4.7 using a mass balance approach that considered the number of dosed organisms, organisms found in the filtrate and organisms found in the filter bed. The latter approach was reported to better represent non-detect results reported as zero in the earlier study. Hijnen and Medema (2007b) completed a literature review of the microbial elimination capacity of slow sand filtration and concluded that average log removals of 4.9 and 4.8 for Giardia and Cryptosporidium, respectively, are achievable. Fogel et al. (1993) reported much lower full-scale results at low temperature (0.5ºC) of 1.2 log for Giardia and 0.3 log for Cryptosporidium. The authors noted that the sand used had a uniformity coefficient of 3.5 instead of the original specification of between 1.8 and 2.2. The low temperature may also have affected performance, as Schijven et al. (2013) reported that the removal of microorganisms by slow sand filtration is most sensitive to changes in temperature and the age of the schmutzdecke.

In summary, when properly designed, constructed, operated and maintained, slow sand filtration can be effective in removing (oo)cysts from a contaminated water source over a wide range of operating conditions. Because of the essential biological action of slow sand filtration, it is important to allow a filter to mature for a period after scraping/cleaning is performed (Collins et al., 1991; Edzwald et al., 1998; Heller and Ladeira Alves de Brito, 2006). Other precautions may also be required to mitigate low temperature impacts (Schijven et al., 2013).

Diatomaceous earth filtration consists of a pressure filter using a medium made from diatoms. The filtration process involves precoating the septum with diatomaceous earth filter media (i.e., filter cake) and pumping raw water combined with a bodyfeed through the media. Factors that determine performance include diatomite grade, precoat rate and thickness, bodyfeed rate, filtration rate and particle characteristics, such as size, shape, concentration and surface charge, making this a complex process to operate (Logsdon et al., 1981; Langé et al., 1986; Ongerth and Hutton, 1997). Logsdon et al. (1981) reported that Giardia log removals ranged from 2.19 to 4.15. Pyper (1985) reported a log removal of 3.52 for Giardia. Langé et al. (1986) reported complete removal of Giardia in 30 of 31 pilot-scale tests and a log removal of 3.12 for the one result with detectable concentrations. Schuler and Ghosh (1990) and Schuler et al. (1991) reported complete removal of Giardia when diatomaceous earth filtration was operated correctly; removal of Cryptosporidium was also excellent (>3 log) but complete removal was seldom achieved (spiked dose = 1,200 to 3,000 oocysts/L) except when alum was used with finer grades of diatomaceous earth. Ongerth and Hutton (1997, 2001) reported >6 log removal for Cryptosporidium. Diatomaceous earth filtration is an accepted technology, but it is reported to be more labour-, energy- and materials-intensive compared to slow sand filtration and less feasible for sources that have highly variable source water quality (Pyper, 1985).

7.1.2.3 Membrane filtration

Four types of pressure-driven membranes are currently used in drinking water treatment: microfiltration (MF), ultrafiltration (UF), nanofiltration (NF) and reverse osmosis (RO). Membranes are generally classified by the type of substances they remove, operating pressure and pore size or molecular weight cut-off (MWCO). MF and UF are referred to as low-pressure membranes and are used for particle/pathogen removal. The predominant removal mechanism is straining or size exclusion. NF and RO are referred to as high-pressure membranes and are used for the removal of organics (e.g., dissolved NOM) and inorganics (e.g., sodium, chloride, calcium, magnesium). The predominant removal mechanism is differences in solubility or diffusivity. The general classes of membranes, their sizes and the substances that are removed are discussed in Kawamura (2000), AWWA (2005, 2011a) and MWH (2012).

Membranes are typically made of polymeric materials; however, there is limited application of ceramic membranes (AWWA, 2005; Kommineni et al., 2010; Huck and Sozański, 2011; MWH, 2012). Although NF and RO membranes are capable of removing (oo)cysts, they are not typically designed for verifiable removal (ANSI/AWWA, 2010). However, Kruithof et al. (2001) did report using conductivity and sulfate to monitor the integrity of RO membranes. MF and UF membranes, which are designed for verifiable removal, are most commonly used for microbial removal.

Jacangelo et al. (1991) reported log removals of 4.0 to 5.1 for G. muris at pilot-scale for four source waters of varying quality using UF membranes (MWCO = 100 kilodaltons). The authors noted that membrane integrity was critical to assuring process efficacy as G. muris was detected in the filtrate when the membranes were compromised (e.g., broken fibre). Jacangelo et al. (1997) evaluated the removal of G. muris and C. parvum at pilot-scale for three source waters of varying quality using MF membranes (pore size = 0.1 µm and 0.2 µm) and UF membranes (MWCO = 100, 300 and 500 kilodaltons). Both MF and UF membranes removed G. muris and C. parvum to detection limits, provided the membrane was intact. Consequently, the reported log removals—6 to 7 for both G. muris and C. parvum—were a function of the spiked (oo)cyst concentration. Trimboli et al. (2001) challenge tested a full-scale 36 ML/d MF plant using bacillus spores. The authors reported complete rejection of the bacillus spores and >5.9 log removal. Mi et al. (2005) challenged tested two full-scale membrane plants (one MF and one UF) with compromised integrity using bacillus spores. In the full-scale MF plant, five tests were conducted on 1 of 6 racks of MF membranes. Testing was conducted by cutting 0, 1, 3, 6 and 8 fibres, representing integrity breaches of 0 to 0.0025%. Log removals of >6.4, 6.0, 5.3, 4.8 and 4.6 were reported for 0, 1, 3, 6 and 8 cut fibres, respectively. In the full-scale UF plant, three tests were conducted on 1 of 11 racks of UF membranes. Testing was conducted by cutting 0, 1 and 3 fibres, representing integrity breaches of 0 to 0.0009%. Log removals of >5.1, 4.8 and 4.5 were reported for 0, 1, and 3 cut fibres, respectively. Additional summaries of pathogen removal studies are given in U.S. EPA (2001).

Low-pressure membranes are well established as a drinking water treatment process, and it is generally accepted that intact membranes can achieve greater than 3 log removal of Giardia and Cryptosporidium. Integrity breaches, however, can compromise membrane removal capability. It is therefore critical that regular monitoring of membrane integrity be conducted. Guo et al. (2010) provide a comprehensive review of integrity monitoring methods for low-pressure membranes. Although numerous methods exist, the pressure decay test and diffusive air flow test are most commonly used to assess membrane integrity. Both involve pressurizing one side of the membrane barrier and measuring air flow through defects under defined conditions (Johnson, 1998). If the pressure decay rate exceeds the site-specific control limit set for the system, diagnostic testing and repair should be conducted. The results of the integrity test should also be converted to an equivalent log removal value (LRV) using methods described in U.S. EPA (2005b) or ASTM (2010). Most commercially available MF/UF systems are equipped to conduct automated pressure decay tests and calculate the LRV.

It is noteworthy that all membranes become fouled over the course of operation and, consequently, the flux (flow per unit area) for a given pressure differential can substantially decrease (AWWA, 2005; MWH, 2012). Regular backwashing and periodic chemical cleaning, using proper foulant-based cleaning chemicals, is required to remove accumulated foulants. Health Canada strongly recommends that cleaning chemicals be certified by an accredited certification body as meeting NSF International (NSF) /American National Standards Institute (ANSI) Standard 60 Drinking Water Treatment Chemicals – Health Effects (NSF/ANSI, 2017).When the flux can no longer be restored to baseline conditions, the membranes must be replaced (Alspach et al., 2014). Chemical cleaning regimes can create harsh conditions for the membrane material (D’Souza and Mawson, 2005). Direct integrity testing following a chemical clean is suggested to ensure membranes remain intact; direct integrity testing should be conducted in accordance with the manufacturer’s recommendations.

7.1.2.4 Riverbank filtration

Riverbank filtration (RBF) involves locating vertical or horizontal water supply wells near a river to use the riverbank and adjacent aquifer as a natural filter to remove particles and pathogens, micro-pollutants and other organic and inorganic compounds, including biodegradable compounds such as NOM and ammonia (Piet and Zoeteman, 1980; Bize et al., 1981; Kuehn and Mueller, 2000; Ray et al., 2002a, 2002b; Gollnitz et al., 2004; Emelko et al., 2010b; U.S. EPA, 2010). As water proceeds to the groundwater table, contaminant concentrations are lowered because of physical filtration, geochemical reactions, biodegradation, predation and through dilution with groundwater (Kuehn and Mueller, 2000; Ray et al., 2002a; Gollnitz et al., 2003; Weiss et al., 2003; Partinoudi and Collins, 2007; Hubbs, 2010; Tufenkji and Emelko, 2011). Ray et al. (2002a) state that properly designed and operated RBF plants can provide 4-log Cryptosporidium removal.

The RBF process is complex and influenced by a number of factors that can vary between sites. Site-specific factors include the characteristics, composition and thickness of the aquifer materials (e.g., grain size distribution, grain surface characteristics, metal oxide content), river water quality (e.g., number and size of particles/microorganisms, NOM concentration, ammonia, nutrients, and other pollutants), riverbed sediments, filtration velocity and distance, temperature, pH, ionic strength, microbe/sediment surface charge, oxygen concentration and groundwater dilution effects (Sontheimer, 1980; McDowell-Boyer et al., 1986; Johnson and Logan, 1996; Gollnitz et al., 1997; Brush et al., 1998, 1999; Walker et al., 1998; Harter and Wagner, 2000; Kuehn and Mueller, 2000; Shaw et al., 2000; Tufenkji et al., 2002; Ray et al., 2002a; Wang et al., 2002; Tufenkji et al., 2004, 2006; Gollnitz et al., 2005; Hijnen et al., 2005; Tufenkji and Elimelech, 2005; Partinoudi and Collins, 2007; Abudalo et al., 2010; Faulkner et al., 2010; Harvey et al., 2010; Kim et al., 2010; Metge et al., 2010).

Schijven et al. (2002) reported log removals for various microorganisms at three RBF sites in the Netherlands and concluded that under relatively homogenous and steady-state conditions in a saturated sand aquifer, up to 8 log virus removal over a distance of 30 metres in about 25 days could be achieved, and greater log removals could be expected for bacteria, protozoa and algae under the same conditions. The authors cautioned that Cryptosporidium breakthrough to RBF wells has been known to occur (Welker et al., 1994; Hancock et al., 1998; Moulton-Hancock et al., 2000) and noted that public health protection could be compromised but data were insufficient to determine the health risk. Wang et al. (2002) conducted a two-year demonstration-scale study of a 20 mgd (80 ML/d) horizontal collector well drawing from a confined, glacial aquifer with medium, coarse and very coarse sand and gravel with little iron and manganese content. The overall filtration rate was estimated to range from 0.15 to 0.37 m/day. Due to the low number of Giardia and Cryptosporidium detections in the river water and collector well, log removals for these microorganisms could not be calculated. As a result, the removal efficiency of the RBF process was calculated using surrogates of pathogen transport. Average log removals of >7.1, >6.7 and >3.9 were reported for algae (MPA test), diatoms and aerobic spores, respectively. The authors reported that turbidity was reliably lowered to below 1.0 NTU. Gollnitz et al. (2003, 2004) conducted a 20-month full-scale study of an RBF facility drawing water from an alluvial sand and gravel aquifer. The rate of infiltration was estimated to be less than 0.07 m/h (1.7 m/day) the majority of the time except during seven high-river-stage events, when it increased to as high as 0.15 m/h (3.6 m/day). Because no Giardia or Cryptosporidium was detected in the production or flow path wells, log removals for these microorganisms could not be calculated. Log removals were calculated using surrogates for wells with short and long flowpaths from the river; log removals ranged from 4.8 (short flowpath) to 6.2 (long flowpath) for algae and 5.0 (short flowpath) to 4.9 (long flowpath) for aerobic spores. The authors reported that scour did not influence the RBF process during high-stage events and that turbidity was reliably lowered to below 1.0 NTU. Gupta et al. (2009) and Abbaszadegan et al. (2011) also reported that scour did not influence the removal of C. parvum in pilot-scale RBF experiments.

Weiss et al. (2003) conducted a two-year full-scale study of three RBF sites in the United States. As with other studies, data were insufficient to calculate log removals for Giardia and Cryptosporidium, and surrogates were used. Log removals for Clostridium and bacteriophage ranged from >3.2 to >3.4 and >1.9 to >3.3, respectively. Gollnitz et al. (2005) conducted a 24-month full-scale study of a sand and gravel aquifer with a hydraulic conductivity of 76.3 to 244 m/day and travel distances of approximately 6.1 m to in excess of 304.8 m. As with other RBF studies, the data were insufficient to calculate log removals for Giardia and Cryptosporidium, and surrogates were used. Log removals ranged from 3.70 to 5.50 for algae, 6.70 to 7.50 for diatoms and 1.50 to 2.10 for aerobic spores; log removals for total coliforms, E. coli and enterococci were in the order of 2 log. The authors reported that turbidity was reliably lowered to below 0.3 NTU. Partinoudi and Collins (2007) monitored four RBF sites at various frequencies for 8 to 20 months. Distances from the river ranged from 19.5 to 55 m for three sites using vertical RBF wells and 12.2 m for one site using horizontal collectors (caisson). Estimated travel times ranged from 0.75 to 5.5 days for the vertical wells and 1.0 day for the caisson. Log removals ranged from ≥2.01 to ≥4.54 for aerobic spores, from ≥2.72 to ≥3.61 for total coliforms and from ≥1.00 to ≥1.74 for E. coli. The authors cautioned that log removals were low due to low concentrations of microorganisms in the surface waters and non-detection in the riverbank filtered water.

In summary, a substantial amount of research has been conducted to explain the role of various physical, chemical and biological factors on microbial transport and removal in natural subsurface environments; however, predicting and quantifying those removals is difficult (Emelko et al., 2003b; Tufenkji, 2007). Given sufficient flow path length and time, RBF can improve microbial water quality to levels protective of public health (Schijven et al., 2002). Farkas et al. (2015) found that antibodies to Cryptosporidium were lower in communities practicing RBF compared to communities using surface water which supports the theory that RBF is contributing to improved microbial water quality.

Microbial reductions/removals should be assessed on a site-specific basis due to the number of factors that affect the RBF process. Sinclair et al. (2012) describes a process to select one or several representative surrogate(s) for natural (or engineered) systems.

Consolidated aquifers, fractured bedrock, karst limestone and gravel aquifers are typically not eligible to receive RBF credits (Berger, 2002; Wang et al., 2002; U.S. EPA, 2010; Nova Scotia Environment, 2012). Groundwater velocities can be high in these aquifers, and faecal contamination can travel long distances over a short period of time (Harvey et al., 2008; Khaldi et al., 2011). Management of groundwater resources in karst and fractured bedrock should not be conducted in the same way as in sand and gravel aquifers (Crowe et al., 2003). It is important that subsurface sources be classified, as numerous outbreaks have been linked to the consumption of untreated water from these sources (Wallender et al., 2014; U.S. EPA, 2016).

7.1.2.5 Physical log removal credits for treatment barriers

Drinking water treatment plants that meet the turbidity limits established in the Guideline Technical Document on turbidity (Health Canada, 2012d) can apply the average removal credits for Giardia and Cryptosporidium given in Table 6. Alternatively, log removal rates can be established on the basis of demonstrated performance or pilot studies. For RBF, the jurisdiction having authority should be consulted for site-specific requirements.

The physical log removal credits can be combined with the disinfection credits (see section 7.1.3) to meet overall treatment goals. For example, if an overall 5 log (99.999%) Cryptosporidium reduction is required for a given system, and conventional filtration provides 3 log removal, then the remaining 2 log reduction must be achieved through another barrier such as primary disinfection.

Table 6. Cryptosporidium and Giardia removal credits for various treatment technologies
Treatment barrier Cryptosporidium removal credit Giardia removal credit
Conventional filtrationTable 6 footnote a 3 logTable 6 footnote b 3 logTable 6 footnote b
Direct filtrationTable 6 footnote a 2.5 logTable 6 footnote b 2.5 logTable 6 footnote b
Slow sand filtrationTable 6 footnote a 3 logTable 6 footnote b 3 logTable 6 footnote b
Diatomaceous earth filtrationTable 6 footnote a 3 logTable 6 footnote b 3 logTable 6 footnote b
Microfiltration and ultrafiltrationTable 6 footnote a Demonstration and challenge testingTable 6 footnote c Demonstration and challenge testingTable 6 footnote c
Nanofiltration and reverse osmosisTable 6 footnote a Demonstration and challenge testingTable 6 footnote c,Table 6 footnote d Demonstration and challenge testingTable 6 footnote c,Table 6 footnote d
Riverbank filtration Site-specific determinationTable 6 footnote e Site-specific determinationTable 6 footnote e
Table 6 footnotes
Table 6 footnote a

Credits are awarded when in compliance with the individual filter effluent turbidity specified in the Guidelines for Canadian Drinking Water Quality (Health Canada, 2012d)

Return to Table 6 footnote a referrer

Table 6 footnote b

Values from Health Canada, 2012d.

Return to Table 6 footnote b referrer

Table 6 footnote c

Removal efficiency demonstrated through challenge testing and verified by direct integrity testing.

Return to Table 6 footnote c referrer

Table 6 footnote d

NF/RO membranes do not currently come equipped with direct integrity testing capability – acceptable verification methods should be approved by the jurisdiction having authority.

Return to Table 6 footnote d referrer

Table 6 footnote e

As required by the jurisdiction having authority.

Return to Table 6 footnote e referrer

7.1.3 Inactivation

Primary disinfection should provide any remaining log reduction credits necessary to meet treatment goals. Primary disinfection is required to protect public health by killing or inactivating harmful protozoa, bacteria and viruses whereas secondary disinfection is used to maintain a residual in the distribution system. It is important that operators understand and maintain required CT/IT values for primary disinfection (see sections 7.1.3.1 and 7.1.3.2), in addition to chlorine residual requirements for secondary disinfection.

Primary disinfection is typically applied after treatment processes that remove particles and NOM. This strategy helps to ensure efficient inactivation of pathogens and minimizes the formation of disinfection by-products (DBPs). It is important to note that when describing microbial disinfection of drinking water, the term “inactivation” is used to indicate that the pathogen is non-infectious and unable to replicate in a suitable host, although it may still be present.

Five disinfectants are commonly used in drinking water treatment: free chlorine, monochloramine, ozone, chlorine dioxide and UV light. All are chemical oxidants except UV light, which uses electromagnetic radiation. Chemical disinfectants inactivate microorganisms by destroying or damaging cellular structures, metabolism, biosynthesis and growth, whereas UV light damages pathogens’ nucleic acid, which prevents their replication such that they are unable to complete cycles of infection.

Free chlorine is the most common chemical used for primary disinfection because it is widely available, is relatively inexpensive and provides a residual that can also be used for secondary disinfection to maintain water quality in the distribution system. Free chlorine, however, requires relatively high concentrations and/or contact times to inactivate Giardia, and it cannot inactivate Cryptosporidium at the doses and contact times commonly used in water treatment (Korich et al., 1990). As a result, treatment systems that use free chlorine for primary disinfection must remove or inactivate Cryptosporidium using an additional treatment barrier (e.g., filtration or alternate disinfectant such as ozone or UV light). The use of monochloramine tends to be restricted to secondary disinfection (i.e., residual maintenance) due to low oxidation potential. Ozone and chlorine dioxide are effective disinfectants against Giardia and Cryptosporidium, although they are typically more expensive and complicated to implement, particularly for small systems. As ozone decays rapidly after being applied, it does not provide a residual and cannot be used for secondary disinfection. Chlorine dioxide is also not recommended for secondary disinfection because of its relatively rapid decay (Health Canada, 2008a). Clancy et al. (1998, 2002) demonstrated that UV disinfection is highly effective for inactivating Giardia and multiple strains of Cryptosporidium parvum. Johnson et al. (2005) confirmed that Cryptosporidium hominis is also susceptible to UV disinfection.

7.1.3.1 Chemical disinfection

The efficacy of chemical disinfectants can be predicted based on knowledge of the residual concentration of a specific disinfectant and factors that influence its performance, mainly temperature, pH, contact time and the level of disinfection required (AWWA, 1991). This relationship is commonly referred to as the CT concept, where CT is the product of “C” (the residual concentration of disinfectant, measured in mg/L) and “T” (the disinfectant contact time, measured in minutes) for a specific microorganism under defined conditions (e.g., temperature and pH). To account for disinfectant decay, the residual concentration is usually determined at the exit of the contact chamber rather than using the applied dose or initial concentration. Also, the contact time T is often calculated using a T10 value, which is defined as the detention time at which 90% of the water meets or exceeds the required contact time. The T10 value can be estimated by multiplying the theoretical hydraulic detention time (i.e., tank volume divided by flow rate) by the baffling factor of the contact chamber. The U.S. EPA (1991) provides baffling factors for sample contact chambers. Alternatively, a hydraulic tracer test can be conducted to determine the actual contact time under plant flow conditions. The T value is dependent on the hydraulics related to the construction of the treatment installation. Improving the hydraulics (i.e., increasing the baffling factor) is more effective to achieve CT requirements than increasing the disinfection dose and can be achieved through physical modifications such as the addition of baffles to the contact chamber or basin.

CT tables for the inactivation of Giardia can be found in U.S. EPA (1991) while CT tables for the inactivation of Cryptosporidium can be found in U.S. EPA (2006a). Selected CT values are presented in Table 7 for 2 log (99%) inactivation of Giardia and Cryptosporidium using chlorine, monochloramine, chlorine dioxide and ozone. The CT values confirm that monochloramine has limited oxidation potential, as much higher concentrations and/or contact times are required to achieve the same degree of Giardia inactivation. Consequently, it is not recommended as a primary disinfectant for protozoa. The CT values also demonstrate that Cryptosporidium is many times more resistant than Giardia to chemical disinfectants, particularly at cold temperatures (Oppenheimer et al., 2000). In general, CT requirements for both Giardia and Cryptosporidium increase two- to three-fold for every 10°C drop in temperature (Hoff, 1986; U.S. EPA, 1991). When water temperatures are close to 0°C, as is often the case in winter in Canada, the efficacy of disinfection is reduced, and an increased disinfectant concentration and/or contact time is required to achieve the same level of inactivation. Thus, it is important to confirm that inactivation requirements can be achieved for minimum temperature conditions.

The effectiveness of some disinfectants is also dependent on pH. When using free chlorine, increasing the pH from 6 to 9 reduces the level of Giardia inactivation by a factor of 3 (U.S. EPA, 1991). On the other hand, pH has been shown to have little effect when using chlorine dioxide or ozone (U.S. EPA, 2006a).

Table 7. CT values for 99% (2 log) inactivation of Giardia and Cryptosporidium by various disinfectants at 0.5°C, 1°C, 5°C and 25°C
Temp.
(°C)
CT values
Free chlorine
[pH 7.5, residual 1 mg/L]
Monochloramine
[pH 6–9]
Chlorine dioxide Ozone
GiardiaTable 7 footnote a Crypto-sporidium GiardiaTable 7 footnote a Crypto-sporidium GiardiaTable 7 footnote a Crypto-sporidiumTable 7 footnote b GiardiaTable 7 footnote a Crypto-sporidiumTable 7 footnote b
0.5 169 -- -- -- -- -- -- --
1 -- -- 2,535 -- 42 1,200 1.9 46
5 99 -- 1,470 -- 17 858 1.3 32
25 25 7,200Table 7 footnote c 500 7,200Table 7 footnote c 7.3 150 0.32 4.9
Table 7 footnotes
Table 7 footnote a

U.S. EPA (1991)

Return to Table 7 footnote a referrer

Table 7 footnote b

U.S. EPA (2006a)

Return to Table 7 footnote b referrer

Table 7 footnote c

Korich et al. (1990)

Return to Table 7 footnote c referrer

Reducing turbidity is an important prerequisite in the inactivation of Giardia, Cryptosporidium and other microorganisms, as chemical disinfection may be inhibited by particles. Additionally, turbidity will consume disinfectant and reduce the effectiveness of chemical disinfection. LeChevallier et al. (1981) reported an eight-fold decrease in free chlorine disinfection efficiency when turbidity increased from 1 to 10 NTU. The effect of turbidity on treatment efficacy is further discussed in the guideline technical document on turbidity (Health Canada, 2012d).

It is important for operational conditions such as temperature, pH and turbidity to be considered when selecting a chemical disinfectant, as these can have a major impact on the inactivation of pathogens; temperature and pH can also influence chemical DBP formation.

Chemical disinfection can result in the formation of DBPs, some of which pose a health risk. The most commonly used disinfectant, chlorine, reacts with NOM to form trihalomethanes (THMs) and haloacetic acids (HAAs), along with other halogenated organic compounds (Rook, 1976; Krasner et al., 2006). N-nitrosodimethylamine (NDMA) may also be formed for systems that use monochloramine and, to a lesser extent, free chlorine. For systems that use commercially available or on-site generated hypochlorite solutions, bromate may be formed (Health Canada, 2016). The use of chlorine dioxide and ozone can also result in the formation of inorganic DBPs, such as chlorite/chlorate and bromate, respectively. When selecting a chemical disinfectant, the potential impact of DBPs should be considered, but it is essential that efforts made to minimize DBP formation not compromise the effectiveness of disinfection. More information can be obtained from the appropriate guideline technical documents (Health Canada, 2006, 2008a, 2008b, 2011b, 2016).

7.1.3.2 Ultraviolet disinfection

For UV disinfection, the product of light intensity “I” (measured in mW/cm2 or W/m2) and time “T” (measured in seconds) results in a computed dose (fluence) in mJ/cm2 for a specific microorganism. This relationship is referred to as the IT concept. UV disinfection can be achieved using low pressure (LP) lamps, which emit UV light at essentially a single (monochromatic) wavelength (~254 nm), or medium pressure (MP) lamps, which emit radiation across a broader (polychromatic) spectrum. Ultraviolet light-emitting diodes (UV-LEDs) are an emerging technology for UV water treatment (Wright et al., 2012). However, in a review of published studies on the application of UV-LEDs, Song et al. (2016) concluded that a standard method for the UV-dose determination of UV-LEDs is needed to reduce the inconsistent and incomparable dose-response data currently available in the literature.

Studies have shown that relatively low UV doses can achieve substantial inactivation of protozoa (Clancy et al., 1998; Bukhari et al., 1999; Craik et al., 2000, 2001; Belosevic et al., 2001; Drescher et al., 2001; Linden et al., 2001, 2002; Shin et al., 2001; Campbell and Wallis, 2002; Mofidi et al., 2002; Rochelle et al., 2002). Based on these and other studies, the U.S. EPA developed UV dose requirements for Giardia lamblia and Cryptosporidium, as outlined in Table 8 (U.S. EPA, 2006a). However, when selecting an appropriate dose, the repair of UV-induced damage by Giardia lamblia should be considered as discussed below. Minimum temperature conditions should also be considered as low temperature can reduce light intensity “I” and affect alarm setpoints. The system should be appropriately designed for these conditions otherwise an increase in “T” may be necessary during minimum temperature conditions (Oosterveld, 2017). In addition, the accumulation of materials on the quartz sleeves (i.e., fouling) can interfere with the disinfection process by absorbing radiation that would otherwise be used to inactivate pathogens (Wait et al., 2005, 2007;Wait and Blatchley, 2010). Iron and calcium tend to comprise the bulk of the foulant material, although aluminum, manganese and zinc may also contribute (Wait and Blatchley, 2010). Wait et al. (2005) found that fouling potential cannot be predicted on the basis of mineral content alone, as the redox potential significantly influences the fouling process. In particular, minimal fouling was reported for untreated groundwaters with elevated iron and moderate to high calcium concentrations (iron = 0.9 to 6.1 mg/L; calcium = 91 to 236 mg/L) whereas following chlorination, foulant accumulation increased ten-fold and irradiance was reduced to 0% within three days (Wait et al., 2005, 2007). Talbot et al. (2011) reported significant fouling of a raw surface water supply using unwiped MP UV lamp sleeves when iron and manganese concentrations were approximately 200 µg/L and 40 µg/L, respectively, whereas unwiped LP lamp sleeves experienced no significant fouling. Wait et al. (2005, 2007) recommend that UV reactors be placed prior to oxidation steps to minimize fouling or after precipitated particulates have been removed. Pilot testing may be required to assess fouling potential. Foulant material must be removed to maintain high UV reactor performance (Bolton and Cotton, 2008).

Several researchers have confirmed that repair of UV-induced damage is not a concern for Cryptosporidium parvum oocysts when irradiated by LP or MP lamps even at very low doses (1–3 mJ/cm2; Shin et al., 2001; Oguma et al., 2001; Zimmer et al., 2003; Rochelle et al., 2004). For Giardia lamblia cysts, researchers report no restoration of infectivity following exposure to LP UV doses of 16 and 40 mJ/cm2, but significant repair activity when exposed to a LP UV dose of 1 mJ/cm2 (Shin and Linden, 2015). However, no data exist for UV doses between 1 and 16 mJ/cm2 for Giardia cysts. In contrast, no repair was detected when Giardia lamblia cysts were exposed to a dose of 1 mJ/cm2 of MP UV irradiation (Shin et al., 2010). In addition, Li et al. (2008) reported that Giardia lamblia trophozoites may survive and potentially reactivate following exposure to LP UV doses up to 10 mJ/cm2; at LP UV doses of 20 and 40 mJ/cm2 the evidence of survival or reactivation was ambiguous and statistically inconclusive. Although this study was conducted with trophozoites instead of cysts, the implication of the study is that a minimum dose is needed to avoid repair of UV-induced damage to Giardia lamblia cysts. The U.S. EPA UV dose requirements were not based on the possibility of Giardia repair. However, dark repair likely occurred during the studies and therefore would have been inherently considered. More research is needed to confirm the exact dose at which Giardia can repair LP UV-induced damage (Linden, 2017). Until further research is conducted, the repair of UV-induced damage by Giardia lamblia should be considered and a dose >10 mJ/cm2, as reported by Li et al. (2008), is recommended for LP UV reactors. Several jurisdictions require a minimum UV dose of 40 mJ/cm2 on the basis that 4 log inactivation can be achieved for many harmful pathogens, including bacteria, viruses and (oo)cysts (ÖNORM, 2001, 2003; DVGW, 2006a, 2006b, 2006c; Ontario Ministry of Environment, 2006; Nova Scotia Environment, 2012). The jurisdiction having authority should be contacted to confirm dose requirements.

In practice, the UV dose delivered in full-scale treatment plants depends on a number of factors, including the hydraulic profile within the reactor, flow rate, the UV transmittance (UVT) of the water, UV intensity, lamp output, lamp placement, lamp aging, fouling and microbe inactivation kinetics (U.S. EPA, 2006c; Bolton and Cotton, 2008). Validation testing should be conducted to determine the operating conditions under which the reactor will deliver the UV dose. Several approaches to UV validation testing are available and are discussed in ÖNORM (2001, 2003), DVGW (2006a, 2006b 2006,c) and U.S. EPA (2006c). These approaches are based on biodosimetric testing to determine the log inactivation of specific challenge microorganisms for a specific reactor in combination with known fluence-response relationships. Continuous monitoring with regularly calibrated sensors should be conducted to verify that the unit remains within validated conditions and is delivering the required dose. Operational issues should also be considered to ensure performance is not compromised (e.g., start-up, failure shutdown, lamp fouling and cleaning, UV sensor maintenance) (U.S. EPA, 2006c).

Table 8. UV dose (mJ/cm2) requirements for up to 4 log (99.99%) inactivation of (oo)cysts (U.S. EPA, 2006a)
Log inactivation UV dose (mJ/cm2) requirements for inactivation
Cryptosporidium Giardia lamblia
0.5 1.6 1.5
1 2.5 2.1
1.5 3.9 3.0
2 5.8 5.2
2.5 8.5 7.7
3 12 11
3.5 15 15
4 22 22

UV disinfection is usually applied after particle removal barriers, such as filtration, to minimize shielding of pathogens by suspended particles. Several studies have examined the effect of particles on UV disinfection efficacy, and most have concluded that the UV dose–response of microorganisms is not affected by variations in turbidity up to 10 NTU (Christensen and Linden, 2002; Oppenheimer et al., 2002; Batch et al., 2004; Mamane-Gravetz and Linden, 2004; Passantino et al., 2004; Amoah et al., 2005). In contrast, Mahmud et al. (2006) reported lower inactivation levels during sub-optimal coagulation (turbidity between 1 and 2 NTU), which was attributed to particle aggregation or clumping that allowed shielding of (oo)cysts. In addition, Templeton et al. (2007) found lower inactivation of bacteriophages during the filter-ripening and end-of-filter-run periods when turbidity was >0.3 NTU. Templeton et al. (2005) found that UV-absorbing organic particles (i.e., humic substances) shielded particle-associated bacteriophages from UV light, whereas inorganic kaolin clay particles did not; Templeton et al. (2005) therefore concluded that particle characteristics were more relevant than turbidity level. For groundwater supplies with elevated iron content, Templeton et al. (2006) found that iron oxide precipitate in air oxidized groundwater samples could interfere with UV disinfection. The extent of protection of (oo)cysts from UV light is more likely to depend on the particle type (e.g., size, structure, chemical composition), the number of large particles (e.g., 25 µm), the level of aggregation of the (oo)cysts with the particulate matter and the desired inactivation level (Caron et al, 2007; Hargy and Landry, 2007; Templeton et al., 2008; Kollu and Örmeci, 2012). As a result, utilities should strive to maintain optimum filtration conditions upstream of UV disinfection (Templeton et al., 2007).

Minimal disinfection by-product formation is expected from UV light (Hijnen et al., 2006; Bolton and Cotton, 2008). However, Wang et al. (2015) reported chlorate and bromate formation for advanced oxidation processes (AOPs) using UV/chlorine (UV >1000 mJ/cm2; free chlorine = 5 to 10 mg/L) and UV/hydrogen peroxide (data not shown). The literature also suggests nitrite formation from nitrate; however, Sharpless and Linden (2001) reported less than 0.1 mg/L nitrite-nitrogen formed with a nitrate-nitrogen concentration of 10 mg/L when dosed up to 400 mJ/cm2. The authors concluded that nitrite formation is unlikely to pose a health concern during UV disinfection using MP lamps. As with chemical disinfectants, the potential impact of DBPs should be considered when using UV. It is essential, however, that efforts made to minimize DBP formation not compromise the effectiveness of disinfection. More information can be obtained from the guideline technical documents on chlorite and chlorate and on bromate (Health Canada, 2008a, 2016).

7.1.3.3 Multi-disinfectant strategy

A multiple disinfectant strategy involving two or more primary disinfection steps (i.e., sequential combination of disinfectants) is effective for inactivating protozoa, along with other microorganisms, in drinking water. For example, the use of UV light and free chlorine are complementary disinfection processes that can inactivate protozoa, viruses and bacteria. As UV light is highly effective for inactivating protozoa (but less effective for viruses), and chlorine is highly effective for inactivating bacteria and viruses (but less effective for protozoa), the multi-disinfectant strategy allows for the use of lower doses of chlorine. Consequently, there is decreased formation of DBPs. In some treatment plants, ozone is applied for the removal of taste and odour compounds, followed by chlorine disinfection. During this period, both ozone and chlorine disinfection may potentially be credited towards meeting the overall treatment goals, provided CT requirements are achieved (see section 7.1.3.1).

7.1.4 Distribution system

A well-maintained distribution system is a critical component of the multiple barrier approach to provide safe drinking water (Fisher et al., 2000). Howe et al. (2002) reported that oocysts remained detectable at low concentrations for up to 19 days after a cryptosporidiosis outbreak triggered the replacement of a contaminated source with an uncontaminated source. It was hypothesized that the slow decline in oocyst concentrations after the change in water source was due to oocysts being released from the biofilm on the surface of distribution system pipes. No additional cases of infection, however, were reported from this exposure. Warnecke (2006) confirmed that distribution system pipes are capable of accumulating and releasing oocysts over an extended period of time. Cross connections and backflow issues have also been linked to outbreaks (see section 5.3.2 and Appendix B).

As the potential exists for oocysts to become attached to pipe biofilms, accumulate in the distribution system and subsequently detach, source water protection measures, treatment optimization, maintaining the physical/hydraulic integrity of the distribution system and minimizing negative- or low-pressure events are key to limiting the entry of (oo)cysts into the distribution system (Karim et al., 2003; Friedman et al., 2004; Fleming et al., 2006; Angles et al., 2007).

Distribution system water quality should be regularly monitored (e.g., microbial indicators, chlorine residual, turbidity, pH), operations/maintenance programs should be in place (e.g., watermain cleaning, cross-connection control, asset management) and strict hygiene should be practiced during watermain repairs to ensure drinking water is transported to the consumer with minimum loss of quality (Kirmeyer et al., 2001, 2014). AWWA (2015) encourages the use of online chlorine residual monitoring in the distribution system.

7.2 Residential scale

Municipal treatment of drinking water is designed to reduce the presence of disease-causing organisms and associated health risks to an acceptable or safe level. As a result, the use of residential-scale treatment devices on municipally treated water is generally not necessary, but is based primarily on individual choice. In cases where an individual household or semi-public system obtains its drinking water from a private supply, the microbiological quality of the source should be determined and the vulnerability of the source to contamination should be assessed. Surface water is not recommended as a private or semi-public water supply unless it is properly filtered and disinfected and monitored for water quality. Well water supplies can also be contaminated and may require treatment. Numerous outbreaks have been linked to the consumption of untreated well water contaminated with enteric protozoa (Wallender et al., 2014).

When drinking water is obtained from a private well, the vulnerability of the source to faecal contamination should be assessed. Although it is difficult for well owners to conduct a detailed assessment of the vulnerability of the well to faecal contamination, steps can be taken to minimize the likelihood of a well becoming contaminated. General guidance on well construction, maintenance, protection and testing is typically available from provincial/territorial jurisdictions. Well owners should have an understanding of the well construction, type of aquifer material surrounding the well and location of the well in relation to sources of faecal contamination (i.e., septic systems, sanitary sewers, animal waste, etc.). This information can be obtained from records provided to the well owner during well and septic system construction, as well as well log databases, aquifer vulnerability maps, and regional groundwater assessments that are generally available from provincial/ territorial jurisdictions. If insufficient information is available to determine if a well is vulnerable to faecal contamination, treatment of the well is a way to reduce risk.

Health Canada does not recommend specific brands of drinking water treatment devices, but it strongly recommends that consumers use devices that have been certified by an accredited certification body as meeting the appropriate NSF/ANSI drinking water treatment unit standards. These standards have been designed to safeguard drinking water by helping to ensure the material safety and performance of products that come into contact with drinking water. Certification organizations provide assurance that a product or service conforms to applicable standards. In Canada, the following organizations have been accredited by the Standards Council of Canada (SCC) to certify drinking water devices and materials as meeting the appropriate NSF/ANSI standards (SCC, 2016):

An up-to-date list of accredited certification organizations can be obtained directly from the SCC (2016).

Private and semi-public supplies that use filtration can refer to several NSF/ANSI standards for the removal of enteric protozoa. These include NSF/ANSI: Standard 53 (Drinking Water Treatment Units – Health Effects) (NSF/ANSI, 2016a), Standard 58 (Reverse Osmosis Drinking Water Treatment Systems) (NSF/ANSI, 2016b) and Standard 62 (Drinking Water Distillation Systems) (NSF/ANSI, 2016c). These standards require 3 log removal or better in order to be certified to a cyst reduction claim. Reverse osmosis (RO) and distillation systems certified to these standards are intended for point-of-use (POU) installation only. This is because water treated by an RO or distillation system may be corrosive to internal plumbing components. These systems also require larger quantities of influent water to obtain the required volume of drinking water and are generally not practical for point-of-entry (POE) installation.

Currently, units certified to NSF/ANSI Standard 53 are generally not capable of achieving the larger capacity required for some semi-public systems (15­–800 gpm). Thus, drinking water systems requiring larger capacity may consider two treatment options: ultrafiltration or cartridge filters.

Ultrafiltration units that are certified to NSF/ANSI Standard 419 (Public Drinking Water Equipment Performance – Filtration) (NSF/ANSI, 2015) can achieve greater than 3 log removal of Cryptosporidium. These ultrafiltration units may include direct integrity testing to confirm membranes remain intact for Cryptosporidium log reductions. As membrane integrity breaches can compromise protozoa removal capability, it is important that regular monitoring of membrane integrity be conducted. Previous certification under the U.S. EPA ETV program may be acceptable pending transition to NSF/ANSI Standard 419 certification provided the manufacturer can demonstrate that the product that was tested under this program has not changed.Water system owners should request the product-specific certification results when considering these systems, as well as establish performance criteria (e.g., protozoa log reduction) and treated water quality objectives to minimize fouling (e.g., organic carbon, colour, iron, manganese, alkalinity and hardness) and to ensure turbidity criteria are met. Pre-treatment may be necessary depending on the source’s water quality.

Cartridge filters that are certified to Standard NSF/ANSI 419 for Cryptosporidium removal (NSF/ANSI, 2015) can also be used for systems that require a larger capacity. Individual jurisdictions may also choose to approve cartridge filters that have been third-party challenge tested for Cryptosporidium removal (GLUMRB, 2012); however, they should also ensure that these filters are certified to NSF/ANSI Standard 61(NSF/ANSI, 2016d) for contact with potable water. Cartridge filters are accepted as an alternative technology for compliance with the filtration requirement of the U.S. Surface Water Treatment Rule (U.S. EPA, 1991, 2006a). Cartridge filters are disposable filters that are operated up to the maximum rated pressure differential (typically 30 psi) and then replaced. As a result, there is no direct integrity testing capability.

Private and semi-public supplies that use disinfection typically rely on chlorine or UV light because of their availability and relative ease of operation. It is important to note that inactivation of Giardia using free chlorine requires relatively high concentrations and/or contact times, and chlorination is not effective for the inactivation of Cryptosporidium. Treatment units meeting NSF/ANSI Standard 55 for Ultraviolet Disinfection Systems (Class A) are designed to inactivate microorganisms, including bacteria, viruses and (oo)cysts, from contaminated water (NSF/ANSI, 2016e); they are not designed to treat wastewater or water contaminated with raw sewage, and they should be installed in visually clear water. Class B systems are not intended for the disinfection of microbiologically unsafe water. Class B systems are only certified for supplemental bactericidal treatment of disinfected public drinking water or other drinking water that has been tested and deemed acceptable for human consumption by the jurisdiction having authority. Class B systems are designed to reduce normally occurring non-pathogenic nuisance microorganisms only. When applying UV light in systems with moderate or high levels of hardness or mineral content, such as groundwater, scaling or fouling of the UV lamp surface is a common problem and may require pre-treatment.

Private and semi-public supplies that use chlorine as part of a multi-disinfectant strategy (see section 7.1.3.3) should use products that are certified as meeting NSF/ANSI Standard 60 (Drinking Water Treatment Chemicals – Health Effects) (NSF/ANSI, 2017) and, for hypochlorite solutions, follow the handling and storage recommendations outlined in the guideline technical document on bromate (Health Canada, 2016).

Periodic testing for E. coli and total coliforms by an accredited laboratory should be conducted on both the water entering the treatment device and the treated water to verify that the treatment device is effective. Treatment devices lose their removal capacity through usage and time and need to be maintained and/or replaced. Consumers should verify the expected longevity of the components in their treatment device according to the manufacturer’s recommendations and establish a clearly defined maintenance schedule. Treatment devices should be inspected and serviced in accordance with the maintenance schedule and manufacturer’s recommendations.

8.0 Health effects

The health effects associated with exposure to Giardia and Cryptosporidium, like those associated with other pathogens, depend upon features of the host, pathogen and environment. The host's immune status, the (oo)cyst's infectivity and the degree of exposure (i.e., number of (oo)cysts consumed) are all key determinants of infection and illness. Infection with Giardia or Cryptosporidium can result in both acute and chronic health effects, which are discussed in the following sections.

8.1 Giardia

8.1.1 Infection

Theoretically, a single cyst is sufficient, at least under some circumstances, to cause infection. However, studies have shown that the ID50 (the dose required for infection to be observed in 50% of test subjects) is more than a single cyst and is dependent on the virulence of the particular strain. Human adult volunteer feeding trials suggest that the ID50 of Giardia is around 50 cysts (Hibler et al., 1987), although some individuals can become infected at a much lower dose (Rendtorff, 1978; Stachan and Kunstýr, 1983). The ID50 of Giardia in humans can also be extrapolated from dose–response curves. Based on this approach, the ID50 for Giardia in humans is around 35 cysts (Rose and Gerba, 1991), which is comparable to the ID50 reported above. Giardia strains that are well adapted to their hosts (e.g., by serial passage) can frequently infect with lower numbers of cysts (Hibler et al., 1987). For example, Rendtorff (1978) reported an ID50 of 19 cysts when using human-source cysts in volunteers.

The prepatent period (time between ingestion of cysts and excretion of new cysts) for giardiasis is 6–16 days (Rendtorff, 1978; Stachan and Kunstýr, 1983; Nash et al., 1987), although this can vary depending on the strain. Research with animal models has shown that smaller inocula result in longer prepatent periods but do not influence the resulting parasite burden (Belosevic and Faubert, 1983).

8.1.2 Pathogenesis and immune response

The specific mechanisms by which Giardia causes illness are not well understood, and no specific virulence factors have been identified. Some suggest that Giardia causes mechanical irritation or mucosal injury by attaching to the brush border of the intestinal tract. Others have proposed that Giardia attachment results in repopulation of the intestinal epithelium by relatively immature enterocytes with reduced absorptive capacities (leading to diarrhea).

The host–parasite relationship is complex, and Giardia has been shown to be versatile in the expression of antigens (Nash, 1994), so universal, lasting immunity is improbable. Humoral immune response is revealed by increased levels of circulating antibodies (immunoglobulin G [IgG] and immunoglobulin M [IgM]) and secretion of antibodies (immunoglobulin A [IgA]) in milk, saliva and possibly intestinal mucus. These antibodies may play a role in eliminating disease (Heyworth, 1988), but lasting immunity has not been demonstrated. Very little is known about cellular immunity, but spontaneous killing of trophozoites by human peripheral blood monocytes has been described (denHollander et al., 1988).

8.1.3 Symptoms and treatment

Typically, Giardia is non-invasive and results in asymptomatic infections. Based on U.S. data, 40% of individuals will develop symptomatic illness after infection with Giardia (Nash et al., 1987). Symptomatic giardiasis can result in nausea, anorexia, an uneasiness in the upper intestine, malaise and occasionally low-grade fever or chills. The onset of diarrhea is usually sudden and explosive, with pale, watery, greasy and foul-smelling stools (Wolfe, 1984; MSSS, 2015). The acute phase of the infection commonly resolves spontaneously, and organisms generally disappear from the faeces. Assemblage A has been associated with mild, intermittent diarrhea, whereas assemblage B has been linked to severe, acute or persistent diarrhea (Homan and Mank, 2001; Read et al., 2002). Giardia infection can also lead to lactase deficiency (i.e., lactose intolerance), general malabsorptive syndrome, and some recent research suggests that it could also lead to irritable bowel syndrome or chronic fatigue syndrome in some individuals (Cotton et al., 2011; Wensaas et al., 2012; Hanvik et al., 2014). Some patients become asymptomatic cyst passers for a period of time and have no further clinical manifestations. Other patients, particularly children, suffer recurring bouts of the disease, which may persist for months or years (Lengerich et al., 1994). In the United States, an estimated 4,600 persons per year are hospitalized for severe giardiasis, a rate similar to that of shigellosis (Lengerich et al., 1994). The median length of hospital stay is four days.

Giardiasis can be treated using a number of drugs, including metronidazole, quinacrine, furazolidone, tinidazole, ornidazole, nitazoxanide and nimorazole. Olson et al. (1994) showed that the potential for a vaccine exists, but infections and symptoms are only attenuated, and prevention of infection is not feasible at this time.

8.2 Cryptosporidium

8.2.1 Infection

Dose–response information for Cryptosporidium, using human volunteer feeding trials involving immunocompetent individuals, is available. As is the case for Giardia and other pathogens, a single organism can potentially cause infection, but studies have only demonstrated infection being caused by more than one organism (DuPont et al., 1995; Okhuysen et al., 1998, 2002; Chappell et al., 1999, 2006). Studies suggest that the ID50 of Cryptosporidium is somewhere between 9 and 2,066 oocysts (DuPont et al., 1995; Okhuysen et al., 1998, 1999, 2002; Chappell et al., 1999, 2006; Messner et al., 2001), indicating that Cryptosporidium isolates can differ significantly in their infectivity and ability to cause symptomatic illness. The TAMU isolate of C. parvum (originally isolated from a foal), for example, was shown to have an ID50 of 9 oocysts and an illness attack rate of 86%, compared with the UCP isolates of C. parvum (isolated from a calf), which had an ID50 of 1,042 oocysts and an illness attack rate of 59% (Okhuysen et al., 1999). In contrast, the Iowa and Moredun isolates of C. parvum had an ID50 of 132 and approximately 300 oocysts, respectively, whereas illness attack rates were similar (i.e., 55–65%) (DuPont et al., 1995; Okhuysen et al., 2002). Based on a meta-analysis of these feeding studies, the ID50s of the TAMU, UCP and Iowa isolates were estimated to be 12.1, 2,066 and 132 oocysts, respectively (Messner et al., 2001). The genetic basis for these differences is not known, although a number of virulence factors have been identified (Okhuysen and Chappell, 2002). In a separate meta-analysis using the TAMU, UCP and Iowa human study data, the probability of infection from ingesting a single infectious oocyst was estimated to range from 4% to 16% (U.S. EPA, 2006a). This estimate is supported by outbreak data, including observations made during the 1993 Milwaukee outbreak (Gupta and Haas, 2004). A recent re-analysis of available dose-response data suggests that Cryptosporidium may be more infectious than previously determined (Messner and Berger, 2016), however further investigation is needed.

The prepatent period for cryptosporidiosis is four to nine days (Ma et al., 1985; DuPont et al., 1995; Okhuysen et al., 1999, 2002), although this can vary depending on the isolate.

8.2.2 Pathogenesis and immune response

Infections of Cryptosporidium spp. in the human intestine are known to cause at least transient damage to the mucosa, including villous atrophy and lengthening of the crypt (Tzipori, 1983); however, the molecular mechanisms by which Cryptosporidium causes this damage are unknown. Several molecules are thought to mediate its mobility, attachment and invasion of host cells, including glycoproteins, lectins and other protein complexes, antigens and ligands (Okhuysen and Chappell, 2002; Tzipori and Ward, 2002). Most of the pathological data available have come from AIDS patients, and the presence of other opportunistic pathogens has made assessment of damage attributable to Cryptosporidium spp. difficult.

The primary mechanism of host defence appears to be cellular immunity (McDonald et al., 2000; Lean et al., 2002; Riggs, 2002), although humoral immunity is also known to be involved (Riggs, 2002; Okhuysen et al., 2004; Priest et al., 2006). Studies using animal models have demonstrated the importance of helper (CD4+) T cells, interferon gamma (IFN-γ) and interleukin 12 (IL-12) in recovery from cryptosporidiosis (Riggs, 2002). Antibody responses against certain glycoproteins involved in Cryptosporidium adhesion have been demonstrated (Riggs, 2002).

It is not clear whether prior exposure to Cryptosporidium provides protection against future infections or disease. Okhuysen et al. (1998) reported that initial exposure to Cryptosporidium was inadequate to protect against future bouts of cryptosporidiosis. Although the rates of diarrhea were similar after each of the exposures, the severity of diarrhea was lower after re-exposure. Chappell et al. (1999) reported that volunteers with pre-existing C. parvum antibodies (suggesting previous infection) exhibited a greater resistance to infection, as demonstrated by a significant increase in the ID50, compared with those who were antibody negative. However, in contrast to the earlier findings (Okhuysen et al., 1998), the severity of diarrhea (defined by the number of episodes and duration of the illness) was greater among the subjects presumed previously infected.

8.2.3 Symptoms and treatment

Individuals infected with Cryptosporidium are more likely to develop symptomatic illness than those infected with Giardia (Macler and Regli, 1993; Okhuysen et al., 1998, 1999). The most common symptom associated with cryptosporidiosis is diarrhea, characterized by very watery, non-bloody stools. The volume of diarrhea can be extreme, with 3 L/day being common in immunocompetent hosts and with reports of up to 17 L/day in immunocompromised patients (Navin and Juranek, 1984). This symptom can be accompanied by cramping, nausea, vomiting (particularly in children), low-grade fever (below 39°C), anorexia and dehydration. Extraintestinal cryptosporidiosis (i.e., in the lungs, middle ear, pancreas, etc.) and death have been reported, primarily among persons with AIDS (Farthing, 2000; Mercado et al., 2007), but are considered rare.

The duration of infection is dependent on the condition of the immune system (Juranek, 1995) and can be broken down into three categories: 1) immunocompetent individuals who clear the infection in 7–14 days, 2) AIDS patients or others with severely weakened immune systems (i.e., individuals with CD4 cell counts <180 cells/mm3) who in most reported cases never completely clear the infection (it may develop into an infection with long bouts of remission followed by mild symptoms) and 3) individuals who are immunosuppressed following chemotherapy, short-term depression or illness (e.g., chickenpox) or malnutrition. In cases where the immunosuppression is not AIDS related, the infection usually clears (no oocyst excretion, and symptoms disappear) within 10–15 days of the onset of symptoms. However, there have been reported cases involving children in which the infection has persisted for up to 30 days. The sensitivity of diagnosis of cryptosporidiosis by stool examination is low—so low that oocyst excreters may be counted as negative prematurely. The application of more sensitive and rapid diagnostic tools, such as immunochromatographical lateral-flow assays, will help to reduce the number of false negatives (Cacciò and Pozio, 2006). Immunocompetent individuals usually carry the infection for a maximum of 30 days. With the exception of AIDS cases, individuals may continue to pass oocysts for up to 24 days. In an outbreak in a daycare facility, children shed oocysts for up to 5 weeks (Stehr-Green et al., 1987). The reported rate of asymptomatic infection is believed to be low, but a report on an outbreak at a daycare facility in Philadelphia, Pennsylvania, concluded that up to 11% of the children were asymptomatic (Alpert et al., 1986), and Ungar (1994) discussed three separate studies in daycare centres where the asymptomatic infection rate ranged from 67% to 100%. It has been suggested that many of these asymptomatic cases were mild cases that were incorrectly diagnosed (Navin and Juranek, 1984).

Nitazoxanide is the only drug approved for treatment of cryptosporidiosis in children and adults (Fox and Saravolatz, 2005), although more than 200 drugs have been tested both in vitro and in vivo (Tzipori, 1983; O’Donoghue, 1995; Armson et al., 2003; Cacciò and Pozio, 2006). This can be explained, in part, by the fact that most inhibitors target biochemical pathways resident in the apicoplast (plastid-derived organelle) (Wiesner and Seeber, 2005), a structure that C. parvum (Abrahamsen et al., 2004) and C. hominis (Xu et al., 2004) lack. This has led to the requirement to find novel targets for drug development, including compounds that target energy metabolism and lipid synthesis. Finding new drug targets is very expensive, and so there is also ongoing work investigating whether currently approved drugs on the market have anti-Cryptosporidial properties (Ryan and Hijjawi, 2015).

The complete genome sequence of several species of Cryptosporidium are available (Ryan and Hijjawi, 2015), and it is being used to help identify virulence determinants and mechanisms of pathogenesis, thereby facilitating the development of antimicrobials (Umejiego et al., 2004), vaccines (Wyatt et al., 2005; Boulter-Bitzer et al., 2007) and immunotherapies (Crabb, 1998; Enriquez and Riggs, 1998; Schaefer et al., 2000; Takashima et al., 2003) against Cryptosporidium.

9.0 Risk assessment

Quantitative microbial risk assessment (QMRA) is a process that uses mathematical modelling, source water quality data, treatment barrier information and pathogen-specific characteristics, to estimate the burden of disease associated with exposure to selected pathogenic organisms in a drinking water source. QMRA can be used in two ways. It can be used as part of a source-to-tap approach for management of a drinking water system, or, as is the case for this document, QMRA can be used to support the development of drinking water quality guidelines.

Further information and direction on how to use QMRA as part of a source-to-tap approach on a site-specific basis is published elsewhere (WHO, 2016; Health Canada, 2018). This guideline technical document will focus solely on using QMRA for the development of a drinking water quality guideline for enteric protozoa.

9.1 Health-based targets

Health-based targets are the "goalposts" or "benchmarks" that have to be met to ensure the safety of drinking water. In Canada, microbiological hazards are commonly addressed by two forms of targets: water quality targets and health-based treatment goals. An example of a water quality target is the bacteriological guideline for E. coli, which sets a maximum acceptable concentration of this organism in drinking water (Health Canada, 2012b). Health-based treatment goals specify a pathogen reduction that needs to be achieved by measures such as treatment processes (see section 7.0). Treatment goals assist in the selection of treatment barriers and should be defined in relation to source water quality (see section 9.3.2). The wide array of microbiological pathogens makes it impractical to measure all of the potential hazards; thus, treatment goals are generally framed in terms of categories of organisms (e.g., bacteria, viruses and protozoa) rather than individual pathogens. The health-based treatment goal for Giardia and Cryptosporidium is a minimum 3 log removal and/or inactivation of (oo)cysts. Surface waters may require a greater log removal and/or inactivation to maintain an acceptable level of risk.

9.2 Reference level or risk

The reference level of risk is the disease burden that is deemed tolerable or acceptable from exposure to drinking water. This value is used to set health-based treatment goals.

Risk levels have been expressed in several ways. The WHO Guidelines for Drinking-water Quality (WHO, 2011) use disability-adjusted life years (DALYs) as a unit of measure for risk. The basic principle of the DALY is to calculate a value that considers both the probability of experiencing an illness or injury and the impact of the associated health effects (Murray and Lopez, 1996a; Havelaar and Melse, 2003). The WHO (2011) guidelines adopt 10−6 DALY/person per year as a reference level of risk. The Australian National Guidelines for Water Recycling (NRMMC-EPHC, 2006) also cite this target. In contrast, other agencies set acceptable microbial risk levels based on the risk of infection and do not consider the probability or severity of associated health outcomes. For example, both the U.S. EPA and the Netherlands have used a health-based target of an annual risk of infection of less than 1/10,000 (10−4) persons (Regli et al., 1991; VROM, 2005).

The risk assessment in this guideline technical document estimates the disease burden in DALYs. There are several advantages to using this metric. DALYs take into account both the number of years lost due to mortality and the number of years lived with a disability (compared with the average healthy individual for the region) to determine the health impact associated with a single type of pathogenic organism. The use of DALYs also allows for comparison of health impacts between different pathogens and potentially between microbiological and some chemical hazards. Although no common health metric has been accepted internationally, DALYs have been used by numerous groups, and published, peer-reviewed information is available. The WHO (2011) reference level of 10−6 DALYs/person per year is used in this risk assessment as a tolerable level of risk.

9.3 Quantitative microbial risk assessment approach

The purpose of this document is to support the development of a health-based treatment goal for enteric protozoa in drinking water. QMRA is an important tool in developing such goals. It follows a common approach in risk assessment, which includes four components: hazard identification, exposure assessment, dose–response assessment and risk characterization. In this case, the risk is already characterized as a reference level of 1 × 10-6 DALYs/person per year. Thus, this risk assessment examines the treatment performance required to reduce enteric protozoa concentrations in source water to a level that will meet that health outcome, assuming a given source water quality, under set exposure conditions and specific dose-response information.

9.3.1 Hazard identification

The enteric protozoa of most concern as human health hazards in Canadian drinking water sources are Giardia and Cryptosporidium. Other enteric protozoa, such as Toxoplasma gondii, Cyclospora cayetanensis, Entamoeba histolytica, and Blastocystis hominis, may also be found; however, they are not the main enteric protozoa of concern (see Appendix A). Although all enteric protozoa that may impact human health are identified, risk assessments do not usually consider each individual enteric protozoan. Instead, the risk assessment includes only specific enteric protozoa whose characteristics make them a good representative of all similar pathogenic protozoa. It is assumed that if the reference protozoa are controlled, this would ensure control of all other similar protozoa of concern. Ideally, reference protozoa will represent a worst-case combination of high occurrence, high concentration and long survival time in source water, low removal and/or inactivation during treatment and a high pathogenicity for all age groups. Giardia and Cryptosporidium have been selected as the reference protozoa for this risk assessment. These organisms can cause serious illness in immunocompetent and immunocompromised individuals. Illness caused by Cryptosporidium is more serious because it is capable of causing death, particularly in immunocompromised individuals, and extraintestinal (e.g., lung, pancreas) damage can occur. However, both organisms have high prevalence rates, potential to cause widespread disease, resistance to chlorine disinfection and published dose–response models. It is assumed that if the disease burden from Giardia and Cryptosporidium is reduced to a tolerable level, the risk from other enteric protozoa will also be at a tolerable level.

9.3.2 Exposure assessment

Exposure is determined as the dose of pathogens ingested by a consumer per day. The principal route of exposure considered in this risk assessment is consumption of drinking water. To assess exposure, both the concentration of Giardia or Cryptosporidium in the drinking water and the volume of water ingested need to be known or estimated.

9.3.2.1 Source water concentration estimates

To inform the development of health-based treatment goals, the QMRA process was conducted in reverse. The QMRA process was used to answer the following question: Given a reference level of risk 1 × 10-6 DALYs/person per year, together with an average volume of water ingested (section 9.3.2.3) and the treatment reductions for the drinking water system (section 9.3.2.2), what are the associated concentrations of Giardia and Cryptosporidium in source water? This is then considered to represent average concentrations of these enteric protozoa in source water. It is also assumed that the concentrations are corrected for method recovery and all enteric protozoa detected are infectious to humans.

Average concentrations have been shown to be suitable for quantifying treatment targets for drinking water exposure (Petterson et al., 2015). When determining average source water concentrations, it is necessary to consider whether the presence of protozoa is continuous or intermittent, has seasonal patterns, and how rare events, such as droughts or floods, can impact levels. Short-term peaks in Giardia or Cryptosporidium concentrations may increase disease risks considerably and even trigger outbreaks of waterborne disease. Monitoring programs should be designed with these factors in mind in order to capture the variability that occurs in the water source (Dechesne and Soyeux, 2007). The MicroRisk project suggested that monthly sampling for one year should be conducted to establish baseline levels, and then at least two full events should be characterized to understand peak conditions (Medema et al., 2006). It should also be noted that for river sources with a high volume of wastewater treatment plant effluents, peak contamination events may occur during low flow conditions ((Deschesne and Soyeux, 2007). The U.S. EPA monitoring requirements for Cryptosporidium also require monthly sampling but for a period of two years (or more frequent sampling for one year), in order to establish a drinking water system's treatment requirements (U.S. EPA, 2006a). In addition to monitoring, uncertainty analysis should also be used as a means to help evaluate the estimated source water concentrations (Petterson et al., 2015). Further information on how to monitor or estimate pathogen concentrations in source water is provided in Health Canada (2018). Other factors that should be taken into consideration when determining source water concentrations, if available, are the recovery efficiencies of the Giardia and Cryptosporidium detection methods, which are much less than 100%, and whether the (oo)cysts found are of human health concern. As mentioned previously, recovery efficiencies should be determined routinely with (oo)cyst monitoring. However, most utilities will not have any indication of whether the (oo)cysts present are potentially a human health concern unless further analysis has been conducted (e.g., genotyping studies). In the absence of further analysis, all (oo)cysts should be considered potentially infectious to humans.

9.3.2.2 Treatment reductions

Different combinations of log reductions achieved through treatment processes and source water concentrations of Giardia and Cryptosporidium were examined in this risk assessment and compared to a defined risk endpoint. It is important to note that treatment can be impacted by numerous factors (see section 7.0). Any Giardia or Cryptosporidium (oo)cysts that were not removed or inactivated during treatment are assumed to still be capable of causing infection and illness.

9.3.2.3 Volume of water ingested

For the volume of water ingested, only the unboiled amount of tap water consumed is considered, as boiling the water inactivates pathogens and will overestimate exposure (Gale, 1996; Payment et al., 1997; WHO, 2011). In Canada, an average of approximately 1 L of unboiled tap water is consumed per person per day (Statistics Canada, 2004; Statistics Canada, 2008). Therefore, for estimating risk from pathogenic organisms, the risk assessment uses an average consumption of 1 L of water per person per day for determining exposure. This estimate is similar to consumption patterns in other developed nations (Westrell et al., 2006; Mons et al., 2007). WHO, in its Guidelines for Drinking-water Quality, also suggests using an estimate of 1 L for consumption of unboiled tap water (WHO, 2011). The treated drinking water concentration and the volume of water ingested can then be used to determine exposure, i.e. the dose of (oo)cysts being consumed per person per day.

9.3.3 Dose–response assessment

The dose–response assessment uses dose–response models to estimate the probability of infection (Pinfection) and the risk of illness after exposure to (oo)cysts. This dose-response relationship can also be used, as is done in this assessment, to estimate the concentration of Giardia or Cryptosporidium associated with a specified level of risk under defined conditions. The dose–response data for Giardia and Cryptosporidium are best explained by the exponential model (Haas et al., 1999):

Equation 1.
Equation 1. Text equivalent follows.
Text equivalent: Equation 1

The probability of infection at a certain estimated dose per day is calculated by subtracting from 1 the result of the following: the base of natural logarithm (e), raised to the exponent of the negative product of the fraction of ingested organisms that survive to initiate infection (r), number of organisms per litre in the ingested volume (µ) and the single volume of liquid ingested (V).

The exponential model mathematically describes the distribution of the individual probabilities of any one organism to survive and start infection, where V is the single volume of liquid ingested, µ is the number of organisms per litre in the ingested volume and r is the fraction of ingested organisms that survive to initiate infection. In this assessment, for Giardia, r = 0.0199 (Rose and Gerba, 1991), whereas for Cryptosporidium, r = 0.018 (Messner et al., 2001). The r parameter is derived from dose–response studies of healthy volunteers and may not adequately represent effects on sensitive subgroups, such as immunocompromised persons, young children or the elderly. Although the exponential model was chosen for this risk assessment, the Beta-Poisson dose–response model has also been used (Petterson et al., 2006; Schijven et al., 2011).

An individual’s daily dose of organisms is estimated using the information from the exposure assessment (see section 9.3.2). An individual’s yearly probability of infection is estimated using equation (2), below. For this risk assessment, it is assumed that there is no secondary spread of infection.

Equation 2.
Equation 2. Text equivalent follows.
Text equivalent: Equation 2

The probability of infection per year is calculated by subtracting from 1 the result of the following: 1 minus the probability of infection (from equation 1), raised to the exponent of 365.

Not all infected individuals will develop a clinical illness. The risk of illness per year for an individual is estimated using equation (3):

Equation 3.
Equation 3. Text equivalent follows.
Text equivalent: Equation 3

The risk of illness is calculated by multiplying the probability of infection per year (from equation 2) by the proportion of the population susceptible to infection (S) (0.70 for Cryptosporidium and 0.4 for Giardia) and by the proportion of individuals that develop symptomatic illness after infection (I) (1 for both Cryptosporidium and Giardia).

where:

Pinfection/year
= the probability of infection obtained from the exponential model
S
= the proportion of the population susceptible to infection
I
= the proportion of individuals who develop symptomatic illness after infection

The risk assessment is based on I values of 0.4 and 0.7 for Giardia (Nash et al., 1987) and Cryptosporidium (Okhuysen et al., 1998), respectively. S is assumed to be 1.

To translate the risk of illness per year for an individual to a disease burden per person, this assessment uses the DALY as a common unit of risk. The key advantage of the DALY as a measure of public health is cited as its aggregate nature, combining life years lost (LYL) with years lived with disability (YLD) to calculate the disease burden. DALYs can be calculated as follows:

Equation 4.
Equation 4. Text equivalent follows.
Text equivalent: Equation 4

The health burden of gastroenteritis resulting from infection with Cryptosporidium and Giardia in drinking water is calculated by multiplying the years lived with a disability by the years of life lost. The years lived with a disability is the sum of the outcome fraction multiplied by the duration and the severity weight for each health outcome contributing to morbidity. The years of life lost is calculated by subtracting the age at death from the life expectancy and then multiplying this result by the severity weight.

where:

YLD
= the sum of the [(outcome fraction) × (duration) × (severity weight)] for each health outcome contributing to morbidity
LYL
= [(life expectancy) − (age at death)] × severity weight

For Giardia and Cryptosporidium, the health effects vary in severity from mild diarrhea to more severe diarrhea and potentially death. It is important to note that, as no published mortality information is available for Giardia, this risk assessment assumes that the risk of death is the same as that for Cryptosporidium. The disease burden of gastroenteritis resulting from infection with Giardia and Cryptosporidium in drinking water is 1.67 DALYs/1,000 cases (1.67 × 10−3 DALY/case) (Table 9).

Table 9. Disease burden calculation for Giardia and Cryptosporidium
DALY Health outcome Outcome fractionTable 9 footnote a Duration of illnessTable 9 footnote b Severity weightTable 9 footnote c DALYs/case
Morbidity (YLD) Mild diarrhea 0.99999 0.01918 year (7 days) 0.067 1.29 × 10−3
Mortality (LYL) Death 0.00001 Life expectancyTable 9 footnote d; age at deathTable 9 footnote e 1 3.90 × 10−4
Disease burden 1.67 × 10−3
Table 9 footnotes
Table 9 footnote a

Macler and Regli (1993).

Return to Table 9 footnote a referrer

Table 9 footnote b

Havelaar and Melse (2003).

Return to Table 9 footnote b referrer

Table 9 footnote c

Murray and Lopez (1996b).

Return to Table 9 footnote c referrer

Table 9 footnote d

Life expectancy for Canadian population = 80.88 years (Statistics Canada, 2012).

Return to Table 9 footnote d referrer

Table 9 footnote e

Age at death is the mean weighted age of the population (assuming no difference in fatality rates between ages) = 38.98.

Return to Table 9 footnote e referrer

Using this disease burden (DALYs/case) and the risk of illness per year in an individual, the disease burden in DALYs/person per year can be estimated as follows:

Equation 5.
Equation 5. Text equivalent follows.
Text equivalent: Equation 5

The burden of disease in disability adjusted life years per person per year is calculated by multiplying the risk of illness (from equation 3) by the health burden of gastroenteritis resulting from infection with Cryptosporidium and Giardia in drinking water in disability adjusted life years per case (1.67 × 10-3).

where:

As mentioned previously, since the disease burden was set to equal the reference level of risk, the DALY calculations are used to translate the reference level of risk into values for the dose-response models for Giardia and Cryptosporidium.

9.3.4 Risk characterization

In this risk assessment, the risk characterization step is used to determine a minimum health-based treatment goal to meet the reference level of risk.

As illustrated in figures 1 and 2, as the source water concentration of (oo)cysts increases, a greater log reduction is needed to continue to meet the reference level of risk. For example, when source waters have a Giardia concentration of 21 cysts/100 L and the treatment plant consistently achieves at least a 3 log reduction in cyst concentration, the burden of disease in the population would meet the reference level of 10−6 DALYs/person per year (less than 1 case/1,000 people per year). Similarly, for Cryptosporidium, a concentration of 13 oocysts/100 L of water would require the treatment plant to consistently achieve at least a 3 log reduction in oocyst concentration in order to meet the reference level of risk. These source water (oo)cyst concentrations fall within the range of average (oo)cyst concentrations that would typically be found in Canadian source waters (see section 5.0). For comparison, a concentration of 2,500 cysts/100 L or 900 oocysts/100 L would require approximately a 5 log reduction to meet the acceptable health burden. Although most surface water sources in Canada do not have average concentrations that would require a 5 log reduction, some do have maximum concentrations at this level (see section 5.0). Although in most cases, the peak levels will not be sustained for a long period of time, it is still important for drinking water providers to consider these peak events in their site-specific assessments to fully understand the potential risks to their drinking water (Health Canada, 2018).

Based on the enteric protozoa data available for Canadian surface water sources, a health-based treatment goal of 3 log reduction of Giardia and Cryptosporidium is a minimum requirement. However, additional removal/inactivation may be needed to meet treatment goals. A site-specific assessment should be done to determine the level of (oo)cyst reduction needed for a given source water. Monitoring source waters for Giardia and Cryptosporidium will result in the highest-quality site-specific assessment. However, if measurements are not possible, information obtained from routine system assessments and information on other water quality parameters can be used to help estimate the risk and/or level of faecal contamination in the source water. This information can then be used to help determine if more than the minimum level of treatment is required for enteric protozoa.

It is also important to understand the log reductions that can be achieved by the treatment plant when it is running under optimal conditions, and the impact of short- and long-term treatment upsets on log reductions. Understanding and planning for the variations that occur in both source water quality and in the treatment plant creates a more robust system that can include safety margins. It is also important to take into consideration the level of uncertainty that is inherent in carrying out a QMRA, to ensure that the treatment in place is producing water of an acceptable quality. A sensitivity analysis using a QMRA model can also help identify critical control points and their limits. Further information on site-specific assessments and the use of QMRA as a tool in a source-to-tap approach can be found elsewhere (Health Canada, 2018).

Figure 1. Health-based treatment goal for Giardia to meet an acceptable level of risk of 10−6 DALYs/person per year based on 1 L daily consumption of drinking water
Figure 1. Text equivalent follows.
Text equivalent: Figure 1

The level of treatment required to meet an acceptable level of risk based on 1 L consumption for Giardia concentrations ranging from 21 cysts per 100 litres to 2100 cysts per 100 litres in raw water is presented graphically. The x-axis of the graph is the raw water concentrations of cysts per 100 litres using a log scale. The y-axis of the graph is the log removal using a linear scale. The relationship between the values on the x-axis and the values on the y-axis for the risk level of 10-6 DALY per person per year creates a diagonal line. Two examples of treatment requirements are illustrated on the graph using dotted lines. The first example draws a horizontal dotted line from the y-axis at 3 log removal. This dotted line intersects the diagonal line at 21 cysts per 100 litres of raw water. The second example draws a horizontal dotted line from the y-axis at 5 log removal. This dotted line intersects the diagonal line at 2100 cysts per 100 litres of raw water.

Figure 2. Health-based treatment goal for Cryptosporidium to meet an acceptable level of risk of 10−6 DALYs/person per year based on 1 L daily consumption of drinking water
Figure 2. Text equivalent follows.
Text equivalent: Figure 2

The level of treatment required to meet an acceptable level of risk based on 1 L consumption for Cryptosporidium concentrations ranging from 13 oocysts per 100 litres to 900 oocysts per 100 litres in raw water is presented graphically. The x-axis of the graph is the raw water concentrations of oocysts per 100 litres using a log scale. The y-axis of the graph is the log removal using a linear scale. The relationship between the values on the x-axis and the values on the y-axis for the risk level of 10-6 DALY per person per year creates a diagonal line. Two examples of treatment requirements are illustrated on the graph using dotted lines. The first example draws a horizontal dotted line from the y-axis at 3 log removal. This dotted line intersects the diagonal line at 13 oocysts per 100 litres of raw water. The second example draws a horizontal dotted line from the y-axis at 5 log removal. This dotted line intersects the diagonal line at 900 oocysts per 100 litres of raw water.

9.4 International considerations

QMRA is increasingly being applied by international agencies and governments at all levels as the foundation for informed decision-making surrounding the health risks from pathogens in drinking water. WHO, the European Commission, the Netherlands, Australia and the United States have all made important advances in QMRA validation and methodology (Staatscourant, 2001; Medema et al., 2006; NRMMC-EPHC, 2006; U.S. EPA, 2006a,b; WHO, 2011). These agencies and governments have adopted approaches that use QMRA to inform the development of health targets (i.e., reference levels of risk or disease) and risk management (e.g., water safety plans, as described in WHO, 2011). Guidance documents on QMRA of drinking water have been published by the European Commission’s MicroRisk project (Medema et al., 2006), the U.S. EPA (2014), and the WHO (2016).

The Netherlands and the U.S. EPA provide two examples of QMRA-based regulatory approaches. In the Netherlands, consistent with the WHO approach, water suppliers must conduct a site-specific QMRA on all surface water supplies to determine whether the system can meet a specified level of risk. Dutch authorities can also require a QMRA of vulnerable groundwater supplies. In contrast, recent regulatory activity in the United States has seen the U.S. EPA assess the health risks from waterborne pathogens through QMRA and apply this information to set nationwide obligatory treatment performance requirements (U.S. EPA, 2006a,b). In general, drinking water systems must achieve a 3 log removal or inactivation of Giardia (U.S. EPA, 1989). To address risk from Cryptosporidium, drinking water systems must monitor their source water, calculate an average Cryptosporidium concentration and use those results to determine whether their source is vulnerable to contamination and requires additional treatment beyond a 2 log minimum. Water systems are classified into categories (“bins”) based on whether they are filtered or unfiltered systems; these bins specify additional removal or inactivation requirements for Cryptosporidium spp. (U.S. EPA, 2006a).

Health Canada and the Federal-Provincial-Territorial Committee on Drinking Water have chosen the same approach as WHO (2011), providing QMRA-based performance targets as minimum requirements. Site-specific QMRA is also recommended as a tool that can be used as part of a source-to-tap approach. A site-specific QMRA approach offers a number of advantages, including 1) the ability to compare the risk from representative groups of pathogens (e.g., viruses, protozoa, bacteria) in an overall assessment; 2) the transparency of assumptions; 3) the potential to account for variability and uncertainty in estimates; 4) the removal of hidden safety factors (these can be applied as a conscious choice by regulatory authorities at the end of the process, if desired); 5) the site-specific identification of critical control points and limits through sensitivity analysis; and 6) the clear implications of system management on a public health outcome. Further information on using QMRA for site-specific assessments is provided in Health Canada (2018).

10.0 Rationale

Giardia lamblia and numerous species and genotypes of Cryptosporidium are known to infect humans. These pathogens are excreted in the faeces of infected persons and animals and can be found in source water. Their occurrence in source water varies over time and can be significantly affected by extreme weather or spill/upset events (i.e., increases in (oo)cyst levels associated with these events). The best way to safeguard against the presence of hazardous levels of Giardia and Cryptosporidium in drinking water is based on the application of the source-to-tap approach, including source water protection and adequate treatment, as demonstrated using appropriate process monitoring, followed by the verification of the absence of indicator organisms in the treated water. The protection of public health is accomplished by setting health-based treatment goals. To set health-based treatment goals, a reference level of risk deemed tolerable or acceptable needs to be determined. The Federal-Provincial-Territorial Committee on Drinking Water has chosen a reference level of risk of 10−6 DALYs/person per year, which is consistent with the reference level adopted by WHO. This is a risk management decision that balances the estimated disease burden from Giardia and Cryptosporidium with the lack of information on the prevalence of these pathogens in source waters, limitations in disease surveillance and the variations in performance within different types of water treatment technologies.

In Canada, many surface water sources will have Giardia and Cryptosporidium concentrations in the range of 1 to 200 (oo)cysts/100 L of water. The QMRA approach used in this guideline technical document demonstrates that if a source water has a concentration of (oo)cysts at the lower end of this range—for example, 21 Giardia cysts/100 L and/or 13 Cryptosporidium oocysts/100 L—a water treatment plant would need to consistently achieve at least a 3 log reduction in (oo)cyst concentration in order to meet the reference level of 10−6 DALYs/person per year. Thus, where treatment is required for enteric protozoa, a minimum 3 log removal and/or inactivation of Giardia and Cryptosporidium has been established as a health-based treatment goal. Some surface waters in Canada may require more than the minimum treatment goal to meet the reference level of risk. Subsurface sources should be evaluated to determine whether the supply is vulnerable to contamination by enteric protozoa (i.e., GUDI). Subsurface sources determined to be GUDI should achieve a minimum 3 log removal and/or inactivation of Giardia and Cryptosporidium. Watersheds or aquifers that are used as sources of drinking water should be protected to the greatest extent possible by minimizing contamination from faecal waste.

11.0 References

Appendix A: Other enteric waterborne protozoa of interest: Toxoplasma gondii, Cyclospora cayetanensis, Entamoeba histolytica, and Blastocystis hominis

Toxoplasma gondii is an obligate, intracellular parasite that affects almost all warm-blooded animals, including humans. It is usually transmitted by ingestion of tissue cysts through consumption of raw or undercooked infected meat, by ingestion of sporulated oocysts through consumption of contaminated food or water or after handling contaminated soil or infected cat faeces. Oocysts are extremely resistant to environmental conditions, including drying, and appear to retain their infectivity for several months (at temperatures of −5°C) (Dubey, 1998). Although this organism tends to cause mild flu-like symptoms, it can be life-threatening for immunocompromised individuals and pregnant women. Infection can result in mental retardation, loss of vision, hearing impairment and mortality in congenitally infected children. Little is known about the distribution of this organism in water sources; however, oocysts have been reported to survive for up to 17 months in tap water. There have been six reported human outbreaks of toxoplasmosis linked to ingestion of contaminated soil and water, including an outbreak in British Columbia in 1995 (Karanis et al., 2007). This outbreak involved 110 acute cases, including 42 pregnant women and 11 neonates (Bowie et al., 1997), and was thought to be due to contamination of a water reservoir by domestic and wild cat faeces (Isaac-Renton et al., 1998; Aramini et al., 1999). Limited information is available on the efficacy of water treatment processes in removing or inactivating T. gondii. Because of its size (10-12 μm), it should be readily removed by physical removal processes (i.e., filtration). It is resistant to chlorine (AWWA, 2006) but has been shown to be susceptible to UV disinfection (Ware et al., 2010). Ware et al. (2010) reported 3 log inactivation of T. gondii oocysts at 10 mJ/cm2 using LP lamps. The authors cautioned that the study was conducted using laboratory-infected cats under ideal conditions. Environmental samples were not evaluated and additional research was recommended to confirm the susceptibility of T. gondii to UV disinfection.

Cyclospora cayetanensis is an obligate, intracellular coccidian parasite whose only natural host is humans (Eberhard et al., 2000). Cyclosporiasis has been reported worldwide but appears to be endemic throughout the tropics (Soave, 1996). Exact routes of transmission have yet to be elucidated; however, person-to-person transmission is unlikely (i.e., unsporulated oocysts are shed in faeces and require a period of maturation). Transmission is likely through food and water that have been contaminated with human faeces. Cyclospora cayetanensis has been detected in environmental samples, including water and wastewater, but detection still represents a challenge; few prevalence studies exist owing to the lack of sensitive methods, including methods to assess viability and infectivity. Cyclospora cayetanensis infection causes symptoms that mimic those caused by Cryptosporidium (e.g., nausea, anorexia, diarrhea). Illness is usually self-limiting, but long-term health effects have been reported, including Reiter syndrome. Epidemiological evidence strongly suggests that water can transmit C. cayetanensis. The first outbreak of cyclosporiasis to be associated with drinking water occurred in 1990 among hospital staff in Chicago, Illinois (Karanis et al., 2007) and was linked to a chlorinated water supply, suggesting the C. cayetanensis is resistant to levels of chlorine used in drinking water treatment. The efficacy of drinking water treatment processes for the removal and/or inactivation of C. cayetanensis has not been evaluated but illnesses have been linked to a chlorinated water supply (AWWA, 2006). It is expected that physical removal of C. cayetanensis by filtration processes should be at least as effective as for Cryptosporidium, given that C. cayetanensis oocysts are larger (8-10 μm).

Entamoeba histolytica is an obligate parasite that affects humans and other primates. Humans are the only reservoirs of significance, shedding trophozoites, cysts or both in their faeces. Entamoeba histolytica can be transmitted through ingestion of faecally contaminated water and food, but person-to-person contact is thought to be the primary route of transmission. Most infections are asymptomatic, but some can cause serious illness (i.e., amoebiasis). In the case of symptomatic infections, diarrhea, fever and abdominal pain are common. More serious health effects, including chronic colitis, lower abscesses and death, have been reported (Kucik et al., 2004). Entamoeba histolytica cysts are resistant to environmental degradation; however, their survival is primarily a function of temperature. Cysts are rapidly killed by modest heat and freezing (Gillin and Diamond, 1980). Although no waterborne outbreaks of amoebiasis have been reported in Canada, outbreaks have been reported in the United States and elsewhere (Karanis etal., 2007). Outbreaks have occurred when chlorinated water became contaminated with sewage (AWWA, 2006), suggesting that the cysts are resistant to chlorination.  A CT value of 1.5 is reported for 1 log reduction at 19°C using ozone (National Research Council, 1980). This means that E. histolytica is more resistant to ozonation than Giardia but not as resistant as Cryptosporidium. Because of its size (10-20 μm), it should be readily removed by physical removal processes (i.e., filtration). No published information was found regarding UV disinfection.

           

Blastocystis hominis is a unicellular protozoan that has been associated with diarrhea, nausea, abdominal pain, vomiting and bloating. Although B. hominis was first identified in the early 1900s, there is still a lot of controversy surrounding the epidemiology of this organism. It is one of the most common intestinal parasites in humans with a prevalence of 1.5% to 10% in developed countries and 50 to 60% in developing countries (Stenzel and Boreham, 1996). However, a large percentage of infections result in asymptomatic carriage of the organism, leading to controversy surrounding its pathogenicity. Animals appear to be a significant reservoir of Blastocystis. Studies have shown that people in close contact with animals have a higher risk of infection (Rajah et al., 1999; Eroglu and Koltas, 2010). As with other waterborne enteric pathogens, the life cycle of Blastocystis contains a cyst form, which is responsible for its transmission via the faecal-oral route. The cysts of Blastocystis are 3 to 5 μm in size and have been reported to be resistant to levels of chlorine used during drinking water treatment (Leelayoova et al., 2004). There is little information available on the removal and/or inactivation of Blastocystis by drinking water treatment processes. As the cyst size of Blastocystis is very similar to the size of Cryptosporidium oocysts, it is expected that drinking water filtration should remove these cysts in a similar manner. No published information was found regarding UV disinfection.

Appendix B: Selected outbreaks related to Giardia and Cryptosporidium

Table B.1 – SelectedTable B.1 footnote a Giardia and Cryptosporidium outbreaks related to public, semi-public and private drinking water systems in Canada (1977–2001)
Date Location Causative agent Scope of outbreak Attributable causes References
Mar 20 – Apr 26, 2001 North Battleford, SK Cryptosporidium 25 laboratory confirmed; 5,800 – 7,100 people estimated to have been affected
  • -Vulnerability of the North Battleford River to contamination by Cryptosporidium in runoff (i.e., drinking water intake only 3.5 km downstream of the treated sewage outfall)
  • - Poor treatment performance (including ineffective turbidity removal)
Stirling et al., 2001
Jun 2 – Jul 12, 1996 Cranbrook, BC Cryptosporidium 29 laboratory-confirmed; 107 clinical; estimated 2,000 cases
  • - Livestock manure contamination of the unfiltered, chlorinated water supply
BCCDC, 1996;
Ong et al., 1997, 1999
Feb – May 1994 Temagami, ON Giardia 26 laboratory-confirmed; between 160 and 330 clinical
  • - Contamination from human sewage due to premature thaw in February and waste management problems
  • - Contamination from beaver
  • - Poor filtration performance
  • - Inadequate chlorine disinfection
Wallis et al., 1998
Feb – May 1993 Kitchener-Waterloo, ON Cryptosporidium 143 laboratory-confirmed
  • *Note: No epidemiological evidence reported to establish association with drinking water
  • - New surface water treatment plant brought on line in June 1992
  • - Spring run-off (increased turbidity)
  • - Recycling filter backwash supernatant to the filters (concentrated oocysts from raw water); Challenged fine-particle removal in the treatment process
  • - Cryptosporidium detected in induced infiltration wells
Pett et al., 1993;
Welker et al., 1994
Jan – Apr 1990 Creston & Erikson, BC Giardia 124 laboratory-confirmed
  • - Unfiltered, unchlorinated surface water
  • - Beavers excreting large numbers of Giardia cysts into water supply
Isaac-Renton et al., 1993, 1994
Jun – Aug, Nov 1986 Penticton, BC Giardia 362 laboratory-confirmed 3,100 estimated cases
  • - Unfiltered surface water supply using only chlorine for inactivation of Giardia
Moorehead et al., 1990
Nov – Dec 1985 Creston, BC Giardia 83 laboratory-confirmed
  • - Unfiltered, unchlorinated surface water
Isaac-Renton et al., 1993
Table B.1 footnotes
Table B.1 footnote a

These represent well-documented outbreaks.

Return to Table B.1 footnote a referrer

Table B.2 – SelectedTableau B.2 footnote - a outbreaks of Giardia and Cryptosporidium worldwide (1954- 2016)
Cause of outbreak Contributing factors References
Groundwater contamination
  • - sewage or livestock faecal contamination, often following heavy rainfall
  • - contaminated spring water
  • - previous identification of well as potentially at risk of contamination
  • - improper set-back distances from septic and surface water sources
  • - improper assessment as groundwater, classification should be GUDI
Brady & Wolfe (1974); CDC (1977a); Craun (1979); Herwaldt et al., 1992; Moore et al., 1993; Kramer et al., 1996; Solo-Gabriele & Neumeister, 1996; Craun et al., 1998; Furtado et al., 1998; Levy et al., 1998; Simth & Rose, 1990; Barwick et al., 2000; CDR, 2001; Howe et al., 2002; Lee et al., 2002; Liang et al., 2006; HPSC, 2007
Surface water contamination
  • -heavy rainfall preceeding contamination event
  • -leaking sewer pipes into water source
  • -surface waters with chlorination as the only treatment, or no treatment
  • -sewage discharges, with and without mitigating factors (examples of mitigating factors include dry season so less than usual dilution, high levels of oocysts in the sewage)
  • -use of mountain streams to supplement water supplies with minimal/no treatment
Meyer (1973); CDC 1975; Barbour et al., 1976; CDC (1977a,b); CDC (1978); Craun (1979); Hopkins et al., 1985; Wallis, 1987; Bryck et al., 1988; Erlandsen & Bemrick, 1988; Kent et al., 1988; McClure & McKenzie, 1988; Birkhead et al., 1989; Moorehead et al., 1990; Richardson et al., 1991; Herwaldt et al., 1992; Isaac-Renton et al., 1994; CDR, 1996; Solo-Gabriele & Neumeister, 1996; Wallis et al., 1996; Craun et al., 1998; Furtado et al., 1998; Levy et al., 1998; Hunter, 1999; Barwick et al., 2000; Wallis et al., 2001; Jennings & Rhatigan (2002); Nygård et al., 2006; Nichols et al., 2006; Pelly et al., 2007; Mason et al., 2010; Chalmers et al., 2010; Widerström et al., 2014; Andersson et al., 2014; Guzman-Herrador et al., 2015; DeSilva et al., 2016
Inadequate TreatmentTable B.2 footnote b
  • -treatment plant malfunctions including pump failures, inadequate chlorination
  • -filter breakthrough, increased turbidity post-filters
  • - inadequate contact time due to high demand periods
  • -treatment deficiencies after maintenance work
  • -high turbidity waters with aging treatment works
  • -Filtered and unfiltered water mixed to meet demand
Vernon, 1973; CDC, 1977b; CDC, 1978; Kirner et al., 1978; Craun, 1979; CDC, 1980; Hopkins et al., 1985; Navin et al., 1985; Erlandsen & Bemrick, 1988; Birkhead et al., 1989; Herwaldt et al., 1992; Moore et al., 1993; MacKenzie et al., 1994; Kramer et al., 1996; Solo-Gabriele & Neumeister, 1996; Pozio et al., 1997; Craun et al., 1998; Furtado et al., 1998; Levy et al., 1998; Perez et al., 2000; CDR, 2001; Stirling et al., 2001; Lee et al., 2002; Webber et al., 2002; O'Toole et al., 2004;
Distribution system issues
  • -Cross connections and backflow issues
  • -seepage from septic tanks/sewage into the distribution systems
  • -animal carcass in the drinking water at treatment works
  • -deliberate contamination
Starko et al., 1980; Weniger et al., 1983; Jephcott et al., 1986; Neringer et al., 1987; Smith et al., 1989; Bell et al., 1991; Moore et al., 1993; Kramer et al., 1996; Craun et al., 1998; Kuroki et al., 1996; Barwick et al., 2000; Glaberman et al., 2002; Lee et al., 2002; Dalle et al., 2003; Chalmers (2012); Hilborn et al., 2013; Moon et al., 2013; Braeye et al., 2015
Table B.2 footnotes
Table B.2 footnote - a

Based on reviews by Schuster et al., 2005, Karanis et al., 2007, Moreira and Bondelind, 2017, Efstratiou et al., 2017

Return to Table B.2 footnote - a

Note de bas de tableau B.2 - b

Includes only outbreaks where a problem was reported with the current treatment system. Systems where no treatments were applied (e.g. non-filtered systems) were included elsewhere.

Return to Table B.2 footnote - b

Appendix C: List of acronyms

AIDS
acquired immunodeficiency syndrome
ANSI
American National Standards Institute
CT
concentration × time
DALY
disability-adjusted life year
DAPI
4′,6-diamidino-2-phenylindole
DBP
disinfection by-product
DIC
differential interference contrast
DNA
deoxyribonucleic acid
EPA
Environmental Protection Agency (U.S.)
FACS
fluorescently activated cell sorting
FISH
fluorescence in situ hybridization
GUDI
groundwater under the direct influence of surface water
HCT-8
human ileocaecal adenocarcinoma (cell line)
HIV
human immunodeficiency virus
ID50
median infective dose
IFA
immunofluorescence assay
IFN
interferon (e.g., IFN-γ)
Ig
immunoglobulin (e.g., IgA, IgG, IgM)
IL
interleukin (e.g., IL-12)
IMS
immunomagnetic separation
LT1ESWTR
Long Term 1 Enhanced Surface Water Treatment Rule (U.S.)
LT2ESWTR
Long Term 2 Enhanced Surface Water Treatment Rule (U.S.)
LYL
life years lost
MPA
microscopic particulate analysis
mRNA
messenger ribonucleic acid
NOM
natural organic matter
NSF
NSF International
NTU
nephelometric turbidity unit
PCR
polymerase chain reaction
PI
propidium iodide
QMRA
quantitative microbial risk assessment
qPCR
quantitative polymerase chain reaction
RBF
riverbank filtration
RFLP
restriction fragment length polymorphism
rRNA
ribosomal ribonucleic acid
RO
reverse osmosis
RT-PCR
reverse transcriptase polymerase chain reaction
SCC
Standards Council of Canada
T10
the detention time at which 90% of the water passing through the unit is retained within the basin
UF
ultrafiltration
UV
ultraviolet
WHO
World Health Organization
YLD
years lived with disability

Page details

Date modified: