Page 8: Guidelines for Canadian Drinking Water Quality: Guideline Technical Document – Turbidity

Part II. Science and Technical Considerations (continued)

7.0 Relationship between turbidity and water quality characteristics

7.1 Microbiological characteristics

The microbiological quality of water can be significantly affected by turbidity. Microorganisms are considered particles themselves, occupying size categories ranging between small particles and colloidal material (MWH, 2005). In the environment, microorganisms become intimately associated with soil and waste particles, either settling together or becoming directly attached to particle surfaces. They subsequently are transported to waters through the same mechanisms as for particles, which include both environmental and treatment system mechanisms (see Table 2 above). As a result, turbidity measurements have become a useful indicator of water quality in raw water, treated water and the distribution system. Information on the rationale behind log removal credits can be found in Appendix B.

7.1.1  Relationship between turbidity and the presence of microorganisms

Surface waters may experience events leading to sudden fluctuations in turbidity and pathogen concentrations. Some of these events (e.g., heavy precipitation, wastewater discharges, flooding) may be unpredictable, while others (e.g., spring snowmelt, onset of a recognized rainy season) can be seasonal and have some degree of predictability. It has been well-documented that rainfall-mediated runoff can lead to significant increases in turbidity, faecal indicator and pathogen concentrations (Ferguson et al., 1996; Atherholt et al., 1998; Kistemann et al., 2002; Dorner et al., 2007). Atherholt et al. (1998) observed that rainfall-related increases in concentrations of Giardia cysts and Cryptosporidium oocysts in a New Jersey river watershed were significantly correlated with turbidity measurements. Curriero et al. (2001) and Naumova et al., (2005) have reported evidence of statistically significant associations between elevated or extreme precipitation events and waterborne disease outbreaks in the United States and the UK respectively. A general link between turbidity and pathogens may exist, such that weather-influenced sources of contamination are known to exist in a watershed, and a spike in raw water turbidity may serve as a warning of an increased pathogen challenge. However, it should also be noted that low turbidity in surface waters does not automatically indicate the absence of pathogens. Research has not provided evidence of any direct correlation between surface water turbidity and pathogen concentrations. The strength of any association between these two elements will be dependent on area-specific factors, such as the nature and degree of the pathogen (faecal) inputs and the soil types involved. Dorner et al. (2007) observed weak correlations between wet weather pathogen concentrations and turbidity measurements in the Grand River watershed (Ontario), further concluding that the findings were the result of regional variations in pathogen sources. St. Pierre et al. (2009) similarly did not find strong connections among the prevalence and quantities of Campylobacter spp., thermotolerant coliforms and E. coli or turbidity values within river and stream water samples from Quebec's Eastern Townships.

In groundwater, turbidity is generally low and is often associated with inorganic metal oxides, geological material such as clay particles and macromolecular components of dissolved organic carbon such as humic acids (Puls et al., 1991; Backhus et al., 1993; WHO, 2011). In some cases, elevated turbidity observed during sampling of groundwater wells has been associated with an elevated clay mineral content (Puls et al., 1993) and increases in metal concentrations such as iron and aluminum (Abbott, 2007). The growth of iron and sulfur bacteria can also cause turbidity increases through the deposition of significant amounts of iron and sulfur precipitates and the production of bacterial slimes (APHA et al., 2012). More recently, a study found that faecal indicator bacteria were present in bedrock monitoring well samples collected at the onset of sampling. In this study, turbidity between 1 and 3 NTU corresponded with the presence of faecal coliforms and E. coli (Kozuskanich et al., 2011). Overall, there is insufficient scientific data to suggest a specific turbidity value that provides an indication of the presence of pathogens in groundwater. However, for turbidity in drinking water systems supplied by groundwater it is best practice to generally be below 1.0 NTU. This helps to ensure that turbidity levels do not interfere with the disinfection and distribution of the water supply.

Treatment and disinfection methods are capable of producing drinking water with a negligible risk of disease transmission. Water of low turbidity is in general a good indication of treatment effectiveness, but specific turbidity values do not reflect the presence or absence of pathogens. A survey of Canadian municipal drinking water supplies for Giardia and Cryptosporidium (Wallis et al., 1993) indicated that cysts and oocysts could be detected in low numbers in treated water from 9 out of 10 municipalities, and in waters with turbidity measurements below the HBTL of 0.3 NTU. Positive samples were less common from municipalities that used filtration. Keswick et al. (1984) reported on the detection of enteric viruses in conventionally treated drinking water from a heavily polluted river source in the United States. Four of nine dry season samples meeting the turbidity standard that existed at that time (1.0 NTU), as well as the standards for total coliform and residual chlorine, had culturable virus (rotavirus and enteroviruses). None of the 14 rainy season samples met turbidity, total coliform or residual chlorine standards, and all contained culturable virus. A collaborative survey of viruses in drinking water from three major urban areas in Canada (Montreal, Ottawa and Toronto) was conducted by Payment et al. (1984). Viruses were detected in 37-72% of raw water samples, but none were detected in finished water samples, which met all turbidity (1.0 NTU), bacteriological and residual chlorine limits.

Increased distribution system turbidity can be indicative of microbiological problems such as intrusion, detachment of biofilm or deposits. Several studies have documented correlations between increasing levels of plate count microorganisms and increased turbidity (Snead et al., 1980; Goshko et al., 1983; Haas et al., 1983). An increase in HPC bacteria can indicate a breakdown in a treatment barrier, deterioration in water quality or post-treatment contamination. Goshko et al. (1983) noted a positive correlation between turbidity and HPC in water samples collected from the distribution systems of several small community supplies. Similarly, Power and Nagy (1999) reported a positive correlation between increased turbidity and increased total coliform and HPC in a distribution system experiencing bacterial regrowth. Changes in pressure or flow can also contribute to the release of soft deposits, resuspension of sediments, detachment of biofilms or intrusion of external contaminants, which can create increases in distribution turbidity and bacteria levels (LeChevallier et al., 2003; Lehtola et al., 2004, 2006). Turbidity measurements can be used as an indication of changes to distribution system conditions, but should not be automatically interpreted as an indication of an unsafe water supply. An early investigation of distribution system turbidity-bacteria relationships found no correlation between turbidity levels above or below 1.0 NTU and the frequency of coliform detection (Reilly and Kippin, 1983). In a study by Lehtola et al. (2004), soft deposits influencing turbidity and bacteria levels contained high numbers of HPC bacteria, but were negative when tested for coliform bacteria and Norwalk-like viruses.

Turbidity results do not provide an indication of water safety but are useful as an indicator of the need to further investigate the cause of the turbidity. Measurements should be used by operators as a useful tool for monitoring plant operations and distribution system conditions. Other useful tests include those for coliform bacteria, HPC and disinfectant residuals.

7.1.2  Relationship between turbidity reduction and microorganism removal in treated water

Filtration is an important barrier in the production of safe drinking water. Depending on the type of filtration technology that is used, protozoa, bacteria, viruses and particles are removed by porous media by becoming attached to the filter grains, through physical straining or by biological mechanisms. Physical removal is particularly important for the enteric protozoa Giardia and Cryptosporidium and enteric viruses. Cryptosporidium oocysts are not effectively inactivated by chlorine disinfection, and inactivation of Giardia cysts with free chlorine requires high concentrations or long contact times (Health Canada, 2012). Filtration is the most practical method for achieving high removals of these organisms. Enteric viruses can pass through most filtration barriers relatively easily because of their small size. Chemical coagulants are often utilized to produce virus-adsorbed flocs that are larger in size and thus can be more easily removed through filtration.

There is no precise relationship between the magnitude of turbidity reduction and the removal of pathogens. Some authors have documented positive and statistically significant correlations between turbidity reduction and parasite removal with rapid granular filtration methods (LeChevallier and Norton, 1992; Nieminski and Ongerth, 1995; Dugan et al., 2001). However, the bulk of the data indicate that the relationship is not proportional (i.e., a one-to-one relationship). Dugan et al., (2002) reported that in optimized conventional filter challenge runs, turbidity reduction was consistently lower than oocyst removals. Huck et al. (2002) observed that two pilot plants of comparable design and optimized to yield similarly low effluent turbidity (less than 0.1 NTU) demonstrated a 2-log difference between their Cryptosporidium removal capabilities. Patania et al. (1995) commented that turbidity reduction, particle removal and cyst or oocyst removal are each largely dependent upon their individual concentrations in source waters. As these can each vary considerably from source to source, a universal relationship should not be expected (Patania et al., 1995). This is further complicated by the fact that the process of coagulation/flocculation fundamentally changes the particle numbers and characteristics, making it difficult to track the reduction of specific particles.

Inherent differences between treatment technologies and their operation can also contribute to the observed differences between turbidity and pathogen removal. For example, as discussed with slow sand filtration, aside from the media and filtration rate differences, the schmutzdecke also presents a complex biological community that is capable of degrading certain types of organic matter (MWH, 2005). Further, it has been suspected that in this biologically active region, predation of bacteria by certain protozoa (Lloyd, 1996; Hijnen et al., 2004; Unger and Collins, 2008) and predation of protozoan (oo)cysts by zooplankton (Bichai et al., 2009) may play a role in the removal of these organisms. It should be similarly recognized that the water turbidity and the associated removal of different pathogen types are also dependent on the pretreatment and the filtration technology used.

In general, it has been demonstrated that good removals of Giardia cysts and Cryptosporidium oocysts can be achieved when water of low turbidity is produced. Various filtration technologies can achieve good turbidity reduction and good removal of protozoan (oo)cysts.

7.1.2.1 Conventional filtration

A considerable amount of research has been conducted on the filtered water turbidity achievable by conventional filtration and the attainable physical removal of pathogens. Published full-scale study data (LeChevallier et al., 1991; LeChevallier and Norton, 1992; Kelley et al., 1995; Nieminski and Ongerth, 1995; McTigue et al., 1998; Nieminski and Bellamy, 2000) and pilot-scale study data (Logsdon et al., 1985; Patania et al., 1995; McTigue et al., 1998; Dugan et al., 2001; Harrington et al., 2001; Emelko et al., 2003; Huck et al., 2004; Assavasilavasukul et al., 2008) indicate that conventional filtration can achieve Giardia cyst and Cryptosporidium oocyst removals from greater than 1.4 log to greater than 5 log and virus removals from 1.6 log to greater than 3 log.

Data from pilot-scale studies conducted by Patania et al. (1995) showed that a 5-log Cryptosporidium removal could be achieved with conventional filtration systems optimized for turbidity reduction and particle removal. Huck et al. (2002) and Emelko et al. (2003) similarly showed greater than 5-log Cryptosporidium removal for pilot-scale conventional systems under optimized conditions and stable filtration (filter effluent turbidities below 0.1 NTU). McTigue et al. (1998) conducted a survey of 100 full-scale water treatment plants in the United States with the purpose of developing nationwide estimates of treatment capabilities and pathogen removal. The median Cryptosporidium removal value reported for the 100 plants was 1.7 log. However, the authors commented that an estimate of the attainable Cryptosporidium removal was limited by the typically low raw water concentrations encountered in the study. Turbidity measurements were collected at only 52 of the plants participating in the survey, with the median turbidity for any filter run at any plant never exceeding 0.2 NTU.

A review of the available information on Cryptosporidium removals through granular media filtration was conducted by Emelko et al. (2005). It was concluded that the body of data taken as a whole suggests that, when optimized for turbidity removal, granular media filters can achieve Cryptosporidium removals of close to 3 log or better.

7.1.2.2 Direct filtration

Data from full-scale (Nieminski and Ongerth, 1995) and pilot-scale (West et al., 1994; Nieminski and Ongerth, 1995; Ongerth and Pecoraro, 1995; Patania et al., 1995; Brown and Emelko, 2009) investigations have shown that optimized direct filtration can achieve removals of Giardia and Cryptosporidium (oo)cysts ranging from 2 log to greater than 4 log. In full- and pilot-scale experiments, Nieminski and Ongerth (1995) observed removals of 3 log for Giardia cysts and close to 3 log for Cryptosporidium oocysts with filter effluent turbidities of 0.1-0.2 NTU. The authors commented that direct filtration, when optimized, can provide a degree of control of Giardia and Cryptosporidium comparable to that seen with conventional filtration. Brown and Emelko (2009) demonstrated that pilot-scale in-line filtration preceded by stable, optimized (less than 0.1 NTU) coagulation was capable of achieving 4-log median removal of Cryptosporidium oocysts. Suboptimal coagulation (50% dosage, 0.2-0.3 NTU) resulted in a reduction in median oocyst removal by 2-3 log (Brown and Emelko, 2009). In contrast, Patania et al. (1995) found that removals of Giardia and Cryptosporidium by direct filtration were 0.8-1.8 log lower than those attained by conventional filtration during pilot-scale studies. Available data have indicated that sedimentation can accomplish microorganism removals on the order of 0.5-1.0 log for Cryptosporidium (Kelley et al., 1995; Edzwald and Kelley, 1998; Dugan et al., 2001) and 0.5 to greater than 3 log for viruses (Rao et al., 1988; Payment and Franco, 1993; Havelaar et al., 1995).

7.1.2.3 Slow sand filtration

Information on achievable filtered water turbidity corresponding to pathogen removals has been lacking. Published pilot-scale studies have indicated average physical removal capabilities for well-operated slow sand filters ranging from greater than 3 log to greater than 4 log for both Giardia cystsand Cryptosporidium oocysts (Bellamy et al., 1985a; Schuler and Ghosh, 1991; Hall et al., 1994; Timms et al., 1995; Hijnen et al., 2007). Pilot-scale slow sand filtration removals of 1-2 log enterovirus (Slade, 1978) and less than1 - 2.2 log MS2 (Anderson et al., 2009) have been reported. Schuler and Ghosh (1991) noted that slow sand filtration can achieve greater than 3-log removal of Cryptosporidium and Giardia and that a filter effluent turbidity of 0.3 NTU is attainable concurrently. In a series of slow sand filtration pilot-scale experiments, Hall et al. (1994) observed Cryptosporidium removals of 2.8-4.3 log (mean 3.8 log) with filtered water turbidity values ranging from 0.2 to 0.4 NTU in three of the four filter trials. In another pilot-scale investigation, Bellamy et al. (1985a) observed greater than a 99.9% (3-log) removal of Giardia cysts, but only a 27-39% reduction in turbidity. The low turbidity removal was attributed to the existence of fine clay particles native to the source water used in the test.

Recent Canadian studies have investigated the capability of multi-stage slow sand filters for removal of turbidity (Anderson et al., 2006), Cryptosporidium and Giardia (DeLoyde et al., 2006) and viruses (MS2) (Anderson et al., 2009). Pilot-scale systems consisting of roughing filtration and two slow sand filters in series were utilized in the experiments. In the turbidity experiments (Anderson et al., 2006), filtered water turbidities achieved in the filter effluents (SSF1, SSF2) were less than 0.3 NTU in 90.4% (SSF1) and 98.7% (SSF2) of the measurements and less than 1.0 NTU in 99% (SSF1, SSF2) of the measurements. Raw water turbidity values observed during the study period were less than 5 NTU in 61% of the measurements; however, spikes exceeding 20 NTU and occasionally reaching 80 NTU were noted (Anderson et al., 2006). During the protozoan challenge testing (DeLoyde et al., 2006), reported log removals ranging from 2.0 log to greater than 5.2 log for Cryptosporidium and from greater than 2.4 log to greater than 4.9 logfor Giardia . The authors noted that removals increased with increasing filter maturation and that Giardia cysts were removed to a greater degree than Cryptosporidium oocysts. Lastly, in MS2 removal experiments, Anderson et al. (2009) observed average log removals ranging from 0.1-0.2 log for the roughing filters and 0.2 log to 2.2 log for the slow sand filters. Slow sand removal of MS2 was shown to be less effective in cold water at a high filtration rate.

7.1.2.4 Diatomaceous earth filtration

A small number of pilot-scale studies have evaluated the effectiveness of diatomaceous earth filtration for the removal of Cryptosporidium and Giardia. Logsdon et al. (1981) reported 2- to 4-log reduction of Giardia cysts, with filter effluent turbidity values spanning from 0.31 to 0.76 NTU. Schuler and Ghosh (1990) found greater than 3-log reductions of both Cryptosporidium and Giardia, with filter effluent turbidity values consistently below 0.2 NTU. The authors observed higher log reductions with Giardia than with Cryptosporidium and noted that Cryptosporidium removals were further improved when a chemical coagulant (alum) was added. Ongerth and Hutton (1997) observed average removals of greater than 5 log to greater than 6 log during testing of Cryptosporidium removal provided by three different grades of diatomaceous earth (median pore sizes of 5.0, 7.0 and 13.0 µm). The efficiency of Cryptosporidium filtration was shown to improve with increasing grades (smaller median particle sizes and lower permeability) of diatomaceous earth (Ongerth and Hutton, 1997). An indication of the corresponding turbidity values was not provided.

Ongerth and Hutton (2001) demonstrated average removals of Cryptosporidium greater than 6 log with filtration rates of 2.5 and 5 m/h and effluent turbidity ranging from 0.06 to 0.12 NTU. This study also found that pressure vibrations due to an undampened peristaltic feed pump caused the effluent turbidity to increase up to 0.4 NTU. Under these conditions, Cryptosporidum removals were still in the range of 5.0-5.8 log. Although particulate removal using diatomaceous earth filtration generally occurs through straining, several authors have demonstrated that Cryptosporidium and Giardia reductions can still be greater than 3 log even at effluent turbidities above 0.4 NTU (Logsdon et al., 1981; Ongerth and Hutton, 2001). Logsdon et al. (1981) attributed the observed high cyst reduction with a corresponding low turbidity reduction to very small particles that could pass through the filter cake while larger cysts were strained out by the diatomaceous earth.

7.1.2.5 Bag and cartridge filtration

Bag and cartridge filters (pore size typically 0.2-10 µm) are capable of removing some protozoan cysts and oocysts, but bacteria and viruses are small enough to pass through. Published studies for these technologies have reported Cryptosporidium removals in the range of 0.5-3.6 log (U.S. EPA, 2006b).

7.1.2.6 Membrane filtration

Microfiltration (pore size: 0.1-10 µm) and ultrafiltration (pore size: 0.01-0.1 µm) represent the most commonly used membrane processes. Microfiltration membranes are effective in removing protozoa and most bacteria, but not viruses, unless preceded by a coagulation step. Published reports have indicated log removal capabilities on the order of greater than 4.0 log to greater than 6.0 log for Cryptosporidium and Giardia, 4.0 log to greater than 6.0 log for E. coli and 0.0-2.0 log for viruses (MS2) (Jacangelo et al., 1995; NSF, 2000a, 2002, 2003). Ultrafiltration membranes have pore sizes small enough to remove protozoa, bacteria and viruses. Reported log removal efficiencies have ranged from greater than 5.0 log to greater than 7.0 log for Cryptosporidium and Giardia, from 2 to greater than 6 log for viruses (MS2) and up to 7.0-8.0 log for E. coli (Jacangelo et al., 1991, 1995; NSF, 2000b,c).

Nanofiltration membranes (0.5-2 nm) and reverse osmosis membranes are generally considered non-porous and are capable of rejecting particle sizes much smaller than those of ultrafiltration membranes. Comparatively fewer studies on the microorganism removal capabilities of these two technologies have been published in the literature. Lovins et al. (1999) reported variable results during microorganism challenge testing of two different nanofiltration technologies, one possessing a cellulose acetate membrane and the other a composite thin-film membrane. The composite thin-film membrane systems were shown to produce greater than 4-5 log removals of bacteria (Clostridium perfringens spores), protozoa (Cryptosporidium oocysts, Giardia cysts) and viruses (MS2), whereas the cellulose acetate membrane accomplished removals of only less than 2 log for each of the three groups. Regarding reverse osmosis membranes, Mi et al. (2004) reported bacteriophage MS2 removals of greater than 5 log, whereas Gagliardo et al. (1997a,b) observed removals of greater than 4.8 log for both Cryptosporidium and Giardia and from 2 log to greater than 5 log for viruses (MS2) during challenge testing experiments. Lozier et al. (2003) demonstrated between greater than 6-log removal to complete removal of MS2 using reverse osmosis membranes and between 3 and 5.5 log removal using nanofiltration membranes. Owen (1999) commented that imperfections in manufacturing and the inherent distribution of pore sizes for all separation processes can contribute to less than complete removal of microorganisms.

7.1.3  Optimized filtration

Optimizing treatment conditions for turbidity reduction and particle removal also optimizes cyst and oocyst removal (Hall et al., 1994; Ongerth and Pecoraro, 1995; Patania et al., 1995; Dugan et al., 2001; Huck et al., 2002). In a pilot study on the effects of coagulation conditions on Cryptosporidium removal by conventional treatment, Dugan et al. (2001) observed that log removals under optimized conditions (mean turbidity 0.08 NTU, range 0.02-0.15 NTU) were several orders of magnitude higher than those under suboptimal conditions (mean turbidity 0.31 NTU, range 0.13-0.66 NTU). Huck et al. (2001, 2002) examined the effects of sub-optimal coagulation and end-of-run conditions at two conventional pilot plants. Sub-optimal coagulation conditions resulted in an increase in the mean turbidity from 0.05 NTU to 0.17 NTU, with a corresponding 2-log decrease in the removal of Cryptosporidium to less than 1-log removal. Similarly, under end-of-run conditions Cryptosporidium removals decreased from an average of 5.5-log to 2.0-log when the mean turbidity increased from 0.03 NTU to 0.21 NTU. Ongerth and Pecoraro (1995) reported that direct filtration removals of Giardia and Cryptosporidium declined by 1.5 log in a filtration run that was intentionally conducted at suboptimal coagulation conditions. The average filter effluent turbidity for the suboptimal run was 0.36 NTU, compared with 0.02-0.09 NTU for runs that were optimized. The filter ripening period has also been identified as period of potential increased risk for Cryptosporidium in filtered water. Amburgey at al. (2005) identified a general trend towards increased Cryptosporidium passage during the ripening period that corresponded with elevated turbidity in the filter effluent. New pilot studies found that Cryptosporidium in backwash remnant water may pass through filters during ripening and result in up to a 1 log decrease in removal.

Filtration systems should be operated to reduce turbidity levels to the lowest levels possible. Water suppliers should strive to achieve a treated water turbidity target of 0.1 NTU at all times. Pilot studies with conventional treatment have demonstrated that removal of Giardia cysts and Cryptosporidium oocysts is maximized when treatment is optimized to meet a filter effluent turbidity goal of 0.1 NTU or less (Nieminski and Ongerth, 1995; Patania et al., 1995; Huck et al., 2002). This goal is consistent with the findings of the assessment behind the U.S. EPA's LT2ESWTR and with recommendations made by the PSW and the Programme d'excellence en eau potable (Program for Excellence in Drinking Water). The LT2ESWTR provides guidance on turbidity levels achieving the greatest Cryptosporidium log removals and specifies additional credit for public supplies meeting a filter effluent turbidity standard of 0.15 NTU with no individual filter measurement greater than 0.3 NTU in two consecutive measurements 15 minutes apart (U.S. EPA, 2006b). Similarly, among the criteria used by the PSW as representing excellence in water treatment is a filter effluent turbidity of less than 0.10 NTU in 95% of measurements with a maximum filtered water turbidity of 0.30 NTU.

There is a paucity of data pertaining to additional pathogen removal capabilities of other filtration types (e.g., slow sand, diatomaceous earth) when operating at very low turbidity levels. These processes remove particles by mechanisms other than conventional filtration. In general, optimizing processes to reduce filter effluent turbidity as low as possible will maximize pathogen removal. Schuler and Ghosh (1990) observed that diatomaceous earth removals of oocysts and turbidity were further improved when a chemical coagulant (alum) was added. In experiments with a coarse grade of diatomaceous earth (median particle size 26 µm), the addition of alum concentrations of 0.01-0.02 g/g diatomaceous earth improved filter effluent turbidities from 0.16-0.20 NTU to 0.02-0.06 NTU and improved detectable oocysts per 100 gallons (379 L) from 1-6 oocysts to less than or equal to 1 oocyst.

Increases in filter effluent turbidity during filter operations can signal the potential for the passage of unwanted organisms. Conventional filtration pilot studies have demonstrated that during periods of coagulation disruption or filter breakthrough, moderate increases in turbidity can be accompanied by increased passage of cysts or oocysts (Logsdon et al., 1985; Patania et al., 1995). Patania et al. (1995) reported that a rise in filter effluent turbidity from below 0.1 NTU to between 0.1 and 0.3 NTU correlated to up to a 1-log decline in oocyst removal capability. Emelko et al. (2003) observed that during end-of-run and early breakthrough periods, removal of seeded oocysts declined by 2-3 log while turbidity levels were increasing, but still below 0.1 NTU. It is evident from these studies that during filter operations, even at turbidity levels below the HBTL, changes in effluent turbidity levels are important as an indicator for the potential breakthrough of pathogenic organisms.

For membrane filtration, filter optimization is not focussed on reducing turbidity in the filter effluent; rather the focus is on factors such as reducing fouling rates or reducing trans-membrane pressure so that water production is optimized. However, ensuring that adequate pathogen removal is occurring during filter operation is linked with verifying that membranes are intact using a combination of indirect and direct integrity monitoring.

In general, it is difficult to link various degrees of integrity loss in membrane filters with specific reductions in pathogen log removals (U.S. EPA, 2001b). Some studies have, however, demonstrated that minor breaches to integrity can have significant impacts on pathogen removal. Jacangelo et al. (1997) examined the impact of artificially breached ultrafiltration and microfiltration pilot plants on the removal of protozoa. The results indicated that when one fibre was cut per membrane module, the removals of Giardia and Cryptosporidium decreased from greater than 6-log down to between 1.2 to 4.2 log. The authors noted that membranes with the highest number of total fibres per unit and a transverse flow mode were the least affected by a membrane integrity breach. The corresponding turbidity increases following the membrane integrity breaches also varied with each membrane unit. In some cases, significantly lower log removals (approximately 2.0 to 2.5-log) of Giardia and Cryptosporidium were observed when turbidity increased from approximately 0.03 NTU up to 0.07 NTU. Kitis et al. (2003) studied the impact of compromised reverse osmosis and nanofiltration units on the removal of MS2. The results indicated that log removals decreased from greater than 6.0 for the intact membranes down to 2.9 and less than 2.0 for reverse osmosis and nanofiltration, respectively.

Since it is difficult to quantify the microbial risk associated with various levels of membrane integrity loss, utilities should focus on the immediate detection of any breach in membrane integrity as a way to minimize the potential for breakthrough of pathogenic organisms (U.S. EPA, 2001b).

7.1.4  Log removal credits

A table outlining the average potential removal credits estimated for Giardia, Cryptosporidium and viruses when treated water meets the turbidity HBTL specified in this document can be found in Appendix B.

7.1.5  Effect of turbidity on disinfection

Disinfection of drinking water supplies requires contact of the disinfectant with microorganisms at an appropriate concentration and for an appropriate contact time. If there is interference with contact, then the effectiveness of inactivation will be reduced. In some cases, this reduction can be significant.

Particulate matter can protect viruses, bacteria and protozoa from the effects of disinfection either by reducing the transmittance of the disinfectant (e.g., scattering ultraviolet [UV] light or reacting with oxidizing disinfectants) or by shielding microorganisms that have become attached to or enmeshed within the particle surface.

Factors important in influencing particle-organism associations include the type of microorganism, the particle chemical composition and the respective sizes and surface charges of each. Both organic and inorganic sources of turbidity can protect microorganisms from disinfection (WHO, 1984). Organic turbidity sources appear to provide greater interference with chlorine disinfection, whereas particles with high UV absorbance (either organic or inorganic in nature) can provide better protection against UV inactivation. Large particles (greater than several micrometres) can provide protection for bacteria (Berman et al., 1988; Kollu and Örmeci, 2012), whereas very small particles (less than 1 µm) are capable of harbouring viruses (Hejkal et al., 1979).

In studies involving chlorine disinfection, Hoff (1978) reported that disinfection curves for chlorine inactivation of polioviruses adsorbed on bentonite clay (7.1 NTU) or precipitated with aluminum phosphate (5.0 NTU) were similar to those for virus only (0.15-0.28 NTU) and showed no indication of a protective effect. In contrast, the data showed a pronounced protective effect for virus associated with Hep-2 (human carcinoma) cells at a turbidity of 1.4 NTU. Free virus (0.15 NTU) exposed to chlorine (3.0 mg/L) was reduced by more than 5 log in under 2 minutes, whereas cell-associated virus (1.4 NTU) exposed to chlorine (range of 2.0-3.0 mg/L) had only a 3-log reduction in virus numbers after 5 minutes. A longer-term experiment conducted by the authors showed that roughly 20 minutes was required for a 5-log inactivation of cell-associated virus by chlorine (3.0 mg/L initial concentration), compared with less than 2 minutes for a 5-log reduction of free virus. The chlorine concentration in the cell-associated suspension declined from an initial concentration of 3.0 mg/L to 1.5 mg/L after 20 minutes. Ohgaki and Mongkonsiri (1990) examined the effects of virus-floc associations on chlorine disinfection efficiency. RNA coliphage Qß entrapped within alum flocs (turbidity values not indicated) was observed to be noticeably protected from chlorine disinfection. The T99 value (contact time required for 99% inactivation) for flocculated phage (43 seconds) was observed to be nearly 3 times the value of that for freely dispersed phage (13 seconds) at a chlorine concentration of 0.4 mg/L. Barbeau et al. (2004) reported that kaolinite clay turbidity at 5 NTU did not produce a significant impact on the inactivation of MS2 phage and Bacillus subtilis spores with free chlorine. LeChevallier et al. (1981) studied the efficiency of chlorination in inactivating coliforms in unfiltered water supplies and found a negative correlation with turbidity. A model predicted that an increase in turbidity from 1 NTU to 10 NTU would result in an eight-fold decrease in the disinfection efficiency at a fixed chlorine dose.

Chlorine dioxide studies conducted by Scarpino et al. (1979) suggested that adsorption of poliovirus to bentonite had a definite protective effect on inactivation at turbidities greater than 3 NTU and temperatures of 25°C. Virus associated with cells demonstrated no protective effect at turbidities from 1 to 3 NTU and temperatures from 5°C to 25°C.

In a study involving ozone, Sproul et al. (1979) showed that alum and bentonite clay afforded little protection to E. coli, poliovirus or coxsackievirus at 1.0 and 5.0 NTU, whereas faecal material and, in particular, Hep-2 cells did provide protection.

Studies conducted with UV have investigated the potential impacts of turbidity particles on obstructing light transmission. Batch et al. (2004) reported that when UV absorbance was accounted for in the dose calculation, relatively low turbidity (0.1-0.3 NTU) had no significant effect on UV inactivation of MS2 in finished drinking water samples. Passantino et al. (2004) reported that UV disinfection was effective in inactivating MS2 seeded into unfiltered surface waters having turbidities ranging from 0.1 to 10 NTU, as well as in synthetic waters with 12 NTU montmorillonite clay turbidity and algal concentrations of 42 000 cells per millilitre . The authors did observe that compared to synthetic waters, natural waters required slightly higher UV doses to maintain the same level of phage inactivation (Passantino et al., 2004). In studying the effects of particles on UV transmission, Christensen and Linden (2003), noted that the mean UV dose delivery in raw water samples decreased between 5 and 33 percent when turbidity increased from 1 to 10 NTU. The authors further commented that increases in turbidity due to raw water spikes or filter breakthrough can negatively impact delivery of the UV dose and may compromise disinfection if not properly accounted and adjusted for during operation. Reporting on the results of a research project conducted for the city of Winnipeg, Wobma et al., (2004) noted that sediments from natural source waters caused no significant change in MS2 inactivation or UV reactor performance at turbidity levels up to 4.0 NTU. However, a turbidity level of 13 NTU was reported to interfere with UV disinfection efficiency. As part of the same study (Wobma et al., 2004), the authors noted that plankton counts had an effect on UV transmission and dose delivery, and that a higher dose was required to achieve the same level of MS2 inactivation as samples where plankton counts were not high.

The influence of turbidity particles on shielding microorganisms from UV inactivation has also been studied. It had been suggested (Templeton et al., 2005) that specific steps to enmesh the organisms with the particles (e.g., by encouraging coagulation) are necessary in order to investigate the impact of particle association of microorganisms on disinfection. Templeton et al. (2005) observed that humic acid flocs provided a measure of protection to MS2 and T4 phage from UV inactivation, whereas kaolin clay flocs had a negligible effect. Log inactivations for MS2 phage (UV doses of 40 and 80 mJ/cm2) and T4 phage (UV doses of 2 and 7 mJ/cm2) reported for the control samples (no humic acid or coagulant added) were at least 1 log greater than those reported for the humic acid flocculated samples (coagulation with either alum or ferric chloride). Turbidity values in this study were elevated, ranging from 70 to 100 NTU. The authors noted that humic acid particles absorb very strongly in the UV range, whereas kaolin clay particles have been observed to have low UV absorbance. Subsequently, it was concluded that the UV-absorbing content of particles is of importance for the survival of particle-associated viruses during UV disinfection (Templeton et al., 2005).

Templeton et al. (2007) also conducted an investigation of the passage of particle-associated phage through a dual-media (anthracite-sand) filter and the effect on UV disinfection. It was reported that humic acid-MS2 flocs reduced UV inactivation by a statistically significant degree, whereas kaolin-MS2 flocs did not. Documented median UV log reductions were on the order of 0.5-1.5 log less for humic acid flocculated samples compared with kaolin flocculated samples across the various filtration cycle stages (ripening, stable operation, end of cycle). The turbidities were less than 0.3 NTU for samples collected during stable operation and between 0.3 and 1.0 NTU for samples collected during ripening and end of filter cycle. The authors also noted that in unfiltered influent samples (range of 4.4-9.4 NTU), UV disinfection of phage in the presence of humic acid flocs was reduced by a statistically significant degree (roughly 0.5 log) compared with particle-free water (Templeton et al., 2005, 2007). Templeton et al. (2008) reviewed the evidence of the impacts of particle-associated viruses on disinfection processes. The authors postulated that the effectiveness of chemical and UV disinfection may be dictated more by the particle type (size, structure, chemical composition) and disinfectant type applied, than by the turbidity values themselves.

Liu et al., (2007) reported that natural turbidity particles (i.e., no coagulation or filtration) ranging from 12 to 32 NTU had little to no impact on UV inactivation of spiked E. coli, whereas the presence of flocs led to significantly lower UV disinfection (average reduction in activation of greater than 1 log). From these findings, the authors cautioned that UV disinfection could potentially be compromised if flocced particles were to pass through filtration and reach the UV unit. Gao and Hu (2012) similarly reported on the potential for floc particles to interfere with UV disinfection, noting a statistically significant protective effect of particles on enmeshed MS2 phage at turbidity values of 5 NTU.

Work by Mamane-Gravetz and Linden (2004) demonstrated that Bacillus spores within aggregates of montmorillonite clay (alum coagulation) could be protected from UV irradiation to a statistically significant degree compared with non-aggregated spores (no alum). It was also suggested that the protective effects were more pronounced in natural surface waters than in laboratory-simulated water. Differences in UV log inactivations reported between aggregated and non-aggregated spores were 1.4-1.6 log for natural waters and 0.2-0.4 log for simulated waters.

The effects of particles and aggregation under conditions similar to their natural state have also been investigated. Several authors have reported that natural particulate matter and aggregates can lower the inactivation rate of UV, but that this occurs mostly at elevated particle concentrations and is less significant in waters with low turbidity (Amoah et. al., 2005; Caron et al., 2007; Liu et al., 2007). The report from a study conducted for the Greater Vancouver Regional District in their Coquitlam watershed indicated that UV light was effective in achieving the target log reduction of greater than 3 log for Cryptosporidium at turbidity levels as high as 50 NTU (Clancy Environmental Consultants, Inc., 2007). The authors noted that the test organisms interacted with the particulate material through passive contact. The data also demonstrated evidence of shielding of MS2 phage and heterotrophic bacteria from UV disinfection, as UV-exposed particles were shown to release viable organisms when subjected to a dissociation procedure. Although the impact on UV inactivation was shown to be minimal, the authors similarly stated that caution would be recommended if a turbidity spike from an organic fraction of suspended material were to occur. Amoah et. al. (2005) reported that an increase in turbidity particles from naturally-occurring particulate matter to 20 NTU was associated with reductions in UV inactivation on the order of 0.8 log for Cryptosporidium and 0.4 log for Giardia. The effect of particles at turbidity values of less than 10 NTU was small and difficult to measure. It was noted that although a high level of inactivation of both parasites was achieved, complete elimination of infectivity was only observed in a small number of exposure trials. The authors concluded from the findings that interactions with natural particulate matter could result in reduced UV inactivation, but that the extent of the effect would depend on the characteristics of the turbidity particles native to individual source waters.

Using a synthetic system designed and controlled to simulate natural particle bioflocculation, Kollu and Örmeci (2012) observed that the presence of flocs was a significant factor in influencing E .coli inactivation at high UV doses (40, 80 mJ cm2) and for larger particles (11, 25 µm). It was also demonstrated that even in the absence of particles, self-aggregated E. coli could survive exposure to a UV dose of 80 mJ/cm2.

In a study specific to groundwater, Templeton et al. (2006) examined the impact of iron particles on the inactivation of bacteriophage by UV light. Both raw and preserved (amended with ethylenediaminetetraacetic acid to prevent iron precipitation) groundwater samples having a mean iron content of 1.3 mg/L were air oxidized, stocked with either bacteriophage MS2 or T4 and exposed to UV light. The authors observed that turbidity-causing iron oxide particles in the raw sample (2.7 NTU) protected both phages from UV inactivation to a small but statistically significant degree. Differences in log inactivations from raw to preserved samples were 0.2 log less for MS2 at a UV dose of 40 mJ/cm2and roughly 0.5 log less for T4 at a UV dose of 2 mJ/cm2(Templeton et al., 2006). The authors commented that the fact that iron particles demonstrated an effect at relatively low turbidity suggests that some types of inorganic particles may be capable of protecting viruses from UV inactivation. It was further noted that iron is a strong absorber of UV light (Templeton et al., 2006).

The potential for compromising disinfection can exist where a spike in turbidity occurs or if a particle aggregate or floc harbouring microbial pathogens were to pass through the system to reach the UV unit (Clancy Environmental Consultants Inc., 2007; Liu et al., 2007). Factors influencing the UV disinfection process include the number, size and chemical composition of particles, the nature and degree of the associated microorganisms and the type of disinfectant being applied (Templeton et al., 2005, 2008; Caron et al., 2007). It is recognized that different particles may differ in their ability to harbour and protect pathogens, and that the characteristics of the native particulate matter can differ considerably between water sources.

A turbidity value of 1.0 NTU for water entering the distribution system is recommended. Historically, 1.0 NTU or less has been recommended to ensure the effectiveness of disinfection (LeChevallier et al., 1981; WHO, 1984). Available data indicate that disinfection technologies can be effective in inactivating microorganisms when turbidity is present at higher values. Similarly, it has been proposed that the nature of the effects of particles in shielding microorganisms is a complex process that is not completely understood, and one that turbidity alone is not completely sufficient to explain (Caron et. al., 2007). As a result, the current body of evidence is not sufficient to be able to make general conclusions about turbidity levels which affect disinfectant efficacy. The recommended value of 1.0 NTU represents a risk management decision based on the assessment that it improves treatment and disinfection, is easily achievable and reflects best management practice as part of an overall commitment to risk reduction. It is advised that water suppliers strive to achieve turbidity that is as low as possible at all points in the supply chain.

7.1.6  Effect of turbidity on microbiological testing

The presence of turbidity particles can interfere with the detection and enumeration of bacteriological indicators of water quality (E. coli, total coliforms, HPC bacteria).

When testing for bacteriological indicators using membrane filtration, high turbidity can hinder filtration and possibly prevent the analysis of an appropriate sample volume. Additionally, turbidity particles having multiple bacterial cells attached to their surface disrupt the one cell-one colony principle of bacterial counting. In this situation, a single colony can develop from a particle containing many cells. This leads to an underestimation of bacterial numbers, because fewer cells than were actually present would be recorded. Deposits of particulate material on the membrane filter can also obstruct bacterial counts by causing poor colony development (i.e., colonies running together) or by masking desired diagnostic reactions (e.g., the development of colour or fluorescence). In an early study of the effects of turbidity on coliform recovery, Herson and Victoreen (1980) observed that turbidity was not a hindrance to the growth of coliforms on membrane filters, but did result in colonies being more difficult to recognize.

With the multiple-tube fermentation method, bacteria associated with particles can also interfere with the determination of the most probable number, similarly resulting in an underestimation of the true counts.

Routine pathogen testing of raw and finished waters is typically not practised, but collection and analysis of source water samples can be performed to provide information on organism concentrations and appropriate treatment requirements. When using methods for detection of enteric viruses and protozoa, turbidity can cause the clogging of filters, difficulties with the recovery of particle-adsorbed organisms and interference effects during the microscopic examination of protozoa (U.S. EPA, 2001a, 2006a).

7.2 Chemical characteristics

The chemical quality of water can also be influenced by turbidity. Table 1 above summarizes some common effects on water chemistry caused by different turbidity types. Inorganic turbidity particles can affect water pH, alkalinity, metal concentrations and the activity of some disinfectants. For plants using aluminum salts, increased residual particulate aluminum concentrations may be generated. Investigations have found that when treatment is properly optimized, low filtered water turbidity (less than 0.15 NTU) results in a very low aluminum residual (Letterman and Driscoll, 1988; Jekel, 1991).

Because of their adsorption capacity, clay particles can entrap inorganic and organic compounds, including metals and pesticides, and can influence the water's chemical characteristics. The reduction of turbidity particles through conventional filtration or equivalent technologies minimizes the contribution of particulate concentrations to the chemical characteristics of water. Technical information on specific chemical contaminants can be found in the individual Health Canada guideline technical documents.

Organic turbidity particles can similarly affect water pH, alkalinity and water chemistry. Among the particle types, the natural organic matter (humic) component of turbidity is considered to be of most importance in terms of its ability to affect chemical water quality. Natural organic matter is able to bind substantial amounts of metal and hydrous oxides together, forming complexes. A review of metal-natural organic matter (humate) complexes, the mechanism of their formation and their properties is provided by Schnizter and Kahn (1972). Organic chemicals such as pesticides can also be adsorbed onto natural organic matter particulates. The bonding in some chemical-natural organic matter complexes can be strong enough to interfere with the recovery and detection of the chemical contaminants.

The chlorination of water containing organic matter can produce disinfection by-products (DBPs), notably trihalomethanes (THMs) and haloacetic acids (HAAs)--groups of chemical compounds that may have health implications for humans. Strategies for addressing turbidity have implications related to controlling the potential formation of DBPs. These include precursor removal as a result of optimizing filtration and the limiting of formation through the modification of disinfection strategies (adapted chlorine application or the use of alternative disinfectants). Vrijenhoek et al. (1998) examined the effectiveness of enhanced coagulation for removing particles and THM precursors. The authors noted that significantly more THM precursors were removed during enhanced coagulation and at pH 5.5 and that removal of particles and turbidity increased substantially at alum doses above 20 mg/L.

7.3 Physical characteristics

Research has demonstrated that there is no specific relationship between turbidity reduction and particle removal (Patania et al., 1995; McTigue et al., 1998; Huck et al., 2002). The removal or reduction of either of these components is dependent upon the raw water and treatment process concentrations. Only a general turbidity-particle count correlation exists, whereby reducing turbidity increases particle removal and vice versa.

Turbidity measurements and particle counts are alike in function in that they are both approximate indicators of pathogen removals, but they are not exact surrogates. Measurements of turbidity have been traditionally relied upon as the main indicator of filtration performance because of the lower instrumentation costs, simplicity of information and ease of use. However, there is growing interest in the use of particle counters in monitoring water treatment efficiency. Similar to turbidity, particle counts have shown decreased removal during suboptimal filter operations as well as demonstrating that small spikes can be coincident with increased pathogen passage (Huck et al., 2002). Particle counts can be viewed as an additional tool for water suppliers to further optimize treatment. For example, it has been suggested that with conventional treatment, particle counters--as a result of their sensitivity--can provide advanced warning of turbidity problems during early filtration breakthrough (Huck et al., 2002).

A considerable body of evidence suggests that a large part of colour in water arises from colloidal particles. These tiny particles have physical and chemical properties that allow them to stay suspended in the water, rather than settling down or dissolving. Early research (Black and Willems, 1961; Black and Christman, 1963) used electrophoretic studies to demonstrate the predominantly colloidal nature of colour in water; it has been claimed that about 50% of colour is due to a "colloidal fraction" of natural organic matter (humic) substances (Pennanen, 1975). True colour is therefore defined as the colour of water from which turbidity has been removed (Sadar, 1998). In terms of the visual detection of turbidity, there is increasing visual detection at 5.0 NTU and above, which many consumers may find unacceptable.

The relationship between high turbidity, in both raw and filtered water, and taste and odour has also long been recognized (Atkins and Tomlinson, 1963). Algal growths, actinomycetes and their debris also contribute to taste and odour problems (Mackenthun and Keup, 1970).

Page details

Date modified: