Page 7: Guidelines for Canadian Drinking Water Quality: Guideline Technical Document – Turbidity

Part II. Science and Technical Considerations (continued)

6.0 Treatment technology

Turbidity is reduced by removing particles from the water through several processes, including, but not limited to, settling, coagulation/flocculation, sedimentation, flotation, adsorption and filtration. Adequate filtration can be achieved by a variety of technologies: conventional and direct filtration, slow sand filtration, diatomaceous earth filtration, membrane filtration or an alternative proven filtration technology.

These technologies all employ monitoring of turbidity in the treated water as a tool for assessing the performance of the water treatment processes. However, the levels of filtered water turbidity that are achievable and the associated potential pathogen removal vary depending on the pretreatment and the filtration technology used. Therefore, a different HBTL will apply to each filtration technology. In addition, the interpretation and implications of turbidity monitoring results vary significantly between different filtration technologies. For example, determining the optimal effluent turbidity levels to maintain and interpreting variations in turbidity during filter operation differ between conventional filtration and slow sand filtration. In this case, the two technologies rely on different turbidity removal mechanisms, and the relationship between turbidity reduction and pathogen reduction is also different.

There are many factors that affect the efficiency of turbidity reduction in filtration processes, depending on the type of technology that is being used. Some of these factors include source water quality, filtration rates, chemical pretreatment, filter media size/type and surface characteristics, filter run length, filter maturation, water temperature, filter integrity and backwashing procedures. Utilities need to identify the main factors that affect turbidity reduction for the filtration technology that is being used and optimize the process. Ensuring that filtration processes are performing optimally helps to increase the level of protection from potential contaminants, including pathogens, in the treated water (U.S. EPA, 1998b).

Although turbidity is not a direct indicator of the presence or absence of pathogens in treated water, it is recognized as the most readily measurable parameter to indicate filtration treatment effectiveness (U.S. EPA, 1998a). As such, extensive studies have been conducted on the use of turbidity as a performance and optimization indicator and its relationship to the removal of contaminants, such as pathogens, for a variety of filtration methods. This topic is discussed in greater detail in section 7.1.2. There has also been a significant amount of research examining the filtered water turbidity typically achieved by well-operated and well-maintained filtration plants and its relationship with the removal of pathogens. A discussion of filtered water turbidity levels and the average potential pathogen removal credits for the different filtration technologies that are discussed below is provided in Appendix B. In addition, guidance on the programs and methods that utilities can follow to achieve a lower turbidity target of 0.1 NTU is provided in Appendix C.

6.1 Conventional and direct filtration

The conventional filtration process generally includes chemical mixing, coagulation, flocculation, sedimentation (or dissolved air flotation) and rapid granular filtration. The direct filtration process includes coagulation and flocculation; however, no sedimentation or flotation is used, and flocculated water proceeds directly to filtration. While conventional filtration can be used on a wide variety of source water quality, direct filtration is typically limited to source water with turbidity that is below 15 NTU (MWH, 2005).

In conventional and direct filtration processes, particles are removed by physicochemical filtration. Chemical pretreatment using coagulants, pH adjustment and polymers is essential to conventional and direct filtration processes, destabilizing the negatively charged colloidal particles, such as clays, algae, cysts and viruses. This destabilization allows aggregation of particles to occur via chemical and van der Waals interactions, and the resulting particles are removed during sedimentation and/or filtration (Stumm and O'Melia, 1968; Stumm and Morgan, 1969; Logsdon, 2008). Aluminum and ferric salts are used as primary coagulants. Cationic and anionic polymers are most commonly used as flocculation aids, and both, along with non-ionic polymers, have been used as filter aids. The granular media filter is the most common type of filter used, and it may be a single-medium, dual-media or multi-media design. In both filtration processes, the effectiveness of particle removal is highly dependent on optimization of the chemical pretreatment (Cleasby et al., 1989; Logsdon, 2008). Filter loading rates generally range from 3.0 to 15 m/h, with some high-rate filters capable of 33 m/h (MWH, 2005).

All conventional and direct filtration plants should conduct continuous turbidity monitoring of filter effluent as an indicator of the performance of the treatment process. Continuous monitoring of the effluent turbidity from each individual filter as well as continuous monitoring of the combined filtered water turbidity from all filters are considered operational necessities in order to provide adequate performance data (Cleasby et al., 1992; U.S. EPA, 1998b; Logsdon et al., 2002; Logsdon, 2008). Continuous monitoring of individual filters has been identified as a key factor in achieving low-turbidity filtered water, enabling filter optimization and adequately detecting individual filter turbidity spikes (Cleasby et al., 1989; Renner and Hegg, 1997; U.S. EPA, 1998b). Continuous monitoring is necessary to ensure that each filter is functioning properly, to help determine when to end filter runs and to detect any short-term or rapid increases in turbidity that represent a process failure and a potential health risk. It also allows utilities to obtain a better understanding of routine filter performance, including establishing filter cycle trends and stable operation turbidity levels (Logsdon et al., 2002). In addition, comprehensive data on turbidity levels through all phases of the filter cycle trend are essential to be able to identify decreasing filter performance and to facilitate filter assessments and optimization programs.

The turbidity of filtered water from conventional and direct filtration plants generally follows a characteristic pattern with distinct segments in which turbidity levels vary depending on the filter run time (Amirtharajah, 1988). A filter cycle includes a pre-ripening period in which the turbidity increases due to the influence of post-backwash remnants above and within the filter, followed by a ripening period in which the turbidity decreases and approaches the level maintained during the stable filter operation phase. If a filter is operated for a long enough period of time, the turbidity will eventually start to increase. This is referred to as the end-of-run and breakthrough phases, when ultimately the filtered water turbidity will reach a maximum value (Amirtharajah, 1988; Cleasby et al., 1989; Logsdon et al., 2002). Filter operation periods such as following backwashing and at the end-of-run are generally characterized by increases in turbidity and risk of the presence of pathogens in the filtered water (Huck et al., 2001; Amburgey et al., 2005; Emelko et al., 2005). In general, all filters should be designed so that the filtered water produced immediately after filter backwashing is directed into a waste stream ("filter-to-waste"). However, in cases where this is not possible, other techniques, such as enhanced backwashing, delayed start and gradual filtration rate increases, can mitigate the initial turbidity spike (Logsdon et al., 2002; Amburgey et al., 2003, 2004; Logsdon, 2008). Similarly, during the stable operation phase of filters, unexpected turbidity spikes (rapid increase and decrease in turbidity) may occur as a result of a variety of factors, such as coagulant dosage upsets, pH changes, hydraulic surges (i.e., filtration rate increases), source water turbidity spikes and other operational factors. These spikes can have a significant effect on the passage of pathogens into the filtered water and are discussed in greater detail in section 7.1.2 (Nieminski and Ongerth, 1995; Patania et al., 1995; Huck et al., 2002; Emelko et al., 2003, 2005; Emelko and Huck, 2004). As the risk of the presence of pathogens in filtered water increases during turbidity increases and spikes, it is essential that utilities immediately investigate and determine the cause of any changes in filtered water quality.

Utilities also need to ensure that the filtration process is sufficiently robust to consistently provide high-quality filtered water and ultimately to maximize public health protection. In general, a robust filtration process is one that performs well both under normal operating conditions as well as during periods when filters may be challenged, such as during high source water turbidity events or coagulation upsets (Huck and Coffey, 2002; Li and Huck, 2007); however, robust performance must be carefully defined, as recent studies have indicated that robust turbidity removal may not always be indicative of adequate pathogen removal by filtration (Emelko et al., 2003; Brown and Emelko, 2009). It is essential for utilities to monitor and understand the turbidity levels of each filter throughout its operation to ensure that both stable operation periods as well as periods when filtered water turbidity is expected to be higher are managed appropriately. Systems that are not optimized to reduce stable operation turbidity levels to as low as possible as well as reduce the magnitude and likelihood of peaks or increases in turbidity levels are of particular concern with respect to the passage of pathogens into the filtered water.

6.1.1  Turbidity of conventional and direct filtration effluent

Conventional and direct filtration systems are capable of producing water with a turbidity of less than 0.3 NTU. Well-operated, optimized treatment plants have demonstrated that producing water with a turbidity of less than 0.1 NTU is achievable on an ongoing basis (U.S. EPA, 1997a, 1998a; McTigue et al., 1998; PSW, 2012b). These studies also indicated that maintaining a maximum filtered turbidity level below 1.0 NTU is also readily achievable for conventional and direct filtration plants. Therefore, meeting the HBTL for conventional and direct filtration systems is feasible, and it is expected that the majority of systems will already be meeting this value.

As part of the process for promulgating its Interim Enhanced Surface Water Treatment Rule (IESWTR), the U.S. EPA evaluated historical turbidity performance data from three large data sets encompassing conventional and direct filtration plants across the United States from 1995 to 1996. The analysis indicated that approximately 78% of the systems serving more than 10 000 people attained a 95th-percentile turbidity limit of 0.3 NTU. Maximum monthly turbidity values were below 1.0 NTU in over 94% of the systems that were evaluated (U.S. EPA, 1997a, 1998a). Similarly, a national assessment of particle removal using filtration conducted in 100 conventional and direct filtration treatment plants in the United States during the period from 1994 to 1996 indicated that the median filtered water turbidity did not exceed 0.2 NTU (McTigue et al., 1998). A more detailed examination of the turbidity of filtered water was conducted at a subset of filtration plants (52 plants) where turbidity measurements were taken by the researchers in addition to the turbidity data that was obtained from the filtration plant staff. This data indicated that over 90% of the plants attained 95th-percentile turbidity values below 0.3 NTU. Furthermore, over 85% of the plants did not exceed a maximum monthly turbidity value of 0.3 NTU. This study included a variety of treatment types (i.e., type of coagulant, filter media, etc.), source water characteristics and operating protocols. It should be noted that these data are from historical studies that evaluated the performance of filtration plants at the time, and it may not be indicative of the effluent turbidity levels that filtration plants may have been capable of achieving with the appropriate system optimization.

More recently, data collected by the Partnership for Safe Water (PSW) indicate that approximately 99% of participating surface water and GUDI filtration plants reported monthly 95th-percentile turbidities less than 0.2 NTU (PSW, 2012b). Similarly, 98% of the monthly maximum turbidity values were less than 0.3 NTU. The data in this report were collected from 404 treatment plants in the United States. The systems ranged in size from less than 3500 to over 700 000 people served. Although many of the largest utilities in the United States participate in the PSW, over 50% of the utilities that currently participate serve fewer than 100 000 people. These data indicate that well-operated conventional and direct filtration plants should not have difficulty operating below 0.3 NTU (U.S. EPA, 1997a, 1998a; McTigue et al., 1998; PSW, 2012b).

Several other historical studies examining either the design and operational practices or the performance of conventional filtration plants have demonstrated that producing filtered water with turbidity of less than 0.3 NTU is achievable for well-operated plants. The plants examined in these studies included a wide geographic coverage in the United States and Canada as well as a diversity of raw water types and system sizes (Cleasby et al., 1992; Consonery et al., 1997; Lusardi and Consonery, 1999; Statistics Canada, 2009).

A survey conducted in 2007 of Canadian conventional and direct filtration plants serving drinking water to over 12 million people reported that 79% of plants did not exceed an average treated water turbidity of 0.3 NTU and that 80% of the plants did not exceed a monthly maximum turbidity value of 1.0 NTU (Statistics Canada, 2009). Lusardi and Consonery (1999) conducted an evaluation of 75 conventional, direct and package filtration plants and found that the average 95th-percentile turbidity was 0.2 NTU and that over 90% of the plants did not exceed a maximum monthly turbidity of 1 NTU. The authors found that most plants consistently attained low turbidity levels despite limitations such as system size, plant age or high source water turbidity. In addition, the authors noted that the type of treatment plant (conventional, direct or package) did not have a significant effect on the annual average or maximum monthly filtered turbidity values that could be achieved. Conventional, direct and package treatment plants were all capable of achieving average annual turbidities of 0.2 NTU or lower. Other operational studies at specific plants have indicated that low turbidities in plant effluent are readily achievable when competent operations are in place (Logsdon et al., 2002). It should be noted that although the levels of turbidity reduction that can be achieved in conventional and direct filtration plants have been found to be comparable, other studies have found that particle count and pathogen reduction may be lower in direct filtration plants than in conventional filtration plants (Patania et al., 1995; McTigue et al., 1998). A more detailed discussion on direct filtration and pathogen removal can be found in section 7.1.2.

The U.S. EPA conducted an analysis of smaller systems serving fewer than 10 000 people to determine if these systems would be able to meet a turbidity limit of 0.3 NTU (U.S EPA, 2000, 2002). Data indicated that approximately 46% of smaller systems across the United States were meeting a turbidity limit of 0.3 NTU and that approximately 70% of systems were meeting this limit for 9 months of the year. Similarly, maximum monthly turbidity values were below 1 NTU in 88% of the systems evaluated (U.S. EPA, 2000). Additional studies that were evaluated indicated that between 41% and 67% of smaller systems were meeting a turbidity limit of 0.3 NTU, including systems that were classified as package plants or "pre-engineered" systems (U.S. EPA, 2000). These data suggest that smaller systems may have more difficulty than larger systems in achieving low filtered water turbidity levels. It is recognized that smaller systems have financial and resource limitations that make the operation of filtration plants more difficult. DeMers and LeBlanc (2003) found that operations and maintenance were the primary factors limiting the achievement of low turbidity levels by smaller systems.

6.1.2  Factors affecting conventional and direct filtration effluent turbidity

Many factors can affect the efficiency of turbidity reduction in conventional and direct filtration systems. Some of the main factors, such as non-optimal or no coagulation, lack of filter-to-waste or non-optimized backwashing techniques, intermittent operation, sudden rate changes and operating filters after turbidity breakthrough, can have a significant impact on filtered water turbidity (AWWA, 1991). There is a significant body of reference material that utilities can use to ensure that operational procedures are designed to minimize effluent turbidity under plant-specific conditions (Renner and Hegg, 1997; U.S. EPA, 1998b, 1999, 2004; Logsdon et al., 2002; Logsdon, 2008). The main procedures that have been identified as key factors in ensuring that a filtration plant is well operated are (1) monitoring instrumentation; (2) monitoring filter run performance; (3) managing pretreatment; (4) optimizing backwash; and (5) inspecting filter media (Logsdon et al., 2002).

In addition to the studies that have examined the design and process conditions that can affect the efficiency of turbidity reduction, studies have also examined both the operational and administrative factors that can affect a filtration plant's ability to achieve lowered filtered water turbidity. In general, these studies demonstrated that the operational and administrative aspects of a plant are the key factors that contribute to successfully achieving low turbidity goals (less than 0.1 NTU) and that, in many cases, large capital expenditures are not required to achieve these goals. Operational factors, such as optimizing chemical pretreatment (coagulant dosage and pH), practising filter-to-waste or other techniques to mitigate the effect of the initial turbidity spike, using filter aids and optimizing filter backwash, are important in achieving low turbidity in finished water. Administrative factors, such as management and operator commitment to achieving low turbidity goals and good operator training procedures, were also identified as key factors (Cleasby et al., 1989; McTigue et al., 1998; Lusardi and Consonery, 1999; DeMers and LeBlanc, 2003).

McTigue et al. (1998) collected turbidity data from 52 filtration plants in order to assess the number of plants that were meeting a 0.1 NTU turbidity criterion. Of the plants that did not meet a 95th-percentile of 0.1 NTU criterion, the authors determined that the majority of the failures (54%) were due to ripening turbidity spikes or end-of-run turbidity spikes. Of the plants that did not meet a 99th-percentile 0.1 NTU criterion, the failures were predominantly associated with only a ripening spike. The authors concluded that for many of the filter runs, a lower turbidity level could have been maintained by providing or extending a filter-to-waste period or ending filter runs sooner. These results are consistent with the goals of either filtering-to-waste or maintaining optimum filter performance following backwash, which are supported by the U.S EPA, AWWA and PSW (Renner and Hegg, 1997; U.S. EPA, 1998b; PSW, 2011). Optimum filter backwash performance involves either conducting filter-to-waste until turbidity levels have returned to 0.1 NTU or less or minimizing the magnitude and duration of the post-backwash spike (ripening) to less than 0.3 NTU, with turbidity returning to 0.1 NTU in less than 15 minutes following backwash (Renner and Hegg, 1997). A number of strategies have been documented in the literature for minimizing filter ripening (post-backwash) turbidity levels (Cleasby et al., 1992; Logsdon et al., 2002, 2005a,b; Amburgey et al., 2003, 2004; Amburgey, 2005; Amburgey and Amirtharajah, 2005; Logsdon, 2008).

In a study of 75 filtration plants in Pennsylvania, variables such as source water quality, plant type and design and operational parameters such as filter rate and coagulant used were examined to determine their effects on filtered water turbidity (Lusardi and Consonery, 1999). Other parameters, such as population served and plant age, were also evaluated. Plants that did not use pretreatment with a coagulant did not achieve low turbidity in the filtered water. This is consistent with other studies that have demonstrated that pathogens are also not effectively removed in plants where pretreatment is not used. The study found that plants that did not use a coagulant, served small systems (less than 3300 people served) or treated water from streams had statistically higher turbidity values relative to all of the plants in the study. Plants using a coagulant were able to consistently achieve low turbidity levels regardless of limitations such as system size, plant age or high source water turbidity. For example, the average annual effluent turbidity for small systems was 0.25 NTU, and the maximum monthly value was 0.40 NTU. The authors suggested that variables such as commitment to achieving low turbidity goals, operator skill level and training were likely important in lowering turbidity levels and that lower levels could be achieved by optimizing operations without making major capital expenditures. Similar results were found in a study of the design and operational practices of 21 conventional filtration plants that enabled them to produce low-turbidity finished water. Some of the key factors that were identified included adopting a low-turbidity goal, optimizing chemical pretreatment, using filter aids and providing good operator training (Cleasby et al., 1989).

Similar results were obtained in a study evaluating the main factors that were limiting smaller systems in Louisiana from achieving optimized goals. The study found that for 53% of systems, operations and maintenance were the top factors limiting optimization, and for 43% of systems, administration was the top factor. In only a few cases (3.5%) were design factors identified as the major limiting factor (DeMers and LeBlanc, 2003). As part of the same study, six plants participated in performance-based training programs and made operational changes, such as installing individual filter turbidimeters, to achieve optimization. Following the training, the average turbidities of the six plants decreased from 0.40 NTU to 0.16 NTU.

6.1.3  Optimization of conventional and direct filtration

Over the last two decades, the use of a treated water turbidity goal of less than 0.1 NTU for individual filter effluent has been increasing, as a way to improve the treatment of surface water or GUDI sources using conventional and direct filtration (Consonery et al., 1997; Renner and Hegg, 1997; U.S. EPA, 1998b; Lusardi and Consonery, 1999; Logsden et al., 2002; PSW, 2012a,b). Extensive research and field studies support optimizing particle removal in conventional and direct filtration plants to maximize protection of public health from microbial contamination (Ongerth and Pecoraro, 1995; Patania et al., 1995; U.S. EPA, 1998b; Huck et al., 2000, 2001, 2002; Emelko et al., 2001b, 2003, 2005). As a result, it is current industry practice to take a proactive approach to plant optimization. This practice includes meeting lower turbidity goals in order to minimize consumers' risk from microbial pathogens. A more detailed discussion of filtration plant optimization and microbial pathogen removal can be found in section 7.1.2.

Data from several studies indicate that many plants have already been achieving filtered water turbidities below 0.1 NTU for a significant period of time (Cleasby et al., 1989; U.S. EPA, 1997a; McTigue et al., 1998; Pizzi, 1998; Lusardi and Consonery, 1999). An assessment of filtration plants across the United States indicated that in the mid-1990s, the median filtered water turbidity of 100 conventional and direct plants was 0.07 NTU. Data from this study also indicated that over 50% of plants were already achieving 95th-percentile turbidities less than 0.1 NTU (McTigue et al., 1998). Other studies have also demonstrated the ability of well-operated conventional and direct filtration plants to achieve filtered water turbidities below 0.1 NTU (Cleasby et al., 1989; PSW, 2012b).

To facilitate achieving a lower filtered water turbidity, many utilities now participate in voluntary optimization programs, such as the Programme d'excellence en eau potable (Program for excellence in drinking water),  the PSW and the U.S. EPA Composite Correction Program. The basis of these programs is for each utility to adopt operational and administrative practices that have been demonstrated to improve treatment plant performance (U.S. EPA, 1998b; PSW, 2007, 2011, 2012a). In most cases, treatment plant performance is significantly improved, including lowering effluent turbidity levels, without major capital expenditures (Renner et al., 1993; U.S. EPA, 1998b; Hegg et al., 2000; Ginley, 2006; PSW, 2012b). One of the main components of these programs is a self-assessment procedure in which the utility examines its plant operations and evaluates the level of plant performance with respect to turbidity goals set by the program. The procedure is systematic and results in the identification and correction of the factors that could limit the performance of the treatment plant. These programs have defined optimum filter performance in terms of achieving specific treated water quality goals. The first optimization goal is to achieve effluent turbidities on individual filters of 0.10 NTU or less 95% of the time. The second goal is to minimize the turbidity of the post-backwash filtered water "spike" to no greater than 0.30 NTU, with turbidity returning to below 0.10 NTU in less than 15 minutes following the backwash (Renner and Hegg, 1997; U.S. EPA, 1998b; AWWA, 2009; PSW, 2012b). Typical examples of actions that utilities can take to facilitate optimization are coagulant dosage and pH adjustments, filter run time modifications, slow start or delayed startup of filters following backwashing and extended terminal subfluidization backwash.

A number of reports and studies have demonstrated that when utilities follow an optimization program or implement optimization tools, they are capable of significantly reducing filtered water turbidity. Data collected for the PSW indicate that in 2010-2011, U.S. filtration plants participating in the program reduced finished water turbidity by greater than 60% compared with baseline levels after following a filter self-assessment program for optimization. In addition, the data indicate that approximately 88% of the monthly 95th-percentile values were below 0.1 NTU (PSW, 2012b). Similarly, the Pennsylvania Department of Environmental Protection found that by identifying weaknesses and optimizing treatment at filtration plants, the number of plants that achieved a filtered water turbidity below 0.2 NTU increased from only 60% in 1988 to over 96% in 1996 (Consonery et al., 1997). Additional studies of the optimization of full-scale conventional filtration plants in North America and the United Kingdom have demonstrated that significant reductions in filtered water turbidity are possible by optimizing existing plants and that a filtered water turbidity goal of less than 0.1 NTU can be achieved consistently (Leland et al., 1993; Hegg et al., 2000; Bayley et al., 2001; Mazloum et al., 2003; Drachenberg et al., 2007). A variety of operational and administrative practices that can limit filtration plant performance have been identified in previous studies and typically include a lack of one or more of the following: optimized chemical pretreatment (coagulant dose and pH adjustment), filter-to-waste, optimized filter backwash, continuous individual filter monitoring, operator training and management commitment to plant performance (Cleasby et al., 1989; Renner et al., 1993; McTigue et al., 1998; Lusardi and Consonery, 1999; DeMers and LeBlanc, 2003).

6.2 Slow sand filtration

The slow sand filtration process generally consists of untreated water slowly flowing by gravity through a bed of submerged porous sand. Below the sand is a layer of gravel for support and an underdrain system that collects the filtered water. The hydraulic loading rates are much lower for typical slow sand filters than for rapid granular filtration and range between 0.05 and 0.4 m/h. In slow sand filtration, filter effectiveness depends on the formation of schmutzdecke, a layer of bacteria, algae and other microorganisms on the surface of the sand, and the formation of a biological population (biopopulation) within the sand bed. As raw water passes through the sand bed, physical, chemical and biological mechanisms remove contaminants. The most important removal mechanism has been determined to be the biological processes. As particles are also physically strained, destabilization using coagulants is not required for slow sand filtration to be effective. Without any pretreatment, application of slow sand filtration is typically restricted to raw water sources with turbidity below 10 NTU, although some research indicates that raw water below 5 NTU is preferable (Cleasby, 1991; MWH, 2005; Logsdon, 2008). Effective filtration of raw water with higher turbidity levels has been demonstrated using different forms of pretreatment (Collins et al., 2005; Anderson et al., 2006; DeLoyde, 2007; Gottinger et al., 2011).

Similar to rapid granular filtration, slow sand filtration operates over a cycle. The cycles consist of a filtration stage and a regeneration stage. Breakthrough of turbidity typically does not occur in slow sand filtration, and the filters can operate until the head loss reaches the design limit. Terminal head loss can take weeks to months to occur, at which time the filter is drained and the top 1-2 cm of schmutzdecke is removed and either disposed of or cleaned for reuse (MWH, 2005; Logsdon, 2008). As is the case with conventional filtration, a "filter-to-waste" feature should be provided so that the filtered water immediately after filter cleaning is directed into a waste stream, because the initial improvement period can be as long as 1-2 days.

Although the rate of filtration is low for slow sand filtration, filter performance monitoring using turbidity is still an important tool for ensuring that filters are performing at an acceptable level. Turbidity levels in the filtered water may increase during operation owing to a number of factors, such as increases in raw water turbidity, increased hydraulic loading rate and decreased water temperature. As with conventional filtration, conducting continuous monitoring of individual filter turbidity allows utilities to obtain a better understanding of filter performance, including identifying factors that affect filtered water quality, such as temperature variations, filter maturity and source water turbidity fluctuations.

6.2.1 Turbidity of slow sand filtration effluent

Researchers have observed variation in the ability of slow sand filters to reduce turbidity; however, studies indicate that slow sand filtration plants are able to achieve filtered water turbidity below 1.0 NTU consistently. Studies have also shown that well-operated mature slow sand filters generally produce filtered water with turbidity levels below 0.5 NTU and often approaching 0.1 NTU (Cullen and Letterman, 1985; Collins et al., 1992; Riesenberg et al., 1995; Cleary et al., 2008; Kelkar et al., 2009).

Fox et al. (1984) found that when water was filtered at 0.12 m/h, after an initial ripening period had allowed the biopopulation to become established on new sand, the treated water turbidity was consistently less than 1.0 NTU. Raw water turbidity ranged from 0.2 to 10.0 NTU in this study. Cleasby et al. (1984) reported that typical effluent turbidity was 0.1 NTU, except during the first 2 days after scraping of the schmutzdecke, for raw water turbidity ranging from less than 1.0 to 30.0 NTU. Pyper (1985) observed slow sand-filtered water with effluent turbidity of 0.1 NTU or lower for 50% of measurements and 1.0 NTU or lower for 99% of measurements; raw water turbidity in this study ranged from 0.4 to 4.6 NTU. Several other studies of full-scale slow sand filtration plants have found that water treated using slow sand filtration can typically achieve turbidities below 0.3 NTU (Cullen and Letterman, 1985; Collins et al., 1992). Cullen and Letterman (1985) found that the average turbidity of filtered water from slow sand filtration plants was 0.25 NTU when influent turbidities ranged between 1 and 3 NTU.

Slezak and Sims (1984) reported that approximately 45% of the 27 full-scale slow sand filtration plants they surveyed produced filtered water turbidity of 0.4 NTU or less. The mean influent and effluent turbidity levels of all the plants were 4 NTU and 0.65 NTU, respectively. In an updated survey of 59 slow sand filtration plants in the United States in 1991, Sims and Slezak (1991) found that over 95% of filtration plants were producing water with turbidities less than 1 NTU. This survey also demonstrated that slow sand filtration plants can readily and consistently achieve effluent turbidity levels below 0.5 NTU, with approximately 80% of plants operating below this value (Barrett et al., 1991).

Other full-scale slow sand filtration studies have indicated that plants can maintain treated water turbidity levels well below 1 NTU. One study found that at plant startup, turbidity was initially 1.4 NTU; however, following a 6-month operation period, the average daily turbidity for the plant was 0.4 NTU. The authors attributed the high initial turbidity values to excessive fines in the filter sand. Effluent turbidity values were also observed to increase up to 0.75 NTU following scraping of the filters. Turbidity returned to below 0.5 NTU following approximately 2 months of filter ripening (Riesenberg et al., 1995). Kelkar et al. (2009) found that small-scale slow sand filtration systems were capable of reducing raw water turbidity between 5 and 8 NTU down to below 0.5 NTU at filtration rates of 0.1 and 0.2 m/h.

More recent studies on slow sand filtration have examined modifications to slow sand filtration, including pretreatment using ozonation, roughing filters and post-treatment granular activated carbon, to increase the range of raw water quality that is suitable for slow sand filtration (Collins et al., 2005; Anderson et al., 2006; Jobb et al., 2007; Gottinger et al., 2011). Modified slow sand filtration using ozonation and roughing filtration followed by slow sand filtration has been shown to reduce turbidity to below 0.3 NTU, with effluent turbidity trending to below 0.1 NTU following 2 years of operation (Jobb et al., 2007). Other studies evaluating the reduction of turbidity using a modified slow sand filtration pilot plant demonstrated that raw water turbidity ranging from 1 to greater than 80 NTU could be reduced to below 0.1 NTU in up to 72% of the measurements and to below 0.3 NTU in 100% of the effluent water measurements. Increases in effluent turbidity were observed when raw water turbidity increased above 30 NTU during rain events (Anderson et al., 2006). However, rainfall events resulting in high turbidity in the raw water had less impact on filtered water turbidity with increasing filter maturity (Cleary et al., 2008). Full-scale plants have reported turbidity reductions for raw water with turbidity above 5 NTU to below 0.1 NTU (Gottinger et al., 2011). Data from optimized slow sand filters installed at two small drinking water systems in Saskatchewan were provided by Gottinger et al. (2011). Finished water turbidity at the first plant (roughing filter, slow sand filtration, biologically active filtration) was shown to be less than 0.40 NTU in greater than 75% of the measurements. Finished water from the other treatment (slow sand filtration and biologically active carbon filtration) had an average turbidity of less than 0.10 NTU.

6.2.2 Factors affecting slow sand filtration effluent turbidity

Slow sand filtration can readily achieve an effluent turbidity below 1.0 NTU and in many cases approaching 0.1 NTU. Particulate removal using slow sand filtration may not consistently be as high as with conventional filtration. However, reducing turbidity as low as possible, with a goal of 0.1 NTU, is an important factor in ensuring that a slow sand filtration plant has been properly designed and is being well operated. There are several factors in the design and operation of slow sand filters that affect the filtered water turbidity. Sand size and uniformity, hydraulic loading rate, filter maturity and water temperature can all affect the water quality of filter effluent (Bellamy et al., 1985a; Cleary et al., 2008; Logsdon, 2008).

Design parameters, such as media size and depth as well as how well the media have been washed prior to installation, can affect the turbidity of filtered water. Smaller-sized sand generally yields better particle removal, but also results in greater head loss during filter operation. Several studies have found that excessive fines or sand that has not been pre-washed can contribute to elevated turbidity in filtered water for several months following plant startup (Seelaus et al., 1986; Leland and Damewood, 1990; Riesenberg et al., 1995). The depth of the slow sand bed can affect slow sand filter performance once it has become too shallow as a result of sand removal during repeated scrapings. Once the bed depth has reached approximately 0.5 m, the filter should be re-sanded (Logsdon, 2008).

One of the main operational factors that utilities can adjust to control effluent turbidity in a slow sand filtration plant is the hydraulic loading rate. Several studies have indicated that increased hydraulic loading rates can result in increased effluent turbidity levels (Bellamy et al., 1985a; Riesenberg et al., 1995; Cleary et al., 2008). Bellamy et al. (1985a) found that increasing hydraulic loading rates from 0.12 to 0.40 m/h decreased the turbidity reduction from 32% to approximately 27%. Similarly, in a study of the performance of a full-scale slow sand filtration plant, effluent turbidity increased from approximately 0.5 NTU with an average filtration rate of 0.024 m/h in the winter and 0.10 m/h in the summer up to 0.8 NTU with filter operation at or near the maximum design capacity of 0.24 m/h (Riesenberg et al., 1995). A pilot-scale study conducted on a multi-stage slow sand filter also demonstrated that effluent turbidity was higher (increased from 0.3 to greater than 1 NTU) when the hydraulic loading rate was increased from 0.2 to 0.4 m/h and the raw water turbidity spiked to greater than 50 NTU. This observation was made during colder temperature periods of less than 10°C (Cleary et al., 2008). The authors noted that following several subsequent months of operation, the filter was capable of consistently achieving effluent turbidity levels below 0.3 NTU, even at a higher filtration rate of 0.4 m/h. This was attributed to the increased performance of a more mature slow sand filter and warmer temperatures (Cleary et al., 2008). In general, filter efficiency typically decreases with lower water temperatures, as the biological action in the filter decreases. Utilities will generally need to make adjustments, such as decreasing the hydraulic loading rate to the filters, during periods when the water temperature is lower, so that the overall filter performance is maintained (Logsdon, 2008).

Filter maturity is considered to be one of the most important aspects affecting slow sand filter performance (Barrett et al., 1991). Several studies have indicated that both turbidity and pathogen removal increase as the biological activity in the filter increases following filter maturation (Bellamy et al., 1985a,b; Anderson et al., 2006).

6.3 Diatomaceous earth filtration

Diatomaceous earth filters consist of a vessel that contains many filtration devices called filter elements or leaves. Filter elements can be placed in a pressure vessel, or they can be used in an open tank if a pump is provided with suction attached to the filter effluent piping to cause a pressure differential across the filter elements in the tank. The filter elements support a porous membrane or fabric referred to as a septum that holds the filter cake during filtration. Typical filtration rates are lower than for rapid granular filtration and range from 1.3 to 5 m/h (MWH, 2005). Without any pretreatment, application of diatomaceous earth filtration is typically limited to raw water sources with maximum turbidity values between 5 and 10 NTU. It is suggested that, when turbidity is caused by inert silt/clay particles, diatomaceous earth filtration may be applied to water at the higher end of the range; however, when the source of turbidity is organic or compressible particles such as alum or iron floc, the lower turbidity limit may be more appropriate (Fulton, 2000).

To begin a filter run, the septum is coated with a thin layer of diatomaceous earth (precoat) about 3 mm thick. To prevent turbid water from clogging the filter, a small amount of diatomaceous earth is continually added as body feed to maintain a permeable filter cake. Filtration occurs through the cake to the inside of the filter element, where it is collected in a channel and exits the element. As the filter run progresses, the body feed and raw water particles deposit at the surface of the cake, forming a new filtering surface and increasing the thickness of the cake. Particulate removal occurs primarily at the surface of the cake through straining, and particles as small as 1 µm can be removed, depending on the media used. Once the head loss across the filter cake becomes too great or the filter cake begins to slough, the filter is removed from service. The filter coat is then washed off using a backwash process and disposed of. New diatomaceous earth is applied, and the cycle starts again (MWH, 2005; Logsdon, 2008).

As discussed above, the diatomaceous earth filtration cycle includes the precoat step, filtration period and backwash. At the start of a filter run, there may be slightly elevated turbidity as the fine, inert diatomaceous earth matter that has not been stabilized in the precoat is sloughed into the filtered water. However, turbidity generally decreases slightly through the filter run as the cake thickness increases. Once the filter run has started, generally there is no breakthrough of turbidity as long as the flow that holds the cake to the septum is not interrupted. Disturbance of the filter cake as a result of hydraulic surges should generally result in termination of the filter run (Fulton, 2000).

Filter performance monitoring using turbidity is an important tool for ensuring that diatomaceous earth filters are performing at an acceptable level. Turbidity levels in the filtered water may increase during operation owing to a number of factors, such as uneven precoating and disturbances to the cake, and require continuous monitoring of the filter effluent to ensure that the filters are operating well.

6.3.1  Turbidity of diatomaceous earth filtration effluent

As with slow sand filtration, well-operated diatomaceous earth filtration plants are readily capable of producing filtered water with turbidity of less than 1 NTU and in many cases can achieve filtered water turbidity below 0.1 NTU. Logsdon et al. (1981) reported that turbidity reductions of 56-78% to achieve filtered water with turbidity below 0.5 NTU were attained with a diatomaceous earth pilot plant when raw water turbidity ranged from 0.95 to 2.5 NTU. Pyper (1985) reported an average turbidity reduction of 75% with an effluent turbidity of 0.5 NTU. A study of the performance of a full-scale diatomaceous earth filtration plant found that raw water turbidity between 1 and 3 NTU was reduced to between 0.3 and 0.5 NTU in the filtered water effluent (Ongerth, 1990).

More recent pilot-scale studies evaluating particle and Cryptosporidium removal using diatomaceous earth filtration found that influent turbidity ranging from 0.7 to 1.1 NTU could readily be reduced to less than 0.1 NTU. Generally, the filter performance improved during the course of a run, with turbidity decreasing to less than 0.1 NTU approximately 20 minutes following the start of precoating and reaching less than 0.07 NTU after 200 minutes of filter operation (Ongerth and Hutton, 2001).

6.3.2 Factors affecting diatomaceous earth filtration effluent turbidity

Diatomaceous earth filtration can readily achieve an effluent turbidity below 1.0 NTU and in many cases approaching 0.1 NTU. Particulate removal using diatomaceous earth filtration may not be consistently as good as conventional filtration. However, reducing turbidity to as low as possible with a goal of 0.1 NTU is an important factor in ensuring that a diatomaceous earth filtration plant has been properly designed and is being well operated. There are several factors in the design and operation of diatomaceous earth filters that can affect the filtered water turbidity. Factors such as the diatomaceous earth size or grade, precoat thickness, hydraulic loading rate, pressure fluctuations and integrity of the cake can all affect the water quality of filter effluent (Lange et al., 1986; Fulton, 2000; Ongerth and Hutton, 2001).

Lange et al. (1986) found that the grade of diatomaceous earth used affected filter performance. For the finest diatomaceous earth grade with a median particle size of 7.5 µm, turbidity reduction was close to 100%; however, for coarser grades with a median particle size of 22 µm, a 10% reduction was observed. The authors noted that the source water turbidity varied between 4 and 10 NTU and was caused by colloidal clays. The authors also expected improved turbidity reduction if the source water turbidity was caused by larger particles. Schuler and Ghosh (1990) also demonstrated that the use of different grades of diatomaceous earth resulted in significant variations in turbidity reduction.

Lange et al. (1986) found a slight decline in turbidity reduction at higher hydraulic loading rates when a coarser-grade diatomaceous earth was used as the filter aid. However, other pilot-scale testing found that increasing hydraulic loading rates from 2.5 to 5.0 m/h resulted in a minor decrease in filtered water turbidity (Ongerth and Hutton, 2001). This study also found that pressure fluctuations at the filter (due to variations in the peristaltic pump) caused effluent turbidity to increase from approximately 0.05 to 0.2-0.4 NTU.

The use of pretreatment or the addition of other filter aids such as alum-coated diatomaceous earth or polymer has been shown to reduce the turbidity of effluent from diatomaceous earth filtration plants. In a pilot-scale study, the use of alum-coated diatomaceous earth resulted in turbidity removal ranging from 66% to 98.8%, in contrast to removal ranging from 11% to 17% for diatomaceous earth without alum coating (Lange et al., 1986). In a similar pilot-scale study, the addition of chemical coagulants and polymers to the precoat and body feeds resulted in improved turbidity reduction. This study found that influent turbidity less than 1 NTU could be reduced to below 0.1 NTU (Schuler and Ghosh, 1990).

In a full-scale study, the addition of a cationic polymer filter aid to an existing diatomaceous earth filtration plant achieved finished water turbidity levels of less than 0.1 NTU from influent values greater than 6 NTU. The finished water turbidity levels of the filtration plant prior to addition of the polymer averaged 0.3-0.4 NTU; however, the plant had difficulty operating at high raw water turbidity levels (greater than 10 NTU), which normally resulted in plant shutdown (Cartnick and Merwin, 2004). Other full-scale plant data have demonstrated that modifications to an existing plant, such as relocating the slurry feed line injection point as well as replacing the filter septums, could improve filtered water turbidity from close to 1 NTU to an average finished water level of 0.25 NTU (Sweed, 1999).

6.4 Membrane filtration

Four membrane treatment processes are currently used in the water industry: microfiltration, ultrafiltration, nanofiltration and reverse osmosis. The most appropriate type of membrane for water treatment depends on a number of factors, including targeted materials to be removed, source water quality characteristics, treated water quality requirements, membrane pore size, molecular weight cut-off, membrane materials and system/treatment configuration (Jacangelo, 1991). The distinction between the types of membrane processes can be subjective and can vary by membrane manufacturer; however, the following classifications can generally be made (MWH, 2005; AWWA, 2007):

  • Reverse osmosis: a high-pressure membrane process originally developed to remove salts from brackish water. The reverse osmosis process is based on diffusion of water through a semi-permeable membrane as a result of a concentration gradient. Reverse osmosis membranes are considered to be non-porous and are used to remove dissolved solids, such as sodium, chloride and nitrate, from water.
  • Nanofiltration: a low-pressure reverse osmosis process for the removal of larger cations (e.g., calcium and magnesium ions) and organic molecules. Nanofiltration membranes are also typically considered non-porous and are reported to reject particles in the size range of 0.5-2 nm.
  • Ultrafiltration: a lower-pressure membrane process characterized by a wide band of molecular weight cut-off and pore sizes for the removal of small colloids, particulates and, in some cases, viruses. Ultrafiltration membranes typically have a pore size range of 0.01-0.1 µm.
  • Microfiltration: a low operating pressure membrane process used to remove particulates, sediment, algae, protozoa and bacteria. Microfiltration membranes typically have a pore size range of 0.1-10 µm.

As with most filtration processes, microfiltration and ultrafiltration have a repeating filtration cycle. As the filtration cycle begins, water is filtered through the membrane, and solids begin to accumulate on the influent side of the membrane. As the amount of solids on the side of the membrane increases, the transmembrane pressure needed to maintain a constant flux increases. Filtration typically proceeds for a set period of time, or until a specified transmembrane pressure is reached. Backwashing is then initiated to remove the surface cake that has been deposited during the filtration cycle (AWWA, 2005; MWH, 2005).

In ultrafiltration and microfiltration, water is filtered through a thin wall of porous material. The main mechanism for removal of particulate matter is through straining or size exclusion, and the types of contaminants that are removed depend partially on the pore size or molecular weight cut-off of the membrane. Research has also demonstrated that contaminant removal is affected by adsorption and cake formation on the membrane (AWWA, 2005; MWH, 2005).

Reverse osmosis and nanofiltration are pressure-driven membrane processes that are based on preferential diffusion to achieve separation of dissolved solutes from water. Reverse osmosis and nanofiltration can also remove particulate matter, although these technologies are not intended specifically for this purpose. High particulate loading can cause these types of membranes to foul rapidly. Therefore, these processes include pretreatment to remove particulate matter in the raw water prior to reverse osmosis or nanofiltration. Filtration of particulates and dissolved solids occurs when high pressure is applied to the influent side of the membrane and water is forced through the membrane surface, while the particulates and a large percentage of dissolved solids are rejected. The pressure needed to operate reverse osmosis and nanofiltration systems is partially dependent on the total dissolved solids concentration and the temperature of the feedwater (MWH, 2005; AWWA, 2007). Reverse osmosis is a continuous filtration, and there is no periodic backwash cycle. Prefiltration and/or the addition of a scale-inhibiting chemical may be required to protect membranes from plugging effects, fouling or scaling. Typically, reverse osmosis and nanofiltration systems are preceded by filtration by 5-20 µm cartridge filters to reduce the particulate load on the membranes and achieve an influent water quality with turbidity below 1 NTU (AWWA, 2007). A "filter-to-waste" feature should be provided for initial startup and commissioning of the membrane system and for emergency diversion in the event of a membrane integrity breach.

In principle, membrane filtration is an absolute barrier to remove any particle that is larger than the exclusion characteristic of the membrane system. However, any breach in the integrity of a membrane or leak in the system could allow particulate matter such as pathogens through the filter. Broken membrane fibres, leaking o-rings and cracked glue joints are some of the integrity breaches that can result in the passage of microorganisms and other contaminants into the treated water. Therefore, integrity testing is an essential component of membrane filtration operation (U.S. EPA, 2001b, 2005).

Membrane integrity testing is based on procedures that can assess whether a membrane system is completely intact or there has been a breach or leak that is compromising the performance of the system. Integrity testing falls into two general categories: direct and indirect. Direct methods are procedures applied directly to the membrane or membrane module to determine whether there is an integrity breach and, if there is, its source. Indirect testing is a surrogate measure of integrity based on monitoring the water quality of the filtrate. Indirect testing is generally conducted continuously, whereas direct testing is conducted at a lower frequency, such as daily (U.S. EPA, 2001b, 2005). A wide range of direct (pressure-based tests, acoustic sensor tests, etc.) and indirect (turbidity, particle counting, surrogate challenge tests, etc.) tests are available for membrane filtration plants. Comprehensive reviews of the different testing methods are available (Sethi et al., 2004; Guo et al., 2010).

Monitoring turbidity in the effluent of membrane filtration systems is one possible indirect integrity testing method. As the quality of the effluent from membrane filtration systems is consistently very high and generally does not vary with raw water fluctuations, an increase in turbidity of the filtrate, as revealed by monitoring, can be indicative of an integrity problem. The use of turbidity monitoring as an indirect integrity test has several advantages and disadvantages over other test methods. The main advantages of turbidity monitoring are that it can be conducted continuously, measurements between meters are relatively consistent, operators are familiar with it and it has lower costs. The main disadvantage of turbidity monitoring is that standard nephelometric turbidity meters are relatively insensitive to minor breaches in membrane integrity compared with other indirect methods. The use of laser turbidimeters has been suggested as a better alternative for membrane filtration monitoring, as the detection limit is lower and laser turbidimeters have been shown in some cases to be significantly more sensitive than nephelometric meters in detecting membrane breaches (Banerjee et al., 2000, 2001; U.S. EPA, 2001b; AWWA, 2005). This is discussed in greater detail in sections 5.0 and 6.4.2.

It has been suggested that turbidity is not sufficient as the sole method for monitoring membrane filter integrity as it does not provide adequate sensitivity for detection of small pinholes in membrane fibres and because of the lack of resolution between the raw and filtrate values (Sethi et al., 2004; AWWA, 2005; MWH, 2005; Gitis et al., 2006; Guo et al., 2010). In general, turbidity is accepted as part of an overall integrity monitoring program that includes both indirect and direct testing. The usefulness of turbidity monitoring in immediately identifying a major membrane system integrity breach has resulted in its widespread use in membrane filtration plants. Some organizations, such as the U.S. EPA, have identified turbidity monitoring as the default method for continuous indirect integrity monitoring unless an alternative method is approved by the state. As part of the U.S. Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR), continuous turbidity monitoring is required in addition to daily direct integrity testing as part of an overall system verification program (U.S. EPA, 2005, 2006b).

6.4.1 Turbidity of membrane filtration effluent

All membrane filtration processes are highly effective at reducing turbidity provided that the membranes are intact. In general, microfiltration and ultrafiltration processes achieve filtered water turbidity of less than 0.1 NTU (Adham et al., 1996; Laine et al., 2000; AWWA, 2005; Guo et al., 2010). As the primary use of reverse osmosis and nanofiltration processes in drinking water treatment is not for particulate removal, turbidity data for these types of systems are generally not reported; however, these processes can achieve very low filtered water turbidity.

One of the main advantages of microfiltration and ultrafiltration systems is their ability to produce low-turbidity filtrate consistently. The American Water Works Association reported on the turbidity of filtrate from over 72 microfiltration and ultrafiltration plants between 1989 and 2001. The results indicated that microfiltration and ultrafiltration membranes produce very high quality water regardless of the influent turbidity. The median filtrate turbidity for all of the plants was 0.06 NTU and the median of the maximum reported filtrate turbidity was 0.08 NTU. The evaluation also demonstrated that turbidity reduction was similar with or without coagulant addition and for all membrane types and manufacturers (AWWA, 2005). A similar study of over 70 microfiltration and ultrafiltration plants worldwide determined that regardless of the influent turbidity levels, microfiltration and ultrafiltration systems were capable of reducing turbidity to below 0.1 NTU (Adham et al., 1996). The U.S. EPA has also reported that most microfiltration and ultrafiltration systems produce filtrate water consistently in the range of 0.03-0.07 NTU as measured by conventional turbidimeters (U.S. EPA, 2001b, 2005).

6.4.2 Factors affecting membrane filtration effluent turbidity

Membrane filtration is an absolute barrier to remove any particle that is larger than the exclusion characteristic of the membrane system. However, any breach in the integrity of a membrane or leak in the system could allow particulate matter into the filtrate and therefore increase the turbidity. There are many possible sources of breaches to the integrity of membrane filtration systems, such as holes in membrane fibres and leaking o-rings; however, the ability of turbidity monitoring to detect breaches can vary significantly.

Adham et al. (1995) found that a number of factors can affect the sensitivity of turbidity for detecting a breach, including the type of microfiltration or ultrafiltration system, the number of modules linked to a single instrument, the number of fibres per module, the hydraulic configuration and other system-specific parameters. When the membranes were intact, the permeate turbidity of the four systems that were tested was in the range of 0.02-0.04 NTU. Turbidity increases were easily discernible, even with a pinpoint integrity breach in one fibre for the cross-flow membranes with between 500 and 2000 fibres; however, no change in turbidity was observed under a variety of breach conditions for the transverse membrane with 22 400 fibres. This study showed that dilution (number of fibres) and module flow mode were important factors determining the ability of turbidity to detect minor integrity breaches.

Two other studies that reported integrity testing data for full-scale plants indicated that turbidity monitoring is limited in its ability to detect integrity breaches. Kothari and St. Peter (2000) demonstrated that cutting up to 200 fibres in a membrane rack resulted in an increase in turbidity of only 0.01 NTU. Similarly, Landsness (2001) reported an increase in turbidity from an intact membrane value of 0.024 NTU up to 0.037 NTU when 200 fibres were cut in the membrane. However, the turbidity in the entire train of eight racks did not show any change and remained at 0.012 NTU. Both of these studies demonstrated that the sensitivity of turbidity monitoring for detecting minor to moderate integrity breaches is limited.

Sethi et al. (2004) conducted a comprehensive assessment of low-pressure membrane integrity monitoring tools. The study evaluated several indirect and direct integrity testing methods in six ultrafiltration or microfiltration full-scale plants, including an analysis of the sensitivity, reliability, costs and implementation capacity of each method. The results indicated that both standard nephelometric turbidity and laser turbidity lacked the sensitivity to detect the levels of breach that were investigated in the study. The levels of integrity breach that were examined ranged from 1 cut fibre up to 0.0025% cut fibres in a rack. The authors suggested that the less sensitive integrity monitoring methods, such as turbidity monitoring, should be considered as warning methods for severe losses in integrity, rather than as routine monitoring tools.

Data reported by Farahbakhsh et al. (2003) support the use of conventional turbidity monitoring as a method for detecting large integrity breaches in membrane systems. This study reported on the results of online integrity monitoring in which a major integrity breach of greater than 60 broken fibres out of 28 500 fibres in a membrane produced an increase in turbidity from 0.04 to 0.2 NTU.

In contrast to the Sethi et al. (2004) study, several authors have suggested that laser turbidimeters are a suitable alternative for membrane filtration monitoring, as the detection limit is lower and the sensitivity can be several orders of magnitude higher than with conventional turbidimeters (Banerjee et al., 2000, 2001; U.S. EPA, 2001b; AWWA, 2005). Banerjee et al. (2000) demonstrated that a laser turbidimeter was capable of detecting an intentional breach in a microfiltration system. One cut fibre out of 5000 in a membrane cartridge was detected by an increase in turbidity from 14 mNTU to over 250 mNTU. Laser turbidity systems equipped with sensors that can be installed on each membrane rack have also been used for the detection of integrity breaches at the module and fibre level (Naismith, 2005).

Since most membrane filtration systems consistently produce water with turbidity below 0.1 NTU, utilities should consider a sustained increase in turbidity above 0.1 NTU as an indicator of a potentially serious integrity breach. In general, when utilities are using turbidity monitoring for integrity testing, they should also use a more sensitive direct integrity testing method, such as pressure decay testing, to enable the detection and location of potential minor integrity breaches (Sethi et al., 2004; MWH, 2005).

6.5 Other technologies

6.5.1 Bag and cartridge filtration

Bag filtration and cartridge filtration are alternative technologies that can be used for the reduction of particulate matter, including turbidity, in drinking water. These technologies do not have a turbidity guideline value, as there is a wide variation in the turbidity reduction that can be achieved, and studies have not found a relationship between turbidity or other parameters such as media pore size or pressure drop on the removal efficiency of protozoa (U.S. EPA, 2006b). However, as many small drinking water systems use bag and cartridge filtration technologies, a brief description of the processes and turbidity reduction and monitoring is provided below.

Bag filtration and cartridge filtration are considered to be pressure-driven physical separation processes that remove particles greater than 1 µm using a porous filtration medium. Bag filters are typically constructed of a woven bag or fabric filtration medium that is placed in a pressure vessel. As water flows from the inside of the bag to the outside, contaminants are filtered out of the water. Cartridge filters are typically made of a semi-rigid or rigid wound filament that is housed in a pressure vessel in which water flows from the outside of the cartridge to the inside. Systems can be constructed with either single or multiple filters within one pressure vessel. It is recommended that all components used in bag and cartridge filters be certified under NSF International (NSF)/American National Standards Institute (ANSI) Standard 61: Drinking Water System Components--Health Effects. This standard ensures the material safety and performance of products that come into contact with drinking water (NSF/ANSI, 2012a).

Bag and cartridge filters remove particles in the water by physically screening those that are greater in size than the filter medium pore size. Bag filters typically have pore sizes that range from 1 to 40 µm, and those of cartridge filters typically range from 0.3 to 80 µm; therefore, selection of the type of filter that is most suitable for a system depends partially on the size of the particles and the level of turbidity in the source water (U.S. EPA, 1997b, 2003c).

Generally, bag and cartridge filters are used in small and very small drinking water systems as primary filtration systems; more recently, however, bag and cartridge filters have been used in larger systems as secondary filtration processes following existing primary filtration to obtain supplemental contaminant removal. When bag or cartridge filtration is used as a primary filtration method, the raw water is often prefiltered to remove larger particles before the bag or cartridge filtration step. In some cases, bag and/or cartridge filters are placed in series, with larger pore size units (greater than 10 µm) placed first followed by smaller pore size units (1-5 µm) as final filter units (U.S. EPA, 1997b). As a secondary filtration step, only smaller pore size units are used. Although bag and cartridge filters can accommodate some high-turbidity source water, generally the turbidity should be below 10 NTU for effective filtration (U.S. EPA, 1997b; Cleasby and Logsdon, 1999).

As there is a wide range in the pore sizes of bag and cartridge filters, the level of turbidity reduction is also highly variable. A study by Li et al. (1997) demonstrated that, depending on the type of bag filter used, turbidity removal could vary between 0.03-log and 1.89-log removal. Filtered water turbidity values in this study ranged from 0.14 to 9.87 NTU. Although turbidity has its limitations as an indicator of filter failure in bag and cartridge filtration, it is nonetheless recommended as a performance indicator for these systems. The frequency of monitoring may vary depending on the source water quality; however, at a minimum, effluent turbidity should be monitored daily (Cleasby and Logsdon, 1999; U.S. EPA, 2003c).

6.5.2 Additional strategies

A number of additional strategies are available for reducing turbidity in source water. These include, but are not limited to, riverbank filtration, lime softening, pre-sedimentation and dual-stage filtration. Generally, these processes are used early in a drinking water treatment train to reduce the level of particulates in the water for subsequent treatment and to enhance the overall particulate removal capabilities of a plant. In most cases, turbidity can be used to monitor the effectiveness of these processes. More detailed discussions on the use of these technologies for turbidity reduction can be found in Kawamura (2000), Ray et al. (2002) and U.S. EPA (2010).

6.6 Residential-scale treatment

Generally, it is not recommended that drinking water treatment devices be used to provide additional treatment to municipally treated water. In cases where an individual household obtains its drinking water from a private well, a private residential drinking water treatment device may be an option for reducing turbidity concentrations in drinking water. It should be noted that microbiological contamination of a well water supply may occur in conjunction with routinely high turbidity measurements and/or sudden increases in turbidity. Therefore, the microbiological aspects of the water quality should be considered prior to selection of a drinking water treatment device.

Health Canada does not recommend specific brands of private residential drinking water treatment devices, but it strongly recommends that consumers use devices that have been certified by an accredited certification body as meeting the appropriate NSF/ANSI drinking water treatment unit standards. These standards have been designed to safeguard drinking water by helping to ensure the material safety and performance of products that come into contact with drinking water. Certified devices for the reduction of turbidity from drinking water in residential systems generally rely on carbon filtration and reverse osmosis treatment processes.

Certification organizations provide assurance that a product conforms to applicable standards and must be accredited by the Standards Council of Canada (SCC). In Canada, the following organizations have been accredited by the SCC to certify drinking water treatment devices and materials as meeting NSF/ANSI standards:

  • Canadian Standards Association International (www.csa-international.org);
  • NSF International (www.nsf.org);
  • Water Quality Association (www.wqa.org);
  • Underwriters Laboratories Inc. (www.ul.com);
  • Quality Auditing Institute (www.qai.org); and
  • International Association of Plumbing & Mechanical Officials (www.iapmo.org).

    An up-to-date list of accredited certification organizations can be obtained from the SCC (www.scc.ca).

NSF/ANSI Standard 53 (Drinking Water Treatment Units--Health Effects) is applicable to the reduction of turbidity in drinking water. For a drinking water treatment device to be certified to Standard 53, it must be capable of reducing a turbidity level of 11 NTU ± 1 NTU to not more than 0.5 NTU (NSF/ANSI, 2011).

NSF/ANSI Standard 58 (Reverse Osmosis Drinking Water Treatment Systems) is also applicable to the reduction of turbidity in drinking water. For a drinking water treatment device to be certified to Standard 58, it must be capable of reducing a turbidity level of 11 NTU ± 1 NTU to not more than 0.5 NTU (NSF/ANSI, 2012b). Certified reverse osmosis systems are intended for point-of-use installation only. Reverse osmosis systems are installed at the point-of-use, as larger quantities of influent water are needed to obtain the required volume of treated, which is generally not practical for residential-scale point-of-entry systems. In addition, water that has been treated using reverse osmosis may be corrosive to internal plumbing components; therefore, these devices should be installed at the point-of-use.

Before a drinking water treatment device is installed, the water should be tested to determine general water chemistry and verify the level of turbidity. Periodic testing on-site by a water treatment specialist using a portable turbidimeter should be conducted on both the water entering the treatment device and the water it produces to verify that the treatment device is effective. Devices can lose removal capacity through usage and time and need to be maintained and/or replaced. Consumers should verify the expected longevity of the components in their treatment device as per the manufacturer's recommendations.

Page details

Date modified: