Page 4: Guidelines for Canadian Drinking Water Quality: Guideline Technical Document – Turbidity
Part I: Overview and Application (continued)
3.0 Application of the guideline
Note: Specific guidance related to the implementation of drinking water guidelines should be obtained from the appropriate drinking water authority in the affected jurisdiction.
The turbidity limit that applies to a drinking water system depends on a variety of factors including requirements to meet pathogen removal goals, type of treatment technology used, and location in the drinking water system. The HBTL for individual filtration technologies were established so that the physical removal credits given for enteric viruses and protozoa combined with disinfection will achieve similar levels of public health protection for the various types of treatment systems. Generally, minimum treatment of supplies whose source is either surface water or GUDI should include adequate filtration (or technologies providing an equivalent log reduction credit) and disinfection. Surface water is defined as all waters open to the atmosphere and subject to surface runoff. GUDI is a groundwater supply that is vulnerable to surface water contamination or contamination by pathogens and, as such, should be treated as a surface water supply.
Turbidity has different implications for water quality and treatment depending on the nature of the particles involved and the location of the turbidity within the drinking water supply chain. An understanding of the type and source of the turbidity can be valuable when assessing the implications on the water quality or treatment. High turbidity measurements or measurement fluctuations can indicate inadequate water treatment, changes in source water quality, or disturbances in the distribution system.
As part of the multi-barrier approach to drinking water treatment, pathogen physical log removal credits should be used in conjunction with disinfection credits to meet or exceed overall treatment goals. Specific information pertaining to pathogen reduction requirements can be found in the guideline technical documents for enteric protozoa and enteric viruses. Because of the potential relationship between turbidity levels and microorganisms, this document should be read in conjunction with all guideline technical documents on microbiological parameters.
3.1 System-specific guidance
3.1.1 Pathogen Removal
Where turbidity reduction is required as part of a strategy to meet pathogen removal goals, filtration systems should be designed and operated to reduce turbidity levels as low as reasonably achievable. At a minimum, these systems should meet the HBTL applicable to their specific treatment technologies. The filtration technologies discussed all employ monitoring of turbidity in the treated water as a tool for assessing the performance of the water treatment processes. Since the levels of turbidity achievable in filtered water and the associated potential pathogen removal vary depending on the pretreatment and the filtration technology used, different HBTL have been established to apply to each of the treatment technologies. It should be noted that the interpretation and implications of turbidity monitoring results may vary significantly between different filtration technologies and treatment scenarios.
For all filtration technologies, the HBTL for systems that are filtering to meet pathogen removal goals apply specifically to the turbidity of effluent water from individual filters. However, it is recommended that both the individual filter effluent turbidity and the combined filter (or clearwell or tank) effluent turbidity be continuously monitored. Continuous monitoring of the effluent turbidity from each individual filter is necessary to (1) ensure that each filter is functioning properly; (2) help determine when to end filter runs; and (3) detect any short-term or rapid increases in turbidity that represent a process failure and a potential health risk. Continuous monitoring of the combined filter effluent turbidity in the clearwell or tank will help ensure that the quality of the water entering the distribution system has not deteriorated following filtration.
188.8.131.52 Interpretation of the guideline
The HBTLs apply to turbidity that is measured in the effluent of individual filters during the period of filter operation when the effluent water is being disinfected and distributed to consumers. They are set based on meeting the values in the portion of measurements specified in Section 1.0 (95% or 99%) and also include 'never to exceed values'.
Assessing whether a system's performance satisfies the HBTL in at least 95% or 99% of turbidity measurements per filter cycle or per month, requires the collection of data over a period of time. The analysis of these data will then dictate whether further actions are needed to improve filter effluent turbidity. Action should be initiated if the applicable HBTL is exceeded. It is not the intent of the guideline to allow filters to operate above the HBTLs for reasons that can be foreseen, controlled or minimized. In many cases systems will be able to remain below these HBTLs 100% of the time. Any turbidity reading above the "never to exceed" value should be addressed immediately.
The actions initiated to address exceedances of the HBTL will be dependent on site-specific considerations and should be determined by the responsible authority on a case-by-case basis, taking into account local knowledge of the system's capabilities and performance. Examples of possible actions may include conducting an investigation of filter performance or initiating corrective actions such as repairs, maintenance or removing the filter from service.
The turbidity target of 0.1 NTU provides the benchmark for assessing system optimization and for comparing improvements over time. Systems employing filtration for pathogen removal and meeting the applicable HBTL should strive to meet the treated water turbidity target of less than 0.1 NTU. Utilities should be aware that filters that meet the target of 0.1 NTU may achieve greater pathogen removal and improve overall public health protection. Developing and implementing a system optimization plan will improve filter performance and help ensure that systems are achieving the appropriate log removal credits.
184.108.40.206 Conventional and direct filtration
Conventional and direct filtration systems should strive to achieve a treated water turbidity target of less than 0.1 NTU at all times. Where this is not achievable or optimization has not yet been attained, it is considered acceptable for the treated water turbidity from individual filters to be less than or equal to 0.3 NTU.In general, all filters should be designed so that the filtered water produced immediately after filter backwashing is directed into a waste stream ("filter-to-waste"). Turbidity levels should be consistently kept below 0.3 NTU (with a target of less than 0.1 NTU) throughout the entire filter cycle, with the exception of the filter-to-waste period. However, it is recognized that some systems, such as those that are not filtering-to-waste, may not be able to achieve this value 100% of the time. Comparison of a system's performance with the HBTL for 95% of turbidity measurements per filter cycle or per month allows utilities to establish operational procedures that are effective for each individual system. However, utilities should be aware that any turbidity measurement above 0.3 NTU may result in lower pathogen removal. Waterwork systems using conventional or direct filtration should investigate and minimize any occurrences of turbidity levels above 0.3 NTU.
The value of 1.0 NTU is identified as "never to exceed" because readings above this value suggest a significant problem with filter performance and subsequent disinfection efficacy may be impacted. Any turbidity values above 1.0 NTU should be investigated and addressed immediately.
220.127.116.11 Slow sand and diatomaceous earth filtration
Slow sand and diatomaceous earth filtration systems should also strive to achieve a treated water turbidity target of less than 0.1 NTU at all times. Although particulate removal using slow sand filtration may not achieve the same turbidity levels as conventional filtration, reducing turbidity as low as possible (goal of 0.1 NTU) remains an important goal and helps ensure that a slow sand filtration plant has been properly designed and is being well operated. Where this is not achievable or optimization has not yet been attained, it is considered acceptable for the treated water turbidity from individual filters to be less than or equal to 1.0 NTU.The value of 1.0 NTU is intended to apply throughout the entire filter cycle, with the exception of the filter-to-waste period. Slow sand filters should be operated to waste after starting or scraping until the filter effluent is consistently less than the standard required for the system. Waterworks systems using slow sand or diatomaceous earth filtration should investigate and minimize any occurrences of turbidity levels above 1.0 NTU.
Comparison of a system's performance with the HBTL for 95% of turbidity measurements per filter cycle or per month allows utilities to establish operational procedures that are effective for each individual system. Operators of slow sand and diatomaceous earth filtration systems should compare readings with operational monitoring records and flag any results above 1.0 NTU as exceedances of the HBTL. Utilities should be aware that any turbidity measurement above 1.0 NTU may result in lower pathogen removal and subsequent disinfection efficacy may be impacted.
The value of 3.0 NTU is stated as "never to exceed" because such significant exceedances suggest a major problem with performance. Any turbidity levels above 3.0 NTU should be investigated and addressed immediately.
18.104.22.168 Membrane filtration
Membrane filtration systems should reduce turbidity to as low as reasonably achievable. Turbidity measurements from membrane filter units should be below 0.1 NTU when membranes are intact and functioning properly. An individual membrane unit may be defined as a group of pressure vessels, cartridges or modules that are valved and isolated from the rest of the system for testing and maintenance. Any increase in turbidity above 0.1 NTU should be considered a potential breach in the integrity of either the membrane filtration unit or an individual filter cartridge. However, recognizing that the measurement of turbidity values below 0.1 NTU is more likely to be affected by the sensitivity of the turbidimeter to measurement error at lower turbidities, it may not be possible for 100% of measurements to be below this value. Therefore, comparison of a system's performance with the HBTL for 99% of turbidity measurements per filter operation period or per month allows utilities to establish operational procedures that are effective for each individual system. To allow systems some flexibility for addressing any uncertainty in turbidity measurements but also recognizing that any values above 0.1 NTU may represent an integrity breach, measurements greater than 0.1 NTU for a period of greater than 15 minutes should immediately trigger an investigation of the membrane unit integrity.
3.1.2 Other systems
While this guideline technical document is intended primarily for systems using surface water sources or GUDI that are filtering to meet pathogen removal goals, it is also important to understand the nature of the turbidity and to control its levels in other systems. In some cases, systems may be filtering for reasons other than pathogen removal, such as removing disinfection by-product precursor, improving the effectiveness of subsequent disinfection, or ensuring consumer acceptance. In other cases, systems may have technologies other than filtration in place, such as ultraviolet (UV) disinfection to provide reduction of certain pathogens. For these systems, a turbidity level of 1.0 NTU or less is recommended. Turbidity levels above this value may be acceptable depending on a variety of factors including the source water quality, the nature of the particles causing the turbidity and the design and operation of the treatment system. This should be evaluated on a case-by-case basis to ensure that the appropriate pathogen inactivation is achieved. Assessing whether a water supply and treatment system's performance satisfies requirements sufficient to be protective of public health should be done on a case-by-case basis. The responsible authority may choose to allow turbidity increases for individual systems, in light of a risk assessment that takes into account local knowledge of the system's capabilities and performance.
For systems that use groundwater that is not under the direct influence of surface water, which are considered less vulnerable to faecal contamination, turbidity should generally be below 1.0 NTU. Best practice for these systems includes appropriate well siting, construction and maintenance, as well as monitoring source water turbidity and ensuring that turbidity levels do not interfere with the disinfection and distribution of the water supply. In some cases, a less stringent value for turbidity may be acceptable if it is demonstrated that the system has a history of acceptable microbiological quality and that a higher turbidity value will not compromise disinfection. The responsible authority may choose to allow turbidity increases for individual systems, in light of a risk assessment that takes into account local knowledge of the system's capabilities and performance.
In keeping with the multi-barrier approach to drinking water quality management, systems using groundwater sources should:
- ensure that groundwater wells are properly constructed and maintained, are located in areas where there is minimum potential for contamination and have appropriate wellhead protection measures in place; these source water protection measures protect public health by reducing the risk of contamination of the drinking water source; and
- ensure that treatment is sufficient to achieve a 4-log reduction of viruses by disinfection where required; it is important to confirm that elevated turbidity levels will not compromise any disinfection process in place, including residual disinfection in the distribution system.
3.2 Distribution system
All drinking water systems should monitor and control turbidity throughout the entire distribution system including areas with long retention times, decreased disinfectant residual, or that have demonstrated deteriorating water quality. For effective operation of the distribution system, it is good practice to ensure that water entering the distribution system has turbidity levels below 1.0 NTU. Increases in distribution system turbidity can be indicative of deteriorating water quality and it is good practice to minimize turbidity fluctuations. Increases in turbidity can be sudden or can gradually increase over time. Although some variation in turbidity is normal, increases above typical turbidity levels measured during routine monitoring can provide an indication of potential contamination or stagnation. If an unusual, rapid, or unexpected increase in turbidity levels does occur, the system should be inspected and the cause determined.
Turbidity monitoring is used in conjunction with indicators of microbiological quality, including disinfectant residual and organisms such as Escherichia coli (E. coli), heterotrophic plate counts (HPC) and total coliforms, to verify that there is no evidence of a recent deterioration of the microbiological quality of water in the distribution system.
Turbidity increases can have different origins which vary considerably in the threat they can pose to water quality and public health. It is not possible then to establish an across-the-board maximum value for turbidity in the distribution system to be used to make public health decisions and to expect it to be protective for all situations. The responsible authority may choose to allow turbidity increases for individual systems, in light of a risk assessment that takes into account local knowledge of the system's capabilities and performance.
3.3 Monitoring turbidity levels
While the primary focus of this document relates to monitoring turbidity at the treatment stage, turbidity can also be monitored in combination with other easily measured parameters from source to tap in order to better understand the status of the overall drinking water system and identify changing conditions. In many cases, changes in or exceedances of turbidity levels will trigger sampling for additional parameters that will help provide information on the status of the drinking water system.
3.3.1 Monitoring turbidity of source water
Monitoring turbidity levels in surface water and GUDI sources provides useful information that enhances overall system knowledge. Source water turbidity data are essential to ensure the appropriate design and operation of the treatment plant. Monitoring of the source water can identify changing source water conditions, such as a decline in source water quality, higher loadings of pathogens and increased challenges to filtration and disinfection. It helps establish historic trends that are capable of characterizing changing source water conditions.
Monitoring turbidity levels in groundwater sources provides key information for on-going health protection. Consistently low turbidity levels observed through varying seasons and weather conditions can help provide assurance that the well and aquifer remain less vulnerable to faecal contamination. On the other hand, observed increases in turbidity after a significant rain event, for example, can provide an indication of changes in the groundwater system in the vicinity of the well or a crack in the well casing, thereby prompting the operator to investigate and take corrective action.
3.3.2 Monitoring turbidity in treatment systems
For conventional and direct filtration (i.e., continuous feed of a coagulant with mixing ahead of filtration), source water turbidity levels should be measured at least daily just prior to the point of addition of treatment chemicals. Treated water turbidity levels from individual filters should be continuously measured (with an online turbidimeter) at intervals no longer than five minutes apart at a point in each individual filter effluent line. The combined filter effluent should also be monitored at some point downstream of the combined filter effluent line or the clearwell or tank. If turbidity monitoring occurs after the addition of some chemicals, such as lime, the chemical addition may increase the combined effluent turbidity relative to the turbidity of the water discharged from the filters.
For slow sand or diatomaceous earth filtration, treated water turbidity levels from individual filters should be continuously measured (with an online turbidimeter) at intervals no longer than five minutes apart at a point in each individual filter effluent line. The combined filter effluent should also be monitored at some point downstream of the combined filter effluent line or the clearwell or tank.
For membrane filtration, treated water turbidity levels from individual membrane units should be continuously measured (with an online turbidimeter) at intervals no longer than five minutes apart at a point in each individual filter effluent line. The combined filter effluent should also be monitored at some point downstream of the combined filter effluent line or the clearwell or tank. An individual membrane unit may be defined as a group of pressure vessels, cartridges or modules that are valved and isolated from the rest of the system for testing and maintenance. Consideration should be given in the design of membrane units to ensure that the level of sensitivity is sufficient to detect membrane breaches with turbidity monitoring or other integrity testing.
3.3.3 Monitoring turbidity within treated water storage and distribution systems
Monitoring turbidity in the distribution system can help identify areas where there may be changes to the water quality, such as biofilm growth, suspension of biofilms, release of corrosion products and disturbance of sediments. Monitoring turbidity in the distribution system may also provide an indication of potential contaminant intrusion from leaks, line breaks, pressure fluctuations or backflow. Turbidity within the distribution system can be monitored in conjunction with other parameters, such as pH, disinfectant residual and pressure, which also offer instant results on site. When integrated with routine monitoring activities in this way, deviations from normal conditions can be detected, and drinking water quality throughout the distribution system can be better understood. Similarly, turbidity measurements can inform maintenance schedules and aid in the detection of problems related to the condition of reservoirs, standpipes or holding tanks and infrastructure.
While such monitoring activities will aid in the detection of potential drinking water quality issues, decisions concerning corrective actions or the need for boil water advisories are made at the local or provincial/territorial levels. Such decisions would be based upon a risk management/risk assessment approach, taking into account other water quality parameters and site-specific knowledge. Increases in turbidity levels in the distribution system do not automatically signal the need for the issuance of a boil water advisory. However, unusual, rapid, or unexpected increases in distribution system turbidity can be indicative of deteriorating water quality and should be investigated.
3.4 Use of alternative filtration technology in drinking water systems
A filtration technology other than the technologies mentioned in section 1.0. may be used in a drinking water treatment plant. In cases where pathogen reduction goals need to be met, the treatment technologies selected, including disinfection, should reliably achieve a minimum 3-log reduction for Giardia lamblia cysts and Cryptosporidium oocysts and a minimum 4-log reduction for viruses. The turbidity values in section 1.0 may not be applicable to alternative technologies such as bag and cartridge filtration. Turbidity levels of filtered water from alternative technologies should be established by the responsible authority taking into account data from challenge testing or other methods used to demonstrate the effectiveness of the filtration technology.
As options evolve through advancements in science and technology, including applications for small systems, waterworks are encouraged to apply validated improvements and optimize existing systems as a matter of best practice. Maintaining current knowledge of best practices and remaining aware of advancements in the drinking water industry are important aspects of the multi-barrier approach to safe drinking water.
3.5 Considerations for exempting drinking water systems from filtration requirements
While it is a fundamental recommendation that all surface water and GUDI sources be filtered prior to disinfection, the decision to exempt a waterworks from this requirement should be made by the appropriate authority based on site-specific considerations, including historical and ongoing monitoring data. The following summary provides a brief description of some of the main considerations relevant to the decision to exempt a waterworks from the filtration requirements:
- Vulnerabilities assessment:Ensure a detailed current understanding of hazards inherent to the water source. This may include sources of microbial or chemical contaminants, activities that may impact the water source and historic information on fluctuations in source water quality, which may affect the chosen approach to treatment over time. These characteristics of the watershed or wellhead area should be well documented and maintained in such a way as to inform ongoing risk management considerations.
- Source water protection:A thorough understanding of measures being taken by all stakeholders to protect the source water should be maintained and documented over time. This would include the policies and regulatory requirements of agencies such as conservation authorities, municipal and provincial governments and local stakeholder groups, as well as permitted activities or land use in the area, potential sources of contaminants and threats to source water quality.
- Inspection and verification: Undertake adequate inspection and preventative maintenance from source to tap on a regular basis. Activities should be well documented such that a history of maintenance, upgrades and optimization approaches can be demonstrated over time. This includes the verification of the proper function and integrity of monitoring devices, treatment and distribution components.
- Treatment: Whether or not filtration technology is in place, the drinking water treatment process must still achieve a minimum 3-log reduction of Giardia lamblia cysts and Cryptosporidium oocysts and a 4-log reduction of viruses. Utilities using surface water or GUDI that are considering not using filtration will need to treat source waters for all three types of organisms (protozoa, viruses and bacteria), using a multi-disinfectant strategy. A possible strategy includes (1) ultraviolet irradiation or ozone to inactivate cysts/oocysts, (2) chlorine to inactivate viruses, and (3) chlorine or chloramines to maintain a residual in the distribution system. Consideration may also be given to strategies that may enhance robustness at the treatment stage. For example, these may include pre-sedimentation or other control strategies for intermittent increases in source water turbidity. The drinking water treatment process will also need to be operated to minimize the formation of disinfection by-products.
- Distribution: The distribution system should be appropriately designed, maintained and monitored in accordance with established best practice, and a disinfectant residual should be maintained throughout the distribution system.
- Contingency or emergency response planning: Also recommended is a well-developed site-specific response plan for episodes of elevated source water turbidity brought about by extreme weather or other unforeseen changes in source water quality that may challenge the drinking water treatment system in place.
Report a problem or mistake on this page
- Date modified: