Turbidity at the levels defined in the scope of this test method are often monitored to help control processes, monitor the health and biology of water environments and determine the impact of changes in response to environmental events (weather events, floods, etc.). Turbidity is often undesirable in drinking water, plant effluent waters, water for food and beverage processing, and for a large number of other water-dependent manufacturing processes. Removal is often accomplished by coagulation, sedimentation, and various levels of filtration. Measurement of turbidity provides an indicator of contamination, and is a vital measurement for monitoring the characteristics and or quality within the sample’s source or process.
This test method does overlap Test Method D6855 for the range of 1–5 TU. If the predominant measurement falls below 1.0 TU with occasional spikes above this value, Test Method D6855 may be more applicable. For measurements that are consistently above 1 TU, this test method is applicable.
This test method is suitable to turbidity such as that found in all waters that measure above 1 NTU. Examples include environmental waters (streams, rivers, lakes, reservoirs, estuaries), processes associated with water pollution control plants (wastewater treatment plants), and various industrial processes involving water with noticeable turbidity. For measurement of cleaner waters, refer to Test Method D6855.
The appropriate measurement range for a specific technology or instrument type that should be utilized is at or below 80 % of full-scale capability for the respective instrument or technology. Measurements above this level may not be dependable.
Dilutions of waters are not recommended, especially in the case of samples with rapidly settling particles (that is, sediments). It is recommended that an appropriate instrument design that covers the expected range be selected to avoid the need to perform dilutions.
Technologies described in this standard may not measure all aspects (absorption and scatter) of a sample. Some of the properties of the water, the suspended material, or both may interfere with the certain measured property of the sample, such as the scattering of light that the particular instrument is measuring.
Several different technologies are available for use in the measurement of high-level turbidity. Some technologies may be better suited for specific types of samples, depending on the application and measurement criteria. Please refer to Table 1 and Appendix X1 which is a flow chart to help assist in selecting the best technology for the specific application.
When measuring high levels of turbidity the samples will often contain significant interferences such as that from absorbing particles, absorbance in the matrix, and rapidly settling particles. These may have a significant impact on how one measurement technology responds to changes in turbidity. Often times it will be prudent to run a series of linear dilutions to determine if the measured response was expected relative to the dilution. In cases where the response to dilution ratio is linear, the technology may be adequately accounting for the interferences. If the response is not expected, another technology should be considered to determine if a more accurate measurement could be obtained.
When reporting the measured result, appropriate units should also be attached. The units are reflective of the technology used to generate the measurements. The intention is to provide traceability for the technology used to generate the measured result, and if necessary, provide more adequate comparison to historical data. Section 7 describes technology that each type of traceable reporting units is based.
Table 1 contains the list of technologies and respective reporting units that will be traceable to that technology.
The methods in Table 1 can be broken down into two distinct groups of designs which are based on the type of incident light source used. These are broad-band white light source or light sources that provide a spectral output in the 400–680 nm range. These include polychromatic light sources, such as those that are necessary to comply with regulatory method USEPA Method 180.1, but also can include mono-chromatic light sources if the respective wavelength falls within the specified range. The second group of instruments uses a near IR monochromatic light source that is in the range of 780 to 900 nm. These designs are distinguishable in the reporting units and will always begin with the letter F.
For a specific design that falls outside of these reporting ranges, the turbidity should be reported in turbidity units (TU) with a subscripted wavelength value to characterize the light source that was used. See 7.4.3.
Those designs listed in Table 1 cover those that were currently identified by the ASTM subcommittee. Future designs that are not covered in this document may be incorporated into a future revision after review by the method subcommittee.
See Section 7 for more details regarding instrument designs.
Section 17 contains precision and bias data that incorporates the different classifications of technologies. The precision and bias section includes the overall data set of all laboratories and smaller segments of this data set to provide comparisons across distinguishing technological features that are exhibited by those technologies that are represented in this test method.
This test method covers the measurement of samples collected from waters and analyzed using typical laboratory based or portable-based instruments.
Область применения1.1 This test method covers the static determination of turbidity in water. Static refers to a sample that is removed from its source and tested in an isolated instrument. (See Section 4.)
1.2 This test method is applicable to the measurement of turbidities greater than 1.0 turbidity unit (TU). The upper end of the measurement range was left undefined because different technologies described in this test method can cover very different ranges. The round robin study covered the range of 0–4000 turbidity units because instrument verification in this range can typically be covered by standards that can be consistently reproduced.
1.3 Many of the turbidity units and instrument designs covered in this test method are numerically equivalent in calibration when a common calibration standard is applied across those designs listed in Table 1. Measurement of a common calibration standard of a defined value will also produce equivalent results across these technologies.
1.3.1 In this test method calibration standards are often defined in NTU values, but the other assigned turbidity units, such as those in Table 1 are equivalent. For example, a 1 NTU formazin standard is also a 1 FNU, a 1 FAU, a 1 BU, and so forth.
1.4 This test method does not purport to cover all available technologies for high-level turbidity measurement.
1.5 This test method was tested on different natural waters and wastewater, and with standards that will serve as surrogates to samples. It is the user's responsibility to ensure the validity of this test method for waters of untested matrices.
1.6 Depending on the constituents within a high-level sample, the proposed sample preparation and measurement methods may or may not be applicable. Those samples with the highest particle densities typically prove to be the most difficult to measure. In these cases, and alternative measurement method such as the process monitoring method can be considered.
1.7 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. Refer to the MSDSs for all chemicals used in this procedure.
TABLE 1 Summary of Known Instrument Designs, Applications, Ranges, and Reporting Units
Design and