Decades of data collection may be necessary to accurately detect low-magnitude trends in water quality indicators of natural systems. Compensating for lack of time by liberal acceptance of Type I errors (false trends) introduces risk inequitably to stakeholders when there is a priori knowledge of trend direction. An efficient water quality monitoring program requires a clear goal, an efficient sampling plan and an efficient change detection method if it is to reduce risks to all stakeholders. We evaluated the effect of trend analysis frequency and sampling network size on the Average Time to Signal of a linear trend detection method. For each combination of effects evaluated, we normalized the average time to a false signal under conditions of no real trend. We calculated the Average Time to Signal using a statistical power function derived from stream chemical composition data from 67 mountain watersheds in two provinces of the Mid-Appalachian Region. We assumed a trend magnitude of 0.5% yr-1. The Average Time to Signal was reduced when trends were evaluated more frequently and when a larger network size was used, despite holding the average time to detection of a false signal constant. The Average Time to Signal ranged from ~15 to ~35 years depending on the trend evaluation frequency and the number of sites assumed to be in the network. These results underscore the importance of long-term monitoring programs, and demonstrate that it is possible to detect trends in indicators sooner if the measurement program is optimally designed.
AGU Spring Meeting Abstracts
- Pub Date:
- May 2002
- 6309 Decision making under uncertainty;
- 6324 Legislation and regulations;
- 1871 Surface water quality