Trends in the First Two Moments of Peak Flow Data in California
Abstract
The U.S. Geological Survey (USGS) is currently updating California's flood frequency statistics, for the first time in 30 years. This much overdue project is necessary to not only help protect lives and property, but for the effective planning management, and use of the State's land and water resources, both of which are coming under unprecedented demand in the 21st Century. This new study has the benefit of using an additional 30 years of peak discharge data, as well as new methodologies for incorporating historical floods to more accurately predict characteristics of flood frequency distributions. Yet with global climate change likely to affect long-term streamflow characteristics, a fundamental assumption of flood frequency analysis- stationarity(no systematic change over time) of the annual flood data--is being questioned. To test whether the first two moments (mean and standard deviation) required to fit probability distributions to annual peak flow data are stationary, trends in the moments over three different periods, will be examined at currently operated or recently discontinued (through water year 2006) USGS stream-gaging stations. All stations included in this trend analysis will be stream sites that the USGS database indicates have no upstream regulation or diversion effects, nor any significant effects due to urbanization. Thus these sites will represent unregulated systems for which trends, if any, would most likely be the result of changing climate forcings. To ensure that the sampled moments are reasonably stable and reliable representatives of the population moments, a 30-year record period will be used to calculate the mean and standard deviation. The three different periods selected for testing trends will cover the past 30 years (1977-2006), the past 40 years (1967-2006), and the past 50 years (1957-2006). For each successive year in the trend test period, the mean and standard deviation will be calculated using a sliding-window basis using only the previous 30 years. For the 30-year trend analysis, 36 stations with at least 60 years of continuous peak data records were used, the 40-year trend test used 23 stations with at least 70 years of record and the 50-year trend test used 10 stations with at least 80 years of data. No temporal or spatial biases were evident in the dataset of selected stations, even in the limited number used in the 50-year trend test. The Kendall Tau test for monotonic trend was used for testing the mean and standard deviation for each site and each of the three trend-test periods.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2008
- Bibcode:
- 2008AGUFM.H21A0815B
- Keywords:
-
- 1821 Floods