Potential to Detect Nonstationary Reservoir System Performance Driven by Climate and Land Use
Abstract
Dynamic planning of water infrastructure requires identifying signals for adaptation, including measures of system performance linked to vulnerabilities. However, it remains a challenge to detect projected changes in performance outside the envelope of natural variability, and to identify whether such detections can be attributed to one or more uncertain drivers. Here we investigate these questions with a combination of ensemble simulation, non-parametric tests, and variance decomposition, demonstrated for a case study of the Sacramento-San Joaquin river basin, California. We additionally train a logistic regression classifier to predict future detections given observed objectives. The scenario ensemble includes coupled climate and land use change through the end-of-century, evaluated with a multi-reservoir simulation model to determine changes in water supply reliability and flooding metrics relative to the historical period (1951-2000). Results show that the reliability metric is far more likely to exhibit a significant change within the century, with the most severe scenarios tending to be detected earlier, reflecting long-term trends. Changes in flooding are often not detected due to natural variability despite severe events in some scenarios. We find that the variance in detection times is largely attributable to the choice of climate model, rather than the emissions scenario or land use. Finally, the prediction model is more accurate for reliability than for flooding, though in both cases the model learns to associate more recent observations with detection. These findings underscore the importance of differentiating between long-term change and natural variability in identifying signals for adaptation.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2022
- Bibcode:
- 2022AGUFM.H42L1446C