Downscaling Climate Data from Distributed Archives
Abstract
Model refinement -- numerical estimates of climate change at higher resolution than climate models are currently capable of producing -- is an essential weapon in the arsenal of decision makers and researchers in climate change. We describe here steps toward a general-purpose system for model refinement. We envision a system wherein multiple climate models, alone or in combination, can be used as predictors; multiple refinement methods, alone or in combination, can be deployed and trained, including evaluation within a perfect-model framework, described below; time periods and locations of training can be chosen at will; and providing all of these options as standard web services within the Earth System Grid Federation (ESGF) global data infrastructure for the distribution of climate model output. The perfect-model framework for systematic testing of model refinement using empirical-statistical downscaling (ESD) schemes is being developed at NOAA/GFDL under the National Climate Predictions and Projections Platform (NCPP) project. It uses the approach that Laprise and collaborators call the "big-brother" framework for evaluating dynamical downscaling. High-resolution model output is used as a "nature run" and used in place of observations to train the ESD scheme under testing. The data is interpolated to a coarse grid (the "little brother") and the ESD scheme attempts to downscale and bias-correct the "future", i.e beyond the period of training. The output of ESD can then be rigorously compared to the original nature run on a chosen list of metrics. Initial work was performed in collaboration with Texas Tech University: the high-resolution time-slice models that GFDL submitted to CMIP5 are used as training sets for the downscaling methods developed by Katharine Hayhoe and collaborators. The approach is being extended to using other downscaling schemes, such as BCSD, Delta, quantile mapping, constructed analogs, and machine learning algorithms; and in future to using other model output for training datasets as well. Initial results were first presented at the Quantitative Evaluation of Downscaling 2013 Workshop (QED-2013). We will describe a software infrastructure wherein: a) any CMIP5 high-resolution model output can be used as a training set; b) any ESD scheme can be deployed using a standard template or API developed under the ExArch project; c) the outputs of downscaling will also conform to CMIP5 standards and be capable of being analyzed on the same footing as any CMIP5 output; d) analysis services computing the chosen metrics can be run on the downscaled output; e) the infrastructure can be deployed "in-house" by the ESD group, or potentially run as a web service on any ESGF node.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2013
- Bibcode:
- 2013AGUFMIN23A1414R
- Keywords:
-
- 0500 COMPUTATIONAL GEOPHYSICS