Ricky Rood - The National Climate Predictions and Projections (NCPP) Platform: Development of Capacity to Support Planning and Management - NCPP aspires to provide a platform for evaluation of downscaled climate data to support applications in adaptation planning and resource management. An evaluation protocol has been used to provide metrics of not only temperature and precipitation, but also an array of indices identified as important by application experts. The evaluation supports the development of narrative translations, which application specialists describe as essential to improve the usability of climate information. NCPP builds tools to facilitate evaluation and provision of services. These tools use service-based strategies in concert with evolving community standards to allow potential re-use across the wide range applications represented by the USGCRP agencies. This talk describes the status of NCPP products and services and the emerging partnerships.
Galina Guentchev - Workshop on Quantitative Evaluation of Downscaling, August 12-16, 2013 – Developing community standards of evaluation and translation - The mission of the National Climate Predictions and Projections (NCPP) platform is to accelerate the provision of climate information on regional and local scale for use in adaptation planning and decision making through collaborative participation of climate scientists and practitioners. A major focus of NCPP’s efforts is the development of a capability for objective and quantitative evaluation of downscaled climate information. We recognize the importance of focusing this evaluation effort on real-world applications and the necessity to work closely with the user community to deliver usable evaluations and guidance. This summer NCPP organized a workshop on Quantitative Evaluation of Downscaling (QED) (http://earthsystemcog.org/projects/downscaling-2013/). Workshop participants included representatives from downscaling teams, applications partners from the health, ecological, agriculture and water resources impacts communities, and people working on data infrastructure, metadata, and standards development. During the workshop NCPP demonstrated capabilities for provision of standardized quantitative evaluation of downscaled datasets for applications. The workshop provided a platform for collaborative work towards the further development of a standard in evaluation and toward the identification of needs and gaps in translation of downscaled climate data for applications. The presentation will focus on elucidating the motivation for the development of an evaluation framework, presenting elements of the evaluation environment, summarizing feedback from participants during and after the workshop, and highlighting workshop outcomes.
Keith Dixon - Statistical Downscaling: Testing if past performance is an indicator of future results - Though valuable for many purposes, the raw output of large-scale global climate model (GCMs) often is deemed inadequate for direct use in studies of projected regional or local-scale climate-impacts. For such applications, GCM shortcomings may include a lack of fine-scale detail and biases in the model-simulated climatology relative to observations. Informed by observational data sets, an empirical statistical downscaling (ESD) technique is often applied to refine GCM output in an attempt to account for GCM shortcomings (somewhat analogous to MOS for weather forecasts). One may assess how well an ESD method performs during the observational period via cross-validation. However, in the absence of observations of the future, a quantitative evaluation of ESD skill can not be made directly for future simulations. Generally, there is an implicit assumption that the levels of ESD skill exhibited in the historical and future periods are similar. In effect, this assumes that statistical relationships between the GCM predictors and local-scale predictands remain constant over time, even as the climate is changing – what we refer to as the ‘stationarity assumption’. To check this assumption, we have developed a ‘Perfect Model’ experimental design that makes use of high resolution GCM output in its raw and processed (smoothed/degraded) forms. An overview of the experimental design and a set of illustrative results will be presented.