On Tuesday January 11th, from 1-2pm ET, join Dr. Melissa Kenney (University of Minnesota, Institute on the Environment), Dr. Michael Gerst (University of Maryland, Earth System Science Interdisciplinary Center and Cooperative Institute for Climate and Satellites-Maryland), Jon Gottschalck (NOAA Climate Prediction Center) to learn about their study of data visualization and how it has been implemented in the CPC Outlook products.
The study considered solutions to the challenge of embedding accurate science in decision support tools and representing it in public communications. This difficulty is partly because most scientific information is infused with multiple trends or patterns. Moreover, how to simplify visualizations is often unclear because lack of stakeholder engagement makes it difficult to know which trend or pattern should be highlighted as the key message. As a result, multiple trends are often shown, leading to complicated scientific graphics being reproduced for public use. The existence of uncertainty further complicates use of scientific information because it adds at least one extra variable to be considered and displayed, and decision-makers or the public are less accustomed to reasoning with scientific uncertainty.
Over the past few years, the researchers have investigated these problems for global change, climate, and water information provided by, respectively, the (1) US Global Research Change Program (USGCRP) indicator suite and 3rd National Climate Assessment graphics, (2) temperature and precipitation outlooks produced by the NOAA Climate Prediction Center (CPC), and (3) water watch, water quality watch, groundwater watch produced by US Geological Survey (USGS). Tackling these problems requires the integration of visualization science, decision science, and design theory. Using focus groups and control/treatment testing, the combination of these scientific fields leads to the ability to better understand user needs, test whether current designs are meeting them, and compare current products against visualizations modified by best practice design principles.
Their results show that this three-step process can identify problems with current visualizations that are fixable within the typical constraints of legacy scientific products, such as a large engaged user base and being embedded within established institutional workflows. Furthermore, they outline how the process is scalable and customizable to the needs of organizations and their users and to the specifics of visualization products.
Learn more on Tuesday January 11th, 1-2pm ET