The Joint Effort for Data assimilation Integration (JEDI) is
a modern, modular, and generic software suite designed to support a wide range
of data assimilation applications. These design principles make JEDI
particularly well suited for atmospheric composition (AC) research and
operations, where the integration of diverse observations with model forecasts
is essential.
This presentation will review recent progress and
accomplishments in applying JEDI to AC studies across the scientific and
operational communities. Highlights include NASA efforts demonstrating the
potential of four-dimensional ensemble–variational (4DEnVar) methods with new
geostationary instruments such as TEMPO, implemented in the GEOS-CF system with
support for stretched grid geometries. Progress on emission inversion using
EnVar techniques will also be presented, along with plans and prototypes to enhance
scalability through artificial intelligence approaches. These developments open
the path from air quality monitoring toward carbon cycle applications. On the
NOAA side, work is advancing on the JEDI-based assimilation of AOD and PM2.5
observations in the GFS and AQFS systems. Additional efforts include NCAR
MPAS-JEDI developments with GOCART aerosols, assimilation of PACE OCI data, and
the use of Level-1b reflectance for aerosol applications. Finally, we will
highlight the increasing flexibility offered by JEDI’s modular modeling
interface, which enables more user-friendly observation–model comparisons and
facilitates complex studies such as observing system simulation experiments
(OSSEs). Together, these efforts demonstrate the growing capabilities and
opportunities of JEDI for advancing AC research and applications.
01/03/2025
Patricia Castellanos
Seminar postponed
The focus will be on the high resolution Copernicus Arctic reanalysis (CARRA) which uses DMI’s HARMONIE NWP model. Additionally, the HARMONIE-Climate system will be introduced, along with a few other Greenland ice sheet related themes.
11/19/2024
Andrea Molod
GEOS S2S version 3
The Planetary Boundary Layer (PBL) is essential to a number of Earth science priorities (including weather and climate prediction and air quality) as stated in the 2018 NASEM Earth Science Decadal Survey, however, it is very challenging to accurately represent PBL structure. As any single observing system can sample only a fraction of the global PBL space-time structure and physical aspects, the GEOS data assimilation system provides the critical capability to assimilate a wide range of observations in combination with model physics to provide improved PBL structure and forecast. In this talk, I will briefly discuss factors affecting observation usage and provide an overview of some of the data assimilation research efforts to address these factors. I will also present our Decadal Survey Incubation (DSI) PBL data assimilation effort on developing a global PBL height analysis and monitoring capability using PBL height data from multiple observing systems including radiosonde, GNSS-RO, space-based lidars CALIPSO and CATS, ground-based lidar MPLNET, and wind radar profiler. The motivation and strategy of utilizing PBL height data to improve overall PBL data assimilation performance will also be discussed
Comparison of GHG flux inventories to atmospheric measurements has often revealed discrepancies and has motivated an emerging consensus of the need for multi-scale, multi-method observations of GHG fluxes. Progress in measurement and monitoring methods are making this vision increasingly feasible. This presentation will review work that my group has done to advance the science of multi-scale, multi-method quantification of GHG fluxes. I will present examples of GHG flux quantification work at the scales of cities and gas basins, the challenges we are facing in understanding the discrepancies between in-versions and inventories, and potential paths forward. I will also describe challenges we are facing at the regional to continental scale, describe the challenges associated with atmospheric transport that emerge and present our attempts, to date, to address these challenges.
As a part of a major upgrade to our global NWP system, NOAA is developing a new land data assimila-tion system. This seminar will present these developments, focussing on an upgrade to our snow depth analysis, and the introduction of our first global soil moisture and soil temperature analy-sis. The new snow depth analysis will be a variational data assimilation of station snow depth and satellite snow cover, using the JEDI software platform. Compared to our current snow depth analy-sis, the variational snow analysis improves both the initial snow states, and low-level atmospheric temperatures over land. The new soil analysis will expand the current GSI Hybrid 4DEnVar assimila-tion of atmospheric observations to also assimilate screen-level temperature and humidity observa-tions, and to also update the soil moisture and soil temperature states. This system improves the 6-hour low-level atmospheric forecasts, as measured by the data assimilation observation minus fore-cast statistics for the screen-level variables, while having a minimal, but positive, impact higher in the atmosphere. The above developments are a part of a long-term effort to unify our land and atmos-pheric data assimilation to use the same methods, allowing more information to be shared between the land and atmosphere updates, and introducing the possibility of strongly coupled land/atmosphere data assimilation. The potential advantages and challenges of strongly coupled land and atmospheric data assimilation will then also be discussed.
The writing is on the proverbial wall of Earth’s freshwater stores: ice sheets are melting, aquifers are emptying, reservoirs are drying, and glaciers are losing mass. Our “working capital†of freshwater is therefore depleting, hindering the human right to safe and clean drinking water and sanitation to the world’s rapidly growing population. These trends may lead to an increasing reliance on other freshwater sources. While Earth’s rivers have a tiny storage, their mighty flow makes them the most renewable and most accessible and hence most sustainable source of freshwater. The management of our freshwater portfolio may hence very well gradually include a “cash flow†perspective using this sustainable freshwater source. The powerful flow of rivers is also a great cause for concern because floods are consistently among the world’s most disastrous natural hazards, ranking first in the number of events and in the number of people affected, second in economic cost, and fourth in total deaths. Yet surprisingly little is known about spatiotemporal variations of global surface water stores and fluxes, induced by both natural and anthropogenic processes.
In this seminar, we will discuss the state of global river modeling along with advances in uncertainty quantification and data assimilation. We will also have an opportunity to discuss the challenges and opportunities of open science for numerical modeling of Earth processes.
The eastern tropical Pacific cold tongue plays a major role in the global climate system. The strength of the cold tongue sets the zonal temperature gradient in the Pacific, coupling the ocean with the atmospheric Walker circulation. This coupling is an essential component of the El Niño Southern Oscillation (ENSO). The cold tongue is supplied with cold water by the equatorial undercurrent that follows the upward sloping thermocline to the east, transport-ing cold water towards the surface. As the thermocline shoals, its water undergoes the diabatic processes of water mass transformation (WMT) allowing for heat uptake from the surface into the ocean. Here, we examine WMT in the cold tongue region from a global high resolution ocean simulation with saved budget terms and from a regional ocean state estimate. We quantify each individual component of WMT (vertical mixing, horizontal mix-ing, eddy fluxes, solar penetration), and find that vertical mixing is the single most im-portant contribution in the thermocline, while solar heating dominates close to the surface. We investigate how WMT changesfrom (sub)-seasonal to interannual timescales. During El Niño events vertical mixing, and hence WMT as a whole, is much reduced, while during La Niña periods strong vertical mixing leads to strong WMT, thereby cooling the surface. This analysis demonstrates the enhancement of diabatic processes during cold events, which in turn enhances surface cooling in the cold tongue region. We compare the underlying model physics to available observations, highlighting existing biases and the need to gather observations that put constraints on the underlying ocean physics.
All-sky assimilation of brightness temperatures (BTs) from GOES-16 infrared water vapor channels (channels 8-10) is challenging because the sensitivity to cloud ice causes large nonlinear errors in the forecast and forward models. Our study begins with the examination of a bias correction (BC) scheme with a quartic polynomial of cloud predictors (the ASRBC4 scheme) when assimilating the all-sky BTs from GOES-16 channel-8 using the NCEP GSI-based 3D ensemble–variational hybrid data assimilation (DA) system with variational BC (VarBC). Results show that applying the ASRBC4 scheme alleviates the nonlinear conditional biases of all-sky scaled observation-minus-backgrounds (OmBs) with respect to the symmetric cloud proxy variable C ̅, and leads to better WRF model track forecasts of tropical storm (TS) Cindy (2017) and Hurricane Laura (2022). The ASRBC4 scheme is then applied to GOES-16 channels 9 and 10. Following successful implementation, prior interchannel observation-error correlations (IOECs) among the three water vapor channels are estimated.
The IOECs exhibit sigmoid function characteristics as a function of lnâ¡C Ì… , being lower and relatively invariant with respect to C Ì… under clear sky conditions, and higher in a cloudy sky. These features of IOECs can be understood by Jacobians under different conditions. In addition, we discuss the eigenvalues and the conditional numbers of the observation error covariance matrix with IOECs. Given the unique properties of IOECs, sensitivity experiments and case studies on Hurricanes Laura (2020) and Ida (2021) were conducted. The results indicate that combining the assimilation of all-sky BTs from GOES-16 channel 10 with clear-sky BTs from the other water vapor channels (experiment Ch10_CLR) yields superior analysis and forecasts in most scenarios.
This study highlights the importance of properly addressing nonlinear biases in OmBs under cloudy skies and accounting for cloud-dependent IOECs when assimilating all-sky BTs from infrared channels in operational DA systems.
Rapidly intensifying “flash droughts†are climate hazards that have significant impacts on agri-culture and ecosystems. They have also proven to be difficult to predict in Earth System Model ensemble subseasonal-to-seasonal (S2S) forecast systems. At the same time, empirical S2S forecasts based on Earth Observation (EO) of land surface and vegetation conditions have shown promise. This suggests that vegetation-mediated land-atmosphere interactions play an important role in flash drought development, and that they could contribute meaningfully to predictability in some contexts. In this seminar I will summarize a set of studies designed to improve our diagnosis of flash drought, classify flash drought events in a process-relevant manner, characterize the role vegetation has played in seminal flash drought events, and build towards improved empirical and dynamically-based S2S flash drought forecasts.
The Paris Agreement was a watershed moment in providing a framework to address the mitigation of climate change. The Global Stocktake is a bi-decadal process to assess progress in greenhouse gas emission reductions in light of climate feedbacks and response. However, the relationship between emission commitments and concentration requirements is confounded by complex natural and anthropogenic biogeochemical processes modulated by climate feedbacks.
We investigate the prospects and challenges of mediating between emissions and concentrations along with the predictability of their trajectory. Our primary tool is the NASA Carbon Monitoring System Flux (CMS-Flux), which is an inverse modeling and data assimilation system that ingests a suite of observations across the carbon cycle to attribute atmospheric carbon variability to anthropogenic and biogeochemical processes.
We use this tool to address an essential question for the Stocktake: the predictability of the carbon cycle. We look at this question through several angles. We ingest data into a carbon cycle model using a Markov Chain Monte Carlo (MCMC) technique that explicitly incorporates non-Gaussian behavior and use those solutions to characterize the trajectory and predictability of terrestrial carbon dynamics. We further consider the coevolution of air quality and carbon in conjunction with an advanced chemical data assimilation system in light of an environmental Kuznet curve to assess the predictability of carbon given air quality emissions. We then consider predictability and observability within a hierarchical emergent constraint (HEC) framework, which is used to constrain carbon-climate feedbacks. These elements taken together are core components of a carbon attribution and prediction system needed to assess the efficacy of carbon mitigation strategies in the presence of a changing climate.
Operational regional oceanography has developed rapidly in recent years. Expansion in observing systems, the development of innovative ocean observing technologies, and growing stakeholder needs continue to drive the development of higher resolution regional ocean analysis and forecast systems. As in global systems, the computational cost of data assimilation represents a significant fraction of the overall computational effort required in operational and near real-time regional forecast systems. Thus there is a growing need to improve the performance and efficiency of ocean data assimilation systems without sacrificing accuracy. With this goal in mind, the saddle-point formulation of weak constraint 4-dimensional variational (4D-Var) data assimilation has been developed for the Regional Ocean Modeling System (ROMS) and tested in the California Current System (CCS). Unlike the conventional forcing formulation of weak constraint 4D-Var, the saddle-point formulation can be efficiently parallelized in time which can lead to a substantial increase in efficiency. The performance of the ROMS saddle-point 4D-Var algorithm will be presented and compared to that of the conventional dual forcing formulation which is the current standard in ROMS. While the rate of convergence of the saddle-point formulation is slower than the forcing formulation, the increase in computational speed due to time-parallelization more than compensates for the additional inner-loop iterations required by the saddle-point algorithm in the CCS configuration considered here. Additional increases in performance can be achieved by running the 4D-Var inner-loop iterations at reduced model resolution and/or reduced arithmetic precision. The results presented here indicate that in high performance computing environments, the saddle-point formulation of 4D-Var has the potential to significantly out-perform the forcing formulation for large data assimilation problems.
The Planetary Boundary Layer (PBL) is essential to a number of Earth science priorities (including weather and climate prediction and air quality) as stated in the 2018 NASEM Earth Science Decadal Survey, however, it is very challenging to accurately represent PBL structure. As any single observing system can sample only a fraction of the global PBL space-time structure and physical aspects, the GEOS data assimilation system provides the critical capability to assimilate a wide range of observations in combination with model physics to provide improved PBL structure and forecast. In this talk, I will briefly discuss factors affecting observation usage and provide overview of some of the data assimilation research efforts to address these factors. I will also present our Decadal Survey Incubation (DSI) PBL data assimilation effort on developing a global PBL height analysis and monitoring capability using PBL height data from multiple observing systems including radiosonde, GNSS-RO, space-based lidars CALIPSO and CATS, ground-based lidar MPLNET, and wind radar profiler. The motivation and strategy of utilizing PBL height data to improve overall PBL data assimilation performance will also be discussed.
The Data Science Group (606.3) at Goddard is a resource for accelerating science through the use of advanced computing techniques including AI, Machine Learning, parallel programming, and high-end computing. This science focused presentation will showcase some projects using foundation models, deep learning and other processes to achieve science objectives of the projects. We will introduce the MODIS based SatVision foundation model developed here at Goddard as well as the "NASA Weather and Climate" foundation model currently under development at NASA Marshall. Other topics of interest will include training data generation, results validation, and multiple ML methods. Lastly, we will discuss how to access compute resources and how to work with the Data Science Group on projects.
In this talk I will explore two different perspectives of the role of marine surface sensible and latent heat fluxes in the climate system. One perspective holds that the role of ocean surface fluxes is to balance the global atmospheric energy loss by infrared emissions to space that is regulated by the horizontal and vertical organization of cloudiness. An alternative perspective considers the role of surface fluxes in regulating cloud organization that in turn affects infrared emissions to space as well as patterns of ocean heating and cooling.
Climate and forecast models use a variety of algorithms to estimate marine surface fluxes which can impact model cloudiness and cloud system variability. I illustrate these differences by applying a simple set of surface flux diagnostics to a suite of CMIP6 simulations, and apply an offline surface flux “correction†to simulated fluxes to estimate their biases relative to the Fairall et al. (1996; 2003) state-of-the-art COARE3.6 bulk flux algorithm. These corrections suggest a sensitivity to choice of flux algorithm of several modes of tropical rainfall, including the mean position of the intertropical convergence zone (ITCZ) and the Madden-Julian oscillation (MJO). Subsequent tests with the NCAR CESM2 confirm these sensitivities, and suggest that revising model surface flux algorithms to conform with those estimated with the COARE3.6 algorithm may reduce model biases in tropical rainfall. I conclude with a summary of ongoing efforts to update the CESM2 flux algorithm to include additional corrections for the ocean cool skin and diurnal warm layer as well as the effects of freshwater lenses, which appear to play an under-appreciated role in regulating the MJO lifecycle.
Cloud microphysics parameterizations, simplified representations of cloud particle populations and their evolutions, are a crucial but highly uncertain part of many climate models. Because of the multi-scale nature of clouds, a lack of governing microphysical equations, process nonlinearity and stochasticity, and limited direct observations, microphysics parameterization design and quantitative evaluation have historically been challenging. Here, I present work to identify, characterize, and reduce parameterization errors that will enable building better cloud microphysics parameterizations. Specifically I consider (a) parametric error driven by poorly constrained and unphysical parameters and (b) structural error driven by inadequate representation of cloud particle properties and processes via parameterization variables and functional forms. First, I show machine learning can make Bayesian parameter inference computationally tractable for computationally expensive 3D climate models. This methodology enables characterizing parametric error and distinguishing it from structural error. By varying the formulation of the parameter inference, we can further pinpoint origins of structural error, which allows for a clearer path toward parameterization improvement. The talk focuses on applications to a microphysics parameterization in a cloud resolving model, but these approaches are of interest for climate model parameterization development more generally. Second, I pair idealized modeling with in situ drop size distributions to interrogate ubiquitous structural assumptions governing warm rain initiation in CMIP6 models. The results suggest a need to reformulate the structure of parametrized warm rain initiation congruent with the Bayesian parameter inference approach.
Flash drought is characterized by the rapid intensification toward drought conditions and can lead to wide-ranging impacts, including agricultural yield loss, reduction in water resources, moisture stress on ecosystems, and increased risk of wildfires and heatwaves. Unlike conventional (slowly developing) drought, flash drought can rapidly desiccate land surface conditions in only the span of a few weeks and place excessive stress on the environment. Flash droughts present challenges for drought mitigation strategies as they often develop with limited warning, and their characteristics, evolution, and drivers are not well understood. To address some of the challenges related to flash drought development and their associated impacts, this presentation highlights 1) the temporal and spatial evolution of flash drought via case study analysis, 2) a regional and global climatology of flash drought occurrence, and 3) future projections of flash drought risk in a changing climate. These research tasks are addressed by using a combination of reanalysis datasets, satellite observations, and global climate models on local to global scales. While the results from this research highlight key advancements in our understanding of flash droughts, several future research pathways exist to develop monitoring techniques of flash drought, improve the predictability of these events, and untangle the complex interactions between flash drought and socioeconomic impacts.
In recent years, the severity, frequency and spatial extent of the extreme heat and precipitation events have even surprised climate scientists. The record-shattering 2021 Pacific Northwest(PNW) heatwave that led to deaths in the thousands and promoted wildfires affecting air quality throughout the continent, and the devastating 2022 Pakistan floods that killed over 1700 people and displaced more than 33 million, are two of the extreme examples. In this talk, I’ll discuss the weather and climate conditions that led to the 2021 PNW heat extremes and the 2022 Pakistan extreme precipitation, and their future implications. I’ll also discuss what these two events have in common in terms of their causal mechanisms and the role climate change may have played in both events.
As temperatures rise due to global warming, fundamental thermodynamics dictate that the hydrologic cycle will intensify in response. Most of this response arises from the exponentially increased propensity for warmer air to “hold†water vapor, which in turn raises the ceiling on how intense both precipitation and evaporation can become. Separately, these effects can account for increases in both heavy precipitation events and extreme drought events at opposite ends of the hydroclimate spectrum. When they occur in rapid succession, however, they give rise to “hydroclimate whiplashâ€â€”sudden transitions between extremely wet and extremely dry states that can greatly amplify societal and ecological impacts. In this talk, I will review the recent literature on hydroclimate whiplash on a warming Earth and offer new evidence that large and nearly universal global increases in such volatility should be expected as the Earth continues to warm. Finally, I'll offer thoughts on some of the less obvious downstream consequences of a warmer, more volatile hydroclimate--from wildfires to snowstorms and even co-seismic hazards.
Water availability is drastically changing and fluctuating the world over including the western U. S., creating challenges for water managers who manage systems that are designed and operated based on assumptions of past variability. More and more, it’s clear that the stationarity paradigm of water management no longer applies. This requires new approaches, providing NASA an unprecedented opportunity to directly benefit water management in the western U. S.. The NASA Western Water Applications Office (WWAO), within the Earth Action Water Resources Program, aims to address the pressing needs and challenges of Western water resource managers through sustained engagement with the community to understand decision contexts, identification of gaps and needs in monitoring and information, implementation of projects that address the needs, and assistance in transitioning successful projects to operational use. This talk provides an overview of NASA’s WWAO, along with a description of activities and highlights of current projects and past successes.
Join me for an insightful exploration of Earth science data management from the perspective of NASA Headquarters. In this talk, I’ll share personal experiences alongside an historical overview of NASA’s approach to handling Earth Science Data Systems and some perspectives on its future. We’ll discuss the practical challenges of data acquisition and management and distribution, highlighting the role of the system in ensuring effective stewardship of valuable scientific resources. Together, we’ll examine the intersection of technology and innovation in our ongoing journey to explore and better understand our planet.
Gravity waves are ubiquitous in the Earth’s atmosphere. While they are generated predominantly in the troposphere by e.g., flow over orography and strong convective events in the tropics, their largest impact on the circulation is in the middle atmosphere. There they attain large amplitudes and on breaking/saturation drive important features that modulate atmospheric teleconnections, which are a major source of predictability for surface weather and climate. Such features in the stratosphere include the quasi-biennial oscillation, the polar vortices, and the subtropical jets above the tropopause.
Because gravity waves have a broad wavelength spectrum, most models rely on gravity wave parameterizations to simulate their effect on the large-scale flow. However, global models that explicitly resolve gravity waves are now possible due to recent technological advancements. Using ECMWF IFS global simulations at horizontal grid-spacing ranging from 10 km to 1 km, in which gravity waves are partially or fully resolved, this talk elucidates the following questions relating to the representation of gravity waves in the stratosphere: i) At what horizontal grid-spacings do we expect to resolve the whole gravity wave spectrum so that parameterizations of gravity waves are no longer needed? And ii) What can we learn from km-scale models with regards to the parametrization design at lower resolution? Finally, the impact of making the hydrostatic approximation (as is currently done at ECMWF) on resolved gravity waves is discussed.
It is well-documented that the atmospheric circulation changes in response to increased CO2, although the responses differ among aspects of the circulation, and the causes of these differences are not well understood. More recently, studies have also shown that the atmospheric response can be not only a nonlinear, but also non-monotonic, function of CO2 forcing. We begin by showing that this nonlinearity in the atmospheric circulation response occurs more broadly across the Coupled Model Intercomparison Project (CMIP) Phase 6 archive and occurs in association with a collapse of the Atlantic Meridional Overturning Circulation (AMOC).
To illustrate this last point, we then isolate the climate impacts of a weakened AMOC using a unique ensemble of Shared Socioeconomic Pathway (SSP) 2-4.5 integrations performed using the CMIP6 version of the NASA Goddard Institute for Space Studies ModelE (E2.1). In these runs internal variability alone results in a spontaneous bifurcation of the ocean flow, wherein two out of ten ensemble members exhibit an entire AMOC collapse, while the other eight recover at various stages despite identical forcing of each ensemble member and with no externally prescribed freshwater perturbation. We show that an AMOC collapse results in an abrupt northward shift and strengthening of the Northern Hemisphere (NH) Hadley Cell and intensification of the northern midlatitude eddy-driven jet. Comparisons with a set of coupled atmosphere-ocean abrupt CO2 experiments spanning 1-5xCO2 reveal that this response to an AMOC collapse results in a nonlinear shift in the NH circulation moving from 2xCO2 to 3xCO2. Slab-ocean versions of these experiments, by comparison, do not capture this nonlinear behavior. Finally, provided time, we show that the CO2 forcing at which this nonlinearity occurs can be influenced by stratospheric ozone feedbacks. Overall, our results suggest that changes in ocean heat flux convergences associated with an AMOC collapse — while highly uncertain — can result in profound changes in the NH atmospheric circulation.
The integrated observing system for air quality is made up of ground-based, airborne, and satellite observations that are interpreted with models to provide the best understanding of atmospheric composition. Airborne field campaigns provide the opportunity to exercise the entire observing system. Field campaign modeling work will be presented from prior field campaigns including the NASA SEAC4RS, ATom, and KORUS-AQ missions with the goal of illustrating how multiple observational perspectives are invaluable for improving models. Specific examples will illustrate improvements to emissions inventories, constraints on model oxidation capacity, and missing physical processes that impact simulated surface air quality.
The January 2022 Hunga Tonga-Hunga Ha'apai eruption was one of the most explosive volcanic events of the modern era, producing a vertical plume that peaked more than 50 km above the Earth. The initial explosion and subsequent plume triggered atmospheric waves that propagated around the world multiple times. A global-scale wave response of this magnitude from a single source has not previously been observed. Here we show the details of this response, using a comprehensive set of satellite and ground-based observations to quantify it from surface to ionosphere. A broad spectrum of waves was triggered by the initial explosion, including Lamb waves propagating at very high phase speeds of 318.2 ± 6 m/s at surface level and between 308 ± 5 to 319 ± 4 m/s in the stratosphere, and gravity waves propagating at 238 ± 3 to 269 ± 3 m/s in the stratosphere. Gravity waves at sub-ionospheric heights have not previously been observed propagating at this speed or over the whole Earth from a single source. Latent heat release from the plume remained the most significant individual gravity wave source worldwide for more than 12 h, producing circular wavefronts visible across the Pacific basin in satellite observations. A single source dominating such a large region is also unique in the observational record. The Hunga Tonga eruption represents a key natural experiment in how the atmosphere responds to a sudden point-source-driven state change, which will be of use for improving weather and climate models.
Accurate, physically-based precipitation retrieval over global land surfaces is an important goal of the now 10 year-old NASA/JAXA Global Precipitation Measurement Mission (GPM). This is a challenging task for the passive microwave constellation, as the signal over radiometrically warm land surfaces in the microwave frequencies means that the measurements used are indirect, and typically require inferring some type of relationship between an observed scattering signal and precipitation at the surface. GPM, which includes a core satellite with collocated radiometer and dual-frequency radar, along with a constellation of partner radiometers, is an excellent tool for testing and validating improved passive retrievals. The operational GPM passive microwave retrieval scheme, the Goddard Profiling Algorithm (GPROF) is a Bayesian probabilistic scheme, utilizing an a priori database constructed using the GPM core satellite along with radiative transfer to expand to the full constellation. This operational technique, along with the emergence of AI/deep learning technology, makes the quality of the a priori database of utmost importance for the implementation and training of the schemes. For accuracy in the radiative transfer calculations required for creating the databases, emissivity is a key variable. In contrast to the radiometrically cold ocean surface, land emissivity in the microwave is large with highly dynamic variability. An accurate understanding of the instantaneous, dynamic emissivity in terms of the associated surface properties is necessary for a physically based retrieval of precipitation and other geophysical variables over land. This presentation will introduce and discuss various avenues for emissivity retrieval and modeling along with implementations of emissivity information for atmospheric retrieval and modeling applications with a focus on precipitation.
Clouds play a central role in the global climate system through the modulation of Earth’s energy flows and as a mediator of precipitation. In the Arctic, clouds are also a major player; however, the processes that govern cloud evolution in the Arctic differ from most of the other regions of the globe. Thus, clouds are a key “wildcard†within the Arctic climate system that could have a substantial influence on the Arctic climate system response to anthropogenic forcing. What is the role of clouds within the phenomenon known as Arctic Amplification? This is question does not have a clear answer. Clouds seem to be a center to the important processes driving Arctic Amplification (e.g., the atmospheric response to sea ice loss and airmass transformation), however feedback analysis studies indicate that net cloud feedback in the Arctic is small. This seminar discusses the role of clouds within the Arctic Amplification processes reviewing aspects of what we know about Arctic clouds and the uncertainties that limit our ability to model them. Results from recently published and ongoing work are presented that provide an observationally-based estimate of the cloud-sea ice feedback and evaluate cloud properties within models. The goal of this presentation is to ignite a discussion and new collaborations around the best approaches to resolving uncertainties related to the role of clouds with the Arctic system and how to better represent Arctic clouds in models.
Some of the most consequential outcomes of global warming for societies and ecosystems are changes in extreme events. Comparing 2000-2019 with 1980-1999, extreme temperature and flood events have more than doubled globally while the number of disastrous storms and droughts has increased by 30-50%. While the nonlinear increase in latent energy with warmer surface air temperature may explain the global increasing trends in weather extremes, credible projections of the regional changes in extreme events and changes in different types of extreme events remain challenging, partly because of model limitations in simulating the extreme events. In this seminar, I will discuss some recent advances in modeling extreme events and their future changes using a hierarchy of models spanning simple conceptual models to computationally intensive global storm-resolving models. Examples including modeling of mesoscale convective systems, atmospheric rivers, and hurricanes will be highlighted.
Reliable probabilistic forecasts about the potential for warmer, colder, wetter, or drier conditions at a few weeks to several seasons lead are valuable for routine planning and resource management. Many sectors would benefit from these predictions, including emergency management, public health, energy, water management, agriculture, and marine fisheries. The ability to make skillful forecasts on subseasonal (2-4 week) and seasonal (1-12 months) lead-times depends on understanding and harnessing the predictive capabilities of key sources of predictability. I will present results from research which are advancing our predictive capabilities on these timescales through understanding sources of predictability.
First, I will demonstrate the current state of subseasonal prediction skill and the benefit of a multi-model ensemble using the national, multi-model, research to operations project called The Subseasonal Experiment (SubX). The SubX models show skill for temperature and precipitation three weeks ahead of time in specific regions. The SubX multi-model ensemble mean is more skillful than any individual model overall. While average skill at individual gridpoints on subseasonal timescales is relatively low, there is potential for more skillful predictions through identification of forecasts of opportunity. However, subseasonal precipitation predictions remain a significant challenge.
In the second part of this presentation, I investigate sources of predictability for South-East US (SEUS) precipitation using explainable machine learning. We investigate the predictability of the sign of daily SEUS precipitation anomalies associated with large-scale climate variability where the predictors are perfectly known. Indices of climate phenomena (e.g., NAO, AMO, PDO, ENSO, MJO, etc.) produce neither accurate nor reliable predictions, indicating that the indices themselves are not good predictors. A convolutional neural network using gridded fields as predictors is reliable and more accurate than the index-based models. Using explainable machine learning we identify which variables and gridpoints of the input fields are most relevant for confident and correct predictions. Our results show that the local circulation is most important as represented by maximum relevance of 850hPa geopotential heights and zonal winds to making skillful, high probability predictions. Corresponding composite anomalies identify connections between SEUS precipitation and the El-Ni~{n}o Southern Oscillation during winter and the Atlantic Multidecadal Oscillation and North American Subtropical High during summer.
In June 2021, the Pacific Northwest of the US and Canada experienced a heat wave of historical proportions. Many locations broke all time high temperature records, often by several degrees, with severe human and ecological impacts. This presentation will describe and diagnose the meteorological conditions that caused this event to occur and be so severe. We use synoptic analysis and air parcel back trajectories to show that the atmospheric setup was similar to past severe heat waves, only much stronger. In particular, a record breaking ridge of high pressure was critical in driving temperatures to such extreme values. Climate model projections of future changes in ridges in the region will also be explored.
In this talk I'll outline the challenges and opportunities of using data (from high resolution models as well as observations) to inform the physics within global atmospheric models. I will present some progress we've made in development and data-driven constraint of cloud microphysics parameterizations, as well as results from our work tuning the NASA GISS ModelE Global Climate Model with satellite observations, Bayesian inference, and machine learning. Most of this work leverages perturbed-parameter ensembles (PPEs) with machine learning surrogates (or ensembles of these surrogates), and Markov Chain Monte Carlo sampling for parameter estimation and uncertainty quantification. Despite some notable successes with this approach, many details of this inference workflow remain ad-hoc, and there is need for systematic evaluation and optimization of choices, a goal hampered by the extreme computational cost of global models. Additionally, it is unclear the appropriate way to share information across scales, from laboratory studies through high resolution large-eddy simulations all the way to global climate simulations. I'll summarize current work towards a more systematic calibration workflow, as well as outstanding challenges associated with data-driven constraint of parametric and structural uncertainties in physical models of atmospheric processes.
Weather forecasters’ operational risk assessment and forecast decision-making environments are complex, dynamic, and increasingly focused on providing the best predictions and decision-support information for high-impact weather risks. New numerical weather prediction (NWP) and artificial intelligence / machine learning (AI/ML) guidance is constantly – and increasingly – being developed to support forecasters’ roles. To support the development and refinement of guidance that is most useful for forecasters, for nearly the last decade, we have been conducting inter- and transdisciplinary research with social, physical, and computational science that is also user-based, meaning that it is driven by data collected with forecasters. This presentation will discuss results from a suite of NOAA- and NSF-funded research projects with National Weather Service forecasters that has focused on (a) eliciting forecasters’ perceptions of and needs for high-resolution ensemble guidance, (b) developing probabilistic timing guidance for winter and fire weather forecasting using, and (c) understanding forecasters’ perspectives on the trustworthiness of AI/ML and guiding development of AI/ML tools accordingly.
Despite its simple definition (a period of abnormally dry condition), drought is a complex phenomena, affected by a diverse array of physical and biological processes, some of which may become increasingly important with climate change. Climate models can provide a critically important tool for understanding the role of these processes in driving drought variability in the past and future, especially in the context of climate change. In this talk, I will discuss several case studies, demonstrating how we have used climate model experiments to investigate and inform our understanding of past and future drought events in western North America. These include the impact of land degradation on the Dust Bowl drought of the 1930s; how the naturally-occurring 1950s drought would intensify in a warmer world; and the likely increased risk of megadroughts in the future analogous to the recent multi-decadal event in Southwestern North America. These studies highlight the utility of climate models for investigating observed drought events, and the insights they can provide on how anthropogenic processes can influence drought risk and severity.
Ice sheets are subject to the whims of a variable climate system and the complexity of small-scale glaciological processes. Simulating the effects of these processes in physical ice sheet models is computationally expensive and subject to considerable process uncertainty. In this talk, I explore a hierarchy of stochastic approaches, ranging from the mathematical to the intensely computational, for the problem of capturing variability in a range of processes that force ice sheets. Stochastic approaches recognize that ice sheets integrate the effects of rapidly varying processes, such that these processes can be equivalently represented by their statistics rather than their detailed deterministic dynamics. Stochastic methods constitute a parallel strategy for making progress on the outstanding problems with current deterministic ice sheet modeling and reveal some new problems as well. I discuss how we apply these stochastic modeling methods to two particularly challenging issues: (1) quantifying uncertainty in projections of future ice sheet change, and (2) quantifying the human contribution to past ice sheet change.
The relative merits of different ocean observation systems (moored buoys, Argo, satellite, XBTs and others) are evaluated by their impact on ocean analyses and subseasonal forecast skill. Several ocean analyses were performed where different ocean observation platforms were withheld from the assimilation in addition to one ocean analysis where all observations were assimilated. These ocean analyses products are then used for initializing a set of subseasonal forecasts to evaluate the impact of different ocean analyses states on the forecast skill. Results from the NASA GMAO and the European Centre for Medium-Range Weather Forecasts (ECMWF) assimilation systems and ensemble prediction systems’ experiments will be presented to highlight changes in the ocean analyses states in the tropical Indian and Pacific Ocean and their impact on the forecast skill from weather to subseasonal timescales. Coupled air-sea interaction processes relevant to weather and intraseasonal variability in the earth’s climate system are inadequately represented in regional and global coupled models. These inaccuracies could be related to either poor parameterization of model physics or insufficient model resolution to resolve the critical processes. New efforts in observations, process understanding and translation into weather and climate models are necessary for improvements in simulation and prediction of the intraseasonal variability and associated weather events. We will discuss the merits of different observation platforms in this context and future observation and model improvement pathways.
Disturbance, rising atmospheric CO2 and changing climate are rapidly shifting the functional status quo of terrestrial ecosystems, yet the myriad process responses involved confound our ability to resolve the future trajectory of terrestrial ecosystem carbon reservoirs. As a result, the state-of-the-art biogeochemical models fundamentally disagree on the sign and magnitude of the land C sink in coming decades. Repeat observations of the Earth’s terrestrial ecosystems—vegetation states, greenhouse gas exchanges, disturbances and their inextricable links to the water and energy cycles—hold the clues to how ecosystem scale C cycling is responding to a changing environment. Fusing observational knowledge with process modelling is therefore an urgent priority for advancing decadal projections of the land C sink. I will show how using Bayesian inference to constrain biosphere model processes using multi-decadal satellite and ground observation records can provide the necessary quantitative insights on the evolution of the land C sink, carbon-water interactions, the CO2 fertilization effect and the magnitude and sign of terrestrial carbon-climate feedbacks.
For many years the most effective way to assimilate data into a numerical weather prediction model has been four dimensional variational assimilation (4D-Var). One of the difficulties with 4D-Var has been the development and maintenance of the linear model, approximately tangent linear to the full model, for evolving perturbations. The linear model is particularly problematic for physical parameterisations. We present a new method, the hybrid tangent linear model, which solves most of these long-standing issues.
Reference: T.J. Payne, “A Hybrid Differential-Ensemble Linear Forecast Model for 4D-Varâ€, Monthly Weather Review (2021), doi: https://doi.org/10.1175/MWR-D-20-0088.1.
This talk will explore the tropical cyclone projections in the CMIP6 models. Standard environmental proxies for tropical cyclone activity, such as potential intensity, genesis indices, and the ventilation index have been calculated in the CMIP6 ensemble. First, I will discuss how the global historical climatological patterns of these environmental proxies in the CMIP6 models compare with the ERA5 reanalysis climatology and show the systematic biases across the CMIP6 models. Then, the expected range of future projections of these proxies will be shown for three future scenarios, namely ssp245, ssp370 and ssp585, for the end of the 21st century.
The role of ENSO diversity in the modulation of TC environmental proxies will also be discussed, as model biases in simulating ENSO diversity can lead to significant model differences, both in present and future climates. The role of ENSO diversity in shaping the tropical cyclone-ENSO relationship in present and future climates will be explored.
In the last part of our talk, a statistical-downscaling model that generates synthetic tropical cyclones from reanalysis and climate models large-scale fields will be presented. I will show the results obtained when downscaling the CMIP6 models and what we can learn from them.
High-resolution numerical weather prediction models are a useful tool for obtaining a better understanding of the processes related to cloud and precipitation formation; however, the spatial extent and life-cycle of simulated clouds are quite sensitive to the assumptions made by model parameterization schemes. The main motivation for this work is to increase understanding of the processes leading to convective initiation (CI) through application of object-based evaluations commonly used in geostationary satellite nowcasting studies. By taking advantage of new high temporal resolution data from the GOES-16 Advanced Baseline Imager (ABI) we track and evaluate the life-cycle of CI events produced by high-resolution (500 m) Weather Research and Forecasting (WRF) model simulations employing different microphysical and land surface parameterization schemes. The work presented will focus on a case study in southeastern United States, however, the satellite-based methodology for tracking and analyzing convection is applicable globally. The southeast U.S. poses an especially challenging forecast problem related to convective storm initiation and the upscale development of convective storms during the summer months when synoptic scale forcing is typically weak. Using the object-based methodology, cloud properties derived from individual cloud objects are examined and assessed using infrared brightness temperatures from the 5-min GOES-16 imagery. Cloud objects are tracked over time and related to observed clouds reaching CI to examine the impacts of microphysical and land assumptions leading to CI onset, cloud extent, longevity, and growth rate.
First half: The early-to mid-Pliocene (5.3–3 Myr), characterized by warmer temperatures and a similar CO2 concentration to that at present, is considered a useful analog for future warming scenarios. Geological evidence suggests that at that time, modern-day desert regions in the South-West US, including Death Valley in California, received higher levels of rainfall and supported large lakes. These wetter conditions have been difficult to reconcile with model projections of 21st-century drying over the same areas. We show that this discrepancy between past evidence and future projections may be due to the models missing an important feedback: Increasing sea surface temperature (SST) due to a weakening of the California coastal upwelling leads to wetter conditions over nearby land, and wetter land leads to a weakening of the wind that forces the upwelling. The mechanism and consequences are discussed. [Work led by Minmin Fu].
Second half: Increases in extreme weather events are an important possible consequence of anthropogenic climate change (ACC), yet it is famously difficult to attribute individual events to ACC. We are motivated by recent attribution studies by the ``World Weather Attribution Project'' (WWAP) based on fitting the observed record to extreme value distribution functions and making the distribution parameters a function of the observed global mean surface temperature (GMST). We re-examine three attribution cases, suggesting modifications that may increase our confidence in the meaningfulness of the attribution results. We test (1) if an extreme value distribution (vs. a normal or log-normal distribution) is required by the data, (2) if the addition of a GMST-dependence of the distribution parameters is justified statistically, and (3) if the errors in the GMST dependence allow a meaningful attribution. We find that the uncertainty in GMST dependence tends to make it difficult to make a confident attribution and that natural variability can lead to a seeming dependence on GMST that does not reflect ACC and may lead to wrong attribution conclusions. [With Peter Sherman and Peter Huybers]
The primary objective of NOAA's Quantitative Observing System Assessment Program (QOSAP) is to improve quantitative and objective assessment capabilities to evaluate operational and future observation system impacts and trade-offs to assess and to prioritize NOAA’s observing system architecture. Observing System Simulation Experiments (OSSEs) are a vital tool as part of the assessment of future observing systems. The OSSE system used as part of these studies will be discussed, outlining both advantages and caveats for their use. In addition, examples of how the OSSE system has been used to provide guidance on future observing system plans will be provided.
Across the broader scientific community, rapid advances in machine learning (ML), and deep learning (DL) in particular, have inspired researchers to consider how these tools might enable new science advances that previously would have been unattainable. The appeal in ML usage stems in part from the ability of ML to model complex nonlinear systems, and in part from recent algorithmic and computational advances (such as graphics processing units, i.e., GPUs) that have improved and accelerated DL model training.
For the field of subseasonal-to-seasonal (S2S) prediction (timescales of two weeks to two months), skillful prediction of precipitation remains very difficult. Predictability stemming from atmospheric initial conditions is substantially reduced beyond approximately two weeks and the ocean generally does not offer added predictability until a trajectory reaches the seasonal timescale. Imperfect initial conditions and model systematic errors also contribute to the difficulty of deterministic initialized forecasts. Ensemble forecasting has helped assess forecast spread in relation to initial condition errors, but the high cost of running global initialized forecasts precludes the creation of many ensemble members. These challenges motivate the use of ML and DL methods for S2S prediction.
Two approaches to S2S prediction research using the Community Earth System Model version 2 (CESM2) will be highlighted: (1) a predictability study and (2) a bias correction approach. The first study focuses on assessing the predictability of North American weather regimes, which are persistent large-scale atmospheric patterns that can imprint on surface anomalous precipitation. Various Earth system components, such as the atmosphere and land, will be used to assess contributions to predictability. The second study focuses on the use of DL models for offline bias correction of S2S forecasts of global precipitation. The DL model architectures include image-to-image approaches (e.g., U-Net), which enables learning of spatial patterns and displacement errors in CESM precipitation fields using various convolutional and pooling layers. We also show how DL methods can be leveraged to create a large ensemble of subseasonal forecasts.
Atmospheric rivers (ARs), tropical storms (TSs), and mesoscale convective systems (MCSs) are important weather phenomena that often threaten society through heavy precipitation and strong winds. Despite their potentially vital role in global and regional hydrological cycles, their contributions to long-term mean and extreme precipitation have not been systematically explored at the global scale. Using observational and reanalysis data, and NOAA’s Geophysical Fluid Dynamics Laboratory’s new high-resolution global climate model, we quantify that despite their occasional (13%) occurrence globally, AR, TS, and MCS days together account for ∼55% of global mean precipitation and ∼75% of extreme precipitation with daily rates exceeding its local 99th percentile. The model reproduces well the observed percentage of mean and extreme precipitation associated with AR, TS, and MCS days. In an idealized global warming simulation with a homogeneous SST increase of 4 K, the modeled changes in global mean and regional distribution of precipitation correspond well with changes in AR/TS/MCS precipitation. Globally, the frequency of AR days increases and migrates toward higher latitudes while the frequency of TS days increases over the central Pacific and part of the south Indian Ocean with a decrease elsewhere. The frequency of MCS days tends to increase over parts of the equatorial western and eastern Pacific warm pools and high latitudes and decreases over most parts of the tropics and subtropics. The AR/TS/MCS mean precipitation intensity increases by ∼5%/K due primarily to precipitation increases in the top 25% of AR/TS/MCS days with the heaviest precipitation, which are dominated by the thermodynamic component with the dynamic and microphysical components playing a secondary role.
Surface-based observations, such as those from conventional surface stations, radiosondes, aircraft and weather radar, provide invaluable information in both global and regional assimilation systems. At the Met Office, the Assimilation of Surface-based Observations Group is responsible for carrying out research to improve and develop the assimilation of the wide range of surface-based observations, and to increase their impact in operational numerical weather prediction systems. This seminar will present some of the recent research and operational updates provided by members of the Assimilation of Surface-based Observations Group including: the assimilation of sonde descents, direct assimilation of radar reflectivity, the increased use of roadside sensor data, understanding the impact of aircraft observations and an overview of our upgrade to the observation processing system.
The US operational global data assimilation system cycles with a six-hourly cadence, which is not frequent enough to handle the rapid error growth associated with fast-moving hurricanes or other storms. This motivates development of an hourly-updating global data assimilation system, but observational data latency can be a barrier. Two methods are presented to overcome this challenge: “catch-up cyclesâ€, in which a 1-hourly system is reinitialized from a 6-hourly system that has assimilated high-latency observations; and “overlapping assimilation windowsâ€, in which the system is updated hourly with new observations valid in the past three hours. The performance of these methods is assessed in a near-operational setup using the Global Forecast System by comparing short-term forecasts to in-situ observations. Methods to control high-frequency noise induced by applying analysis increments every hour are also discussed.
Air quality managers have long relied on atmospheric chemistry measurements and models to support decision-making. Today an even wider audience of policy, planning, and advocacy organizations are interested in air quality and climate data. By collaborating with these new communities, especially energy, health, and environmental justice. scientists can expand the impact of existing knowledge, data, and tools. The evolution of satellite data for air quality and health applications highlights these opportunities, with lessons learned over the past 10 years through initiatives of the NASA Applied Sciences Program. Atmospheric models play an important role interpreting satellite data, connecting emissions and impacts, and answering “what if?†questions relevant to policy and planning. Advanced regional models, including the EPA Community Multiscale Air Quality (CMAQ) model, are well-suited to policy applications, but may not be appropriate for all user needs. Simpler, reduced-form models, including COBRA, InMAP, and AERMOD, complement complex models to support a wider range of partners and problems. Traditional scientific frameworks are evolving to better support engagement and to expand the benefits of science to new issues and communities. Still, challenges remain, especially for early-career scientists balancing academic milestones with “real-world†engagement and societal impact.
Food insecurity and increased human migration are two major challenges facing society in the coming decades due to a range of factors including climate change, political instability, and economic inequality. To improve society’s responds to these challenges, it is important to advance our understanding of food systems and human migration dynamics as well as our capacity to observe these dynamics. I will present recent research findings using empirical and modeling analyses to understand the structure and response of these systems. Then I will briefly discuss key opportunities to integrate NASA’s Earth Observations into food systems and migration analyses to generate new knowledge and to support stakeholders (e.g., governments, institutions, and individuals) as they work it improve food security and migration outcomes.