Abstract:
Accurate estimates of soil moisture are critical for several applications, including flood risk mitigation, drought monitoring , weather forecasting, and water resource management. Soil moisture estimates from in-situ, model, and remote sensing platforms, however, are wildly different, stemming from many factors, including representativeness differences, measurement noise, uncertainties in models and parameterizations, among others. Soil moisture from land surface models is a strongly model-specific quantity. As a result, there are often large biases between modeled and observed soil moisture datasets. Calibration of the land surface model parameters is a typical strategy employed to reduce such differences. The traditional calibration methods, however, are computationally expensive, which limits their application over large spatial extents at fine resolution. Here, we demonstrate the use of a computationally efficient calibration procedure for Noah-Multiparameterization (Noah-MP) LSM to fit the SMAP surface soil moisture with machine learning techniques. To carry out the calibration, we employ a timeseries deep learning (DL)-based long short-term memory (LSTM) network. We first trained a highly efficient LSTM to emulate Noah-MP LSM (called a surrogate model) as closely as possible using the forcings, attributes, and parameters from Noah-MP as inputs and Noah-MP modeled soil moisture as a target over CONUS. The calibration is then performed using differential parameter learning (dPL) framework, where we train another LSTM network to map information from forcings and input attributes to the parameters of Noah-MP such that surrogate output using optimal parameters best matches the SMAP soil moisture. Because dPL uses a global loss function, the optimal parameter sets for Noah-MP are spatially coherent, extrapolate better in space, and produce better predictions for not only the calibrated but also uncalibrated variables. The efficient surrogate model and training by backpropagation result in a hyper efficient calibration framework with orders-of-magnitude faster performance compared to the traditional approaches. A unique advantage of the DL-based approach is also the ability to learn and represent underlying non-stationarities, which is often difficult with traditional process-based models with fixed conceptual representations.