09 Nov 2023
 | 09 Nov 2023
Status: this preprint is currently under review for the journal HESS.

When ancient numerical demons meet physics-informed machine learning: adjoint-based gradients for implicit differentiable modeling

Yalan Song, Wouter J. M. Knoben, Martyn P. Clark, Dapeng Feng, Kathryn E. Lawson, and Chaopeng Shen

Abstract. Recent advances in differentiable modeling, a genre of physics-informed machine learning that trains neural networks (NNs) together with process-based equations, has shown promise in enhancing hydrologic models’ accuracy, interpretability, and knowledge-discovery potential. Current differentiable models are efficient for NN-based parameter regionalization, but the simple explicit numerical schemes paired with sequential calculations (operator splitting) can incur large numerical errors whose impacts on models’ representation power and learned parameters are not clear. Implicit schemes, however, cannot rely on automatic differentiation to calculate gradients due to potential issues of gradient vanishing and memory demand. Here we propose a “discretize-then-optimize” adjoint method to enable differentiable implicit numerical schemes for the first time for large-scale hydrologic modeling. The adjoint model demonstrates comprehensively improved performance, with Kling-Gupta efficiency coefficients, peak-flow and low-flow metrics, and evapotranspiration that moderately surpass the already-competitive explicit model. Therefore, the previous sequential-calculation approach had a detrimental impact on the model’s ability to represent hydrologic dynamics. Furthermore, with a structural update that describes capillary rise, the adjoint model can better describe baseflow in arid regions and also produce low and peak flows that outperform even pure machine learning methods such as long short-term memory networks. The adjoint model rectified some parameter distortions but did not alter spatial parameter distributions, demonstrating the robustness of regionalized parameterization. Despite higher computational expenses and modest improvements, the adjoint model’s success removes the barrier for complex implicit schemes to enrich differentiable modeling in hydrology.

Yalan Song et al.

Status: open (until 04 Jan 2024)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse

Yalan Song et al.

Yalan Song et al.


Total article views: 223 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
156 65 2 223 2 2
  • HTML: 156
  • PDF: 65
  • XML: 2
  • Total: 223
  • BibTeX: 2
  • EndNote: 2
Views and downloads (calculated since 09 Nov 2023)
Cumulative views and downloads (calculated since 09 Nov 2023)

Viewed (geographical distribution)

Total article views: 223 (including HTML, PDF, and XML) Thereof 223 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
Latest update: 30 Nov 2023
Short summary
Wouldn't it be nice to have both the accuracy of neural networks (NNs) and the interpretability of process-based models (PBMs)? Differentiable modeling gives you the best of both worlds by connecting NNs with PBMs. However, there was previously a major issue that iterative solution schemes would run into memory use trouble. This paper presents an operator called adjoint, which liberates all the iterative solvers. This is the first time adjoint is applied to large-scale hydrologic modeling.