5:15 PM - 6:45 PM
[MGI24-P02] Adjoint models using automatic differentiation
Keywords:Variation Data Assimilation, Machine Learning, Gradient, Sensitivity
Adjoint models enable calculation of sensitivity of the cost function to initial conditions or model parameters by performing integration backward in time. The foundation of the variational data assimilation is mathematically rigorous and the procedure to construct an adjoint model has been well-established. However, it is tedious and error prone to write an adjoint by hand. Traditional source-to-source compilers significantly reduce the labour but it still requires human examination. Recent advances in machine learning, which utilize adjoints for backpropagation, have promoted development of frameworks and tools, which typically include automatic differentiation (AD). Graph-based compilers such as JAX and Enzyme trace operators to construct a computational tree for gradient calculation. In this study, we apply JAX and Enzyme to generate adjoints of the three-variable Lorenz-63 model. AD adjoints reproduce exactly the solution of the hand-written code and can be run comparably owing to optimization.