Japan Geoscience Union Meeting 2024

Presentation information

[E] Oral

M (Multidisciplinary and Interdisciplinary) » M-GI General Geosciences, Information Geosciences & Simulations

[M-GI24] Data assimilation: A fundamental approach in geosciences

Thu. May 30, 2024 10:45 AM - 12:00 PM 104 (International Conference Hall, Makuhari Messe)

convener:Shin ya Nakano(The Institute of Statistical Mathematics), Yosuke Fujii(Meteorological Research Institute, Japan Meteorological Agency), Takemasa Miyoshi(RIKEN), Masayuki Kano(Graduate school of science, Tohoku University), Chairperson:Daisuke Hotta(Meteorological Research Institute), Shin ya Nakano(The Institute of Statistical Mathematics)

11:15 AM - 11:30 AM

[MGI24-08] Short-term forecast of the geomagnetic secular variation using recurrent neural networks trained by Kalman filter

*Sho Sato1, Hiroaki Toh1 (1.Graduate School of Science, Kyoto University)

Keywords:Geomagnetic Secular Variation, Time Series Forecast, Data Assimilation, Machine Learning

We present the application of machine learning models trained by the extended Kalman filter to the actual geomagnetic problem. We employ the Recurrent Neural Network (RNN) model (Elman, 1990) to predict the geomagnetic main field changes, known as secular variation (SV), in a 5-year range for use for the 14th generation of International Geomagnetic Reference Field (IGRF-14). The extended Kalman filter (EKF) is an efficient data assimilation algorithm widely used in the geoscience community.
To test the accuracy of 5-year predictions, hindcast results are examined for the learning window from 2004.50 to 2014.25. The training and test datasets of the RNN models are geomagnetic field snapshots derived from hourly means collected at geomagnetic observatories worldwide, and CHAMP and Swarm-A Low-Earth-Orbit satellite data (MCM Model; Ropp et al., 2020). These tests demonstrate that RNNs trained by the Error Backpropagation algorithm can accurately reproduce the training data but may fail to predict future SVs. This problem is commonly known as overfitting and is one of the fundamental issues in deep neural networks. Instead of the Backpropagation, the EKF formulation in RNN provides an alternative algorithm to update RNN weights. This enables the inclusion of the observation error in the training process and may prevent overfitting for efficient prediction of short-term SVs.