1:45 PM - 3:15 PM
[SCG55-P05] Attempt to detect tsunami-generated magnetic variation using machine learning
Tsunami-generated electromagnetic (TEM) variation is caused by moving conductive sea water when tsunami waves propagate (e.g., Tyler, 2005). Lin et al. (2021) showed that TEM data allows us to estimate the tsunami wave height and the direction of tsunami wave propagation. These results indicate the possibility of application of TEM variation to tsunami early-warning systems. However, TEM phenomena have been analyzed only in the case of huge tsunami events and the number of reported TEM data is limited. This is because the signal-to-noise ratio of TEM variation is tiny in the small tsunami events, which makes it hard to visually identify the TEM variation in the observed data. Although some previous studies detected TEM variation in the frequency domain using wavelet analysis (e.g. Schnepf et al. 2016),identification of TEM components must be done visually as in the time domain, which has not improved the situation significantly.
To overcome the current situation, we tried to developed machine learning models to judge whether input magnetic data includes TEM variation or not. In preparation of training data for the machine learning, data without TEM was prepared by extracting 60-min three-component magnetic time series from the seafloor magnetic data obtained at the Philippine Sea. On the other hand, the magnetic data with TEM variation was prepared by adding simulation results of TEM variation to real magnetic data that was prepared by the same procedure as used for the data without TEM variation. In the simulation, we obtained synthetic TEM variation by feeding the results of tsunami simulation by COMCOT (Wang & Liu, 2006) to the TEM simulation by TMTGEM (Minami et al., 2017). The final data sets consist of 24,000 input data with answer labels, a half of which includes TEM variation. We used them to train the model and verify its performance. The simplest sequential model of Keras, a neural network library written in python, was used as the machine learning model. We tried several different parameters in our model and compared their performances; the number of layers in the model ranged from 3 to 5, and the number of neurons in each layer ranged from 16 to 64. As a result, a model consisting of 5 layers with 16 neurons per layer has achieved the accuracy of about 57% for test datasets. In the presentation, we plan to report comparison between our sequential model with differenttypes of machine learning models as well as the details of the above results.
To overcome the current situation, we tried to developed machine learning models to judge whether input magnetic data includes TEM variation or not. In preparation of training data for the machine learning, data without TEM was prepared by extracting 60-min three-component magnetic time series from the seafloor magnetic data obtained at the Philippine Sea. On the other hand, the magnetic data with TEM variation was prepared by adding simulation results of TEM variation to real magnetic data that was prepared by the same procedure as used for the data without TEM variation. In the simulation, we obtained synthetic TEM variation by feeding the results of tsunami simulation by COMCOT (Wang & Liu, 2006) to the TEM simulation by TMTGEM (Minami et al., 2017). The final data sets consist of 24,000 input data with answer labels, a half of which includes TEM variation. We used them to train the model and verify its performance. The simplest sequential model of Keras, a neural network library written in python, was used as the machine learning model. We tried several different parameters in our model and compared their performances; the number of layers in the model ranged from 3 to 5, and the number of neurons in each layer ranged from 16 to 64. As a result, a model consisting of 5 layers with 16 neurons per layer has achieved the accuracy of about 57% for test datasets. In the presentation, we plan to report comparison between our sequential model with differenttypes of machine learning models as well as the details of the above results.