日本地球惑星科学連合2023年大会

講演情報

[J] オンラインポスター発表

セッション記号 A (大気水圏科学) » A-AS 大気科学・気象学・大気環境

[A-AS08] 高性能計算で拓く気象・気候・環境科学

2023年5月22日(月) 13:45 〜 15:15 オンラインポスターZoom会場 (1) (オンラインポスター)

コンビーナ:八代 尚(国立研究開発法人国立環境研究所)、宮川 知己(東京大学 大気海洋研究所)、小玉 知央(国立研究開発法人海洋研究開発機構)、大塚 成徳(国立研究開発法人理化学研究所計算科学研究センター)


現地ポスター発表開催日時 (2023/5/21 17:15-18:45)

13:45 〜 15:15

[AAS08-P02] 数値天気予報・データ同化・AIの融合による3D降水ナウキャストに向けて:敵対的学習の適用

*大塚 成徳1三好 建正1 (1.国立研究開発法人理化学研究所計算科学研究センター)

キーワード:深層学習、データ同化、数値天気予報、降水、ナウキャスト

Recent advances of deep learning allowed us to seek for new algorithms to predict precipitation based on past observations by weather radars. On the other hand, high-end supercomputers enabled us to perform “big data assimilation,” rapid-update numerical weather predictions at high spatiotemporal resolution by assimilating dense and frequent observations such as the Phased Array Weather Radar (PAWR) (e.g., Miyoshi et al. 2016a,b, Honda et al. 2022a,b). Nevertheless, neither deep learning nor big data assimilation is perfect. In conventional precipitation nowcasting, blending of numerical weather prediction and extrapolation-based nowcasting is known to be better than either of these (e.g., Sun et al. 2014). Therefore, even in the era of deep learning and big data assimilation, combining these two cutting-edge technologies is a reasonable choice.
We have been testing a convolutional long short-term memory (ConvLSTM, Shi et al. 2015)-based neural network. Recently, an adversarial training is considered a promising technique for deep learning-based precipitation nowcasting to avoid blurring effect (Ravuri et al. 2021). Therefore, we applied an adversarial training to a three-dimensional extension of ConvLSTM with PAWR.
PAWR observations are converted to a Cartesian mesh at 250-m resolution. Data with rainy pixels are cropped to 64 x 64 x 32, and past 5 steps with a time interval of 30 seconds are fed to the network. Future 20 steps, i.e., forecasts every 30 seconds up to 10 minutes lead, are generated by the network, and an adversarial loss and a pixelwise loss are computed.
Preliminary results indicate that the use of adversarial loss increases small-scale features compared to the training without the adversarial loss. However, threat scores did not change much between the training with and without the adversarial loss. In future, a numerical weather prediction output will be fed to the network to combine it with a deep learning-based prediction in a nonlinear manner.