16:00 〜 16:20
[3K5-IS-2b-02] Multi-Model Data Transfer by Knowledge Distillation for Enhancing Precipitation Nowcasting
[[オンライン]]
キーワード:precipitation nowcasting, knowledge distillation, U-Net architecture
Precipitation nowcasting refers to a rapid, high-resolution prediction within the next 2 hours, providing important benefits for areas such as air traffic control and emergency services. Recently, deep learning methods using only radar images have shown promising results for precipitation nowcasting without relying on physical models. However, these methods often overlook the additional meteorological information, such as temperature, humidity, and cloud water content, contained in reanalysis data, thus limiting further improvements in prediction accuracy. In this research, we build upon the U-Net architecture to integrate radar data with reanalysis data for network training. Since reanalysis data are delayed and cannot be used for real-time forecasts, we apply a knowledge distillation approach to transfer information from a teacher model to a student model that does not require reanalysis data when making predictions. Our experiments show that the distilled student model outperforms the baseline model trained only on radar data in terms of MSE, CSI, and PSD, demonstrating the effectiveness of our method in improving forecast accuracy.
講演PDFパスワード認証
論文PDFの閲覧にはログインが必要です。参加登録者の方は「参加者用ログイン」画面からログインしてください。あるいは論文PDF閲覧用のパスワードを以下にご入力ください。