4:00 PM - 4:20 PM
[3K5-IS-2b-02] Multi-Model Data Transfer by Knowledge Distillation for Enhancing Precipitation Nowcasting
[[Online]]
Keywords:precipitation nowcasting, knowledge distillation, U-Net architecture
Precipitation nowcasting refers to a rapid, high-resolution prediction within the next 2 hours, providing important benefits for areas such as air traffic control and emergency services. Recently, deep learning methods using only radar images have shown promising results for precipitation nowcasting without relying on physical models. However, these methods often overlook the additional meteorological information, such as temperature, humidity, and cloud water content, contained in reanalysis data, thus limiting further improvements in prediction accuracy. In this research, we build upon the U-Net architecture to integrate radar data with reanalysis data for network training. Since reanalysis data are delayed and cannot be used for real-time forecasts, we apply a knowledge distillation approach to transfer information from a teacher model to a student model that does not require reanalysis data when making predictions. Our experiments show that the distilled student model outperforms the baseline model trained only on radar data in terms of MSE, CSI, and PSD, demonstrating the effectiveness of our method in improving forecast accuracy.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.