Japan Geoscience Union Meeting 2023

Presentation information

[E] Online Poster

A (Atmospheric and Hydrospheric Sciences ) » A-TT Technology &Techniques

[A-TT29] Machine Learning Techniques in Weather, Climate, Ocean, Hydrology and Disease Predictions

Tue. May 23, 2023 10:45 AM - 12:15 PM Online Poster Zoom Room (7) (Online Poster)

convener:Venkata Ratnam Jayanthi(Application Laboratory, JAMSTEC), Patrick Martineau(Japan Agency for Marine-Earth Science and Technology), Takeshi Doi(JAMSTEC), Swadhin Behera(Application Laboratory, JAMSTEC, 3173-25 Showa-machi, Yokohama 236-0001)

On-site poster schedule(2023/5/22 17:15-18:45)

10:45 AM - 12:15 PM

[ATT29-P02] A Transformer Approach to Streamflow Long-Term Forecasting

*Jongho Kim1, Trung Duc Tran1 (1.University of Ulsan, Korea)

Keywords:Transformer, Long short-term memory (LSTM) , Streamflow long-term forecasting, Hyper-parameter optimization, Attention mechanism

Streamflow forecasting is a vital yet challenging task, as the accuracy of the forecasting model can greatly impact the effectiveness of operating and exploitation policies, resulting in significant economic benefits. However, the complexity of this task arises from the multitude of factors that can affect the performance of the model. Both process-based and data-driven models are known to face difficulties, such as the need for large amounts of data, and a decrease in accuracy as the lead time of the forecast increases. In recent years, the long short-term memory (LSTM) based model has demonstrated promising results in time-series forecasting. Nonetheless, the implementation of LSTM-based models has presented challenges in terms of input predictor and hyperparameter optimization. This study proposes the use of a model referred to as the Transformer. Transformer models are particularly well-suited for time-series forecasting as they are capable of effectively capturing long-term dependencies in the data through the use of attention mechanisms, which weigh the importance of different input timesteps when making predictions. To evaluate the performance of the Transformer and optimal LSTM-based models, daily streamflow from five stations in the dam system of the Han River Basin (South Korea) was utilized. The results indicate that the Transformer model outperforms LSTM-based models significantly for 7-day lead-time forecasting, with an average accuracy of Nash Sutcliffe Efficiency (NSE) that is 10-40% higher at different lead times, and a running time that is approximately 20% shorter. By utilizing only streamflow and its lag times, the Transformer model was high-performance, which opens up the potential for its application in other hydrological problems.
Acknowledgment: This work was supported by Korea Environment Industry & Technology Institute (KEITI) through Water Management Program for Drought Program funded by Korea Ministry of Environment (2022003610003), and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (NRF- 2022R1A2C2008584).