JSAI2023

Presentation information

General Session

General Session » GS-2 Machine learning

[1T5-GS-2] Machine learning

Tue. Jun 6, 2023 5:00 PM - 6:20 PM Room T (Online)

座長:森 隼基(NEC) [現地]

5:00 PM - 5:20 PM

[1T5-GS-2-01] Improvement of Transformer-based Time Series Forecasting Model using Attention in Frequency Domain

〇Sohei Kodama1, Takuya Matsuzaki1 (1. Tokyo University of Science)

[[Online]]

Keywords:Time Series Forecasting, Transformer

The purpose of this research is to enable long-term forecasting of time series data with multiple seasonal variations with a low amount of calculation and high accuracy. We use FEDformer (Zhou et al., 2022) as a baseline model. Since FEDformer performs attention in the frequency domain, it is possible to capture the periodicity even when there are multiple seasonal variations. In addition, it is designed to lighten the calculation by sampling frequency components when performing matrix calculation in the frequency domain. However, because Zhou et al. neglected an important condition in this sampling, the reduction of computational cost is small. We demonstate that, by sampling the frequency component based on the amplitude, it is possible to maintain accuracy with a small number of samples. As a result, in the long-term forecasting of time series data with multiple seasonal variations, we achieved higher accuracy than other models with less computational cost.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password