Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
37th (2023)
Session ID : 1T5-GS-2-01
Conference information

Improvement of Transformer-based Time Series Forecasting Model using Attention in Frequency Domain
*Sohei KODAMATakuya MATSUZAKI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

The purpose of this research is to enable long-term forecasting of time series data with multiple seasonal variations with a low amount of calculation and high accuracy. We use FEDformer (Zhou et al., 2022) as a baseline model. Since FEDformer performs attention in the frequency domain, it is possible to capture the periodicity even when there are multiple seasonal variations. In addition, it is designed to lighten the calculation by sampling frequency components when performing matrix calculation in the frequency domain. However, because Zhou et al. neglected an important condition in this sampling, the reduction of computational cost is small. We demonstate that, by sampling the frequency component based on the amplitude, it is possible to maintain accuracy with a small number of samples. As a result, in the long-term forecasting of time series data with multiple seasonal variations, we achieved higher accuracy than other models with less computational cost.

Content from these authors
© 2023 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top