This repository contains the official implementation of the paper: Introducing Spectral Attention for Long-Range Dependency in Time Series Forecasting.
- Add dataset txt file into /data folder.
- Execute the following command in the terminal:
python3 run_exp/itransformer.py --cuda 0 --data 0 --len 0 --basic --model 0
--cuda
: GPU index--data
: Dataset index (refer torun_exp/itransformer.py
)--len
: Output length index (96 / 192 / 336 / 720)--basic
: Include this flag to run the baseline. Omit for finetuning only.--model
: Finetuning model index (refer torun_exp/itransformer.py
). Omit for baseline training only.
layers/Momentum.py
: Spectral Attention (SA) codelayers/Momentum_batch.py
: Batched Spectral Attention (BSA) codelayers/Mmomentum_learnable.py
: BSA with learnable EMA smoothing factor (alpha)run_exp/
: Contains scripts for running the project, including baseline training and finetuning (with hyperparameter search)config.py
: Configuration file. Some attributes are automatically adjusted in other files to fit the dataset, prediction length, etc.
This project is built on the Time-Series-Library GitHub Repository, with modifications. Therefore, if you want to try other models, you can use the updated model Python files from this GitHub repository.
If you use this repository in your research, please cite:
@inproceedings{2024BSA,
title={Introducing Spectral Attention for Long-Range Dependency in Time Series Forecasting},
author={Kang, Bong Gyun and Lee, Dongjun and Kim, HyunGi and Chung, DoHyun and Yoon, Sungroh},
booktitle={Advances in Neural Information Processing Systems},
year={2024},
}
- Dongjun Lee ([email protected])
- Bong Gyun Kang ([email protected])