Skip to content

DJLee1208/BSA_2024

Repository files navigation

Batched Spectral Attention (BSA)

This repository contains the official implementation of the paper: Introducing Spectral Attention for Long-Range Dependency in Time Series Forecasting.

Usage

  1. Add dataset txt file into /data folder.
  2. Execute the following command in the terminal:
python3 run_exp/itransformer.py --cuda 0 --data 0 --len 0 --basic --model 0
  • --cuda: GPU index
  • --data: Dataset index (refer to run_exp/itransformer.py)
  • --len: Output length index (96 / 192 / 336 / 720)
  • --basic: Include this flag to run the baseline. Omit for finetuning only.
  • --model: Finetuning model index (refer to run_exp/itransformer.py). Omit for baseline training only.

Key Code Files

  • layers/Momentum.py: Spectral Attention (SA) code
  • layers/Momentum_batch.py: Batched Spectral Attention (BSA) code
  • layers/Mmomentum_learnable.py: BSA with learnable EMA smoothing factor (alpha)
  • run_exp/: Contains scripts for running the project, including baseline training and finetuning (with hyperparameter search)
  • config.py: Configuration file. Some attributes are automatically adjusted in other files to fit the dataset, prediction length, etc.

Acknowledgement

This project is built on the Time-Series-Library GitHub Repository, with modifications. Therefore, if you want to try other models, you can use the updated model Python files from this GitHub repository.

Citation

If you use this repository in your research, please cite:

@inproceedings{2024BSA,
  title={Introducing Spectral Attention for Long-Range Dependency in Time Series Forecasting},
  author={Kang, Bong Gyun and Lee, Dongjun and Kim, HyunGi and Chung, DoHyun and Yoon, Sungroh},
  booktitle={Advances in Neural Information Processing Systems},
  year={2024},
}

Contact

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages