Skip to content
/ WOODS Public

Benchmarks for Out-of-Distribution Generalization in Time Series Tasks

License

Notifications You must be signed in to change notification settings

jc-audet/WOODS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Documentation Status Continuous Integration License

Benchmarks for Out-of-Distribution Generalization in Time Series Tasks

ArXiv Paper: WOODS: Benchmarks for Out-of-Distribution Generalization in Time Series Tasks

Authors: Jean-Christophe Gagnon-Audet, Kartik Ahuja, Mohammad-Javad Darvishi-Bayazi, Guillaume Dumas, Irina Rish

Abstract

Deep learning models often fail to generalize well under distribution shifts. Understanding and overcoming these failures have led to a new research field on Out-of-Distribution (OOD) generalization. Despite being extensively studied for static computer vision tasks, OOD generalization has been severely underexplored for time series tasks. To shine a light on this gap, we present WOODS: ten challenging time series benchmarks covering a diverse range of data modalities, such as videos, brain recordings, and smart device sensory signals. We revise the existing OOD generalization algorithms for time series tasks and evaluate them using our systematic framework. Our experiments show a large room for improvement for empirical risk minimization and OOD generalization algorithms on our datasets, thus underscoring the new challenges posed by time series tasks.


Quick Start

Download

Some datasets require downloads in order to be used. We provide the download script to directly download our own preprocessed versions of the datasets which are ready for use. If you want to look into the preprocessing yourself, check the fetch_and_preprocess.py script. You can provide a specific dataset to download, or provide nothing to download all of them. See the raw and preprocessed sizes of the dataset on the dataset page of the documentation.

python3 -m woods.scripts.download_datasets {dataset} \
        --data_path /path/to/data/directory

Important The IEMOCAP dataset can only be obtained through their website after agreeing to their license agreement. Once the dataset is downloaded, you will need to preprocess it with the fetch fetch_and_preprocess.py script. More details on the documentation page.

Train a model

Train a model using one objective on one dataset with one test environment. For data that requires a download, you need to provide the path to the data directory with --data_path.

python3 -m woods.scripts.main train \
        --dataset Spurious_Fourier \
        --objective ERM \
        --test_env 0 \
        --data_path /path/to/data/directory

Make a sweep

To launch a hyper parameter sweep, use the following script. Define a collection of Objective and Dataset to investigate a combination. All test environment are gonna be investigated automatically. By default, the sweep is gonna do a random search over 20 hyper parameter seeds with 3 trial seed each. For more details on the usage of the hparams_sweep, see the documentation.

python3 -m woods.scripts.hparams_sweep \
        --dataset Spurious_Fourier CAP\
        --objective ERM IRM \
        --save_path ./results \
        --launcher local \
        --data_path /path/to/data/directory

When the sweep is complete you can compile the results in neat tables, the '--latex' argument outputs a table that can be directly copy pasted into a .tex documents. For more details on the usage of compile results, see the documentation.

python3 -m woods.scripts.compile_results \
        --mode tables \
        --results_dir ./results \
        --latex

Citation

@misc{WOODS,
Author = {Jean-Christophe Gagnon-Audet and Kartik Ahuja and Mohammad-Javad Darvishi-Bayazi and Guillaume Dumas and Irina Rish},
Title = {WOODS: Benchmarks for Out-of-Distribution Generalization in Time Series Tasks},
Year = {2022},
Eprint = {arXiv:2203.09978},
}

About

Benchmarks for Out-of-Distribution Generalization in Time Series Tasks

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages