Quick and Easy Outlier Detection for Time Series
Explore the docs »
View Demo
·
Report Bug
·
Request Feature
Table of Contents
Adapting existing outlier detection & prediction methods into a time series outlier detection system is not a simple task. Good news: OATS has done the heavy lifting for you!
We present a straight-forward interface for popular, state-of-the-art detection methods to assist you in your experiments. In addition to the models, we also present different options when it comes to selecting a final threshold for predictions.
OATS seamlessly supports both univariate and multivariate time series regardless of the model choice and guarantees the same output shape, enabling a modular approach to time series anoamly detection.
- Install package via pip
❗ Installing using an environment manager such as
pip install pyoats
conda
,venv
, andpoetry
is highly encouraged as this package contains deep learning frameworks.
- Clone the repo
git clone https://github.com/georgian-io/pyoats.git && cd pyoats
- Build image
docker build -t pyoats .
- Run Container
# CPU Only docker run -it pyoats # with GPU docker run -it --gpus all pyoats
- Clone the repo
git clone https://github.com/georgian-io/pyoats.git && cd pyoats
- Install via Poetry
poetry install
For a quick start, please refer to our blog or copy our Colab notebook!
from oats.models import NHiTSModel
model = NHiTSModel(window=20, use_gpu=True)
model.fit(train)
scores = model.get_scores(test)
from oats.threshold import QuantileThreshold
t = QuantileThreshold()
threshold = t.get_threshold(scores, 0.99)
anom = scores > threshold
For more examples, please refer to the Documentation
For more details about the individual models, please refer to the Documentation or this blog for deeper explanation.
Model | Type | Multivariate Support* | Requires Fitting | DL Framework Dependency | Paper | Reference Model |
---|---|---|---|---|---|---|
ARIMA |
Predictive | ✅ | statsmodels.ARIMA |
|||
FluxEV |
Predictive | ✅ | 📝 | |||
LightGBM |
Predictive | ✅ | darts.LightGBM |
|||
Moving Average |
Predictive | |||||
N-BEATS |
Predictive | ✅ | ✅ | 📝 | darts.NBEATS |
|
N-HiTS |
Predictive | ✅ | ✅ | 📝 | darts.NHiTS |
|
RandomForest |
Predictive | ✅ | darts.RandomForest |
|||
Regression |
Predictive | ✅ | darts.Regression |
|||
RNN |
Predictive | ✅ | ✅ | darts.RNN |
||
Temporal Convolution Network |
Predictive | ✅ | ✅ | 📝 | darts.TCN |
|
Temporal Fusion Transformers |
Predictive | ✅ | ✅ | 📝 | darts.TFT |
|
Transformer |
Predictive | ✅ | ✅ | 📝 | darts.Transformer |
|
Isolation Forest |
Distance-Based | ✅ | ✅ | pyod.IForest |
||
Matrix Profile |
Distance-Based | ✅ | 📝 | stumpy |
||
TranAD |
Reconstruction-Based | ✅ | ✅ | 📝 | tranad |
|
Variational Autoencoder |
Reconstruction-Based | ✅ | ✅ | 📝 | pyod.VAE |
|
Quantile |
Rule-Based |
* For models with
- Automatic hyper-parameter tuning
- More examples
- More preprocessors
- More models from
pyod
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/amazing_feature
) - Commit your Changes (
git commit -m 'Add some amazing_feature'
) - Push to the Branch (
git push origin feature/amazing_feature
) - Open a Pull Request
Distributed under the Apache 2.0 License. See LICENSE
for more information.
Benjamin Ye |
Project Link: https://github.com/georgian-io/oats
I would like to thank my colleagues from Georgian for all the help and advice provided along the way.
I'd also like to extend my gratitude to all the contributors at Darts
(for time series predictions) and PyOD
(for general outlier detection), whose projects have enabled a straight-forward extension into the domain of time series anomaly detection.
Finally, it'll be remiss of me to not mention DATA Lab @ Rice University, whose wonderful TODS
package served as a major inspiration for this project. Please check them out especially if you're looking for AutoML support.