Skip to content

About Code release for "Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight), https://openreview.net/forum?id=LzQQ89U1qm_

License

Notifications You must be signed in to change notification settings

thuml/Anomaly-Transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Anomaly-Transformer (ICLR 2022 Spotlight)

Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy

Unsupervised detection of anomaly points in time series is a challenging problem, which requires the model to learn informative representation and derive a distinguishable criterion. In this paper, we propose the Anomaly Transformer in these three folds:

  • An inherent distinguishable criterion as Association Discrepancy for detection.
  • A new Anomaly-Attention mechanism to compute the association discrepancy.
  • A minimax strategy to amplify the normal-abnormal distinguishability of the association discrepancy.

Get Started

  1. Install Python 3.6, PyTorch >= 1.4.0. (Thanks Élise for the contribution in solving the environment. See this issue for details.)
  2. Download data. You can obtain four benchmarks from Google Cloud. All the datasets are well pre-processed. For the SWaT dataset, you can apply for it by following its official tutorial.
  3. Train and evaluate. We provide the experiment scripts of all benchmarks under the folder ./scripts. You can reproduce the experiment results as follows:
bash ./scripts/SMD.sh
bash ./scripts/MSL.sh
bash ./scripts/SMAP.sh
bash ./scripts/PSM.sh

Especially, we use the adjustment operation proposed by Xu et al, 2018 for model evaluation. If you have questions about this, please see this issue or email us.

Main Result

We compare our model with 15 baselines, including THOC, InterFusion, etc. Generally, Anomaly-Transformer achieves SOTA.

Citation

If you find this repo useful, please cite our paper.

@inproceedings{
xu2022anomaly,
title={Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy},
author={Jiehui Xu and Haixu Wu and Jianmin Wang and Mingsheng Long},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=LzQQ89U1qm_}
}

Contact

If you have any question, please contact wuhx23@mails.tsinghua.edu.cn.

About

About Code release for "Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight), https://openreview.net/forum?id=LzQQ89U1qm_

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published