This is the accompanying toolbox for the paper "A Survey on GANs for Anomaly Detection"

Overview

Anomaly Toolbox

Description

Anomaly Toolbox Powered by GANs.

This is the accompanying toolbox for the paper "A Survey on GANs for Anomaly Detection" (https://arxiv.org/pdf/1906.11632.pdf).

The toolbox is meant to be used by the user to explore the performance of different GAN based architectures (in our work aka "experiments"). It also already provides some datasets to perform experiments on:

We provided the MNIST dataset because the original works extensively use it. On the other hand, we have also added the previously listed datasets both because used by a particular architecture and because they contribute a good benchmark for the models we have implemented.

All the architectures were tested on commonly used datasets such as MNIST, FashionMNIST, CIFAR-10, and KDD99. Some of them were even tested on more specific datasets, such as an X-Ray dataset that, however, we could not provide because of the impossibility of getting the data (privacy reasons).

The user can create their own dataset and use it to test the models.

Quick Start

  • First thing first, install the toolbox
pip install anomaly-toolbox

Then you can choose what experiment to run. For example:

  • Run the GANomaly experiment (i.e., the GANomaly architecture) with hyperparameters tuning enabled, the pre-defined hyperparameters file hparams.json and the MNIST dataset:
anomaly-box.py --experiment GANomalyExperiment --hps-path path/to/config/hparams.json --dataset 
MNIST 
  • Otherwise, you can run all the experiments using the pre-defined hyperparameters file hparams. json and the MNIST dataset:
anomaly-box.py --run-all --hps-path path/to/config/hparams.json --dataset MNIST 

For any other information, feel free to check the help:

anomaly-box.py --help

Contribution

This work is completely open source, and we would appreciate any contribution to the code. Any merge request to enhance, correct or expand the work is welcome.

Notes

The structures of the models inside the toolbox come from their respective papers. We have tried to respect them as much as possible. However, sometimes, due to implementation issues, we had to make some minor-ish changes. For this reason, you could find out that, in some cases, some features such as the number of layers, the size of kernels, or other such things may differ from the originals.

However, you don't have to worry. The heart and purpose of the architectures have remained intact.

Installation

pip install anomaly-toolbox

Usage

Options:
  --experiment [AnoGANExperiment|DeScarGANExperiment|EGBADExperiment|GANomalyExperiment]
                                  Experiment to run.
  --hps-path PATH                 When running an experiment, the path of the
                                  JSON file where all the hyperparameters are
                                  located.  [required]
  --tuning BOOLEAN                If you want to use hyperparameters tuning,
                                  use 'True' here. Default is False.
  --dataset TEXT                  The dataset to use. Can be a ready to use
                                  dataset, or a .py file that implements the
                                  AnomalyDetectionDataset interface
                                  [required]
  --run-all BOOLEAN               Run all the available experiments
  --help                          Show this message and exit.

Datasets and Custom Datasets

The provided datasets are:

and are automatically downloaded when the user makes a specific choice: ["MNIST", "CorruptedMNIST", "SurfaceCracks","MVTecAD"].

The user can also add its own specific dataset. To do this, the new dataset should inherit from the AnomalyDetectionDataset abstract class implementing its own configure method. For a more detailed guide, the user can refer to the README.md file inside the src/anomaly_toolbox/datasets folder. Moreover, in the examples folder, the user can find a dummy.py module with the basic skeleton code to implement a dataset.

References

Owner
Zuru Tech
Open source @ ZURU Tech
Zuru Tech
Rank 3 : Source code for OPPO 6G Data Generation Challenge

OPPO 6G Data Generation with an E2E Framework Homepage of OPPO 6G Data Generation Challenge Datasets H1_32T4R.mat H2_32T4R.mat Please put the original

Sen Pei 97 Jan 07, 2023
Adaptive Attention Span for Reinforcement Learning

Adaptive Transformers in RL Official implementation of Adaptive Transformers in RL In this work we replicate several results from Stabilizing Transfor

100 Nov 15, 2022
Mixed Transformer UNet for Medical Image Segmentation

MT-UNet Update 2021/11/19 Thank you for your interest in our work. We have uploaded the code of our MTUNet to help peers conduct further research on i

dotman 92 Dec 25, 2022
High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.

What is xLearn? xLearn is a high performance, easy-to-use, and scalable machine learning package that contains linear model (LR), factorization machin

Chao Ma 3k Jan 03, 2023
PyTorch implementation of neural style transfer algorithm

neural-style-pt This is a PyTorch implementation of the paper A Neural Algorithm of Artistic Style by Leon A. Gatys, Alexander S. Ecker, and Matthias

770 Jan 02, 2023
Heterogeneous Deep Graph Infomax

Heterogeneous-Deep-Graph-Infomax Parameter Setting: HDGI-A: Node-level dimension: 16 Attention head: 4 Semantic-level attention vector: 8 learning rat

52 Oct 31, 2022
Unsupervised Feature Ranking via Attribute Networks.

FRANe Unsupervised Feature Ranking via Attribute Networks (FRANe) converts a dataset into a network (graph) with nodes that correspond to the features

7 Sep 29, 2022
Price-Prediction-For-a-Dream-Home - A machine learning based linear regression trained model for house price prediction.

Price-Prediction-For-a-Dream-Home ROADMAP TO THIS LINEAR REGRESSION BASED HOUSE PRICE PREDICTION PREDICTION MODEL Import all the dependencies of the p

DIKSHA DESWAL 1 Dec 29, 2021
PyTorch code for our ECCV 2018 paper "Image Super-Resolution Using Very Deep Residual Channel Attention Networks"

PyTorch code for our ECCV 2018 paper "Image Super-Resolution Using Very Deep Residual Channel Attention Networks"

Yulun Zhang 1.2k Dec 26, 2022
Algo-burn - Script to configure an Algorand address as a "burn" address for one or more ASA tokens

Algorand Burn Address This is a simple script to illustrate how a "burn address"

GSD 5 May 10, 2022
Code for "Sparse Steerable Convolutions: An Efficient Learning of SE(3)-Equivariant Features for Estimation and Tracking of Object Poses in 3D Space"

Sparse Steerable Convolution (SS-Conv) Code for "Sparse Steerable Convolutions: An Efficient Learning of SE(3)-Equivariant Features for Estimation and

25 Dec 21, 2022
Official PyTorch implementation of UACANet: Uncertainty Aware Context Attention for Polyp Segmentation

UACANet: Uncertainty Aware Context Attention for Polyp Segmentation Official pytorch implementation of UACANet: Uncertainty Aware Context Attention fo

Taehun Kim 85 Dec 14, 2022
Code for 2021 NeurIPS --- Towards Multi-Grained Explainability for Graph Neural Networks

ReFine: Multi-Grained Explainability for GNNs This is the official code for Towards Multi-Grained Explainability for Graph Neural Networks (NeurIPS 20

Shirley (Ying-Xin) Wu 47 Dec 16, 2022
Histocartography is a framework bringing together AI and Digital Pathology

Documentation | Paper Welcome to the histocartography repository! histocartography is a python-based library designed to facilitate the development of

155 Nov 23, 2022
HyperPose is a library for building high-performance custom pose estimation applications.

HyperPose is a library for building high-performance custom pose estimation applications.

TensorLayer Community 1.2k Jan 04, 2023
Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021.

Dense Contrastive Learning for Self-Supervised Visual Pre-Training This project hosts the code for implementing the DenseCL algorithm for se

Xinlong Wang 491 Jan 03, 2023
IAST: Instance Adaptive Self-training for Unsupervised Domain Adaptation (ECCV 2020)

This repo is the official implementation of our paper "Instance Adaptive Self-training for Unsupervised Domain Adaptation". The purpose of this repo is to better communicate with you and respond to y

CVSM Group - email: <a href=[email protected]"> 84 Dec 12, 2022
Automatic Data-Regularized Actor-Critic (Auto-DrAC)

Auto-DrAC: Automatic Data-Regularized Actor-Critic This is a PyTorch implementation of the methods proposed in Automatic Data Augmentation for General

89 Dec 13, 2022
Simple image captioning model - CLIP prefix captioning.

CLIP prefix captioning. Inference Notebook: 🥳 New: 🥳 Our technical papar is finally out! Official implementation for the paper "ClipCap: CLIP Prefix

688 Jan 04, 2023
【CVPR 2021, Variational Inference Framework, PyTorch】 From Rain Generation to Rain Removal

From Rain Generation to Rain Removal (CVPR2021) Hong Wang, Zongsheng Yue, Qi Xie, Qian Zhao, Yefeng Zheng, and Deyu Meng [PDF&&Supplementary Material]

Hong Wang 48 Nov 23, 2022