Implementation of "Adversarial purification with Score-based generative models", ICML 2021

Related tags

Text Data & NLPadp
Overview

Adversarial Purification with Score-based Generative Models

by Jongmin Yoon, Sung Ju Hwang, Juho Lee

This repository includes the official PyTorch implementation of our paper:

Adversarial Purification with Score-based Generative Models

Jongmin Yoon, Sung Ju Hwang, Juho Lee

the 38th International Conference for Machine Learning (ICML 2021)

ArXiv: https://arxiv.org/abs/2106.06041

What does our work do?

We propose a method that gives adversarial robustness to a neural network model against (stochastic) adversarial attacks by using an Energy-based Model (EBM) trained with Denoising Score Matching (DSM), which is called Adversarial denosing purification (ADP).

Running Codes

Dependency

Run the following command to install some necessary python packages to run our code.

pip install -r requirements.txt

Running code

To run the experiments with adp.py or adp_decision.py, enter the following command.

python main.py --config <config-file>

For example, we provide the example configuration file configs/cifar10_bpda_eot_sigma025_eot15.yml in the repository.

Attack and defense

For adversarial attacks, the classifier PGD attack and BPDA+EOT attack are implemented in attacks/clf_pgd.py and attacks/bpda_strong.py, respectively. At the configuration file, setting the attack.attack_method into clf_pgd or bpda_strong will run these attacks, respectively. For defense, we implemented the main ADP algorithm and ADP after detecting adversarial examples (Appendix F.) in purification/adp.py and purification/adp_decision.py, respectively.

Main components

File name Explanation
main.py Execute the main code, with initializing configurations and loggers.
runners/empirical.py Attacks and purifies the image to show empirical adversarial robustness.
attacks/bpda_strong.py Code for BPDA+EOT attack.
purification/adp.py Code for adversarial purification.
ncsnv2/* Code for training the EBM, i.e., NCSNv2 (paper, code).
networks/* Code for used classifier network architectures.
utils/* Utility files.

Notes

  • For the configuration files, we use the pixel ranges [0, 255] for the perturbation scale attack.ptb and the one-step attack scale attack.alpha. And the main experiments are performed within the pixel range [0, 1] after being rescaled during execution.
  • For training the EBM and classifier models, we primarily used the pre-existing methods such as NCSNv2 and WideResNet classifier. Here is the repository we used for training the WideResNet classifier. Nevertheless, other classifiers, such as the pre-trained adversarially robust classifier implemented in here can be used.

Reference

If you find our work useful for your research, please consider citing this.

@inproceedings{
yoon2021advpur,
title={Adversarial Purification with Score-based Generative Models},
author={Jongmin Yoon and Sung Ju Hwang and Juho Lee},
booktitle={Proceedings of The 38th International Conference on Machine Learning (ICML 2021)},
year={2021},
}

Contact

For further details, please contact [email protected].

License

MIT

This is a project built for FALLABOUT2021 event under SRMMIC, This project deals with NLP poetry generation.

FALLABOUT-SRMMIC 21 POETRY-GENERATION HINGLISH DESCRIPTION We have developed a NLP(natural language processing) model which automatically generates a

7 Sep 28, 2021
Espresso: A Fast End-to-End Neural Speech Recognition Toolkit

Espresso Espresso is an open-source, modular, extensible end-to-end neural automatic speech recognition (ASR) toolkit based on the deep learning libra

Yiming Wang 919 Jan 03, 2023
Malware-Related Sentence Classification

Malware-Related Sentence Classification This repo contains the code for the ICTAI 2021 paper "Enrichment of Features for Malware-Related Sentence Clas

Chau Nguyen 1 Mar 26, 2022
Creating a chess engine using GPT-3

GPT3Chess Creating a chess engine using GPT-3 Code for my article : https://towardsdatascience.com/gpt-3-play-chess-d123a96096a9 My game (white) vs GP

19 Dec 17, 2022
NLPShala , the best IDE for all Natural language processing tasks.

The revolutionary IDE for all NLP (Natural language processing) stuffs on the internet.

Abhi 3 Aug 08, 2021
Curso práctico: NLP de cero a cien 🤗

Curso Práctico: NLP de cero a cien Comprende todos los conceptos y arquitecturas clave del estado del arte del NLP y aplícalos a casos prácticos utili

Somos NLP 147 Jan 06, 2023
This repo stores the codes for topic modeling on palliative care journals.

This repo stores the codes for topic modeling on palliative care journals. Data Preparation You first need to download the journal papers. bash 1_down

3 Dec 20, 2022
Implementation of Multistream Transformers in Pytorch

Multistream Transformers Implementation of Multistream Transformers in Pytorch. This repository deviates slightly from the paper, where instead of usi

Phil Wang 47 Jul 26, 2022
🧪 Cutting-edge experimental spaCy components and features

spacy-experimental: Cutting-edge experimental spaCy components and features This package includes experimental components and features for spaCy v3.x,

Explosion 65 Dec 30, 2022
Code for EMNLP20 paper: "ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training"

ProphetNet-X This repo provides the code for reproducing the experiments in ProphetNet. In the paper, we propose a new pre-trained language model call

Microsoft 394 Dec 17, 2022
Leon is an open-source personal assistant who can live on your server.

Leon Your open-source personal assistant. Website :: Documentation :: Roadmap :: Contributing :: Story 👋 Introduction Leon is an open-source personal

Leon AI 11.7k Dec 30, 2022
AIDynamicTextReader - A simple dynamic text reader based on Artificial intelligence

AI Dynamic Text Reader: This is a simple dynamic text reader based on Artificial

Md. Rakibul Islam 1 Jan 18, 2022
Task-based datasets, preprocessing, and evaluation for sequence models.

SeqIO: Task-based datasets, preprocessing, and evaluation for sequence models. SeqIO is a library for processing sequential data to be fed into downst

Google 290 Dec 26, 2022
ETM - R package for Topic Modelling in Embedding Spaces

ETM - R package for Topic Modelling in Embedding Spaces This repository contains an R package called topicmodels.etm which is an implementation of ETM

bnosac 37 Nov 06, 2022
Train and use generative text models in a few lines of code.

blather Train and use generative text models in a few lines of code. To see blather in action check out the colab notebook! Installation Use the packa

Dan Carroll 16 Nov 07, 2022
基于Transformer的单模型、多尺度的VAE模型

UniVAE 基于Transformer的单模型、多尺度的VAE模型 介绍 https://kexue.fm/archives/8475 依赖 需要大于0.10.6版本的bert4keras(当前还没有推到pypi上,可以直接从GitHub上clone最新版)。 引用 @misc{univae,

苏剑林(Jianlin Su) 49 Aug 24, 2022
Finetune gpt-2 in google colab

gpt-2-colab finetune gpt-2 in google colab sample result (117M) from retraining on A Tale of Two Cities by Charles Di

212 Jan 02, 2023
My implementation of Safaricom Machine Learning Codility test. The code has bugs, logical I guess I made errors and any correction will be appreciated.

Safaricom_Codility Machine Learning 2022 The test entails two questions. Question 1 was on Machine Learning. Question 2 was on SQL I ran out of time.

Lawrence M. 1 Mar 03, 2022
FactSumm: Factual Consistency Scorer for Abstractive Summarization

FactSumm: Factual Consistency Scorer for Abstractive Summarization FactSumm is a toolkit that scores Factualy Consistency for Abstract Summarization W

devfon 83 Jan 09, 2023
A list of NLP(Natural Language Processing) tutorials built on Tensorflow 2.0.

A list of NLP(Natural Language Processing) tutorials built on Tensorflow 2.0.

Won Joon Yoo 335 Jan 04, 2023