Official PyTorch implementation of "Preemptive Image Robustification for Protecting Users against Man-in-the-Middle Adversarial Attacks" (AAAI 2022)

Overview

Preemptive Image Robustification for Protecting Users against Man-in-the-Middle Adversarial Attacks

DOI License: MIT

This is the code for reproducing the results of the paper Preemptive Image Robustification for Protecting Users against Man-in-the-Middle Adversarial Attacks accepted at AAAI 2022.

Requirements

All Python packages required are listed in requirements.txt. To install these packages, run the following commands.

conda create -n preempt-robust python=3.7
conda activate preempt-robust
pip install -r requirements.txt

Preparing CIFAR-10 data

Download the CIFAR-10 dataset from https://www.cs.toronto.edu/~kriz/cifar.html and place it a directory ./data.

Pretrained models

We provide pre-trained checkpoints for adversarially trained model and preemptively robust model.

  • adv_l2: ℓ2 adversarially trained model with early stopping
  • adv_linf: ℓ adversarially trained model with early stopping
  • preempt_robust_l2: ℓ2 preemptively robust model
  • preempt_robust_linf: ℓ preemptively robust model

We also provide a pre-trained checkpoint for a model with randomized smoothing.

  • gaussian_0.1: model trained with additive Gaussian noises (σ = 0.1)

Shell scripts for downloading these checkpoint are located in ./checkpoints/cifar10/wideresent/[train_type]/download.sh. You can run each script to download a checkpoint named ckpt.pt. To download all the checkpoints, run download_all_ckpts.sh. You can delete all the checkpoints by running delete_all_ckpts.sh.

Preemptively robust training

To train preemptively robust classifiers, run the following commands.

1. ℓ2 threat model, ε = δ = 0.5

python train.py --config ./configs/cifar10_l2_model.yaml

2. ℓ threat model, ε = δ = 8/255

python train.py --config ./configs/cifar10_linf_model.yaml

Preemptive robustification and reconstruction algorithms

To generate preepmtive roobust images and their reconstruction, run the following commands. You can specify the classifier used for generating preemptively robust images by changing train_type in each yaml file.

1. ℓ2 threat model, ε = δ = 0.5

python robustify.py --config ./configs/cifar10_l2.yaml
python reconstruct.py --config ./configs/cifar10_l2.yaml

2. ℓ threat model, ε = δ = 8/255

python robustify.py --config ./configs/cifar10_linf.yaml
python reconstruct.py --config ./configs/cifar10_linf.yaml

3. ℓ2 threat model, smoothed network, ε = δ = 0.5

python robustify.py --config ./configs/cifar10_l2_rand.yaml
python reconstruct.py --config ./configs/cifar10_l2_rand.yaml

Grey-box attacks on preemptively robustified images

To conduct grey-box attacks on preemptively robustified images, run the following commands. You can specify attack type by changing attack_type_eval in each yaml file.

1. ℓ2 threat model, ε = δ = 0.5

python attack_grey_box.py --config ./configs/cifar10_l2.yaml

2. ℓ threat model, ε = δ = 8/255

python attack_grey_box.py --config ./configs/cifar10_linf.yaml

3. ℓ2 threat model, smoothed network, ε = δ = 0.5

python attack_grey_box.py --config ./configs/cifar10_l2_rand.yaml

White-box attacks on preemptively robustified images

To conduct white-box attacks on preemptively robustified images, run the following commands. You can specify attack type and its perturbation size by changing attack_type_eval and wbox_epsilon_p in each yaml file.

1. ℓ2 threat model, ε = δ = 0.5

python attack_white_box.py --config ./configs/cifar10_l2.yaml

2. ℓ threat model, ε = δ = 8/255

python attack_white_box.py --config ./configs/cifar10_linf.yaml

3. ℓ2 threat model, smoothed network, ε = δ = 0.5

python attack_white_box.py --config ./configs/cifar10_l2_rand.yaml
You might also like...
Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)

Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching Official pytorch implementation of "Show, Attend and Distill: Kn

Official implementation for paper Knowledge Bridging for Empathetic Dialogue Generation (AAAI 2021).
Official implementation for paper Knowledge Bridging for Empathetic Dialogue Generation (AAAI 2021).

Knowledge Bridging for Empathetic Dialogue Generation This is the official implementation for paper Knowledge Bridging for Empathetic Dialogue Generat

PyTorch Implementation for AAAI'21
PyTorch Implementation for AAAI'21 "Do Response Selection Models Really Know What's Next? Utterance Manipulation Strategies for Multi-turn Response Selection"

UMS for Multi-turn Response Selection Implements the model described in the following paper Do Response Selection Models Really Know What's Next? Utte

Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding (AAAI 2020) - PyTorch Implementation

Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding PyTorch implementation for the Scalable Attentive Sentence-Pair Modeling vi

Official Pytorch implementation of
Official Pytorch implementation of "Learning to Estimate Robust 3D Human Mesh from In-the-Wild Crowded Scenes", CVPR 2022

Learning to Estimate Robust 3D Human Mesh from In-the-Wild Crowded Scenes / 3DCrowdNet News 💪 3DCrowdNet achieves the state-of-the-art accuracy on 3D

Commonality in Natural Images Rescues GANs: Pretraining GANs with Generic and Privacy-free Synthetic Data - Official PyTorch Implementation (CVPR 2022)
Commonality in Natural Images Rescues GANs: Pretraining GANs with Generic and Privacy-free Synthetic Data - Official PyTorch Implementation (CVPR 2022)

Commonality in Natural Images Rescues GANs: Pretraining GANs with Generic and Privacy-free Synthetic Data (CVPR 2022) Potentials of primitive shapes f

Official Pytorch implementation of Online Continual Learning on Class Incremental Blurry Task Configuration with Anytime Inference (ICLR 2022)
Official Pytorch implementation of Online Continual Learning on Class Incremental Blurry Task Configuration with Anytime Inference (ICLR 2022)

The Official Implementation of CLIB (Continual Learning for i-Blurry) Online Continual Learning on Class Incremental Blurry Task Configuration with An

Official PyTorch implementation of the paper
Official PyTorch implementation of the paper "Deep Constrained Least Squares for Blind Image Super-Resolution", CVPR 2022.

Deep Constrained Least Squares for Blind Image Super-Resolution [Paper] This is the official implementation of 'Deep Constrained Least Squares for Bli

Official pytorch implementation for Learning to Listen: Modeling Non-Deterministic Dyadic Facial Motion (CVPR 2022)
Official pytorch implementation for Learning to Listen: Modeling Non-Deterministic Dyadic Facial Motion (CVPR 2022)

Learning to Listen: Modeling Non-Deterministic Dyadic Facial Motion This repository contains a pytorch implementation of "Learning to Listen: Modeling

Releases(v1.0)
DGN pymarl - Implementation of DGN on Pymarl, which could be trained by VDN or QMIX

This is the implementation of DGN on Pymarl, which could be trained by VDN or QM

4 Nov 23, 2022
Tensorflow port of a full NetVLAD network

netvlad_tf The main intention of this repo is deployment of a full NetVLAD network, which was originally implemented in Matlab, in Python. We provide

Robotics and Perception Group 225 Nov 08, 2022
Elevation Mapping on GPU.

Elevation Mapping cupy Overview This is a ros package of elevation mapping on GPU. Code are written in python and uses cupy for GPU calculation. * pla

Robotic Systems Lab - Legged Robotics at ETH Zürich 183 Dec 19, 2022
A Lightweight Hyperparameter Optimization Tool 🚀

Lightweight Hyperparameter Optimization 🚀 The mle-hyperopt package provides a simple and intuitive API for hyperparameter optimization of your Machin

136 Jan 08, 2023
pytorch implementation of Attention is all you need

A Pytorch Implementation of the Transformer: Attention Is All You Need Our implementation is largely based on Tensorflow implementation Requirements N

230 Dec 07, 2022
Implementation of temporal pooling methods studied in [ICIP'20] A Comparative Evaluation Of Temporal Pooling Methods For Blind Video Quality Assessment

Implementation of temporal pooling methods studied in [ICIP'20] A Comparative Evaluation Of Temporal Pooling Methods For Blind Video Quality Assessment

Zhengzhong Tu 5 Sep 16, 2022
A framework for annotating 3D meshes using the predictions of a 2D semantic segmentation model.

Semantic Meshes A framework for annotating 3D meshes using the predictions of a 2D semantic segmentation model. Paper If you find this framework usefu

Florian 40 Dec 09, 2022
Bayesian Optimization Library for Medical Image Segmentation.

bayesmedaug: Bayesian Optimization Library for Medical Image Segmentation. bayesmedaug optimizes your data augmentation hyperparameters for medical im

Şafak Bilici 7 Feb 10, 2022
Implementation for our AAAI2021 paper (Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction).

SSAN Introduction This is the pytorch implementation of the SSAN model (see our AAAI2021 paper: Entity Structure Within and Throughout: Modeling Menti

benfeng 69 Nov 15, 2022
DC3: A Learning Method for Optimization with Hard Constraints

DC3: A learning method for optimization with hard constraints This repository is by Priya L. Donti, David Rolnick, and J. Zico Kolter and contains the

CMU Locus Lab 57 Dec 26, 2022
Kaggle DSTL Satellite Imagery Feature Detection

Kaggle DSTL Satellite Imagery Feature Detection

Konstantin Lopuhin 206 Oct 29, 2022
ChineseBERT: Chinese Pretraining Enhanced by Glyph and Pinyin Information

ChineseBERT: Chinese Pretraining Enhanced by Glyph and Pinyin Information This repository contains code, model, dataset for ChineseBERT at ACL2021. Ch

413 Dec 01, 2022
Code and Data for the paper: Molecular Contrastive Learning with Chemical Element Knowledge Graph [AAAI 2022]

Knowledge-enhanced Contrastive Learning (KCL) Molecular Contrastive Learning with Chemical Element Knowledge Graph [ AAAI 2022 ]. We construct a Chemi

Fangyin 58 Dec 26, 2022
Class-Balanced Loss Based on Effective Number of Samples. CVPR 2019

Class-Balanced Loss Based on Effective Number of Samples Tensorflow code for the paper: Class-Balanced Loss Based on Effective Number of Samples Yin C

Yin Cui 546 Jan 08, 2023
YouRefIt: Embodied Reference Understanding with Language and Gesture

YouRefIt: Embodied Reference Understanding with Language and Gesture YouRefIt: Embodied Reference Understanding with Language and Gesture by Yixin Che

16 Jul 11, 2022
Code for the paper "Reinforcement Learning as One Big Sequence Modeling Problem"

Trajectory Transformer Code release for Reinforcement Learning as One Big Sequence Modeling Problem. Installation All python dependencies are in envir

Michael Janner 269 Jan 05, 2023
Proximal Backpropagation - a neural network training algorithm that takes implicit instead of explicit gradient steps

Proximal Backpropagation Proximal Backpropagation (ProxProp) is a neural network training algorithm that takes implicit instead of explicit gradient s

Thomas Frerix 40 Dec 17, 2022
A PoC Corporation Relationship Knowledge Graph System on top of Nebula Graph.

Corp-Rel is a PoC of Corpartion Relationship Knowledge Graph System. It's built on top of the Open Source Graph Database: Nebula Graph with a dataset

Wey Gu 20 Dec 11, 2022
(ICCV'21) Official PyTorch implementation of Relational Embedding for Few-Shot Classification

Relational Embedding for Few-Shot Classification (ICCV 2021) Dahyun Kang, Heeseung Kwon, Juhong Min, Minsu Cho [paper], [project hompage] We propose t

Dahyun Kang 82 Dec 24, 2022
Align before Fuse: Vision and Language Representation Learning with Momentum Distillation

This is the official PyTorch implementation of the ALBEF paper [Blog]. This repository supports pre-training on custom datasets, as well as finetuning on VQA, SNLI-VE, NLVR2, Image-Text Retrieval on

Salesforce 805 Jan 09, 2023