Learnable Boundary Guided Adversarial Training (ICCV2021)

Overview

Learnable Boundary Guided Adversarial Training

This repository contains the implementation code for the ICCV2021 paper:
Learnable Boundary Guided Adversarial Training (https://arxiv.org/pdf/2011.11164.pdf)

If you find this code or idea useful, please consider citing our work:

@article{cui2020learnable,
  title={Learnable boundary guided adversarial training},
  author={Cui, Jiequan and Liu, Shu and Wang, Liwei and Jia, Jiaya},
  journal={arXiv preprint arXiv:2011.11164},
  year={2020}
}

Overview

In this paper, we proposed the "Learnable Boundary Guided Adversarial Training" to preserve high natural accuracy while enjoy strong robustness for deep models. An interesting phenomenon in our exploration shows that natural classifier boundary can benefit model robustness to some degree, which is different from the previous work that the improved robustness is at cost of performance degradation on natural data. Our method creates new state-of-the-art model robustness on CIFAR-100 without extra real or Synthetic data under auto-attack benchmark.

image

Results and Pretrained models

`
Models are evaluated under the strongest AutoAttack(https://github.com/fra31/auto-attack) with epsilon 0.031.

Our CIFAR-100 models:
CIFAR-100-LBGAT0-wideresnet-34-10 70.25 vs 27.16
CIFAR-100-LBGAT6-wideresnet-34-10 60.64 vs 29.33
CIFAR-100-LBGAT6-wideresnet-34-20 62.55 vs 30.20

Our CIFAR-10 models:
CIFAR-10-LBGAT0-wideresnet-34-10 88.22 vs 52.86
CIFAR-10-LBGAT0-wideresnet-34-20 88.70 vs 53.57

CIFAR-100 L-inf

Note: this is one partial results list for comparisons with methods without using additional data up to 2020/11/25. Full list can be found at https://github.com/fra31/auto-attack. TRADES (alpha=6) is trained with official open-source code at https://github.com/yaodongyu/TRADES.

# Method Model Natural Acc Robust Acc (AutoAttack)
1 LBGAT (Ours) WRN-34-20 62.55 30.20
2 (Gowal et al. 2020) WRN-70-16 60.86 30.03
3 LBGAT (Ours) WRN-34-10 60.64 29.33
4 (Wu et al. 2020) WRN-34-10 60.38 28.86
5 LBGAT (Ours) WRN-34-10 70.25 27.16
6 (Chen et al. 2020) WRN-34-10 62.15 26.94
7 (Zhang et al. 2019) TRADES (alpha=6) WRN-34-10 56.50 26.87
8 (Sitawarin et al. 2020) WRN-34-10 62.82 24.57
9 (Rice et al. 2020) RN-18 53.83 18.95

CIFAR-10 L-inf

Note: this is one partial results list for comparisons with previous published methods without using additional data up to 2020/11/25. Full list can be found at https://github.com/fra31/auto-attack. TRADES (alpha=6) is trained with official open-source code at https://github.com/yaodongyu/TRADES. “*” denotes methods aiming to speed up adversarial training.

# Method Model Natural Acc Robust Acc (AutoAttack)
1 LBGAT (Ours) WRN-34-20 88.70 53.57
2 (Zhang et al.) WRN-34-10 84.52 53.51
3 (Rice et al. 2020) WRN-34-20 85.34 53.42
4 LBGAT (Ours) WRN-34-10 88.22 52.86
5 (Qin et al., 2019) WRN-40-8 86.28 52.84
6 (Zhang et al. 2019) TRADES (alpha=6) WRN-34-10 84.92 52.64
7 (Chen et al., 2020b) WRN-34-10 85.32 51.12
8 (Sitawarin et al., 2020) WRN-34-10 86.84 50.72
9 (Engstrom et al., 2019) RN-50 87.03 49.25
10 (Kumari et al., 2019) WRN-34-10 87.80 49.12
11 (Mao et al., 2019) WRN-34-10 86.21 47.41
12 (Zhang et al., 2019a) WRN-34-10 87.20 44.83
13 (Madry et al., 2018) AT WRN-34-10 87.14 44.04
14 (Shafahi et al., 2019)* WRN-34-10 86.11 41.47
14 (Wang & Zhang, 2019)* WRN-28-10 92.80 29.35

Get Started

Befor the training, please create the directory 'Logs' via the command 'mkdir Logs'.

Training

bash sh/train_lbgat0_cifar100.sh

Evaluation

before running the evaluation, please download the pretrained model.

bash sh/eval_autoattack.sh

Acknowledgements

This code is partly based on the TRADES and autoattack.

Contact

If you have any questions, feel free to contact us through email ([email protected]) or Github issues. Enjoy!

A Factor Model for Persistence in Investment Manager Performance

Factor-Model-Manager-Performance A Factor Model for Persistence in Investment Manager Performance I apply methods and processes similar to those used

Omid Arhami 1 Dec 01, 2021
A CROSS-MODAL FUSION NETWORK BASED ON SELF-ATTENTION AND RESIDUAL STRUCTURE FOR MULTIMODAL EMOTION RECOGNITION

CFN-SR A CROSS-MODAL FUSION NETWORK BASED ON SELF-ATTENTION AND RESIDUAL STRUCTURE FOR MULTIMODAL EMOTION RECOGNITION The audio-video based multimodal

skeleton 15 Sep 26, 2022
From Canonical Correlation Analysis to Self-supervised Graph Neural Networks

Code for CCA-SSG model proposed in the NeurIPS 2021 paper From Canonical Correlation Analysis to Self-supervised Graph Neural Networks.

Hengrui Zhang 44 Nov 27, 2022
Fast EMD for Python: a wrapper for Pele and Werman's C++ implementation of the Earth Mover's Distance metric

PyEMD: Fast EMD for Python PyEMD is a Python wrapper for Ofir Pele and Michael Werman's implementation of the Earth Mover's Distance that allows it to

William Mayner 433 Dec 31, 2022
The first dataset on shadow generation for the foreground object in real-world scenes.

Object-Shadow-Generation-Dataset-DESOBA Object Shadow Generation is to deal with the shadow inconsistency between the foreground object and the backgr

BCMI 105 Dec 30, 2022
This is the code for the paper "Jinkai Zheng, Xinchen Liu, Wu Liu, Lingxiao He, Chenggang Yan, Tao Mei: Gait Recognition in the Wild with Dense 3D Representations and A Benchmark. (CVPR 2022)"

Gait3D-Benchmark This is the code for the paper "Jinkai Zheng, Xinchen Liu, Wu Liu, Lingxiao He, Chenggang Yan, Tao Mei: Gait Recognition in the Wild

82 Jan 04, 2023
Groceries ARL: Association Rules (Birliktelik Kuralı)

Groceries_ARL Association Rules (Birliktelik Kuralı) Birliktelik kuralları, mark

Şebnem 5 Feb 08, 2022
Two-stage CenterNet

Probabilistic two-stage detection Two-stage object detectors that use class-agnostic one-stage detectors as the proposal network. Probabilistic two-st

Xingyi Zhou 1.1k Jan 03, 2023
PyTorch implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"

DiscoGAN in PyTorch PyTorch implementation of Learning to Discover Cross-Domain Relations with Generative Adversarial Networks. * All samples in READM

Taehoon Kim 1k Jan 04, 2023
Pytorch domain adaptation package

DomainAdaptation This package is created to tackle the problem of domain shifts when dealing with two domains of different feature distributions. In d

Institute of Computational Perception 7 Oct 22, 2022
This repository contains part of the code used to make the images visible in the article "How does an AI Imagine the Universe?" published on Towards Data Science.

Generative Adversarial Network - Generating Universe This repository contains part of the code used to make the images visible in the article "How doe

Davide Coccomini 9 Dec 18, 2022
Basit bir burç modülü.

Bu modulu burclar hakkinda gundelik bir sekilde bilgi alin diye yaptim ve sizler icin kullanima sunuyorum. Modulun kullanimi asiri basit: Ornek Kullan

Special 17 Jun 08, 2022
Neuron class provides LNU (Linear Neural Unit), QNU (Quadratic Neural Unit), RBF (Radial Basis Function), MLP (Multi Layer Perceptron), MLP-ELM (Multi Layer Perceptron - Extreme Learning Machine) neurons learned with Gradient descent or LeLevenberg–Marquardt algorithm

Neuron class provides LNU (Linear Neural Unit), QNU (Quadratic Neural Unit), RBF (Radial Basis Function), MLP (Multi Layer Perceptron), MLP-ELM (Multi Layer Perceptron - Extreme Learning Machine) neu

Filip Molcik 38 Dec 17, 2022
GLIP: Grounded Language-Image Pre-training

GLIP: Grounded Language-Image Pre-training Updates 12/06/2021: GLIP paper on arxiv https://arxiv.org/abs/2112.03857. Code and Model are under internal

Microsoft 862 Jan 01, 2023
This is the official implement of paper "ActionCLIP: A New Paradigm for Action Recognition"

This is an official pytorch implementation of ActionCLIP: A New Paradigm for Video Action Recognition [arXiv] Overview Content Prerequisites Data Prep

268 Jan 09, 2023
Deep Learning Visuals contains 215 unique images divided in 23 categories

Deep Learning Visuals contains 215 unique images divided in 23 categories (some images may appear in more than one category). All the images were originally published in my book "Deep Learning with P

Daniel Voigt Godoy 1.3k Dec 28, 2022
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization techniques.

Opytimizer: A Nature-Inspired Python Optimizer Welcome to Opytimizer. Did you ever reach a bottleneck in your computational experiments? Are you tired

Gustavo Rosa 546 Dec 31, 2022
Shitty gaze mouse controller

demo.mp4 shitty_gaze_mouse_cotroller install tensofflow, cv2 run the main.py and as it starts it will collect data so first raise your left eyebrow(bo

16 Aug 30, 2022
A toy compiler that can convert Python scripts to pickle bytecode 🥒

Pickora 🐰 A small compiler that can convert Python scripts to pickle bytecode. Requirements Python 3.8+ No third-party modules are required. Usage us

ꌗᖘ꒒ꀤ꓄꒒ꀤꈤꍟ 68 Jan 04, 2023
OpenMMLab 3D Human Parametric Model Toolbox and Benchmark

Introduction English | 简体中文 MMHuman3D is an open source PyTorch-based codebase for the use of 3D human parametric models in computer vision and comput

OpenMMLab 782 Jan 04, 2023