Official implementation of Representer Point Selection via Local Jacobian Expansion for Post-hoc Classifier Explanation of Deep Neural Networks and Ensemble Models at NeurIPS 2021

Related tags

Deep LearningRPS_LJE
Overview

Representer Point Selection via Local Jacobian Expansion for Classifier Explanation of Deep Neural Networks and Ensemble Models

This repository is the official implementation of Representer Point Selection via Local Jacobian Expansion for Classifier Explanation of Deep Neural Networks and Ensemble Models at NeurIPS 2021. (will update the link)

Introduction

We propose a novel sample-based explanation method for classifiers with a novel derivation of representer point with Taylor Expansion on the Jacobian matrix.

If you would like to cite this work, a sample bibtex citation is as following:

@inproceedings{yi2021representer,
 author = {Yi Sui, Ga Wu, Scott Sanner},
 booktitle = {Advances in Neural Information Processing Systems},
 title = {Representer Point Selection via Local Jacobian Expansion for Classifier Explanation of Deep Neural Networks and Ensemble Models},
 year = {2021}
}

Set up

To install requirements:

pip install -r requirements.txt

Change the root path in config.py to the path to the project

project_root = #your path here

Download the pre-trained models and calculated weights here

  • Dowload and unzip the saved_models_MODEL_NAME
  • Put the content into the corresponding folders ("models/ MODEL_NAME /saved_models")

Training

In our paper, we run experiment with three tasks

  • CIFAR image classification with ResNet-20 (CNN)
  • IMDB sentiment classification with Bi-LSTM (RNN)
  • German credit analysis with XGBoost (Xgboost)

The models are implemented in the models directory with pre-trained weights under "models/ MODEL_NAME /saved_models/base" : ResNet (CNN), Bi-LSTM (RNN), and XGBoost.

To train theses model(s) in the paper, run the following commands:

python models/CNN/train.py --lr 0.01 --epochs 10 --saved_path saved_models/base
python models/RNN/train.py --lr 1e-3 --epochs 10 --saved_path saved_models/base --use_pretrained True
python models/Xgboost/train.py

Caculate weights

We implemented three different explainers: RPS-LJE, RPS-l2 (modified from official repository of RPS-l2), and Influence Function. To calculate the importance weights, run the following commands:

python explainer/calculate_ours_weights.py --model CNN --lr 0.01
python explainer/calculate_representer_weights.py --model RNN --lmbd 0.003 --epoch 3000
python explainer/calculate_influence.py --model Xgboost

Experiments

Dataset debugging experiment

To run the dataset debugging experiments, run the following commands:

python dataset_debugging/experiment_dataset_debugging_cnn.py --num_of_run 10 --flip_portion 0.2 --path ../models/CNN/saved_models/experiment_dataset_debugging --lr 1e-5
python dataset_debugging/experiment_dataset_debugging_cnn.py --num_of_run 10 --flip_portion 0.2 --path ../models/CNN/saved_models/experiment_dataset_debugging_fix_random_split --lr 1e-5 --seed 11

python dataset_debugging/experiment_dataset_debugging_rnn.py --num_of_run 10 --flip_portion 0.2 --path ../models/RNN/saved_models/experiment_dataset_debugging --lr 1e-5

python dataset_debugging/experiment_dataset_debugging_Xgboost.py --num_of_run 10 --flip_portion 0.3 --path ../models/Xgboost/saved_models/experiment_dataset_debugging --lr 1e-5

The trained models, intermediate outputs, explainer weights, and accuracies at each checkpoint are stored under the specified paths "models/MODEL_NAME/saved_models/experiment_dataset_debugging". To visualize the results, run the notebooks plot_res_cnn.ipynb, plot_res_cnn_fixed_random_split.ipynb, plot_res_rnn.ipynb, plot_res_xgboost.ipynb. The results are saved under folder dataset_debugging/figs.

Other experiments

All remaining experiments are in Jupyter-notebooks organized under "models/ MODEL_NAME /experiments" : ResNet (CNN), Bi-LSTM (RNN), and XGBoost.

A comparison of explanation provided by Influence Function, RPS-l2, and RPS-LJE. Explanation for Image Classification

Owner
Yi(Amy) Sui
Yi(Amy) Sui
GAN-STEM-Conv2MultiSlice - Exploring Generative Adversarial Networks for Image-to-Image Translation in STEM Simulation

GAN-STEM-Conv2MultiSlice GAN method to help covert lower resolution STEM images generated by convolution methods to higher resolution STEM images gene

UW-Madison Computational Materials Group 2 Feb 10, 2021
The Wearables Development Toolkit - a development environment for activity recognition applications with sensor signals

Wearables Development Toolkit (WDK) The Wearables Development Toolkit (WDK) is a framework and set of tools to facilitate the iterative development of

Juan Haladjian 114 Nov 27, 2022
An Inverse Kinematics library aiming performance and modularity

IKPy Demo Live demos of what IKPy can do (click on the image below to see the video): Also, a presentation of IKPy: Presentation. Features With IKPy,

Pierre Manceron 481 Jan 02, 2023
Python framework for Stochastic Differential Equations modeling

SDElearn: a Python package for SDE modeling This package implements functionalities for working with Stochastic Differential Equations models (SDEs fo

4 May 10, 2022
The official codes for the ICCV2021 presentation "Uniformity in Heterogeneity: Diving Deep into Count Interval Partition for Crowd Counting"

UEPNet (ICCV2021 Poster Presentation) This repository contains codes for the official implementation in PyTorch of UEPNet as described in Uniformity i

Tencent YouTu Research 15 Dec 14, 2022
Fortuitous Forgetting in Connectionist Networks

Fortuitous Forgetting in Connectionist Networks Introduction This repository includes reference code for the paper Fortuitous Forgetting in Connection

Hattie Zhou 14 Nov 26, 2022
DSEE: Dually Sparsity-embedded Efficient Tuning of Pre-trained Language Models

DSEE Codes for [Preprint] DSEE: Dually Sparsity-embedded Efficient Tuning of Pre-trained Language Models Xuxi Chen, Tianlong Chen, Yu Cheng, Weizhu Ch

VITA 4 Dec 27, 2021
Learning Representations that Support Robust Transfer of Predictors

Transfer Risk Minimization (TRM) Code for Learning Representations that Support Robust Transfer of Predictors Prepare the Datasets Preprocess the Scen

Yilun Xu 15 Dec 07, 2022
PyTorch implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"

DiscoGAN in PyTorch PyTorch implementation of Learning to Discover Cross-Domain Relations with Generative Adversarial Networks. * All samples in READM

Taehoon Kim 1k Jan 04, 2023
Unbalanced Feature Transport for Exemplar-based Image Translation (CVPR 2021)

UNITE and UNITE+ Unbalanced Feature Transport for Exemplar-based Image Translation (CVPR 2021) Unbalanced Intrinsic Feature Transport for Exemplar-bas

Fangneng Zhan 183 Nov 09, 2022
Code and data for the EMNLP 2021 paper "Just Say No: Analyzing the Stance of Neural Dialogue Generation in Offensive Contexts". Coming soon!

ToxiChat Code and data for the EMNLP 2021 paper "Just Say No: Analyzing the Stance of Neural Dialogue Generation in Offensive Contexts". Install depen

Ashutosh Baheti 11 Jan 01, 2023
Meta-TTS: Meta-Learning for Few-shot SpeakerAdaptive Text-to-Speech

Meta-TTS: Meta-Learning for Few-shot SpeakerAdaptive Text-to-Speech This repository is the official implementation of "Meta-TTS: Meta-Learning for Few

Sung-Feng Huang 128 Dec 25, 2022
Performant, differentiable reinforcement learning

deluca Performant, differentiable reinforcement learning Notes This is pre-alpha software and is undergoing a number of core changes. Updates to follo

Google 114 Dec 27, 2022
Code implementing "Improving Deep Learning Interpretability by Saliency Guided Training"

Saliency Guided Training Code implementing "Improving Deep Learning Interpretability by Saliency Guided Training" by Aya Abdelsalam Ismail, Hector Cor

8 Sep 22, 2022
Code for "My(o) Armband Leaks Passwords: An EMG and IMU Based Keylogging Side-Channel Attack" paper

Myo Keylogging This is the source code for our paper My(o) Armband Leaks Passwords: An EMG and IMU Based Keylogging Side-Channel Attack by Matthias Ga

Secure Mobile Networking Lab 7 Jan 03, 2023
DeepLab-ResNet rebuilt in TensorFlow

DeepLab-ResNet-TensorFlow This is an (re-)implementation of DeepLab-ResNet in TensorFlow for semantic image segmentation on the PASCAL VOC dataset. Fr

Vladimir 1.2k Nov 04, 2022
PyTorch implementation for the ICLR 2020 paper "Understanding the Limitations of Variational Mutual Information Estimators"

Smoothed Mutual Information ``Lower Bound'' Estimator PyTorch implementation for the ICLR 2020 paper Understanding the Limitations of Variational Mutu

50 Nov 09, 2022
Integrated Semantic and Phonetic Post-correction for Chinese Speech Recognition

Integrated Semantic and Phonetic Post-correction for Chinese Speech Recognition | paper | dataset | pretrained detection model | Authors: Yi-Chang Che

Yi-Chang Chen 1 Aug 23, 2022
HyperaPy: An automatic hyperparameter optimization framework ⚡🚀

hyperpy HyperPy: An automatic hyperparameter optimization framework Description HyperPy: Library for automatic hyperparameter optimization. Build on t

Sergio Mora 7 Sep 06, 2022
An end-to-end implementation of intent prediction with Metaflow and other cool tools

You Don't Need a Bigger Boat An end-to-end (Metaflow-based) implementation of an intent prediction flow for kids who can't MLOps good and wanna learn

Jacopo Tagliabue 614 Dec 31, 2022