Fortuitous Forgetting in Connectionist Networks

Overview

Fortuitous Forgetting in Connectionist Networks

Introduction

This repository includes reference code for the paper Fortuitous Forgetting in Connectionist Networks (ICLR 2022).

@inproceedings{
  zhou2022fortuitous,
  title={Fortuitous Forgetting in Connectionist Networks},
  author={Hattie Zhou and Ankit Vani and Hugo Larochelle and Aaron Courville},
  booktitle={International Conference on Learning Representations},
  year={2022},
  url={https://openreview.net/forum?id=ei3SY1_zYsE}
}

Targeted Forgetting

This code implements the experiments on partial weight perturbations and their effects on easy or hard examples. Scripts are stored in /targeted_forgetting.

To run KE-style forgetting:

python mixed_group_training.py --seed 1 --train_perc 0.1 --random_perc 0.1 --keep_perc 0.5 --train_iters 50000 --fname new_rand_reinit_train0.1_mislabel0.1 --no_wandb

To run IMP-style forgetting:

python mixed_group_training.py --seed 1 --train_perc 1 --random_perc 0.0 --keep_perc 0.3 --train_iters 50000 --weight_mask --reset_to_zero --rewind_to_init --margin_groups --fname new_weight_rewind_zero_train1_margin0.1 --no_wandb

Later Layer Forgetting

This code builds upon the repository for Knowledge Evolution in Neural Networks. Scripts are stored in /llf_ke.

To run 10 generations of LLF on the Flower102 dataset:

python train_KE_cls.py --epochs 200 --num_generations 11 --name resetlayer4_flower_resnet18 --weight_decay 0.0001 --arch Split_ResNet18 --reset_layer_name layer4 --set Flower102 --data $DATA_DIR --no_wandb

To run 10 generations of KE:

python train_KE_cls.py --epochs 200 --num_generations 11 --name ke_kels_flower_resnet18 --weight_decay 0.0001 --arch Split_ResNet18 --split_rate 0.8 --split_mode kels --set Flower102 --data $DATA_DIR --no_wandb

To run 10 generations-equivalent of the long baseline on the Flower102 dataset:

python train_KE_cls.py --epochs 2200 --num_generations 1 --name resetlayer4_flower_resnet18_long2200 --weight_decay 0.0001 --arch Split_ResNet18 --reset_layer_name layer4 --set Flower102 --eval_intermediate_tst 200 --data $DATA_DIR --no_wandb

To run freeze later layers experiment:

python train_KE_cls.py --epochs 200 --num_generations 11 --name resetlayer4_flower_resnet18_freeze_reset_layers --weight_decay 0.0001 --arch Split_ResNet18 --reset_layer_name layer4 --data $DATA_DIR --set Flower102 --reverse_freeze --freeze_non_reset --optimizer sgd_TEMP --no_wandb

To run freeze early layers experiment:

python train_KE_cls.py --epochs 200 --num_generations 11 --name resetlayer4_flower_resnet18_freeze_nonreset_layers --weight_decay 0.0001 --arch Split_ResNet18 --reset_layer_name layer4 --data $DATA_DIR --set Flower102 --freeze_non_reset --optimizer sgd_TEMP --no_wandb

To run freeze later layers with fixed seed experiment:

python train_KE_cls.py --epochs 200 --num_generations 11 --name resetlayer4_flower_resnet18_freeze_reset_layers --weight_decay 0.0001 --arch Split_ResNet18 --reset_layer_name layer4 --data $DATA_DIR --set Flower102 --reverse_freeze --freeze_non_reset --optimizer sgd_TEMP --seed 0 --fix_seed --no_wandb

Ease-of-teaching

This code builds upon the repository for Ease-of-Teaching and Language Structure from Emergent Communication. Scripts are stored in /ease_of_teaching.

To run the no reset baseline:

python forget_train.py --fname baseline_no_reset --seed 0 --no_wandb

To run the reset receiver baseline:

python forget_train.py --resetNum 50 --fname baseline_reset_receiver --seed 0 --reset_receiver --no_wandb

To run partial balanced forgetting (PBF):

python forget_train.py --resetNum 100 --fname same_weight_reinit_sender10_receiver10_reset100 --seed 0 --forget_sender --sender_keep_perc 0.1 --forget_receiver --receiver_keep_perc 0.1 --weight_mask --same_mask --no_wandb

To run targeted forgettine experiments:

python mixed_language_forget_samebatch.py --group_vars same_mask weight_mask reset_to_zero keep_perc seed trainIters train_with_reset reset_every --seed 0 --keep_perc 0.5 --fname new_rand_reinit

python mixed_language_forget_samebatch.py --group_vars same_mask weight_mask reset_to_zero keep_perc seed trainIters train_with_reset reset_every --seed 0 --keep_perc 0.5 --fname same_weight_zero --same_mask --weight_mask --reset_to_zero

Owner
Hattie Zhou
Hattie Zhou
AutoML library for deep learning

Official Website: autokeras.com AutoKeras: An AutoML system based on Keras. It is developed by DATA Lab at Texas A&M University. The goal of AutoKeras

Keras 8.7k Jan 08, 2023
implementation for paper "ShelfNet for fast semantic segmentation"

ShelfNet-lightweight for paper (ShelfNet for fast semantic segmentation) This repo contains implementation of ShelfNet-lightweight models for real-tim

Juntang Zhuang 252 Sep 16, 2022
RSNA Intracranial Hemorrhage Detection with python

RSNA Intracranial Hemorrhage Detection This is the source code for the first place solution to the RSNA2019 Intracranial Hemorrhage Detection Challeng

24 Nov 30, 2022
LSSY量化交易系统

LSSY量化交易系统 该项目是本人3年来研究量化慢慢积累开发的一套系统,属于早期作品慢慢修改而来,仅供学习研究,回测分析,实盘交易部分未公开

55 Oct 04, 2022
nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures.

nextPARS, a novel Illumina-based implementation of in-vitro parallel probing of RNA structures. Here you will find the scripts necessary to produce th

Jesse Willis 0 Jan 20, 2022
Graph Attention Networks

GAT Graph Attention Networks (Veličković et al., ICLR 2018): https://arxiv.org/abs/1710.10903 GAT layer t-SNE + Attention coefficients on Cora Overvie

Petar Veličković 2.6k Jan 05, 2023
Implementation of "Fast and Flexible Temporal Point Processes with Triangular Maps" (Oral @ NeurIPS 2020)

Fast and Flexible Temporal Point Processes with Triangular Maps This repository includes a reference implementation of the algorithms described in "Fa

Oleksandr Shchur 20 Dec 02, 2022
PyTorch-Multi-Style-Transfer - Neural Style and MSG-Net

PyTorch-Style-Transfer This repo provides PyTorch Implementation of MSG-Net (ours) and Neural Style (Gatys et al. CVPR 2016), which has been included

Hang Zhang 906 Jan 04, 2023
A Python script that creates subtitles of a given length from text paragraphs that can be easily imported into any Video Editing software such as FinalCut Pro for further adjustments.

Text to Subtitles - Python This python file creates subtitles of a given length from text paragraphs that can be easily imported into any Video Editin

Dmytro North 9 Dec 24, 2022
A repo to show how to use custom dataset to train s2anet, and change backbone to resnext101

A repo to show how to use custom dataset to train s2anet, and change backbone to resnext101

jedibobo 3 Dec 28, 2022
Repo for "TableParser: Automatic Table Parsing with Weak Supervision from Spreadsheets" at [email protected]

TableParser Repo for "TableParser: Automatic Table Parsing with Weak Supervision from Spreadsheets" at DS3 Lab 11 Dec 13, 2022

PyTorch implementation for "Sharpness-aware Quantization for Deep Neural Networks".

Sharpness-aware Quantization for Deep Neural Networks Recent Update 2021.11.23: We release the source code of SAQ. Setup the environments Clone the re

Zhuang AI Group 30 Dec 19, 2022
This repository is all about spending some time the with the original problem posed by Minsky and Papert

This repository is all about spending some time the with the original problem posed by Minsky and Papert. Working through this problem is a great way to begin learning computer vision.

Jaissruti Nanthakumar 1 Jan 23, 2022
Deeplab-resnet-101 in Pytorch with Jaccard loss

Deeplab-resnet-101 Pytorch with Lovász hinge loss Train deeplab-resnet-101 with binary Jaccard loss surrogate, the Lovász hinge, as described in http:

Maxim Berman 95 Apr 15, 2022
This repository introduces a short project about Transfer Learning for Classification of MRI Images.

Transfer Learning for MRI Images Classification This repository introduces a short project made during my stay at Neuromatch Summer School 2021. This

Oscar Guarnizo 3 Nov 15, 2022
Official code repository for the work: "The Implicit Values of A Good Hand Shake: Handheld Multi-Frame Neural Depth Refinement"

Handheld Multi-Frame Neural Depth Refinement This is the official code repository for the work: The Implicit Values of A Good Hand Shake: Handheld Mul

55 Dec 14, 2022
A robotic arm that mimics hand movement through MediaPipe tracking.

La-Z-Arm A robotic arm that mimics hand movement through MediaPipe tracking. Hardware NVidia Jetson Nano Sparkfun Pi Servo Shield Micro Servos Webcam

Alfred 1 Jun 05, 2022
Self-training for Few-shot Transfer Across Extreme Task Differences

Self-training for Few-shot Transfer Across Extreme Task Differences (STARTUP) Introduction This repo contains the official implementation of the follo

Cheng Perng Phoo 33 Oct 31, 2022
Dataset used in "PlantDoc: A Dataset for Visual Plant Disease Detection" accepted in CODS-COMAD 2020

PlantDoc: A Dataset for Visual Plant Disease Detection This repository contains the Cropped-PlantDoc dataset used for benchmarking classification mode

Pratik Kayal 109 Dec 29, 2022
The codes I made while I practiced various TensorFlow examples

TensorFlow_Exercises The codes I made while I practiced various TensorFlow examples About the codes I didn't create these codes by myself, but re-crea

Terry Taewoong Um 614 Dec 08, 2022