Meta-Learning Sparse Implicit Neural Representations (NeurIPS 2021)

Overview

Meta-SparseINR

Official PyTorch implementation of "Meta-learning Sparse Implicit Neural Representations" (NeurIPS 2021) by Jaeho Lee*, Jihoon Tack*, Namhoon Lee, and Jinwoo Shin.

TL;DR: We develop a scalable method to learn sparse neural representations for a large set of signals.

Illustrations of (a) an implicit neural representation, (b) the standard pruning algorithm that prunes and retrains the model for each signal considered, and (c) the proposed Meta-SparseINR procedure to find a sparse initial INR, which can be trained further to fit each signal.

1. Requirements

conda create -n inrprune python=3.7
conda activate inrprune

conda install pytorch torchvision cudatoolkit=11.1 -c pytorch -c nvidia

pip install torchmeta
pip install imageio einops tensorboardX

Datasets

  • Download Imagenette and SDF file from the following page:
  • One should locate the dataset into /data folder

2. Training

Training option

The option for the training method is as follows:

  • <DATASET>: {celeba,sdf,imagenette}

Meta-SparseINR (ours)

# Train dense model first
python main.py --exp meta_baseline --epoch 150000 --data <DATASET>

# Iterative pruning (magnitude pruning)
python main.py --exp metaprune --epoch 30000 --pruner MP --amount 0.2 --data <DATASET>

Random Pruning

# Train dense model first
python main.py --exp meta_baseline --epoch 150000 --data <DATASET>

# Iterative pruning (random pruning)
python main.py --exp metaprune --epoch 30000 --pruner RP --amount 0.2 --data <DATASET>

Dense-Narrow

# Train dense model with a given width

# Shell script style
widthlist="230 206 184 164 148 132 118 106 94 84 76 68 60 54 48 44 38 34 32 28"
for width in $widthlist
do
    python main.py --exp meta_baseline --epoch 150000 --data <DATASET> --width $width --id width_$width
done

3. Evaluation

Evaluation option

The option for the training method is as follows:

  • <DATASET>: {celeba,sdf,imagenette}
  • <OPT_TYPE>: {default,two_step_sgd}, default denotes adam optimizer with 100 steps.

We assume all checkpoints are trained.

Meta-SparseINR (ours)

python eval.py --exp prune --pruner MP --data <DATASET> --opt_type <OPT_TYPE>

Baselines

# Random pruning
python eval.py --exp prune --pruner RP --data <DATASET> --opt_type <OPT_TYPE>

# Dense-Narrow
python eval.py --exp dense_narrow --data <DATASET> --opt_type <OPT_TYPE>

# MAML + One-Shot
python eval.py --exp one_shot --data <DATASET> --opt_type default

# MAML + IMP
python eval.py --exp imp --data <DATASET> --opt_type default

# Scratch
python eval.py --exp scratch --data <DATASET> --opt_type <OPT_TYPE>

4. Experimental Results

Citation

@inproceedings{lee2021meta,
  title={Meta-learning Sparse Implicit Neural Representations},
  author={Jaeho Lee and Jihoon Tack and Namhoon Lee and Jinwoo Shin},
  booktitle={Advances in Neural Information Processing Systems},
  year={2021}
}

Reference

Owner
Jaeho Lee
Postdoctoral researcher at KAIST.
Jaeho Lee
PyTorch implementation of the paper The Lottery Ticket Hypothesis for Object Recognition

LTH-ObjectRecognition The Lottery Ticket Hypothesis for Object Recognition Sharath Girish*, Shishira R Maiya*, Kamal Gupta, Hao Chen, Larry Davis, Abh

16 Feb 06, 2022
Python Multi-Agent Reinforcement Learning framework

- Please pay attention to the version of SC2 you are using for your experiments. - Performance is *not* always comparable between versions. - The re

whirl 1.3k Jan 05, 2023
CellRank's reproducibility repository.

CellRank's reproducibility repository We believe that reproducibility is key and have made it as simple as possible to reproduce our results. Please e

Theis Lab 8 Oct 08, 2022
The Power of Scale for Parameter-Efficient Prompt Tuning

The Power of Scale for Parameter-Efficient Prompt Tuning Implementation of soft embeddings from https://arxiv.org/abs/2104.08691v1 using Pytorch and H

Kip Parker 208 Dec 30, 2022
Some useful blender add-ons for SMPL skeleton's poses and global translation.

Blender add-ons for SMPL skeleton's poses and trans There are two blender add-ons for SMPL skeleton's poses and trans.The first is for making an offli

犹在镜中 154 Jan 04, 2023
Net2net - Network-to-Network Translation with Conditional Invertible Neural Networks

Net2Net Code accompanying the NeurIPS 2020 oral paper Network-to-Network Translation with Conditional Invertible Neural Networks Robin Rombach*, Patri

CompVis Heidelberg 206 Dec 20, 2022
Reviving Iterative Training with Mask Guidance for Interactive Segmentation

This repository provides the source code for training and testing state-of-the-art click-based interactive segmentation models with the official PyTorch implementation

Visual Understanding Lab @ Samsung AI Center Moscow 406 Jan 01, 2023
PyTorch implementations of the beta divergence loss.

Beta Divergence Loss - PyTorch Implementation This repository contains code for a PyTorch implementation of the beta divergence loss. Dependencies Thi

Billy Carson 7 Nov 09, 2022
NeuroGen: activation optimized image synthesis for discovery neuroscience

NeuroGen: activation optimized image synthesis for discovery neuroscience NeuroGen is a framework for synthesizing images that control brain activatio

3 Aug 17, 2022
Rotation Robust Descriptors

RoRD Rotation-Robust Descriptors and Orthographic Views for Local Feature Matching Project Page | Paper link Evaluation and Datasets MMA : Training on

Udit Singh Parihar 25 Nov 15, 2022
Pytorch implementation of BRECQ, ICLR 2021

BRECQ Pytorch implementation of BRECQ, ICLR 2021 @inproceedings{ li&gong2021brecq, title={BRECQ: Pushing the Limit of Post-Training Quantization by Bl

Yuhang Li 148 Dec 28, 2022
PyTorch implementation of Barlow Twins.

Barlow Twins: Self-Supervised Learning via Redundancy Reduction PyTorch implementation of Barlow Twins. @article{zbontar2021barlow, title={Barlow Tw

Facebook Research 839 Dec 29, 2022
PyTorch Implementation of PIXOR: Real-time 3D Object Detection from Point Clouds

PIXOR: Real-time 3D Object Detection from Point Clouds This is a custom implementation of the paper from Uber ATG using PyTorch 1.0. It represents the

Philip Huang 270 Dec 14, 2022
Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective

Does-MAML-Only-Work-via-Feature-Re-use-A-Data-Set-Centric-Perspective Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective Installin

2 Nov 07, 2022
TGRNet: A Table Graph Reconstruction Network for Table Structure Recognition

TGRNet: A Table Graph Reconstruction Network for Table Structure Recognition Xue, Wenyuan, et al. "TGRNet: A Table Graph Reconstruction Network for Ta

Wenyuan 68 Jan 04, 2023
Neural Motion Learner With Python

Neural Motion Learner Introduction This work is to extract skeletal structure from volumetric observations and to learn motion dynamics from the detec

Jinseok Bae 14 Nov 28, 2022
Exploring Cross-Image Pixel Contrast for Semantic Segmentation

Exploring Cross-Image Pixel Contrast for Semantic Segmentation Exploring Cross-Image Pixel Contrast for Semantic Segmentation, Wenguan Wang, Tianfei Z

Tianfei Zhou 510 Jan 02, 2023
Codes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.

TS-CAM: Token Semantic Coupled Attention Map for Weakly SupervisedObject Localization This is the official implementaion of paper TS-CAM: Token Semant

vasgaowei 112 Jan 02, 2023
This is the repository for paper NEEDLE: Towards Non-invertible Backdoor Attack to Deep Learning Models.

This is the repository for paper NEEDLE: Towards Non-invertible Backdoor Attack to Deep Learning Models.

1 Oct 25, 2021
VOS: Learning What You Don’t Know by Virtual Outlier Synthesis

VOS This is the source code accompanying the paper VOS: Learning What You Don’t

248 Dec 25, 2022