official code for dynamic convolution decomposition

Related tags

Deep Learningdcd
Overview

Revisiting Dynamic Convolution via Matrix Decomposition (ICLR 2021)

A pytorch implementation of DCD. If you use this code in your research please consider citing

@article{li2021revisiting, title={Revisiting Dynamic Convolution via Matrix Decomposition}, author={Li, Yunsheng and Chen, Yinpeng and Dai, Xiyang and Liu, Mengchen and Chen, Dongdong and Yu, Ye and Yuan, Lu and Liu, Zicheng and Chen, Mei and Vasconcelos, Nuno}, journal={arXiv preprint arXiv:2103.08756}, year={2021} }

Requirements

  • Hardware: PC with NVIDIA Titan GPU.
  • Software: Ubuntu 16.04, CUDA 10.0, Anaconda3, pytorch 1.0.0
  • Python package
    • conda install --quiet --yes pytorch==1.0.0 torchvision==0.2.1 cuda100 -c pytorch
    • pip install tensorboard tensorboardX pillow==6.1

Evaluate DCD on ImageNet

The pre-trained model can be downloaded here ResNet-50 and MobileNetV2x1.0

DCD for ResNet-50

python main.py -a resnet50_dcd -d /path/to/imagenet/ -b 256 -c /path/to/output -j 48 --input-size 224 --dropout 0.1 --weight /path/to/resnet50_dcd.pth.tar --evaluate

DCD for MobileNetV2x1.0

python main.py -a mobilenetv2_dcd -d /path/to/imagenet/ -b 512 -c /path/to/output --width-mult 1.0 -j 48 --input-size 224 --dropout 0.1 --fc-squeeze 16 --weight mv2x1.0_dcd.pth.tar --evaluate

Train DCD on ImageNet

DCD for ResNet-50

CUDA_VISIBLE_DEVICES=0,1,2,3 python main.py -a resnet50_dcd -d /path/to/imagenet/ -b 256 --epochs 120 --lr-decay schedule --lr 0.1 --wd 1e-4 -c /path/to/output -j 48 --input-size 224 --label-smoothing 0.1 --dropout 0.1 --mixup 0.2

DCD for MobileNetV2x1.0

CUDA_VISIBLE_DEVICES=0,1,2,3 python main.py -a mobilenetv2_dcd -d /path/to/imagenet/ --epochs 300 --lr-decay cos --lr 0.1 --wd 2e-5 -c /path/to/output --width-mult 1.0 -j 48 --input-size 224 --label-smoothing 0.1 --dropout 0.2 -b 512 --mixup 0.2 --fc-squeeze 16
Owner
Yunsheng Li
Yunsheng Li
Official code for "InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization" (ICLR 2020, spotlight)

InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization Authors: Fan-yun Sun, Jordan Hoffm

Fan-Yun Sun 232 Dec 28, 2022
A collection of pre-trained StyleGAN2 models trained on different datasets at different resolution.

Awesome Pretrained StyleGAN2 A collection of pre-trained StyleGAN2 models trained on different datasets at different resolution. Note the readme is a

Justin 1.1k Dec 24, 2022
This Repostory contains the pretrained DTLN-aec model for real-time acoustic echo cancellation.

This Repostory contains the pretrained DTLN-aec model for real-time acoustic echo cancellation.

Nils L. Westhausen 182 Jan 07, 2023
Deep Learning ❤️ OneFlow

Deep Learning with OneFlow made easy 🚀 ! Carefree? carefree-learn aims to provide CAREFREE usages for both users and developers. User Side Computer V

21 Oct 27, 2022
Data loaders and abstractions for text and NLP

torchtext This repository consists of: torchtext.datasets: The raw text iterators for common NLP datasets torchtext.data: Some basic NLP building bloc

3.2k Jan 08, 2023
Cross-platform-profile-pic-changer - Script to change profile pictures across multiple platforms

cross-platform-profile-pic-changer script to change profile pictures across mult

4 Jan 17, 2022
CasualHealthcare's Pneumonia detection with Artificial Intelligence (Convolutional Neural Network)

CasualHealthcare's Pneumonia detection with Artificial Intelligence (Convolutional Neural Network) This is PneumoniaDiagnose, an artificially intellig

Azhaan 2 Jan 03, 2022
:boar: :bear: Deep Learning based Python Library for Stock Market Prediction and Modelling

bulbea "Deep Learning based Python Library for Stock Market Prediction and Modelling." Table of Contents Installation Usage Documentation Dependencies

Achilles Rasquinha 1.8k Jan 05, 2023
Tool for working with Y-chromosome data from YFull and FTDNA

ycomp ycomp is a tool for working with Y-chromosome data from YFull and FTDNA. Run ycomp -h for information on how to use the program. Installation Th

Alexander Regueiro 2 Jun 18, 2022
Robust Self-augmentation for NER with Meta-reweighting

Robust Self-augmentation for NER with Meta-reweighting

Lam chi 17 Nov 22, 2022
.NET bindings for the Pytorch engine

TorchSharp TorchSharp is a .NET library that provides access to the library that powers PyTorch. It is a work in progress, but already provides a .NET

Matteo Interlandi 17 Aug 30, 2021
RAMA: Rapid algorithm for multicut problem

RAMA: Rapid algorithm for multicut problem Solves multicut (correlation clustering) problems orders of magnitude faster than CPU based solvers without

Paul Swoboda 60 Dec 13, 2022
[NeurIPS 2021] “Improving Contrastive Learning on Imbalanced Data via Open-World Sampling”,

Improving Contrastive Learning on Imbalanced Data via Open-World Sampling Introduction Contrastive learning approaches have achieved great success in

VITA 24 Dec 17, 2022
Predicting Semantic Map Representations from Images with Pyramid Occupancy Networks

This is the code associated with the paper Predicting Semantic Map Representations from Images with Pyramid Occupancy Networks, published at CVPR 2020.

Thomas Roddick 219 Dec 20, 2022
Official Code Release for "TIP-Adapter: Training-free clIP-Adapter for Better Vision-Language Modeling"

Official Code Release for "TIP-Adapter: Training-free clIP-Adapter for Better Vision-Language Modeling" Pipeline of Tip-Adapter Tip-Adapter can provid

peng gao 187 Dec 28, 2022
💃 VALSE: A Task-Independent Benchmark for Vision and Language Models Centered on Linguistic Phenomena

💃 VALSE: A Task-Independent Benchmark for Vision and Language Models Centered on Linguistic Phenomena.

Heidelberg-NLP 17 Nov 07, 2022
A non-linear, non-parametric Machine Learning method capable of modeling complex datasets

Fast Symbolic Regression Symbolic Regression is a non-linear, non-parametric Machine Learning method capable of modeling complex data sets. fastsr aim

VAMSHI CHOWDARY 3 Jun 22, 2022
Jittor implementation of Recursive-NeRF: An Efficient and Dynamically Growing NeRF

Recursive-NeRF: An Efficient and Dynamically Growing NeRF This is a Jittor implementation of Recursive-NeRF: An Efficient and Dynamically Growing NeRF

33 Nov 30, 2022
Bootstrapped Representation Learning on Graphs

Bootstrapped Representation Learning on Graphs This is the PyTorch implementation of BGRL Bootstrapped Representation Learning on Graphs The main scri

NerDS Lab :: Neural Data Science Lab 55 Jan 07, 2023
Compares various time-series feature sets on computational performance, within-set structure, and between-set relationships.

feature-set-comp Compares various time-series feature sets on computational performance, within-set structure, and between-set relationships. Reposito

Trent Henderson 7 May 25, 2022