Collection of generative models in Pytorch version.

Overview

pytorch-generative-model-collections

Original : [Tensorflow version]

Pytorch implementation of various GANs.

This repository was re-implemented with reference to tensorflow-generative-model-collections by Hwalsuk Lee

I tried to implement this repository as much as possible with tensorflow-generative-model-collections, But some models are a little different.

This repository is included code for CPU mode Pytorch, but i did not test. I tested only in GPU mode Pytorch.

Dataset

  • MNIST
  • Fashion-MNIST
  • CIFAR10
  • SVHN
  • STL10
  • LSUN-bed

I only tested the code on MNIST and Fashion-MNIST.

Generative Adversarial Networks (GANs)

Lists (Table is borrowed from tensorflow-generative-model-collections)

Name Paper Link Value Function
GAN Arxiv
LSGAN Arxiv
WGAN Arxiv
WGAN_GP Arxiv
DRAGAN Arxiv
CGAN Arxiv
infoGAN Arxiv
ACGAN Arxiv
EBGAN Arxiv
BEGAN Arxiv

Variants of GAN structure (Figures are borrowed from tensorflow-generative-model-collections)

Results for mnist

Network architecture of generator and discriminator is the exaclty sames as in infoGAN paper.
For fair comparison of core ideas in all gan variants, all implementations for network architecture are kept same except EBGAN and BEGAN. Small modification is made for EBGAN/BEGAN, since those adopt auto-encoder strucutre for discriminator. But I tried to keep the capacity of discirminator.

The following results can be reproduced with command:

python main.py --dataset mnist --gan_type <TYPE> --epoch 50 --batch_size 64

Fixed generation

All results are generated from the fixed noise vector.

Name Epoch 1 Epoch 25 Epoch 50 GIF
GAN
LSGAN
WGAN
WGAN_GP
DRAGAN
EBGAN
BEGAN

Conditional generation

Each row has the same noise vector and each column has the same label condition.

Name Epoch 1 Epoch 25 Epoch 50 GIF
CGAN
ACGAN
infoGAN

InfoGAN : Manipulating two continous codes

All results have the same noise vector and label condition, but have different continous vector.

Name Epoch 1 Epoch 25 Epoch 50 GIF
infoGAN

Loss plot

Name Loss
GAN
LSGAN
WGAN
WGAN_GP
DRAGAN
EBGAN
BEGAN
CGAN
ACGAN
infoGAN

Results for fashion-mnist

Comments on network architecture in mnist are also applied to here.
Fashion-mnist is a recently proposed dataset consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes. (T-shirt/top, Trouser, Pullover, Dress, Coat, Sandal, Shirt, Sneaker, Bag, Ankle boot)

The following results can be reproduced with command:

python main.py --dataset fashion-mnist --gan_type <TYPE> --epoch 50 --batch_size 64

Fixed generation

All results are generated from the fixed noise vector.

Name Epoch 1 Epoch 25 Epoch 50 GIF
GAN
LSGAN
WGAN
WGAN_GP
DRAGAN
EBGAN
BEGAN

Conditional generation

Each row has the same noise vector and each column has the same label condition.

Name Epoch 1 Epoch 25 Epoch 50 GIF
CGAN
ACGAN
infoGAN

InfoGAN : Manipulating two continous codes

All results have the same noise vector and label condition, but have different continous vector.

Name Epoch 1 Epoch 25 Epoch 50 GIF
infoGAN

Loss plot

Name Loss
GAN
LSGAN
WGAN
WGAN_GP
DRAGAN
EBGAN
BEGAN
CGAN
ACGAN
infoGAN

Folder structure

The following shows basic folder structure.

├── main.py # gateway
├── data
│   ├── mnist # mnist data (not included in this repo)
│   ├── ...
│   ├── ...
│   └── fashion-mnist # fashion-mnist data (not included in this repo)
│
├── GAN.py # vainilla GAN
├── utils.py # utils
├── dataloader.py # dataloader
├── models # model files to be saved here
└── results # generation results to be saved here

Development Environment

  • Ubuntu 16.04 LTS
  • NVIDIA GTX 1080 ti
  • cuda 9.0
  • Python 3.5.2
  • pytorch 0.4.0
  • torchvision 0.2.1
  • numpy 1.14.3
  • matplotlib 2.2.2
  • imageio 2.3.0
  • scipy 1.1.0

Acknowledgements

This implementation has been based on tensorflow-generative-model-collections and tested with Pytorch 0.4.0 on Ubuntu 16.04 using GPU.

Owner
Hyeonwoo Kang
Hyeonwoo Kang
Continual reinforcement learning baselines: experiment specifications, implementation of existing methods, and common metrics. Easily extensible to new methods.

Continual Reinforcement Learning This repository provides a simple way to run continual reinforcement learning experiments in PyTorch, including evalu

55 Dec 24, 2022
Stable Neural ODE with Lyapunov-Stable Equilibrium Points for Defending Against Adversarial Attacks

Stable Neural ODE with Lyapunov-Stable Equilibrium Points for Defending Against Adversarial Attacks Stable Neural ODE with Lyapunov-Stable Equilibrium

Kang Qiyu 8 Dec 12, 2022
This repository contains the code used to quantitatively evaluate counterfactual examples in the associated paper.

On Quantitative Evaluations of Counterfactuals Install To install required packages with conda, run the following command: conda env create -f requi

Frederik Hvilshøj 1 Jan 16, 2022
A Genetic Programming platform for Python with TensorFlow for wicked-fast CPU and GPU support.

Karoo GP Karoo GP is an evolutionary algorithm, a genetic programming application suite written in Python which supports both symbolic regression and

Kai Staats 149 Jan 09, 2023
Pytorch implementation of XRD spectral identification from COD database

XRDidentifier Pytorch implementation of XRD spectral identification from COD database. Details will be explained in the paper to be submitted to NeurI

Masaki Adachi 4 Jan 07, 2023
Pytorch Implementation of Auto-Compressing Subset Pruning for Semantic Image Segmentation

Pytorch Implementation of Auto-Compressing Subset Pruning for Semantic Image Segmentation Introduction ACoSP is an online pruning algorithm that compr

Merantix 8 Dec 07, 2022
A PyTorch toolkit for 2D Human Pose Estimation.

PyTorch-Pose PyTorch-Pose is a PyTorch implementation of the general pipeline for 2D single human pose estimation. The aim is to provide the interface

Wei Yang 1.1k Dec 30, 2022
A scanpy extension to analyse single-cell TCR and BCR data.

Scirpy: A Scanpy extension for analyzing single-cell immune-cell receptor sequencing data Scirpy is a scalable python-toolkit to analyse T cell recept

ICBI 145 Jan 03, 2023
Additional functionality for use with fastai’s medical imaging module

fmi Adding additional functionality to fastai's medical imaging module To learn more about medical imaging using Fastai you can view my blog Install g

14 Oct 31, 2022
3D AffordanceNet is a 3D point cloud benchmark consisting of 23k shapes from 23 semantic object categories, annotated with 56k affordance annotations and covering 18 visual affordance categories.

3D AffordanceNet This repository is the official experiment implementation of 3D AffordanceNet benchmark. 3D AffordanceNet is a 3D point cloud benchma

49 Dec 01, 2022
Code for Parameter Prediction for Unseen Deep Architectures (NeurIPS 2021)

Parameter Prediction for Unseen Deep Architectures (NeurIPS 2021) authors: Boris Knyazev, Michal Drozdzal, Graham Taylor, Adriana Romero-Soriano Overv

Facebook Research 462 Jan 03, 2023
A large-scale benchmark for co-optimizing the design and control of soft robots, as seen in NeurIPS 2021.

Evolution Gym A large-scale benchmark for co-optimizing the design and control of soft robots. As seen in Evolution Gym: A Large-Scale Benchmark for E

121 Dec 14, 2022
Plover-tapey-tape: an alternative to Plover’s built-in paper tape

plover-tapey-tape plover-tapey-tape is an alternative to Plover’s built-in paper

7 May 29, 2022
Easy to use Audio Tagging in PyTorch

Audio Classification, Tagging & Sound Event Detection in PyTorch Progress: Fine-tune on audio classification Fine-tune on audio tagging Fine-tune on s

sithu3 15 Dec 22, 2022
We propose a new method for effective shadow removal by regarding it as an exposure fusion problem.

Auto-exposure fusion for single-image shadow removal We propose a new method for effective shadow removal by regarding it as an exposure fusion proble

Qing Guo 146 Dec 31, 2022
A web-based application for quick, scalable, and automated hyperparameter tuning and stacked ensembling in Python.

Xcessiv Xcessiv is a tool to help you create the biggest, craziest, and most excessive stacked ensembles you can think of. Stacked ensembles are simpl

Reiichiro Nakano 1.3k Nov 17, 2022
Flow is a computational framework for deep RL and control experiments for traffic microsimulation.

Flow Flow is a computational framework for deep RL and control experiments for traffic microsimulation. See our website for more information on the ap

867 Jan 02, 2023
Disease Informed Neural Networks (DINNs) — neural networks capable of learning how diseases spread, forecasting their progression, and finding their unique parameters (e.g. death rate).

DINN We introduce Disease Informed Neural Networks (DINNs) — neural networks capable of learning how diseases spread, forecasting their progression, a

19 Dec 10, 2022
[ICCV 2021] Self-supervised Monocular Depth Estimation for All Day Images using Domain Separation

ADDS-DepthNet This is the official implementation of the paper Self-supervised Monocular Depth Estimation for All Day Images using Domain Separation I

LIU_LINA 52 Nov 24, 2022
A Python library for common tasks on 3D point clouds

Point Cloud Utils (pcu) - A Python library for common tasks on 3D point clouds Point Cloud Utils (pcu) is a utility library providing the following fu

Francis Williams 622 Dec 27, 2022