Collection of generative models in Tensorflow

Overview

tensorflow-generative-model-collections

Tensorflow implementation of various GANs and VAEs.

Related Repositories

Pytorch version

Pytorch version of this repository is availabel at https://github.com/znxlwm/pytorch-generative-model-collections

"Are GANs Created Equal? A Large-Scale Study" Paper

https://github.com/google/compare_gan is the code that was used in the paper.
It provides IS/FID and rich experimental results for all gan-variants.

Generative Adversarial Networks (GANs)

Lists

Name Paper Link Value Function
GAN Arxiv
LSGAN Arxiv
WGAN Arxiv
WGAN_GP Arxiv
DRAGAN Arxiv
CGAN Arxiv
infoGAN Arxiv
ACGAN Arxiv
EBGAN Arxiv
BEGAN Arxiv

Variants of GAN structure

Results for mnist

Network architecture of generator and discriminator is the exaclty sames as in infoGAN paper.
For fair comparison of core ideas in all gan variants, all implementations for network architecture are kept same except EBGAN and BEGAN. Small modification is made for EBGAN/BEGAN, since those adopt auto-encoder strucutre for discriminator. But I tried to keep the capacity of discirminator.

The following results can be reproduced with command:

python main.py --dataset mnist --gan_type 
   
     --epoch 25 --batch_size 64

   

Random generation

All results are randomly sampled.

Name Epoch 2 Epoch 10 Epoch 25
GAN
LSGAN
WGAN
WGAN_GP
DRAGAN
EBGAN
BEGAN

Conditional generation

Each row has the same noise vector and each column has the same label condition.

Name Epoch 1 Epoch 10 Epoch 25
CGAN
ACGAN
infoGAN

InfoGAN : Manipulating two continous codes

Results for fashion-mnist

Comments on network architecture in mnist are also applied to here.
Fashion-mnist is a recently proposed dataset consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes. (T-shirt/top, Trouser, Pullover, Dress, Coat, Sandal, Shirt, Sneaker, Bag, Ankle boot)

The following results can be reproduced with command:

python main.py --dataset fashion-mnist --gan_type 
   
     --epoch 40 --batch_size 64

   

Random generation

All results are randomly sampled.

Name Epoch 1 Epoch 20 Epoch 40
GAN
LSGAN
WGAN
WGAN_GP
DRAGAN
EBGAN
BEGAN

Conditional generation

Each row has the same noise vector and each column has the same label condition.

Name Epoch 1 Epoch 20 Epoch 40
CGAN
ACGAN
infoGAN

Without hyper-parameter tuning from mnist-version, ACGAN/infoGAN does not work well as compared with CGAN.
ACGAN tends to fall into mode-collapse.
infoGAN tends to ignore noise-vector. It results in that various style within the same class can not be represented.

InfoGAN : Manipulating two continous codes

Some results for celebA

(to be added)

Variational Auto-Encoders (VAEs)

Lists

Name Paper Link Loss Function
VAE Arxiv
CVAE Arxiv
DVAE Arxiv (to be added)
AAE Arxiv (to be added)

Variants of VAE structure

Results for mnist

Network architecture of decoder(generator) and encoder(discriminator) is the exaclty sames as in infoGAN paper. The number of output nodes in encoder is different. (2x z_dim for VAE, 1 for GAN)

The following results can be reproduced with command:

python main.py --dataset mnist --gan_type 
   
     --epoch 25 --batch_size 64

   

Random generation

All results are randomly sampled.

Name Epoch 1 Epoch 10 Epoch 25
VAE
GAN

Results of GAN is also given to compare images generated from VAE and GAN. The main difference (VAE generates smooth and blurry images, otherwise GAN generates sharp and artifact images) is cleary observed from the results.

Conditional generation

Each row has the same noise vector and each column has the same label condition.

Name Epoch 1 Epoch 10 Epoch 25
CVAE
CGAN

Results of CGAN is also given to compare images generated from CVAE and CGAN.

Learned manifold

The following results can be reproduced with command:

python main.py --dataset mnist --gan_type VAE --epoch 25 --batch_size 64 --dim_z 2

Please notice that dimension of noise-vector z is 2.

Name Epoch 1 Epoch 10 Epoch 25
VAE

Results for fashion-mnist

Comments on network architecture in mnist are also applied to here.

The following results can be reproduced with command:

python main.py --dataset fashion-mnist --gan_type 
   
     --epoch 40 --batch_size 64

   

Random generation

All results are randomly sampled.

Name Epoch 1 Epoch 20 Epoch 40
VAE
GAN

Results of GAN is also given to compare images generated from VAE and GAN.

Conditional generation

Each row has the same noise vector and each column has the same label condition.

Name Epoch 1 Epoch 20 Epoch 40
CVAE
CGAN

Results of CGAN is also given to compare images generated from CVAE and CGAN.

Learned manifold

The following results can be reproduced with command:

python main.py --dataset fashion-mnist --gan_type VAE --epoch 25 --batch_size 64 --dim_z 2

Please notice that dimension of noise-vector z is 2.

Name Epoch 1 Epoch 10 Epoch 25
VAE

Results for celebA

(to be added)

Folder structure

The following shows basic folder structure.

├── main.py # gateway
├── data
│   ├── mnist # mnist data (not included in this repo)
│   |   ├── t10k-images-idx3-ubyte.gz
│   |   ├── t10k-labels-idx1-ubyte.gz
│   |   ├── train-images-idx3-ubyte.gz
│   |   └── train-labels-idx1-ubyte.gz
│   └── fashion-mnist # fashion-mnist data (not included in this repo)
│       ├── t10k-images-idx3-ubyte.gz
│       ├── t10k-labels-idx1-ubyte.gz
│       ├── train-images-idx3-ubyte.gz
│       └── train-labels-idx1-ubyte.gz
├── GAN.py # vanilla GAN
├── ops.py # some operations on layer
├── utils.py # utils
├── logs # log files for tensorboard to be saved here
└── checkpoint # model files to be saved here

Acknowledgements

This implementation has been based on this repository and tested with Tensorflow over ver1.0 on Windows 10 and Ubuntu 14.04.

Implementation of fast algorithms for Maximum Spanning Tree (MST) parsing that includes fast ArcMax+Reweighting+Tarjan algorithm for single-root dependency parsing.

Fast MST Algorithm Implementation of fast algorithms for (Maximum Spanning Tree) MST parsing that includes fast ArcMax+Reweighting+Tarjan algorithm fo

Miloš Stanojević 11 Oct 14, 2022
Configure SRX interfaces with Scrapli

Configure SRX interfaces with Scrapli Overview This example will show how to configure interfaces on Juniper's SRX firewalls. In addition to the Pytho

Calvin Remsburg 1 Jan 07, 2022
On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification

Understanding Bayesian Classification This repository hosts the code to reproduce the results presented in the paper On Uncertainty, Tempering, and Da

Sanyam Kapoor 18 Nov 17, 2022
The Noise Contrastive Estimation for softmax output written in Pytorch

An NCE implementation in pytorch About NCE Noise Contrastive Estimation (NCE) is an approximation method that is used to work around the huge computat

Kaiyu Shi 287 Nov 25, 2022
ESGD-M - A stochastic non-convex second order optimizer, suitable for training deep learning models, for PyTorch

ESGD-M - A stochastic non-convex second order optimizer, suitable for training deep learning models, for PyTorch

Katherine Crowson 53 Dec 29, 2022
Iranian Cars Detection using Yolov5s, PyTorch

Iranian Cars Detection using Yolov5 Train 1- git clone https://github.com/ultralytics/yolov5 cd yolov5 pip install -r requirements.txt 2- Dataset ../

Nahid Ebrahimian 22 Dec 05, 2022
Leveraging Social Influence based on Users Activity Centers for Point-of-Interest Recommendation

SUCP Leveraging Social Influence based on Users Activity Centers for Point-of-Interest Recommendation () Direct Friends (i.e., users who follow each o

Kosar 8 Nov 26, 2022
BMN: Boundary-Matching Network

BMN: Boundary-Matching Network A pytorch-version implementation codes of paper: "BMN: Boundary-Matching Network for Temporal Action Proposal Generatio

qinxin 260 Dec 06, 2022
Improving the robustness and performance of biomedical NLP models through adversarial training

RobustBioNLP Improving the robustness and performance of biomedical NLP models through adversarial training In this repository you can find suppliment

Milad Moradi 3 Sep 20, 2022
Warning: This project does not have any current developer. See bellow.

Pylearn2: A machine learning research library Warning : This project does not have any current developer. We will continue to review pull requests and

Laboratoire d’Informatique des Systèmes Adaptatifs 2.7k Dec 26, 2022
Make your master artistic punk avatar through machine learning world famous paintings.

Master-art-punk Make your master artistic punk avatar through machine learning world famous paintings. 通过机器学习世界名画制作属于你的大师级艺术朋克头像 Nowadays, NFT is beco

Philipjhc 53 Dec 27, 2022
Official repository of OFA. Paper: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework

Paper | Blog OFA is a unified multimodal pretrained model that unifies modalities (i.e., cross-modality, vision, language) and tasks (e.g., image gene

OFA Sys 1.4k Jan 08, 2023
An improvement of FasterGICP: Acceptance-rejection Sampling based 3D Lidar Odometry

fasterGICP This package is an improvement of fast_gicp Please cite our paper if possible. W. Jikai, M. Xu, F. Farzin, D. Dai and Z. Chen, "FasterGICP:

79 Dec 31, 2022
This code provides a PyTorch implementation for OTTER (Optimal Transport distillation for Efficient zero-shot Recognition), as described in the paper.

Data Efficient Language-Supervised Zero-Shot Recognition with Optimal Transport Distillation This repository contains PyTorch evaluation code, trainin

Meta Research 45 Dec 20, 2022
PIKA: a lightweight speech processing toolkit based on Pytorch and (Py)Kaldi

PIKA: a lightweight speech processing toolkit based on Pytorch and (Py)Kaldi PIKA is a lightweight speech processing toolkit based on Pytorch and (Py)

336 Nov 25, 2022
PICK: Processing Key Information Extraction from Documents using Improved Graph Learning-Convolutional Networks

Code for the paper "PICK: Processing Key Information Extraction from Documents using Improved Graph Learning-Convolutional Networks" (ICPR 2020)

Wenwen Yu 498 Dec 24, 2022
JupyterNotebook - C/C++, Javascript, HTML, LaTex, Shell scripts in Jupyter Notebook Also run them on remote computer

JupyterNotebook Read, write and execute C, C++, Javascript, Shell scripts, HTML, LaTex in jupyter notebook, And also execute them on remote computer R

1 Jan 09, 2022
PyTorch DepthNet Training on Still Box dataset

DepthNet training on Still Box Project page This code can replicate the results of our paper that was published in UAVg-17. If you use this repo in yo

Clément Pinard 115 Nov 21, 2022
discovering subdomains, hidden paths, extracting unique links

python-website-crawler discovering subdomains, hidden paths, extracting unique links pip install -r requirements.txt discover subdomain: You can give

merve 4 Sep 05, 2022
Collection of generative models in Tensorflow

tensorflow-generative-model-collections Tensorflow implementation of various GANs and VAEs. Related Repositories Pytorch version Pytorch version of th

3.8k Dec 30, 2022