Codes for CyGen, the novel generative modeling framework proposed in "On the Generative Utility of Cyclic Conditionals" (NeurIPS-21)

Overview

On the Generative Utility of Cyclic Conditionals

This repository is the official implementation of "On the Generative Utility of Cyclic Conditionals" (NeurIPS 2021).

Chang Liu <[email protected]>, Haoyue Tang, Tao Qin, Jintao Wang, Tie-Yan Liu.
[Paper & Appendix] [Slides] [Video] [Poster]

Introduction

graphical summary

Whether and how can two conditional models p(x|z) and q(z|x) that form a cycle uniquely determine a joint distribution p(x,z)? We develop a general theory for this question, including criteria for the two conditionals to correspond to a common joint (compatibility) and for such joint to be unique (determinacy). As in generative models we need a generator (decoder/likelihood model) and also an encoder (inference model) for representation, the theory indicates they could already define a generative model p(x,z) without specifying a prior distribution p(z)! We call this novel generative modeling framework as CyGen, and develop methods to achieve the eligibility (compatibility and determinacy) and the usage (fitting and generating data) as a generative model.

This codebase implements these CyGen methods, and various baseline methods. The model architectures are based on the Sylvester flow (Householder version), and the experiment environments/setups follow FFJORD. Authorship is clarified in each file.

Requirements

The code requires python version >= 3.6, and is based on PyTorch. To install requirements:

pip install -r requirements.txt

Usage

Run the run_toy.sh and run_image.sh scripts for the synthetic and real-world (i.e. MNIST and SVHN) experiments. See the commands in the script files or python3 main_[toy|image].py --help for customized usage or hyperparameter tuning.

For the real-world experiments, downstream classification accuracy is evaluated along training. To evaluate the FID score, run the command python3 compute_gen_fid.py --load_dict=<path_to_model.pth>.

Results

CyGen synthetic results

As a trailer, we show the synthetic results here. We see that CyGen achieves both high-quality data generation, and well-separated latent clusters (useful representation). This is due to the removal of a specified prior distribution so that the manifold mismatch and posterior collapse problems are avoided. DAE (denoising auto-encoder) does not need a prior, but its training method hurts determinacy. If pretrained as a VAE (i.e. CyGen(PT)), we see that the knowledge of a centered and centrosymmetric prior is encoded through the conditional models. See the paper for more results.

Owner
Chang Liu
Senior Researcher @ MSR Asia. Ph.D. from Tsinghua University. Statistical Machine Learning, Bayesian Inference, Generative Models
Chang Liu
ICCV2021 Oral SA-ConvONet: Sign-Agnostic Optimization of Convolutional Occupancy Networks

Sign-Agnostic Convolutional Occupancy Networks Paper | Supplementary | Video | Teaser Video | Project Page This repository contains the implementation

63 Nov 18, 2022
Doing the asl sign language classification on static images using graph neural networks.

SignLangGNN When GNNs 💜 MediaPipe. This is a starter project where I tried to implement some traditional image classification problem i.e. the ASL si

10 Nov 09, 2022
Semantic Segmentation for Aerial Imagery using Convolutional Neural Network

This repo has been deprecated because whole things are re-implemented by using Chainer and I did refactoring for many codes. So please check this newe

Shunta Saito 27 Sep 23, 2022
Self-Supervised Learning

Self-Supervised Learning Features self_supervised offers features like modular framework support for multi-gpu training using PyTorch Lightning easy t

Robin 1 Dec 14, 2021
Digital Twin Mobility Profiling: A Spatio-Temporal Graph Learning Approach

Digital Twin Mobility Profiling: A Spatio-Temporal Graph Learning Approach This is the implementation of traffic prediction code in DTMP based on PyTo

chenxin 1 Dec 19, 2021
Code accompanying the paper on "An Empirical Investigation of Domain Generalization with Empirical Risk Minimizers" published at NeurIPS, 2021

Code for "An Empirical Investigation of Domian Generalization with Empirical Risk Minimizers" (NeurIPS 2021) Motivation and Introduction Domain Genera

Meta Research 15 Dec 27, 2022
This is the official repository of the paper Stocastic bandits with groups of similar arms (NeurIPS 2021). It contains the code that was used to compute the figures and experiments of the paper.

Experiments How to reproduce experimental results of Stochastic bandits with groups of similar arms submitted paper ? Section 5 of the paper To reprod

Fabien 0 Oct 25, 2021
[ICLR 2021] "CPT: Efficient Deep Neural Network Training via Cyclic Precision" by Yonggan Fu, Han Guo, Meng Li, Xin Yang, Yining Ding, Vikas Chandra, Yingyan Lin

CPT: Efficient Deep Neural Network Training via Cyclic Precision Yonggan Fu, Han Guo, Meng Li, Xin Yang, Yining Ding, Vikas Chandra, Yingyan Lin Accep

26 Oct 25, 2022
Federated Learning Based on Dynamic Regularization

Federated Learning Based on Dynamic Regularization This is implementation of Federated Learning Based on Dynamic Regularization. Requirements Please i

39 Jan 07, 2023
RoboDesk A Multi-Task Reinforcement Learning Benchmark

RoboDesk A Multi-Task Reinforcement Learning Benchmark If you find this open source release useful, please reference in your paper: @misc{kannan2021ro

Google Research 66 Oct 07, 2022
Official PaddlePaddle implementation of Paint Transformer

Paint Transformer: Feed Forward Neural Painting with Stroke Prediction [Paper] [Paddle Implementation] Update We have optimized the serial inference p

TianweiLin 284 Dec 31, 2022
Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth [Paper]

Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth [Paper] Downloads [Downloads] Trained ckpt files for NYU Depth V2 and

98 Jan 01, 2023
DANet for Tabular data classification/ regression.

Deep Abstract Networks A PyTorch code implemented for the submission DANets: Deep Abstract Networks for Tabular Data Classification and Regression. Do

Ronnie Rocket 55 Sep 14, 2022
PyTorch implementation of MulMON

MulMON This repository contains a PyTorch implementation of the paper: Learning Object-Centric Representations of Multi-object Scenes from Multiple Vi

NanboLi 16 Nov 03, 2022
Supervised domain-agnostic prediction framework for probabilistic modelling

A supervised domain-agnostic framework that allows for probabilistic modelling, namely the prediction of probability distributions for individual data

The Alan Turing Institute 112 Oct 23, 2022
a Pytorch easy re-implement of "YOLOX: Exceeding YOLO Series in 2021"

A pytorch easy re-implement of "YOLOX: Exceeding YOLO Series in 2021" 1. Notes This is a pytorch easy re-implement of "YOLOX: Exceeding YOLO Series in

91 Dec 26, 2022
Authors implementation of LieTransformer: Equivariant Self-Attention for Lie Groups

LieTransformer This repository contains the implementation of the LieTransformer used for experiments in the paper LieTransformer: Equivariant self-at

35 Oct 18, 2022
Neural network for digit classification powered by cuda

cuda_nn_mnist Neural network library for digit classification powered by cuda Resources The library was built to work with MNIST dataset. python-mnist

Nikita Ardashev 1 Dec 20, 2021
Official Repsoitory for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]

Mish: Self Regularized Non-Monotonic Activation Function BMVC 2020 (Official Paper) Notes: (Click to expand) A considerably faster version based on CU

Xa9aX ツ 1.2k Dec 29, 2022
Bayesian Neural Networks in PyTorch

We present the new scheme to compute Monte Carlo estimator in Bayesian VI settings with almost no memory cost in GPU, regardles of the number of sampl

Jurijs Nazarovs 7 May 03, 2022