Imitating Deep Learning Dynamics via Locally Elastic Stochastic Differential Equations

Related tags

Deep Learningle_sde
Overview

Imitating Deep Learning Dynamics via Locally Elastic Stochastic Differential Equations

This repo contains official code for the NeurIPS 2021 paper Imitating Deep Learning Dynamics via Locally Elastic Stochastic Differential Equations by Jiayao Zhang, Hua Wang, Weijie J. Su.

Discussions welcome, please submit via Discussions. You can also read the reviews on OpenReview.

@misc{zhang2021imitating,
      title={Imitating Deep Learning Dynamics via Locally Elastic Stochastic Differential Equations}, 
      author={Jiayao Zhang and Hua Wang and Weijie J. Su},
      year={2021},
      eprint={2110.05960},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Reproducing Experiments

Dependencies

We use Python 3.8 and pytorch for training neural nets, please use pip install -r requirements.txt (potentially in a virtual environment) to install dependencies.

Datasets

We use a dataset of geometric shapes (GeoMNIST) we constructed as well as CIFAR-10. GeoMNIST is lightweighted and will be generated when simulation runs; CIFAR-10 will be downloaded from torchvision.

Code Structure

After instsalling the dependencies, one may navigate through the two Jupyter notebooks for running experiments and producing plots and figures. Below we outline the code structure.

.
├── LICENSE                         # code license
├── README.md                       # this file
├── LE-SDE Data Analysis.ipynb      # reproducing plots and figures
├── LE-SDE Experiments.ipynb        # reproducing experiments
└── src                         # source code
    ├── data_analyzer.py            # processing experiment data
    ├── datasets.py                 # generating and loading datasets
    ├── models.py                   # definition of neural net models
    ├── plotter.py                  # generating plots and figures
    └── utils.py                    # utilities, including training pipelines
└── exp_data                    # experiment data
    ├── *.csv                       # dataframes from neural net training
    └── *.npy                       # numpy.ndarray storing LE-ODE simulations

More info regarding npy files can be found in the numpy documentation.

Reproducing Figures

Experiment Data

Although all simulations can be run on your machine, it is quite time-consuming. Data from our experiments can be downloaded from the following anonymous Dropbox links:

After downloading those tarballs, extract them into ./exp_data (or change the EXP_DIR variable in the notebooks accordingly).

Plotter

Once experiment data are ready, simply follow LE-SDE Data Analysis.ipynb for reproducing all figures.

Owner
Jiayao Zhang
Ph.D. Student at UPenn
Jiayao Zhang
A no-BS, dead-simple training visualizer for tf-keras

A no-BS, dead-simple training visualizer for tf-keras TrainingDashboard Plot inter-epoch and intra-epoch loss and metrics within a jupyter notebook wi

Vibhu Agrawal 3 May 28, 2021
TensorFlow Implementation of "Show, Attend and Tell"

Show, Attend and Tell Update (December 2, 2016) TensorFlow implementation of Show, Attend and Tell: Neural Image Caption Generation with Visual Attent

Yunjey Choi 902 Nov 29, 2022
Pgn2tex - Scripts to convert pgn files to latex document. Useful to build books or pdf from pgn studies

Pgn2Latex (WIP) A simple script to make pdf from pgn files and studies. It's sti

12 Jul 23, 2022
The Empirical Investigation of Representation Learning for Imitation (EIRLI)

The Empirical Investigation of Representation Learning for Imitation (EIRLI)

Center for Human-Compatible AI 31 Nov 06, 2022
House-GAN++: Generative Adversarial Layout Refinement Network towards Intelligent Computational Agent for Professional Architects

House-GAN++ Code and instructions for our paper: House-GAN++: Generative Adversarial Layout Refinement Network towards Intelligent Computational Agent

122 Dec 28, 2022
An open source object detection toolbox based on PyTorch

MMDetection is an open source object detection toolbox based on PyTorch. It is a part of the OpenMMLab project.

Bo Chen 24 Dec 28, 2022
This is the repository for our paper SimpleTrack: Understanding and Rethinking 3D Multi-object Tracking

SimpleTrack This is the repository for our paper SimpleTrack: Understanding and Rethinking 3D Multi-object Tracking. We are still working on writing t

TuSimple 189 Dec 26, 2022
AgeGuesser: deep learning based age estimation system. Powered by EfficientNet and Yolov5

AgeGuesser AgeGuesser is an end-to-end, deep-learning based Age Estimation system, presented at the CAIP 2021 conference. You can find the related pap

5 Nov 10, 2022
YOLOv5 + ROS2 object detection package

YOLOv5-ROS YOLOv5 + ROS2 object detection package This program changes the input of detect.py (ultralytics/yolov5) to sensor_msgs/Image of ROS2. Requi

Ar-Ray 23 Dec 19, 2022
Laplacian Score-regularized Concrete Autoencoders

Laplacian Score-regularized Concrete Autoencoders Requirements: torch = 1.9 scikit-learn = 0.24 omegaconf = 2.0.6 scipy = 1.6.0 matplotlib How to

JS 6 Dec 07, 2022
A Decentralized Omnidirectional Visual-Inertial-UWB State Estimation System for Aerial Swar.

Omni-swarm A Decentralized Omnidirectional Visual-Inertial-UWB State Estimation System for Aerial Swarm Introduction Omni-swarm is a decentralized omn

HKUST Aerial Robotics Group 99 Dec 23, 2022
Code for "NeRS: Neural Reflectance Surfaces for Sparse-View 3D Reconstruction in the Wild," in NeurIPS 2021

Code for Neural Reflectance Surfaces (NeRS) [arXiv] [Project Page] [Colab Demo] [Bibtex] This repo contains the code for NeRS: Neural Reflectance Surf

Jason Y. Zhang 234 Dec 30, 2022
Compares various time-series feature sets on computational performance, within-set structure, and between-set relationships.

feature-set-comp Compares various time-series feature sets on computational performance, within-set structure, and between-set relationships. Reposito

Trent Henderson 7 May 25, 2022
Text Summarization - WCN — Weighted Contextual N-gram method for evaluation of Text Summarization

Text Summarization WCN — Weighted Contextual N-gram method for evaluation of Text Summarization In this project, I fine tune T5 model on Extreme Summa

Aditya Shah 1 Jan 03, 2022
Original Implementation of Prompt Tuning from Lester, et al, 2021

Prompt Tuning This is the code to reproduce the experiments from the EMNLP 2021 paper "The Power of Scale for Parameter-Efficient Prompt Tuning" (Lest

Google Research 282 Dec 28, 2022
Orchestrating Distributed Materials Acceleration Platform Tutorial

Orchestrating Distributed Materials Acceleration Platform Tutorial This tutorial for orchestrating distributed materials acceleration platform was pre

BIG-MAP 1 Jan 25, 2022
Bayesian algorithm execution (BAX)

Bayesian Algorithm Execution (BAX) Code for the paper: Bayesian Algorithm Execution: Estimating Computable Properties of Black-box Functions Using Mut

Willie Neiswanger 38 Dec 08, 2022
HybVIO visual-inertial odometry and SLAM system

HybVIO A visual-inertial odometry system with an optional SLAM module. This is a research-oriented codebase, which has been published for the purposes

Spectacular AI 320 Jan 03, 2023
TensorFlow, PyTorch and Numpy layers for generating Orthogonal Polynomials

OrthNet TensorFlow, PyTorch and Numpy layers for generating multi-dimensional Orthogonal Polynomials 1. Installation 2. Usage 3. Polynomials 4. Base C

Chuan 29 May 25, 2022
The PyTorch re-implement of a 3D CNN Tracker to extract coronary artery centerlines with state-of-the-art (SOTA) performance. (paper: 'Coronary artery centerline extraction in cardiac CT angiography using a CNN-based orientation classifier')

The PyTorch re-implement of a 3D CNN Tracker to extract coronary artery centerlines with state-of-the-art (SOTA) performance. (paper: 'Coronary artery centerline extraction in cardiac CT angiography

James 135 Dec 23, 2022