Born-Infeld (BI) for AI: Energy-Conserving Descent (ECD) for Optimization

Related tags

Deep LearningBBI
Overview

Born-Infeld (BI) for AI: Energy-Conserving Descent (ECD) for Optimization

This repository contains the code for the BBI optimizer, introduced in the paper Born-Infeld (BI) for AI: Energy-Conserving Descent (ECD) for Optimization. 2201.11137. It is implemented using Pytorch.

The repository also includes the code needed to reproduce all the experiments presented in the paper. In particular:

  • The BBI optimizer is implemented in the file inflation.py.

  • The jupyter notebooks with the synthetic experiments are in the folder synthetic. All the notebooks already include the output, and text files with results are also included in the folder. In particular

    • The notebook ackley.ipynb can be used to reproduce the results in Sec. 4.1.
    • The notebook zakharov.ipynb can be used to reproduce the results in Sec. 4.2.
    • The notebook multi_basin.ipynb can be used to reproduce the results in Sec. 4.3.
  • The ML benchmarks described in Sec. 4.5 can be found in the folders CIFAR and MNIST. The notebooks already include some results that can be inspected, but not all the statistics that builds up the results in Table 2. In particular:

    • CIFAR : The notebook CIFAR-notebook.ipynb uses hyperopt to estimate the best hyperparameters for each optimizer and then runs a long run with the best estimated hyperparamers. The results can be analyzed with the notebook analysis-cifar.ipynb, which can also be used to generate more runs with the best hyperparameters to gather more statistics. The subfolder results already includes some runs that can be inspected.

    • MNIST: The notebooks mnist_scan_BBI.ipynb and mnist_scan_SGD.ipynb perform a grid scan using BBI and SGD, respectively and gather some small statistics. All the results are within the notebooks themselves.

  • The PDE experiments can be run by running the script script-PDE.sh as

    bash script-PDE.sh
    

    This will solve the PDE outlined in Sec. 4.4 and App. C multiple times with the same initialization. The hyperparameters are also kept fixed and can be obtained from the script itself. In particular:

    • feature 1 means that an L2 regularization is added to the loss.
    • seed specifies the seed, which fixes the initialization of the network. The difference between the different runs then is only due to the random bounces, which are not affected by this choice of the seed.

    The folder results already includes some runs. The runs performed in this way are not noisy, i.e. the set of points sampled from the domain is kept fixed. To randomly change the points every "epoch" (1000 iterations), edit the file experiments/PDE_PoissonD.py by changing line 134 to self.update_points = True.

The code has been tested with Python 3.9, Pytorch 1.10, hyperopt 0.2.5. We ran the synthetic experiments and MNIST on a six-core i7-9850H CPU with 16 GB of RAM, while we ran the CIFAR and PDE experiments on a pair of GPUs. We tested both on a pair of NVIDIA GeForce RTX 2080 Ti and on a pair of NVIDIA Tesla V100-SXM2-16GB GPUs, coupled with 32 GB of RAM and AMD EPYC 7502P CPUs.

The Resnet-18 code (in experiments/models) and the utils.py helper functions are adapted from https://github.com/kuangliu/pytorch-cifar (MIT License).

Owner
G. Bruno De Luca
G. Bruno De Luca
Answer a series of contextually-dependent questions like they may occur in natural human-to-human conversations.

SCAI-QReCC-21 [leaderboards] [registration] [forum] [contact] [SCAI] Answer a series of contextually-dependent questions like they may occur in natura

19 Sep 28, 2022
K-Nearest Neighbor in Pytorch

Pytorch KNN CUDA 2019/11/02 This repository will no longer be maintained as pytorch supports sort() and kthvalue on tensors. git clone https://github.

Chris Choy 65 Dec 01, 2022
i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery

i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery This is a public code repository for the publication: i-SpaSP: Structured Neural Pruning

Cameron Ronald Wolfe 5 Nov 04, 2022
A naive ROS interface for visualDet3D.

YOLO3D ROS Node This repo contains a Monocular 3D detection Ros node. Base on https://github.com/Owen-Liuyuxuan/visualDet3D All parameters are exposed

Yuxuan Liu 19 Oct 08, 2022
Si Adek Keras is software VR dangerous object detection.

Si Adek Python Keras Sistem Informasi Deteksi Benda Berbahaya Keras Python. Version 1.0 Developed by Ananda Rauf Maududi. Developed date: 24 November

Ananda Rauf 1 Dec 21, 2021
ARKitScenes - A Diverse Real-World Dataset for 3D Indoor Scene Understanding Using Mobile RGB-D Data

ARKitScenes This repo accompanies the research paper, ARKitScenes - A Diverse Real-World Dataset for 3D Indoor Scene Understanding Using Mobile RGB-D

Apple 371 Jan 05, 2023
Facestar dataset. High quality audio-visual recordings of human conversational speech.

Facestar Dataset Description Existing audio-visual datasets for human speech are either captured in a clean, controlled environment but contain only a

Meta Research 87 Dec 21, 2022
Unofficial PyTorch Implementation of "DOLG: Single-Stage Image Retrieval with Deep Orthogonal Fusion of Local and Global Features"

Pytorch Implementation of Deep Orthogonal Fusion of Local and Global Features (DOLG) This is the unofficial PyTorch Implementation of "DOLG: Single-St

DK 96 Jan 06, 2023
This is the code of NeurIPS'21 paper "Towards Enabling Meta-Learning from Target Models".

ST This is the code of NeurIPS 2021 paper "Towards Enabling Meta-Learning from Target Models". If you use any content of this repo for your work, plea

Su Lu 7 Dec 06, 2022
Towards Fine-Grained Reasoning for Fake News Detection

FinerFact This is the PyTorch implementation for the FinerFact model in the AAAI 2022 paper Towards Fine-Grained Reasoning for Fake News Detection (Ar

Ahren_Jin 15 Dec 15, 2022
Code release for The Devil is in the Channels: Mutual-Channel Loss for Fine-Grained Image Classification (TIP 2020)

The Devil is in the Channels: Mutual-Channel Loss for Fine-Grained Image Classification Code release for The Devil is in the Channels: Mutual-Channel

PRIS-CV: Computer Vision Group 230 Dec 31, 2022
g9.py - Torch interactive graphics

g9.py - Torch interactive graphics A Torch toy in the browser. Demo at https://srush.github.io/g9py/ This is a shameless copy of g9.js, written in Pyt

Sasha Rush 13 Nov 16, 2022
3D ResNet Video Classification accelerated by TensorRT

Activity Recognition TensorRT Perform video classification using 3D ResNets trained on Kinetics-400 dataset and accelerated with TensorRT P.S Click on

Akash James 39 Nov 21, 2022
Experimental code for paper: Generative Adversarial Networks as Variational Training of Energy Based Models

Experimental code for paper: Generative Adversarial Networks as Variational Training of Energy Based Models, under review at ICLR 2017 requirements: T

Shuangfei Zhai 18 Mar 05, 2022
An Official Repo of CVPR '20 "MSeg: A Composite Dataset for Multi-Domain Segmentation"

This is the code for the paper: MSeg: A Composite Dataset for Multi-domain Semantic Segmentation (CVPR 2020, Official Repo) [CVPR PDF] [Journal PDF] J

226 Nov 05, 2022
A repository for benchmarking neural vocoders by their quality and speed.

License The majority of VocBench is licensed under CC-BY-NC, however portions of the project are available under separate license terms: Wavenet, Para

Meta Research 177 Dec 12, 2022
Ludwig is a toolbox that allows to train and evaluate deep learning models without the need to write code.

Translated in 🇰🇷 Korean/ Ludwig is a toolbox that allows users to train and test deep learning models without the need to write code. It is built on

Ludwig 8.7k Dec 31, 2022
Semi-SDP Semi-supervised parser for semantic dependency parsing.

Semi-SDP Semi-supervised parser for semantic dependency parsing. This repo contains the code used for the semi-supervised semantic dependency parser i

12 Sep 17, 2021
This repo is the official implementation for Multi-Scale Adaptive Graph Neural Network for Multivariate Time Series Forecasting

1 MAGNN This repo is the official implementation for Multi-Scale Adaptive Graph Neural Network for Multivariate Time Series Forecasting. 1.1 The frame

SZJ 12 Nov 08, 2022
TGRNet: A Table Graph Reconstruction Network for Table Structure Recognition

TGRNet: A Table Graph Reconstruction Network for Table Structure Recognition Xue, Wenyuan, et al. "TGRNet: A Table Graph Reconstruction Network for Ta

Wenyuan 68 Jan 04, 2023