A highly modular PyTorch framework with a focus on Neural Architecture Search (NAS).

Related tags

Deep Learninguninas
Overview

UniNAS

A highly modular PyTorch framework with a focus on Neural Architecture Search (NAS).

under development

(which happens mostly on our internal GitLab, we push only every once in a while to Github)

  • APIs may change
  • argparse arguments may be moved to more fitting classes
  • there may be incomplete or not-yet-working pieces of code
  • ...

Features

  • modular and therefore reusable
    • data set loading,
    • network building code and topologies,
    • methods to train architecture weights,
    • sets of operations (primitives),
    • weight initializers,
    • metrics,
    • ... and more
  • everything is configurable from the command line and/or config files
    • improved reproducibility, since detailed run configurations are saved and logged
    • powerful search network descriptions enable e.g. highly customizable weight sharing settings
    • the underlying argparse mechanism enables using a GUI for configurations
  • compare results of different methods in the same environment
  • import and export detailed network descriptions
  • integrate new methods and more with fairly little effort
  • NAS-Benchmark integration
    • NAS-Bench 201
  • ... and more

Where is this code from?

Except for a few pieces, the code is entirely self-written. However, sometimes the (official) code is useful to learn from or clear up some details, and other frameworks can be used for their nice features.

Other meta-NAS frameworks

  • Deep Architect
    • highly customizable search spaces, hyperparameters, ...
    • the searchers (SMBO, MCTS, ...) focus on fully training (many) models and are not differentiable
  • D-X-Y NAS-Projects
  • Auto-PyTorch
    • stronger focus on model selection than optimizing one architecture
  • Vega
  • NNI

Repository notes

Dynamic argparse tree

Everything is an argument. Learning rate? Argument. Scheduler? Argument. The exact topology of a Network, including how many of each cell and whether they share their architecture weights? Also arguments.

This is enabled by the idea that each used class (method, network, cells, regularizers, ...) can add arguments to argparse, including which further classes are required (e.g. a method needs a network, which needs a stem).

It starts with the Main class adding a Task (cls_task), which itself adds all required components (cls_*).

To see all available (meta) arguments, run Main.list_all_arguments() in uninas/main.py

Graphical user interface

Since putting together the arguments correctly is not trivial (and requires some familiarity with the code base), an easier approach is using a GUI.

Have a look at uninas/gui/tk_gui/main.py, a tkinter GUI frontend.

The GUI can automatically filter usable classes, display available arguments, and display tooltips; based only on the implemented argparse (meta) arguments in the respective classes.

Some meta arguments take a single class name:

e.g: cls_task, cls_trainer, cls_data, cls_criterion, cls_method

The chosen classes define their own arguments, e.g.:

  • cls_trainer="SimpleTrainer"
  • SimpleTrainer.max_epochs=100
  • SimpleTrainer.test_last=10

Their names are also available as wildcards, automatically using their respectively set class name:

  • cls_trainer="SimpleTrainer"
  • {cls_trainer}.max_epochs --> SimpleTrainer.max_epochs
  • {cls_trainer}.test_last --> SimpleTrainer.test_last

Some meta arguments take a comma-separated list of class names:

e.g. cls_metrics, cls_initializers, cls_regularizers, cls_optimizers, cls_schedulers

The chosen classes also define their own arguments, but always include an index, e.g.:

  • cls_regularizers="DropOutRegularizer, DropPathRegularizer"
  • DropOutRegularizer#0.prob=0.5
  • DropPathRegularizer#1.max_prob=0.3
  • DropPathRegularizer#1.drop_id_paths=false

And they are also available as indexed wildcards:

  • cls_regularizers="DropOutRegularizer, DropPathRegularizer"
  • {cls_regularizers#0}.prob --> DropOutRegularizer#0.prob
  • {cls_regularizers#1}.max_prob --> DropPathRegularizer#1.max_prob
  • {cls_regularizers#1}.drop_id_paths --> DropPathRegularizer#1.drop_id_paths

Register

UniNAS makes heavy use of a registering mechanism (via decorators in uninas/register.py). Classes of the same type (e.g. optimizers, networks, ...) will register in one RegisterDict.

Registered classes can be accessed via their name in the Register, no matter of their actual location in the code. This enables e.g. saving network topologies as nested dictionaries, no matter how complicated they are, since the class names are enough to find the classes in the code. (It also grants a certain amount of refactoring-freedom.)

Exporting networks

(Trained) Networks can easily be used by other PyTorch frameworks/scripts, see verify.py for an easy example.

Citation

The framework

we will possibly create a whitepaper at some point

@misc{kl2020uninas,
  author = {Kevin Alexander Laube},
  title = {UniNAS},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/cogsys-tuebingen/uninas}}
}

Inter-choice dependent super-network weights

  1. Train super-networks, e.g. via experiments/demo/inter_choice_weights/icw1_train_supernet_nats.py
    • you will need Cifar10, but can also easily use fake data or download it
    • to generate SubImageNet see uninas/utils/generate/data/subImageNet
  2. Evaluate the super-network, e.g. via experiments/demo/inter_choice_weights/icw2_eval_supernet.py
  3. View the evaluation results in the save dir, in TensorBoard or plotted directly
@article{laube2021interchoice,
  title={Inter-choice dependent super-network weights},
  author={Kevin Alexander Laube, Andreas Zell},
  journal={arXiv preprint arXiv:2104.11522},
  year={2021}
}
Owner
Cognitive Systems Research Group
Autonomous Mobile Robots; Bioinformatics; Chemo- and Geoinformatics; Evolutionary Algorithms; Machine Learning
Cognitive Systems Research Group
Distributed Deep learning with Keras & Spark

Elephas: Distributed Deep Learning with Keras & Spark Elephas is an extension of Keras, which allows you to run distributed deep learning models at sc

Max Pumperla 1.6k Jan 05, 2023
PyTorch code for the "Deep Neural Networks with Box Convolutions" paper

Box Convolution Layer for ConvNets Single-box-conv network (from `examples/mnist.py`) learns patterns on MNIST What This Is This is a PyTorch implemen

Egor Burkov 515 Dec 18, 2022
[TNNLS 2021] The official code for the paper "Learning Deep Context-Sensitive Decomposition for Low-Light Image Enhancement"

CSDNet-CSDGAN this is the code for the paper "Learning Deep Context-Sensitive Decomposition for Low-Light Image Enhancement" Environment Preparing pyt

Jiaao Zhang 17 Nov 05, 2022
Using this codebase as a tool for my own research. Making some modifications to the original repo for my own purposes.

For SwapNet Create a list.txt file containing all the images to process. This can be done with the GNU find command: find path/to/input/folder -name '

Andrew Jong 2 Nov 10, 2021
Context Decoupling Augmentation for Weakly Supervised Semantic Segmentation

Context Decoupling Augmentation for Weakly Supervised Semantic Segmentation The code of: Context Decoupling Augmentation for Weakly Supervised Semanti

54 Dec 12, 2022
Cours d'Algorithmique Appliquée avec Python pour BTS SIO SISR

Course: Introduction to Applied Algorithms with Python (in French) This is the source code of the website for the Applied Algorithms with Python cours

Loic Yvonnet 0 Jan 27, 2022
Code for paper "Multi-level Disentanglement Graph Neural Network"

Multi-level Disentanglement Graph Neural Network (MD-GNN) This is a PyTorch implementation of the MD-GNN, and the code includes the following modules:

Lirong Wu 6 Dec 29, 2022
AI drive app that can help user become beautiful.

爱美丽 Beauty 简体中文 Features Beauty is an AI drive app that can help user become beautiful. it contain those functions: face score cheek face beauty repor

Starved Midnight 1 Jan 30, 2022
Evaluating Privacy-Preserving Machine Learning in Critical Infrastructures: A Case Study on Time-Series Classification

PPML-TSA This repository provides all code necessary to reproduce the results reported in our paper Evaluating Privacy-Preserving Machine Learning in

Dominik 1 Mar 08, 2022
Summary Explorer is a tool to visually explore the state-of-the-art in text summarization.

Summary Explorer Summary Explorer is a tool to visually inspect the summaries from several state-of-the-art neural summarization models across multipl

Webis 42 Aug 14, 2022
DeepGNN is a framework for training machine learning models on large scale graph data.

DeepGNN Overview DeepGNN is a framework for training machine learning models on large scale graph data. DeepGNN contains all the necessary features in

Microsoft 45 Jan 01, 2023
Official implementation of "Synthetic Temporal Anomaly Guided End-to-End Video Anomaly Detection" (ICCV Workshops 2021: RSL-CV).

Official PyTorch implementation of "Synthetic Temporal Anomaly Guided End-to-End Video Anomaly Detection" This is the implementation of the paper "Syn

Marcella Astrid 11 Oct 07, 2022
Realtime micro-expression recognition using OpenCV and PyTorch

Micro-expression Recognition Realtime micro-expression recognition from scratch using OpenCV and PyTorch Try it out with a webcam or video using the e

Irfan 35 Dec 05, 2022
DiAne is a smart fuzzer for IoT devices

Diane Diane is a fuzzer for IoT devices. Diane works by identifying fuzzing triggers in the IoT companion apps to produce valid yet under-constrained

seclab 28 Jan 04, 2023
Code and data accompanying our SVRHM'21 paper.

Code and data accompanying our SVRHM'21 paper. Requires tensorflow 1.13, python 3.7, scikit-learn, and pytorch 1.6.0 to be installed. Python scripts i

5 Nov 17, 2021
PyTorch/TorchScript compiler for NVIDIA GPUs using TensorRT

PyTorch/TorchScript compiler for NVIDIA GPUs using TensorRT

NVIDIA Corporation 1.8k Dec 30, 2022
PyTorch implementation for ComboGAN

ComboGAN This is our ongoing PyTorch implementation for ComboGAN. Code was written by Asha Anoosheh (built upon CycleGAN) [ComboGAN Paper] If you use

Asha Anoosheh 139 Dec 20, 2022
Federated learning on graph, especially on graph neural networks (GNNs), knowledge graph, and private GNN.

Federated learning on graph, especially on graph neural networks (GNNs), knowledge graph, and private GNN.

keven 198 Dec 20, 2022
LTR_CrossEncoder: Legal Text Retrieval Zalo AI Challenge 2021

LTR_CrossEncoder: Legal Text Retrieval Zalo AI Challenge 2021 We propose a cross encoder model (LTR_CrossEncoder) for information retrieval, re-retrie

Hieu Duong 7 Jan 12, 2022