Pytorch Performace Tuning, WandB, AMP, Multi-GPU, TensorRT, Triton

Overview

Plant Pathology 2020 FGVC7

Introduction

A deep learning model pipeline for training, experimentaiton and deployment for the Kaggle Competition, Plant Pathology 2020, utilising:

  • PyTorch: A Deep Learning Framework for high-performance AI research
  • Weights and Biases: tool for experiment tracking, dataset versioning, and model management
  • Apex: A Library to Accelerate Deep Learning Training using AMP, Fused Optimizer, and Multi-GPU
  • TensorRT: high-performance neural network inference optimizer and runtime engine for production deployment
  • Triton Inference Server: inference serving software that simplifies the deployment of AI models at scale
  • Streamlit: framework to quickly build highly interactive web applications for machine learning models

For a quick tutorial about all these modules, check out tutorials folder. Exploratory data analysis for the same can also be found in the notebooks folder.

Structure

├── app                 # Interactive Streamlit app scripts
├── data                # Datasets
├── examples            # assignment on pytorch amp and ddp
├── model               # Directory to save models for triton
├── notebooks           # EDA, Training, Model conversion, Inferencing and other utility notebooks
├── tutorials           # Tutorials on the modules used
└── requirements.txt    # Basic requirements

Usage

EDA: Data Evaluation

Data can be explored with various visualization techniques provided in eda.ipyb notebooks folder

Training the model

To run the pytorch resnet50 model use pytorch_train.ipynb.

The code is inspired by Pytorch Performance Tuning Guide

Once the model is trained, you can even run model explainabilty using the shap library. The tutorial notebook for the same can be found in the notebooks folder.

Model Conversion and Inferencing

Once you've trained the model, you will need to convert it to different formats in order to have a faster inference time as well as easily deploy them. You can convert the model to ONNX, TensorRT FP32 and TensorRT FP16 formats which are optimised to run faster inference. You will also need to convert the PyTorch model to TorchScript. Procedure for converting and benchmarking all the different formats of the model can be found in notebooks folder.

Model Deployment and Benchmarking

Now your models are ready to be deployed. For deployment, we utilise the Triton Inference Server. It provides an inferencing solution for deep learning models to be easily deployed and integrated with various functionalities. It supports HTTP and gRPC protocol that allows clients to request for inferencing, utilising any model of choice being managed by the server. The process of deployment can be found in Triton Inference Server.md.

Once your inferencing server is up and running, the next step it to understand as well as optimise the model performance. For this purpose, you can utilise tools like perf_analyzer which helps you measure changes in performance as you experiment with different parameters.

Interactive Web App

To run the Streamlit app:

cd app/
streamlit app.py

This will create a local server on which you can view the web application. This app contains the client side for the Triton Inference Server, along with an easy to use GUI.

Acknowledgement

This repository is built with references and code snippets from the NN Template by Luca Moschella.

Owner
Bharat Giddwani
B.Tech Graduate || Deep learning/ machine learning enthusiast. A passionate/avid learner.
Bharat Giddwani
Official code implementation for "Personalized Federated Learning using Hypernetworks"

Personalized Federated Learning using Hypernetworks This is an official implementation of Personalized Federated Learning using Hypernetworks paper. [

Aviv Shamsian 121 Dec 25, 2022
Official implementation of "GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators" (NeurIPS 2020)

GS-WGAN This repository contains the implementation for GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators (NeurIPS

46 Nov 09, 2022
Model Zoo of BDD100K Dataset

Model Zoo of BDD100K Dataset

ETH VIS Group 200 Dec 27, 2022
List some popular DeepFake models e.g. DeepFake, FaceSwap-MarekKowal, IPGAN, FaceShifter, FaceSwap-Nirkin, FSGAN, SimSwap, CihaNet, etc.

deepfake-models List some popular DeepFake models e.g. DeepFake, CihaNet, SimSwap, FaceSwap-MarekKowal, IPGAN, FaceShifter, FaceSwap-Nirkin, FSGAN, Si

Mingcan Xiang 100 Dec 17, 2022
🚀 PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)"

PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)" Unofficial PyTorch Implementation of Progressi

Vitaliy Hramchenko 58 Dec 19, 2022
HashNeRF-pytorch - Pure PyTorch Implementation of NVIDIA paper on Instant Training of Neural Graphics primitives

HashNeRF-pytorch Instant-NGP recently introduced a Multi-resolution Hash Encodin

Yash Sanjay Bhalgat 616 Jan 06, 2023
[CVPR 2022] Deep Equilibrium Optical Flow Estimation

Deep Equilibrium Optical Flow Estimation This is the official repo for the paper Deep Equilibrium Optical Flow Estimation (CVPR 2022), by Shaojie Bai*

CMU Locus Lab 136 Dec 18, 2022
Implementation for NeurIPS 2021 Submission: SparseFed

READ THIS FIRST This repo is an anonymized version of an existing repository of GitHub, for the AIStats 2021 submission: SparseFed: Mitigating Model P

2 Jun 15, 2022
Reinforcement learning library in JAX.

Reinforcement learning library in JAX.

Yicheng Luo 96 Oct 30, 2022
Cosine Annealing With Warmup

CosineAnnealingWithWarmup Formulation The learning rate is annealed using a cosine schedule over the course of learning of n_total total steps with an

zhuyun 4 Apr 18, 2022
Adaptive Attention Span for Reinforcement Learning

Adaptive Transformers in RL Official implementation of Adaptive Transformers in RL In this work we replicate several results from Stabilizing Transfor

100 Nov 15, 2022
SenseNet is a sensorimotor and touch simulator for deep reinforcement learning research

SenseNet is a sensorimotor and touch simulator for deep reinforcement learning research

59 Feb 25, 2022
Attention Probe: Vision Transformer Distillation in the Wild

Attention Probe: Vision Transformer Distillation in the Wild Jiahao Wang, Mingdeng Cao, Shuwei Shi, Baoyuan Wu, Yujiu Yang In ICASSP 2022 This code is

Wang jiahao 3 Oct 31, 2022
Deep Dual Consecutive Network for Human Pose Estimation (CVPR2021)

Beanie - is an asynchronous ODM for MongoDB, based on Motor and Pydantic. It uses an abstraction over Pydantic models and Motor collections to work wi

295 Dec 29, 2022
Bayesian Optimization Library for Medical Image Segmentation.

bayesmedaug: Bayesian Optimization Library for Medical Image Segmentation. bayesmedaug optimizes your data augmentation hyperparameters for medical im

Şafak Bilici 7 Feb 10, 2022
Official PyTorch implementation for FastDPM, a fast sampling algorithm for diffusion probabilistic models

Official PyTorch implementation for "On Fast Sampling of Diffusion Probabilistic Models". FastDPM generation on CIFAR-10, CelebA, and LSUN datasets. S

Zhifeng Kong 68 Dec 26, 2022
Code for the Convolutional Vision Transformer (ConViT)

ConViT : Vision Transformers with Convolutional Inductive Biases This repository contains PyTorch code for ConViT. It builds on code from the Data-Eff

Facebook Research 418 Jan 06, 2023
Camera Distortion-aware 3D Human Pose Estimation in Video with Optimization-based Meta-Learning

Camera Distortion-aware 3D Human Pose Estimation in Video with Optimization-based Meta-Learning This is the official repository of "Camera Distortion-

Hanbyel Cho 12 Oct 06, 2022
Pytorch Implementation of "Diagonal Attention and Style-based GAN for Content-Style disentanglement in image generation and translation" (ICCV 2021)

DiagonalGAN Official Pytorch Implementation of "Diagonal Attention and Style-based GAN for Content-Style Disentanglement in Image Generation and Trans

32 Dec 06, 2022
Covid-19 Test AI (Deep Learning - NNs) Software. Accuracy is the %96.5, loss is the 0.09 :)

Covid-19 Test AI (Deep Learning - NNs) Software I developed a segmentation algorithm to understand whether Covid-19 Test Photos are positive or negati

Emirhan BULUT 28 Dec 04, 2021