PyTorch implementation of some learning rate schedulers for deep learning researcher.

Overview

pytorch-lr-scheduler

PyTorch implementation of some learning rate schedulers for deep learning researcher.

Usage

WarmupReduceLROnPlateauScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.warmup_reduce_lr_on_plateau_scheduler import WarmupReduceLROnPlateauScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = WarmupReduceLROnPlateauScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        warmup_steps=30000, 
        patience=1,
        factor=0.3,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            if timestep < warmup_steps:
                scheduler.step()
                
        val_loss = validate()
        scheduler.step(val_loss)

TransformerLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.transformer_lr_scheduler import TransformerLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = TransformerLRScheduler(
        optimizer=optimizer, 
        init_lr=1e-10, 
        peak_lr=0.1,
        final_lr=1e-4, 
        final_lr_scale=0.05,
        warmup_steps=3000, 
        decay_steps=17000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

TriStageLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.tri_stage_lr_scheduler import TriStageLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = TriStageLRScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        final_lr=1e-7, 
        init_lr_scale=0.01, 
        final_lr_scale=0.05,
        warmup_steps=30000, 
        hold_steps=70000, 
        decay_steps=100000,
        total_steps=200000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

ReduceLROnPlateauScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.reduce_lr_on_plateau_lr_scheduler import ReduceLROnPlateauScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-4)

    scheduler = ReduceLROnPlateauScheduler(optimizer, patience=1, factor=0.3)

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
        
        val_loss = validate()
        scheduler.step(val_loss)

WarmupLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.warmup_lr_scheduler import WarmupLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = WarmupLRScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        warmup_steps=4000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

Troubleshoots and Contributing

If you have any questions, bug reports, and feature requests, please open an issue on Github.

I appreciate any kind of feedback or contribution. Feel free to proceed with small issues like bug fixes, documentation improvement. For major contributions and new features, please discuss with the collaborators in corresponding issues.

Code Style

I follow PEP-8 for code style. Especially the style of docstrings is important to generate documentation.

License

This project is licensed under the MIT LICENSE - see the LICENSE.md file for details

Owner
Soohwan Kim
Toward human-like A.I.
Soohwan Kim
Supervised domain-agnostic prediction framework for probabilistic modelling

A supervised domain-agnostic framework that allows for probabilistic modelling, namely the prediction of probability distributions for individual data

The Alan Turing Institute 112 Oct 23, 2022
DAT4 - General Assembly's Data Science course in Washington, DC

DAT4 Course Repository Course materials for General Assembly's Data Science course in Washington, DC (12/15/14 - 3/16/15). Instructors: Sinan Ozdemir

Kevin Markham 779 Dec 25, 2022
Simplified interface for TensorFlow (mimicking Scikit Learn) for Deep Learning

SkFlow has been moved to Tensorflow. SkFlow has been moved to http://github.com/tensorflow/tensorflow into contrib folder specifically located here. T

3.2k Dec 29, 2022
Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable.

Diffrax Numerical differential equation solvers in JAX. Autodifferentiable and GPU-capable. Diffrax is a JAX-based library providing numerical differe

Patrick Kidger 717 Jan 09, 2023
QHack—the quantum machine learning hackathon

Official repo for QHack—the quantum machine learning hackathon

Xanadu 72 Dec 21, 2022
torchbearer: A model fitting library for PyTorch

Note: We're moving to PyTorch Lightning! Read about the move here. From the end of February, torchbearer will no longer be actively maintained. We'll

632 Dec 13, 2022
UDP++ (ECCVW 2020 Oral), (Winner of COCO 2020 Keypoint Challenge).

UDP-Pose This is the pytorch implementation for UDP++, which won the Fisrt place in COCO Keypoint Challenge at ECCV 2020 Workshop. Top-Down Results on

20 Jul 29, 2022
TianyuQi 10 Dec 11, 2022
Official repo for BMVC2021 paper ASFormer: Transformer for Action Segmentation

ASFormer: Transformer for Action Segmentation This repo provides training & inference code for BMVC 2021 paper: ASFormer: Transformer for Action Segme

42 Dec 23, 2022
Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module

Invariant Point Attention - Pytorch Implementation of Invariant Point Attention as a standalone module, which was used in the structure module of Alph

Phil Wang 113 Jan 05, 2023
Social Network Ads Prediction

Social network advertising, also social media targeting, is a group of terms that are used to describe forms of online advertising that focus on social networking services.

Khazar 2 Jan 28, 2022
Unofficial pytorch implementation of paper "One-Shot Free-View Neural Talking-Head Synthesis for Video Conferencing"

One-Shot Free-View Neural Talking Head Synthesis Unofficial pytorch implementation of paper "One-Shot Free-View Neural Talking-Head Synthesis for Vide

ZLH 406 Dec 23, 2022
Dashboard for the COVID19 spread

COVID-19 Data Explorer App A streamlit Dashboard for the COVID-19 spread. The app is live at: [https://covid19.cwerner.ai]. New data is queried from G

Christian Werner 22 Sep 29, 2022
[CVPR 2022] Structured Sparse R-CNN for Direct Scene Graph Generation

Structured Sparse R-CNN for Direct Scene Graph Generation Our paper Structured Sparse R-CNN for Direct Scene Graph Generation has been accepted by CVP

Multimedia Computing Group, Nanjing University 44 Dec 23, 2022
Updated for TTS(CE) = Also Known as TTN V3. The code requires the first server to be 'ttn' protocol.

Updated Updated for TTS(CE) = Also Known as TTN V3. The code requires the first server to be 'ttn' protocol. Introduction This balenaCloud (previously

Remko 1 Oct 17, 2021
Web-interface + rest API for classification and regression (https://jeff1evesque.github.io/machine-learning.docs)

Machine Learning This project provides a web-interface, as well as a programmatic-api for various machine learning algorithms. Supported algorithms: S

Jeff Levesque 252 Dec 11, 2022
Pytorch and Torch testing code of CartoonGAN

CartoonGAN-Test-Pytorch-Torch Pytorch and Torch testing code of CartoonGAN [Chen et al., CVPR18]. With the released pretrained models by the authors,

Yijun Li 642 Dec 27, 2022
Traditional deepdream with VQGAN+CLIP and optical flow. Ready to use in Google Colab

VQGAN-CLIP-Video cat.mp4 policeman.mp4 schoolboy.mp4 forsenBOG.mp4

23 Oct 26, 2022
Code for Discriminative Sounding Objects Localization (NeurIPS 2020)

Discriminative Sounding Objects Localization Code for our NeurIPS 2020 paper Discriminative Sounding Objects Localization via Self-supervised Audiovis

51 Dec 11, 2022
An introduction to satellite image analysis using Python + OpenCV and JavaScript + Google Earth Engine

A Gentle Introduction to Satellite Image Processing Welcome to this introductory course on Satellite Image Analysis! Satellite imagery has become a pr

Edward Oughton 32 Jan 03, 2023