PyTorch implementation of some learning rate schedulers for deep learning researcher.

Overview

pytorch-lr-scheduler

PyTorch implementation of some learning rate schedulers for deep learning researcher.

Usage

WarmupReduceLROnPlateauScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.warmup_reduce_lr_on_plateau_scheduler import WarmupReduceLROnPlateauScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = WarmupReduceLROnPlateauScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        warmup_steps=30000, 
        patience=1,
        factor=0.3,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            if timestep < warmup_steps:
                scheduler.step()
                
        val_loss = validate()
        scheduler.step(val_loss)

TransformerLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.transformer_lr_scheduler import TransformerLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = TransformerLRScheduler(
        optimizer=optimizer, 
        init_lr=1e-10, 
        peak_lr=0.1,
        final_lr=1e-4, 
        final_lr_scale=0.05,
        warmup_steps=3000, 
        decay_steps=17000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

TriStageLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.tri_stage_lr_scheduler import TriStageLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = TriStageLRScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        final_lr=1e-7, 
        init_lr_scale=0.01, 
        final_lr_scale=0.05,
        warmup_steps=30000, 
        hold_steps=70000, 
        decay_steps=100000,
        total_steps=200000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

ReduceLROnPlateauScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.reduce_lr_on_plateau_lr_scheduler import ReduceLROnPlateauScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-4)

    scheduler = ReduceLROnPlateauScheduler(optimizer, patience=1, factor=0.3)

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
        
        val_loss = validate()
        scheduler.step(val_loss)

WarmupLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.warmup_lr_scheduler import WarmupLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = WarmupLRScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        warmup_steps=4000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

Troubleshoots and Contributing

If you have any questions, bug reports, and feature requests, please open an issue on Github.

I appreciate any kind of feedback or contribution. Feel free to proceed with small issues like bug fixes, documentation improvement. For major contributions and new features, please discuss with the collaborators in corresponding issues.

Code Style

I follow PEP-8 for code style. Especially the style of docstrings is important to generate documentation.

License

This project is licensed under the MIT LICENSE - see the LICENSE.md file for details

Owner
Soohwan Kim
Toward human-like A.I.
Soohwan Kim
Codes for [NeurIPS'21] You are caught stealing my winning lottery ticket! Making a lottery ticket claim its ownership.

You are caught stealing my winning lottery ticket! Making a lottery ticket claim its ownership Codes for [NeurIPS'21] You are caught stealing my winni

VITA 8 Nov 01, 2022
Send text to girlfriend in the morning

Girlfriend Text Send text to girlfriend (or really anyone with a phone number) in the morning 1. Configure your settings in utils.py. phone_number = "

Paras Adhikary 199 Oct 25, 2022
A deep learning network built with TensorFlow and Keras to classify gender and estimate age.

Convolutional Neural Network (CNN). This repository contains a source code of a deep learning network built with TensorFlow and Keras to classify gend

Pawel Dziemiach 1 Dec 19, 2021
Scheme for training and applying a label propagation framework

Factorisation-based Image Labelling Overview This is a scheme for training and applying the factorisation-based image labelling (FIL) framework. Some

Wellcome Centre for Human Neuroimaging 2 Dec 17, 2021
Probabilistic Tracklet Scoring and Inpainting for Multiple Object Tracking

Probabilistic Tracklet Scoring and Inpainting for Multiple Object Tracking (CVPR 2021) Pytorch implementation of the ArTIST motion model. In this repo

Fatemeh 38 Dec 12, 2022
Automate issue discovery for your projects against Lightning nightly and releases.

Automated Testing for Lightning EcoSystem Projects Automate issue discovery for your projects against Lightning nightly and releases. You get CPUs, Mu

Pytorch Lightning 41 Dec 24, 2022
Author: Wenhao Yu ([email protected]). ACL 2022. Commonsense Reasoning on Knowledge Graph for Text Generation

Diversifying Commonsense Reasoning Generation on Knowledge Graph Introduction -- This is the pytorch implementation of our ACL 2022 paper "Diversifyin

DM2 Lab @ ND 61 Dec 30, 2022
Official Pytorch Implementation of: "ImageNet-21K Pretraining for the Masses"(2021) paper

ImageNet-21K Pretraining for the Masses Paper | Pretrained models Official PyTorch Implementation Tal Ridnik, Emanuel Ben-Baruch, Asaf Noy, Lihi Zelni

574 Jan 02, 2023
Spatial Temporal Graph Convolutional Networks (ST-GCN) for Skeleton-Based Action Recognition in PyTorch

Reminder ST-GCN has transferred to MMSkeleton, and keep on developing as an flexible open source toolbox for skeleton-based human understanding. You a

sijie yan 1.1k Dec 25, 2022
Utility code for use with PyXLL

pyxll-utils There is no need to use this package as of PyXLL 5. All features from this package are now provided by PyXLL. If you were using this packa

PyXLL 10 Dec 18, 2021
📚 Papermill is a tool for parameterizing, executing, and analyzing Jupyter Notebooks.

papermill is a tool for parameterizing, executing, and analyzing Jupyter Notebooks. Papermill lets you: parameterize notebooks execute notebooks This

nteract 5.1k Jan 03, 2023
Multi-Anchor Active Domain Adaptation for Semantic Segmentation (ICCV 2021 Oral)

Multi-Anchor Active Domain Adaptation for Semantic Segmentation Munan Ning*, Donghuan Lu*, Dong Wei†, Cheng Bian, Chenglang Yuan, Shuang Yu, Kai Ma, Y

Munan Ning 36 Dec 07, 2022
Koopman operator identification library in Python

pykoop pykoop is a Koopman operator identification library written in Python. It allows the user to specify Koopman lifting functions and regressors i

DECAR Systems Group 34 Jan 04, 2023
PyTorch implementation of paper “Unbiased Scene Graph Generation from Biased Training”

A new codebase for popular Scene Graph Generation methods (2020). Visualization & Scene Graph Extraction on custom images/datasets are provided. It's also a PyTorch implementation of paper “Unbiased

Kaihua Tang 824 Jan 03, 2023
PFLD pytorch Implementation

PFLD-pytorch Implementation of PFLD A Practical Facial Landmark Detector by pytorch. 1. install requirements pip3 install -r requirements.txt 2. Datas

zhaozhichao 669 Jan 02, 2023
MediaPipe is a an open-source framework from Google for building multimodal

MediaPipe is a an open-source framework from Google for building multimodal (eg. video, audio, any time series data), cross platform (i.e Android, iOS, web, edge devices) applied ML pipelines. It is

Bhavishya Pandit 3 Sep 30, 2022
Code for our NeurIPS 2021 paper Mining the Benefits of Two-stage and One-stage HOI Detection

CDN Code for our NeurIPS 2021 paper "Mining the Benefits of Two-stage and One-stage HOI Detection". Contributed by Aixi Zhang*, Yue Liao*, Si Liu, Mia

71 Dec 14, 2022
Finetuning Pipeline

KLUE Baseline Korean(한국어) KLUE-baseline contains the baseline code for the Korean Language Understanding Evaluation (KLUE) benchmark. See our paper fo

74 Dec 13, 2022
Pytorch implementation of "M-LSD: Towards Light-weight and Real-time Line Segment Detection"

M-LSD: Towards Light-weight and Real-time Line Segment Detection Pytorch implementation of "M-LSD: Towards Light-weight and Real-time Line Segment Det

123 Jan 04, 2023
The ARCA23K baseline system

ARCA23K Baseline System This is the source code for the baseline system associated with the ARCA23K dataset. Details about ARCA23K and the baseline sy

4 Jul 02, 2022