2 Repositories
Latest Python Libraries
A Momentumized, Adaptive, Dual Averaged Gradient Method for Stochastic Optimization
MADGRAD Optimization Method A Momentumized, Adaptive, Dual Averaged Gradient Method for Stochastic Optimization pip install madgrad Try it out! A best
Bunch of optimizer implementations in PyTorch
Bunch of optimizer implementations in PyTorch