• Stars
    star
    367
  • Rank 116,257 (Top 3 %)
  • Language
    Python
  • License
    MIT License
  • Created over 5 years ago
  • Updated almost 4 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Implementation of https://arxiv.org/abs/1904.00962

Implementation of https://arxiv.org/abs/1904.00962 for large batch, large learning rate training.

The paper doesn't specify clamp values for Ï•, so I use 10.

Bonus: TensorboardX logging (example below).

Try the sample

git clone [email protected]:cybertronai/pytorch-lamb.git
cd pytorch-lamb
pip install -e .
python test_lamb.py
tensorboard --logdir=runs

Sample results

At --lr=.02, the Adam optimizer is unable to train.

Red: python test_lamb.py --batch-size=512 --lr=.02 --wd=.01 --log-interval=30 --optimizer=adam

Blue: python test_lamb.py --batch-size=512 --lr=.02 --wd=.01 --log-interval=30 --optimizer=lamb