CMD + K

mechanic-pytorch

Community

black box tuning of optimizers

Installation

To install this package, run one of the following:

Conda
$conda install flvincen::mechanic-pytorch

Usage Tracking

0.0.1
1 / 8 versions selected
Downloads (Last 6 months): 0

Description

Based on the paper: https://arxiv.org/abs/2306.00144 Be aware that all experiments reported in the paper were run using the JAX version of mechanic, which is available in optax via optax.contrib.mechanize. Mechanic aims to remove the need for tuning a learning rate scalar (i.e. the maximum learning rate in a schedule). You can use it with any pytorch optimizer and schedule. Simply replace:

optimizer = torch.optim.SGD(model.parameters(), lr=0.01)

with:

from mechanic_pytorch import mechanize
optimizer = mechanize(torch.optim.SGD)(model.parameters(), lr=1.0)

You can set the lr to anything here. However, excessively small values may cause numerical precision issues. Mechanic's scale factor will be multiplied by the base optimizer's learning rate. That's it! The new optimizer should no longer require tuning the learning rate scale! That is, the optimizer should now be very robust to heavily mis-specified values of lr.

About

Summary

black box tuning of optimizers

Last Updated

Jul 9, 2024 at 13:32

License

MIT

Total Downloads

7

Version Downloads

7

Supported Platforms

linux-64