ml4chem.optim package

Submodules

ml4chem.optim.LBFGS module

ml4chem.optim.handler module

ml4chem.optim.handler.get_lr(optimizer)[source]

Get current learning rate

Parameters

optimizer (obj) – An optimizer object.

Returns

Current learning rate.

Return type

lr

ml4chem.optim.handler.get_lr_scheduler(optimizer, lr_scheduler)[source]

Get a learning rate scheduler

With a learning rate scheduler it is possible to perform training with an adaptative learning rate.

Parameters
  • optimizer (obj) – An optimizer object.

  • lr_scheduler (tuple) –

    Tuple with structure: scheduler’s name and a dictionary with keyword arguments.

    >>> scheduler = ('ReduceLROnPlateau', {'mode': 'min', 'patience': 10})
    

Returns

scheduler – A learning rate scheduler object that can be used to train models.

Return type

obj

Notes

For a list of schedulers and respective keyword arguments, please refer to https://pytorch.org/docs/stable/_modules/torch/optim/lr_scheduler.html

ml4chem.optim.handler.get_optimizer(optimizer, params)[source]

Get optimizer to train pytorch models

There are several optimizers available in pytorch, and all of them take different parameters. This function takes as arguments an optimizer tuple with the following structure:

>>> optimizer = ('adam', {'lr': 1e-2, 'weight_decay': 1e-6})

and returns an optimizer object.

Parameters
  • optimizer (tuple) – Tuple with name of optimizer and keyword arguments of optimizer as shown above.

  • params (list) – Parameters obtained from model.parameters() method.

Returns

optimizer – An optimizer object.

Return type

obj

Notes

For a list of all supported optimizers please check:

https://pytorch.org/docs/stable/optim.html

Module contents