ml4chem.optim package

Submodules

ml4chem.optim.LBFGS module

Checks that tensor is not NaN or Inf.

Inputs:

v (tensor): tensor to be checked

ml4chem.optim.LBFGS.polyinterp(points, x_min_bound=None, x_max_bound=None, plot=False)[source]

Gives the minimizer and minimum of the interpolating polynomial over given points based on function and derivative information. Defaults to bisection if no critical points are valid.

Based on polyinterp.m Matlab function in minFunc by Mark Schmidt with some slight modifications.

Implemented by: Hao-Jun Michael Shi and Dheevatsa Mudigere Last edited 12/6/18.

Inputs:

points (nparray): two-dimensional array with each point of form [x f g] x_min_bound (float): minimum value that brackets minimum (default: minimum of points) x_max_bound (float): maximum value that brackets minimum (default: maximum of points) plot (bool): plot interpolating polynomial

Outputs:

x_sol (float): minimizer of interpolating polynomial F_min (float): minimum of interpolating polynomial

Note

. Set f or g to np.nan if they are unknown

ml4chem.optim.handler module

ml4chem.optim.handler.get_lr(optimizer)[source]

Get current learning rate

Parameters

optimizer (obj) – An optimizer object.

Returns

Current learning rate.

Return type

lr

ml4chem.optim.handler.get_lr_scheduler(optimizer, lr_scheduler)[source]

Get a learning rate scheduler

With a learning rate scheduler it is possible to perform training with an adaptative learning rate.

Parameters
  • optimizer (obj) – An optimizer object.

  • lr_scheduler (tuple) –

    Tuple with structure: scheduler’s name and a dictionary with keyword arguments.

    >>> scheduler = ('ReduceLROnPlateau', {'mode': 'min', 'patience': 10})
    

Returns

scheduler – A learning rate scheduler object that can be used to train models.

Return type

obj

Notes

For a list of schedulers and respective keyword arguments, please refer to https://pytorch.org/docs/stable/_modules/torch/optim/lr_scheduler.html

ml4chem.optim.handler.get_optimizer(optimizer, params)[source]

Get optimizer to train pytorch models

There are several optimizers available in pytorch, and all of them take different parameters. This function takes as arguments an optimizer tuple with the following structure:

>>> optimizer = ('adam', {'lr': 1e-2, 'weight_decay': 1e-6})

and returns an optimizer object.

Parameters
  • optimizer (tuple) – Tuple with name of optimizer and keyword arguments of optimizer as shown above.

  • params (list) – Parameters obtained from model.parameters() method.

Returns

optimizer – An optimizer object.

Return type

obj

Notes

For a list of all supported optimizers please check:

https://pytorch.org/docs/stable/optim.html

Module contents