Gradient descent (fixed learning rate)

class pints.GradientDescent(x0, sigma0=0.1, boundaries=None)[source]

Gradient-descent method with a fixed learning rate.

ask()[source]

See Optimiser.ask().

fbest()[source]

See Optimiser.fbest().

learning_rate()[source]

Returns this optimiser’s learning rate.

n_hyper_parameters()[source]

See pints.TunableMethod.n_hyper_parameters().

name()[source]

See Optimiser.name().

needs_sensitivities()[source]

See Optimiser.needs_sensitivities().

running()[source]

See Optimiser.running().

set_hyper_parameters(x)[source]

See pints.TunableMethod.set_hyper_parameters().

The hyper-parameter vector is [learning_rate].

set_learning_rate(eta)[source]

Sets the learning rate for this optimiser.

Parameters:eta (float) – The learning rate, as a float greater than zero.
stop()

Checks if this method has run into trouble and should terminate. Returns False if everything’s fine, or a short message (e.g. “Ill-conditioned matrix.”) if the method should terminate.

tell(reply)[source]

See Optimiser.tell().

xbest()[source]

See Optimiser.xbest().