Gradient descent (fixed learning rate)¶
-
class
pints.
GradientDescent
(x0, sigma0=0.1, boundaries=None)[source]¶ Gradient-descent method with a fixed learning rate.
-
ask
()[source]¶ See
Optimiser.ask()
.
-
fbest
()[source]¶ See
Optimiser.fbest()
.
-
name
()[source]¶ See
Optimiser.name()
.
-
running
()[source]¶ See
Optimiser.running()
.
-
set_hyper_parameters
(x)[source]¶ See
pints.TunableMethod.set_hyper_parameters()
.The hyper-parameter vector is
[learning_rate]
.
-
set_learning_rate
(eta)[source]¶ Sets the learning rate for this optimiser.
Parameters: eta (float) – The learning rate, as a float greater than zero.
-
stop
()¶ Checks if this method has run into trouble and should terminate. Returns
False
if everything’s fine, or a short message (e.g. “Ill-conditioned matrix.”) if the method should terminate.
-
tell
(reply)[source]¶ See
Optimiser.tell()
.
-
xbest
()[source]¶ See
Optimiser.xbest()
.
-