Nelder-Mead¶
-
class
pints.
NelderMead
(x0, sigma0=None, boundaries=None)[source]¶ Nelder-Mead downhill simplex method.
Implementation of the classical algorithm by [1], following the presentation in Algorithm 8.1 of [2].
This is a deterministic local optimiser. In most update steps it performs either 1 evaluation, or 2 sequential evaluations, so that it will not typically benefit from parallelisation.
Generates a “simplex” of
n + 1
samples around a given starting point, and evaluates their scores. Next, each iteration consists of a sequence of operations, typically the worst sampley_worst
is replaced with a new point:y_new = mu + delta * (mu - y_worst) mu = (1 / n) * sum(y), y != y_worst
where
delta
has one of four values, depending on the type of operation:- Reflection (
delta = 1
) - Expansion (
delta = 2
) - Inside contraction (
delta = -0.5
) - Outside contraction (
delta = 0.5
)
Note that the
delta
values here are common choices, but not the only valid choices.A fifth type of iteration called a “shrink” is occasionally performed, in which all samples except the best sample
y_best
are replaced:y_i_new = y_best + ys * (y_i - y_best)
where ys is a parameter (typically ys = 0.5).
The initialisation of the initial simplex was copied from [3].
References
[1] A simplex method for function minimization Nelder, Mead 1965, Computer Journal https://doi.org/10.1093/comjnl/7.4.308 [2] Introduction to derivative-free optimization Andrew R. Conn, Katya Scheinberg, Luis N. Vicente 2009, First edition. ISBN 978-0-098716-68-9 https://doi.org/10.1137/1.9780898718768 [3] SciPy on GitHub https://github.com/scipy/scipy/ -
ask
()[source]¶ See:
pints.Optimiser.ask()
.
-
fbest
()[source]¶ See:
pints.Optimiser.fbest()
.
-
n_hyper_parameters
()¶ Returns the number of hyper-parameters for this method (see
TunableMethod
).
-
name
()[source]¶ See:
pints.Optimiser.name()
.
-
needs_sensitivities
()¶ Returns
True
if this methods needs sensitivities to be passed in totell
along with the evaluated error.
-
set_hyper_parameters
(x)¶ Sets the hyper-parameters for the method with the given vector of values (see
TunableMethod
).Parameters: x – An array of length n_hyper_parameters
used to set the hyper-parameters.
-
stop
()[source]¶ See:
pints.Optimiser.stop()
.
-
tell
(fx)[source]¶ See:
pints.Optimiser.tell()
.
-
xbest
()[source]¶ See:
pints.Optimiser.xbest()
.
- Reflection (