Nelder-Mead¶
- class pints.NelderMead(x0, sigma0=None, boundaries=None)[source]¶
Nelder-Mead downhill simplex method.
Implementation of the classical algorithm by [1], following the presentation in Algorithm 8.1 of [2].
This is a deterministic local optimiser. In most update steps it performs either 1 evaluation, or 2 sequential evaluations, so that it will not typically benefit from parallelisation.
Generates a “simplex” of
n + 1
samples around a given starting point, and evaluates their scores. Next, each iteration consists of a sequence of operations, typically the worst sampley_worst
is replaced with a new point:y_new = mu + delta * (mu - y_worst) mu = (1 / n) * sum(y), y != y_worst
where
delta
has one of four values, depending on the type of operation:Reflection (
delta = 1
)Expansion (
delta = 2
)Inside contraction (
delta = -0.5
)Outside contraction (
delta = 0.5
)
Note that the
delta
values here are common choices, but not the only valid choices.A fifth type of iteration called a “shrink” is occasionally performed, in which all samples except the best sample
y_best
are replaced:y_i_new = y_best + ys * (y_i - y_best)
where ys is a parameter (typically ys = 0.5).
The initialisation of the initial simplex was copied from [3].
References
- ask()[source]¶
See:
pints.Optimiser.ask()
.
- f_best()[source]¶
See:
pints.Optimiser.f_best()
.
- f_guessed()¶
For optimisers in which the best guess of the optimum (see
x_guessed()
) differs from the best-seen point (seex_best()
), this method returns an estimate of the objective function value atx_guessed
.Notes:
For many optimisers the best guess is simply the best point seen during the optimisation, so that this method is equivalent to
f_best()
.Because
x_guessed
is not required to be a point that the optimiser has visited, the valuef(x_guessed)
may be unkown. In these cases, an approximation off(x_guessed)
may be returned.
- n_hyper_parameters()¶
Returns the number of hyper-parameters for this method (see
TunableMethod
).
- name()[source]¶
See:
pints.Optimiser.name()
.
- needs_sensitivities()¶
Returns
True
if this methods needs sensitivities to be passed in totell
along with the evaluated error.
- set_hyper_parameters(x)¶
Sets the hyper-parameters for the method with the given vector of values (see
TunableMethod
).- Parameters:
x – An array of length
n_hyper_parameters
used to set the hyper-parameters.
- stop()[source]¶
See:
pints.Optimiser.stop()
.
- tell(fx)[source]¶
See:
pints.Optimiser.tell()
.
- x_best()[source]¶
See:
pints.Optimiser.x_best()
.
- x_guessed()¶
Returns the optimiser’s current best estimate of where the optimum is.
For many optimisers, this will simply be the point for which the minimal error or maximum likelihood was observed, so that
x_guessed = x_best
. However, optimisers likepints.CMAES
and its derivatives, maintain a separate “best guess” value that does not necessarily correspond to any of the points evaluated during the optimisation.