Optimiser base classes

class pints.Optimiser(x0, sigma0=None, boundaries=None)[source]

Base class for optimisers implementing an ask-and-tell interface.

This interface provides fine-grained control. Users seeking to simply run an optimisation may wish to use the OptimisationController instead.

Optimisation using “ask-and-tell” proceed by the user repeatedly “asking” the optimiser for points, and then “telling” it the function evaluations at those points. This allows a user to have fine-grained control over an optimisation, and implement custom parallelisation, logging, stopping criteria etc. Users who don’t need this functionality can use optimisers via the OptimisationController class instead.

All PINTS optimisers are _minimisers_. To maximise a function simply pass in the negative of its evaluations to tell() (this is handled automatically by the OptimisationController).

All optimisers implement the pints.Loggable and pints.TunableMethod interfaces.

Parameters:
  • x0 – A starting point for searches in the parameter space. This value may be used directly (for example as the initial position of a particle in PSO) or indirectly (for example as the center of a distribution in XNES).

  • sigma0 – An optional initial standard deviation around x0. Can be specified either as a scalar value (one standard deviation for all coordinates) or as an array with one entry per dimension. Not all methods will use this information.

  • boundaries – An optional set of boundaries on the parameter space.

Example

An optimisation with ask-and-tell, proceeds roughly as follows:

optimiser = MyOptimiser()
running = True
while running:
    # Ask for points to evaluate
    xs = optimiser.ask()

    # Evaluate the score function or pdf at these points
    # At this point, code to parallelise evaluation can be added in
    fs = [f(x) for x in xs]

    # Tell the optimiser the evaluations; allowing it to update its
    # internal state.
    optimiser.tell(fs)

    # Check stopping criteria
    # At this point, custom stopping criteria can be added in
    if optimiser.f_best() < threshold:
        running = False

    # Check for optimiser issues
    if optimiser.stop():
        running = False

    # At this point, code to visualise or benchmark optimiser behaviour
    # could be added in, for example by plotting `xs` in the parameter
    # space.
ask()[source]

Returns a list of positions in the search space to evaluate.

f_best()[source]

Returns the best objective function evaluation seen by this optimiser, such that f_best = f(x_best).

f_guessed()[source]

For optimisers in which the best guess of the optimum (see x_guessed()) differs from the best-seen point (see x_best()), this method returns an estimate of the objective function value at x_guessed.

Notes:

  1. For many optimisers the best guess is simply the best point seen during the optimisation, so that this method is equivalent to f_best().

  2. Because x_guessed is not required to be a point that the optimiser has visited, the value f(x_guessed) may be unkown. In these cases, an approximation of f(x_guessed) may be returned.

fbest()[source]

Deprecated alias of f_best().

n_hyper_parameters()

Returns the number of hyper-parameters for this method (see TunableMethod).

name()[source]

Returns this method’s full name.

needs_sensitivities()[source]

Returns True if this methods needs sensitivities to be passed in to tell along with the evaluated error.

running()[source]

Returns True if this an optimisation is in progress.

set_hyper_parameters(x)

Sets the hyper-parameters for the method with the given vector of values (see TunableMethod).

Parameters:

x – An array of length n_hyper_parameters used to set the hyper-parameters.

stop()[source]

Checks if this method has run into trouble and should terminate. Returns False if everything’s fine, or a short message (e.g. “Ill-conditioned matrix.”) if the method should terminate.

tell(fx)[source]

Performs an iteration of the optimiser algorithm, using the evaluations fx of the points x previously specified by ask.

For methods that require sensitivities (see needs_sensitivities()), fx should be a tuple (objective, sensitivities), containing the values returned by pints.ErrorMeasure.evaluateS1().

x_best()[source]

Returns the best position seen during an optimisation, i.e. the point for which the minimal error or maximum likelihood was observed.

x_guessed()[source]

Returns the optimiser’s current best estimate of where the optimum is.

For many optimisers, this will simply be the point for which the minimal error or maximum likelihood was observed, so that x_guessed = x_best. However, optimisers like pints.CMAES and its derivatives, maintain a separate “best guess” value that does not necessarily correspond to any of the points evaluated during the optimisation.

xbest()[source]

Deprecated alias of x_best().

class pints.PopulationBasedOptimiser(x0, sigma0=None, boundaries=None)[source]

Base class for optimisers that work by moving multiple points through the search space.

Extends Optimiser.

ask()

Returns a list of positions in the search space to evaluate.

f_best()

Returns the best objective function evaluation seen by this optimiser, such that f_best = f(x_best).

f_guessed()

For optimisers in which the best guess of the optimum (see x_guessed()) differs from the best-seen point (see x_best()), this method returns an estimate of the objective function value at x_guessed.

Notes:

  1. For many optimisers the best guess is simply the best point seen during the optimisation, so that this method is equivalent to f_best().

  2. Because x_guessed is not required to be a point that the optimiser has visited, the value f(x_guessed) may be unkown. In these cases, an approximation of f(x_guessed) may be returned.

fbest()

Deprecated alias of f_best().

n_hyper_parameters()[source]

See TunableMethod.n_hyper_parameters().

name()

Returns this method’s full name.

needs_sensitivities()

Returns True if this methods needs sensitivities to be passed in to tell along with the evaluated error.

population_size()[source]

Returns this optimiser’s population size.

If no explicit population size has been set, None may be returned. Once running, the correct value will always be returned.

running()

Returns True if this an optimisation is in progress.

set_hyper_parameters(x)[source]

The hyper-parameter vector is [population_size].

See TunableMethod.set_hyper_parameters().

set_population_size(population_size=None)[source]

Sets a population size to use in this optimisation.

If population_size is set to None, the population size will be set using the heuristic suggested_population_size().

stop()

Checks if this method has run into trouble and should terminate. Returns False if everything’s fine, or a short message (e.g. “Ill-conditioned matrix.”) if the method should terminate.

suggested_population_size(round_up_to_multiple_of=None)[source]

Returns a suggested population size for this method, based on the dimension of the search space (e.g. the parameter space).

If the optional argument round_up_to_multiple_of is set to an integer greater than 1, the method will round up the estimate to a multiple of that number. This can be useful to obtain a population size based on e.g. the number of worker processes used to perform objective function evaluations.

tell(fx)

Performs an iteration of the optimiser algorithm, using the evaluations fx of the points x previously specified by ask.

For methods that require sensitivities (see needs_sensitivities()), fx should be a tuple (objective, sensitivities), containing the values returned by pints.ErrorMeasure.evaluateS1().

x_best()

Returns the best position seen during an optimisation, i.e. the point for which the minimal error or maximum likelihood was observed.

x_guessed()

Returns the optimiser’s current best estimate of where the optimum is.

For many optimisers, this will simply be the point for which the minimal error or maximum likelihood was observed, so that x_guessed = x_best. However, optimisers like pints.CMAES and its derivatives, maintain a separate “best guess” value that does not necessarily correspond to any of the points evaluated during the optimisation.

xbest()

Deprecated alias of x_best().