PSO

class pints.PSO(x0, sigma0=None, boundaries=None)[source]

Finds the best parameters using the PSO method described in [1].

Particle Swarm Optimisation (PSO) is a global search method (so refinement with a local optimiser is advised!) that works well for problems in high dimensions and with many local minima. Because it treats each parameter independently, it does not require preconditioning of the search space.

In a particle swarm optimization, the parameter space is explored by n independent particles. The particles perform a pseudo-random walk through the parameter space, guided by their own personal best score and the global optimum found so far.

The method starts by creating a swarm of n particles and assigning each an initial position and initial velocity (see the explanation of the arguments hints and v for details). Each particle’s score is calculated and set as the particle’s current best local score pl. The best score of all the particles is set as the best global score pg.

Next, an iterative procedure is run that updates each particle’s velocity v and position x using:

v[k] = v[k-1] + al * (pl - x[k-1]) + ag * (pg - x[k-1])
x[k] = v[k]

Here, x[t] is the particle’s current position and v[t] its current velocity. The values al and ag are scalars randomly sampled from a uniform distribution, with values bound by r * 4.1 and (1 - r) * 4.1. Thus a swarm with r = 1 will only use local information, while a swarm with r = 0 will only use global information. The de facto standard is r = 0.5. The random sampling is done each time al and ag are used: at each time step every particle performs m samplings, where m is the dimensionality of the search space.

Pseudo-code algorithm:

almax = r * 4.1
agmax = 4.1 - almax
while stopping criterion not met:
    for i in [1, 2, .., n]:
        if f(x[i]) < f(p[i]):
            p[i] = x[i]
        pg = min(p[1], p[2], .., p[n])
        for j in [1, 2, .., m]:
            al = uniform(0, almax)
            ag = uniform(0, agmax)
            v[i,j] += al * (p[i,j] - x[i,j]) + ag * (pg[i,j]  - x[i,j])
            x[i,j] += v[i,j]

Extends PopulationBasedOptimiser.

References

ask()[source]

See Optimiser.ask().

f_best()[source]

See Optimiser.f_best().

f_guessed()

For optimisers in which the best guess of the optimum (see x_guessed()) differs from the best-seen point (see x_best()), this method returns an estimate of the objective function value at x_guessed.

Notes:

  1. For many optimisers the best guess is simply the best point seen during the optimisation, so that this method is equivalent to f_best().

  2. Because x_guessed is not required to be a point that the optimiser has visited, the value f(x_guessed) may be unkown. In these cases, an approximation of f(x_guessed) may be returned.

fbest()

Deprecated alias of f_best().

n_hyper_parameters()[source]

See TunableMethod.n_hyper_parameters().

name()[source]

See Optimiser.name().

needs_sensitivities()

Returns True if this methods needs sensitivities to be passed in to tell along with the evaluated error.

population_size()

Returns this optimiser’s population size.

If no explicit population size has been set, None may be returned. Once running, the correct value will always be returned.

running()[source]

See Optimiser.running().

set_hyper_parameters(x)[source]

The hyper-parameter vector is [population_size, local_global_balance].

See TunableMethod.set_hyper_parameters().

set_local_global_balance(r=0.5)[source]

Set the balance between local and global exploration for each particle, using a parameter r such that r = 1 is a fully local search and r = 0 is a fully global search.

set_population_size(population_size=None)

Sets a population size to use in this optimisation.

If population_size is set to None, the population size will be set using the heuristic suggested_population_size().

stop()

Checks if this method has run into trouble and should terminate. Returns False if everything’s fine, or a short message (e.g. “Ill-conditioned matrix.”) if the method should terminate.

suggested_population_size(round_up_to_multiple_of=None)

Returns a suggested population size for this method, based on the dimension of the search space (e.g. the parameter space).

If the optional argument round_up_to_multiple_of is set to an integer greater than 1, the method will round up the estimate to a multiple of that number. This can be useful to obtain a population size based on e.g. the number of worker processes used to perform objective function evaluations.

tell(fx)[source]

See Optimiser.tell().

x_best()[source]

See Optimiser.x_best().

x_guessed()

Returns the optimiser’s current best estimate of where the optimum is.

For many optimisers, this will simply be the point for which the minimal error or maximum likelihood was observed, so that x_guessed = x_best. However, optimisers like pints.CMAES and its derivatives, maintain a separate “best guess” value that does not necessarily correspond to any of the points evaluated during the optimisation.

xbest()

Deprecated alias of x_best().