Nested ellipsoid sampler

class pints.NestedEllipsoidSampler(log_prior)[source]

Creates a nested sampler that estimates the marginal likelihood and generates samples from the posterior.

This is the form of nested sampler described in [1], where an ellipsoid is drawn around surviving particles (typically with an enlargement factor to avoid missing prior mass), and then random samples are drawn from within the bounds of the ellipsoid. By sampling in the space of surviving particles, the efficiency of this algorithm aims to improve upon simple rejection sampling. This algorithm has the following steps:

Initialise:

Z_0 = 0
X_0 = 1

Draw samples from prior:

for i in 1:n_active_points:
    theta_i ~ p(theta), i.e. sample from the prior
    L_i = p(theta_i|X)
endfor
L_min = min(L)
indexmin = min_index(L)

Run rejection sampling for n_rejection_samples to generate an initial sample, along with updated values of L_min and indexmin.

Fit active points using a minimum volume bounding ellipse. In our approach, we do this with the following procedure (which we term minimum_volume_ellipsoid in what follows) that returns the positive definite matrix A with centre c that define the ellipsoid by \((x - c)^t A (x - c) = 1\):

cov = covariance(transpose(active_points))
cov_inv = inv(cov)
c = mean(points)
for i in n_active_points:
    dist[i] = (points[i] - c) * cov_inv * (points[i] - c)
endfor
enlargement_factor = max(dist)
A = (1.0 / enlargement_factor) * cov_inv
return A, c

From then on, in each iteration (t), the following occurs:

if mod(t, ellipsoid_update_gap) == 0:
    A, c = minimum_volume_ellipsoid(active_points)
else:
    if dynamic_enlargement_factor:
        enlargement_factor *= (
            exp(-(t + 1) / n_active_points)**alpha
        )
    endif
endif
theta* = ellipsoid_sample(enlargement_factor, A, c)
while p(theta*|X) < L_min:
    theta* = ellipsoid_sample(enlargement_factor, A, c)
endwhile
theta_indexmin = theta*
L_indexmin = p(theta*|X)

If the parameter dynamic_enlargement_factor is true, the enlargement factor is shrunk as the sampler runs, to avoid inefficiencies in later iterations. By default, the enlargement factor begins at 1.1.

In ellipsoid_sample, a point is drawn uniformly from within the minimum volume ellipsoid, whose volume is increased by a factor enlargement_factor.

At the end of iterations, there is a final Z increment:

Z = Z + (1 / n_active_points) * (L_1 + L_2 + ..., + L_n_active_points)

The posterior samples are generated as described in [2] on page 849 by weighting each dropped sample in proportion to the volume of the posterior region it was sampled from. That is, the probability for drawing a given sample j is given by:

p_j = L_j * w_j / Z

where j = 1, …, n_iterations.

Extends NestedSampler.

References

active_points()

Returns the active points from nested sampling run.

alpha()[source]

Returns alpha which controls rate of decline of enlargement factor with iteration (when dynamic_enlargement_factor is true).

ask(n_points)[source]

If in initial phase, then uses rejection sampling. Afterwards, points are drawn from within an ellipse (needs to be in uniform sampling regime).

dynamic_enlargement_factor()[source]

Returns dynamic enlargement factor.

ellipsoid_update_gap()[source]

Returns the ellipsoid update gap used in the algorithm (see set_ellipsoid_update_gap()).

enlargement_factor()[source]

Returns the enlargement factor used in the algorithm (see set_enlargement_factor()).

in_initial_phase()[source]

See pints.NestedSampler.in_initial_phase().

min_index()

Returns index of sample with lowest log-likelihood.

n_active_points()

Returns the number of active points that will be used in next run.

n_hyper_parameters()[source]

See TunableMethod.n_hyper_parameters().

n_rejection_samples()[source]

Returns the number of rejection sample used in the algorithm (see set_n_rejection_samples()).

name()[source]

See pints.NestedSampler.name().

needs_initial_phase()[source]

See pints.NestedSampler.needs_initial_phase().

needs_sensitivities()

Determines whether sampler uses sensitivities of the solution.

running_log_likelihood()

Returns current value of the threshold log-likelihood value.

set_alpha(alpha)[source]

Sets alpha which controls rate of decline of enlargement factor with iteration (when dynamic_enlargement_factor is true).

set_dynamic_enlargement_factor(dynamic_enlargement_factor)[source]

Sets dynamic enlargement factor

set_ellipsoid_update_gap(ellipsoid_update_gap=100)[source]

Sets the frequency with which the minimum volume ellipsoid is re-estimated as part of the nested rejection sampling algorithm.

A higher rate of this parameter means each sample will be more efficiently produced, yet the cost of re-computing the ellipsoid may mean it is better to update this not each iteration – instead, with gaps of ellipsoid_update_gap between each update. By default, the ellipse is updated every 100 iterations.

set_enlargement_factor(enlargement_factor=1.1)[source]

Sets the factor (>1) by which to increase the minimal volume ellipsoidal in rejection sampling.

A higher value means it is less likely that areas of high probability mass will be missed. A low value means that rejection sampling is more efficient.

set_hyper_parameters(x)[source]

The hyper-parameter vector is [# active points, # rejection samples, enlargement factor, ellipsoid update gap, dynamic enlargement factor, alpha].

See TunableMethod.set_hyper_parameters().

set_initial_phase(in_initial_phase)[source]

See pints.NestedSampler.set_initial_phase().

set_n_active_points(active_points)

Sets the number of active points for the next run.

set_n_rejection_samples(rejection_samples=200)[source]

Sets the number of rejection samples to take, which will be assigned weights and ultimately produce a set of posterior samples.

tell(fx)

If a single evaluation is provided as arguments, a single point is accepted and returned if its likelihood exceeds the current threshold; otherwise None is returned.

If multiple evaluations are provided as arguments (for example, if running the algorithm in parallel), None is returned if no points have likelihood exceeding threshold; if a single point passes the threshold, it is returned; if multiple points pass, one is selected uniformly at random and returned and the others are stored for later use.

In all cases, two objects are returned: the proposed point (which may be None) and an array of other points that also pass the threshold (which is empty for single evaluation mode but may be non-empty for multiple evaluation mode).