Twisted Gaussian distribution

class pints.toy.TwistedGaussianLogPDF(dimension=10, b=0.1, V=100)[source]

Twisted multivariate Gaussian ‘banana’ with un-normalised density [1]:

\[p(x_1, x_2, x_3, ..., x_n) \propto \pi(\phi(x_1, x_2, x_2, ..., x_n))\]

where pi is the multivariate Gaussian density with covariance matrix \(\Sigma=\text{diag}(100, 1, 1, ..., 1)\) and

\[\phi(x_1,x_2,x_3,...,x_n) = (x_1, x_2 + b x_1^2 - V b, x_3, ..., x_n),\]

Extends pints.toy.ToyLogPDF.

Parameters:
  • dimension (int) – Problem dimension (n), must be 2 or greater.

  • b (float) – “Bananicity”: b = 0.01 induces mild non-linearity in target density, while non-linearity for b = 0.1 is high. Must be greater than or equal to zero.

  • V (float) – Offset (see equation).

References

distance(samples)[source]

Returns approximate Kullback-Leibler divergence of samples from underyling distribution.

See pints.toy.ToyLogPDF.distance().

evaluateS1(x)[source]

See LogPDF.evaluateS1().

kl_divergence(samples)[source]

Calculates the approximate Kullback-Leibler divergence between a given list of samples and the distribution underlying this LogPDF.

The returned value is (near) zero for perfect sampling, and then increases as the error gets larger.

See: https://en.wikipedia.org/wiki/Kullback-Leibler_divergence

n_parameters()[source]

See pints.LogPDF.n_parameters().

sample(n)[source]

See pints.toy.ToyLogPDF.sample().

suggested_bounds()[source]

See pints.toy.ToyLogPDF.suggested_bounds().

untwist(samples)[source]

De-transforms (or “untwists”) a list of samples from the twisted distribution, which should result in a simple multivariate Gaussian again.