Twisted Gaussian distribution¶
-
class
pints.toy.
TwistedGaussianLogPDF
(dimension=10, b=0.1, V=100)[source]¶ Twisted multivariate Gaussian ‘banana’ with un-normalised density [1]:
\[p(x_1, x_2, x_3, ..., x_n) \propto \pi(\phi(x_1, x_2, x_2, ..., x_n))\]where pi is the multivariate Gaussian density with covariance matrix \(\Sigma=\text{diag}(100, 1, 1, ..., 1)\) and
\[\phi(x_1,x_2,x_3,...,x_n) = (x_1, x_2 + b x_1^2 - V b, x_3, ..., x_n),\]Extends
pints.toy.ToyLogPDF
.Parameters: - dimension (int) – Problem dimension (
n
), must be 2 or greater. - b (float) – “Bananicity”:
b = 0.01
induces mild non-linearity in target density, while non-linearity forb = 0.1
is high. Must be greater than or equal to zero. - V (float) – Offset (see equation).
References
[1] Adaptive proposal distribution for random walk Metropolis algorithm Haario, Saksman, Tamminen (1999) Computational Statistics. https://doi.org/10.1007/s001800050022 -
distance
(samples)[source]¶ Returns
approximate Kullback-Leibler divergence
of samples from underyling distribution.
-
kl_divergence
(samples)[source]¶ Calculates the approximate Kullback-Leibler divergence between a given list of samples and the distribution underlying this LogPDF.
The returned value is (near) zero for perfect sampling, and then increases as the error gets larger.
See: https://en.wikipedia.org/wiki/Kullback-Leibler_divergence
- dimension (int) – Problem dimension (