Welcome to the pints documentation¶
Pints is hosted on GitHub, where you can find downloads and installation instructions.
Detailed examples can also be found there.
This page provides the API, or developer documentation for pints
.
Contents¶
- Boundaries
- Core classes and methods
- Diagnosing MCMC results
- Diagnostic plots
- Error measures
- Function evaluation
- I/O Helper classes
- Log-likelihoods
- Log-PDFs
- Log-priors
- MCMC Samplers
- Running an MCMC routine
- MCMC Sampler base classes
- Adaptive Covariance MC
- Differential Evolution MCMC
- Dram ACMC
- DreamMCMC
- Dual Averaging
- EmceeHammerMCMC
- Haario ACMC
- Haario Bardenet ACMC
- Hamiltonian MCMC
- Metropolis-Adjusted Langevin Algorithm (MALA) MCMC
- Metropolis Random Walk MCMC
- Monomial-Gamma Hamiltonian MCMC
- No-U-Turn MCMC Sampler
- Population MCMC
- Rao-Blackwell ACMC
- Relativistic MCMC
- Slice Sampling - Doubling MCMC
- Slice Sampling - Rank Shrinking MCMC
- Slice Sampling - Stepout MCMC
- MCMC Summary
- Nested samplers
- Noise generators
- Optimisers
- Noise model diagnostics
- Toy problems
- Toy base classes
- Annulus Distribution
- Beeler-Reuter Action Potential Model
- Cone Distribution
- Constant Model
- Eight Schools distribution
- Fitzhugh-Nagumo Model
- Gaussian distribution
- German Credit Hierarchical Logistic Distribution
- German Credit Logistic Distribution
- Goodwin oscillator model
- HES1 Michaelis-Menten Model
- High dimensional Gaussian distribution
- Hodgkin-Huxley IK Experiment Model
- Logistic model
- Lotka-Volterra model
- Multimodal Gaussian distribution
- Neal’s Funnel Distribution
- Parabolic error
- Repressilator model
- Rosenbrock function
- Simple Egg Box Distribution
- Simple Harmonic Oscillator model
- SIR Epidemiology model
- Stochastic degradation model
- Stochastic Logistic Model
- Twisted Gaussian distribution
- Transformations
- Utilities
Hierarchy of methods¶
Pints contains different types of methods, that can be roughly arranged into a hierarchy, as follows.
Sampling¶
MCMC without gradients
MetropolisRandomWalkMCMC
, works on anyLogPDF
.- Metropolis-Hastings
- Adaptive methods
AdaptiveCovarianceMC
, works on anyLogPDF
.
PopulationMCMC
, works on anyLogPDF
.- Differential evolution methods
DifferentialEvolutionMCMC
, works on anyLogPDF
.DreamMCMC
, works on anyLogPDF
.EmceeHammerMCMC
, works on anyLogPDF
.
Nested sampling
NestedEllipsoidSampler
, requires aLogPDF
and aLogPrior
that can be sampled from.NestedRejectionSampler
, requires aLogPDF
and aLogPrior
that can be sampled from.
- Particle based samplers
- SMC
- Likelihood free sampling (Need distance between data and states, e.g. least squares?)
- ABC-MCMC
- ABC-SMC
- 1st order sensitivity MCMC samplers (Need derivatives of
LogPDF
)Metropolis-Adjusted Langevin Algorithm (MALA)
, works on anyLogPDF
that provides 1st order sensitivities.Hamiltonian Monte Carlo
, works on anyLogPDF
that provides 1st order sensitivities.- NUTS
- Differential geometric methods (Need Hessian of
LogPDF
)- smMALA
- RMHMC
Problems in Pints¶
Pints defines single
and
multi-output
problem classes that wrap around
models and data, and over which error measures
or
log-likelihoods
can be defined.
To find the appropriate type of Problem to use, see the overview below:
- Systems with a single observable output
- Single data set: Use a
SingleOutputProblem
and any of the appropriate error measures or log-likelihoods - Multiple, independent data sets: Define multiple
SingleOutputProblems
and an error measure / log-likelihood on each, and then combine using e.g.SumOfErrors
orSumOfIndependentLogPDFs
.
- Single data set: Use a
- Systems with multiple observable outputs
- Single data set: Use a
MultiOutputProblem
and any of the appropriate error measures or log-likelihoods
- Single data set: Use a