BayesianProblem#
- class cuqi.problem.BayesianProblem(*densities, **data)#
Representation of a Bayesian inverse problem defined by any number of densities (distributions and likelihoods), e.g.
\[\begin{align*} \mathrm{density}_1 &\sim \pi_1 \newline \mathrm{density}_2 &\sim \pi_2 \newline &\vdots \end{align*}\]The main goal of this class is to provide fully automatic methods for computing samples or point estimates of the Bayesian problem.
This class uses
JointDistribution
to model the Bayesian problem, and to condition on observed data. We term the resulting distribution the target distribution.- Parameters:
*densities (Density) – The densities that represent the Bayesian Problem. Each density is passed as comma-separated arguments. Can be Distribution, Likelihood etc.
**data (ndarray, Optional) – Any potential observed data. The data should be passed as keyword arguments, where the keyword is the name of the density that the data is associated with. Data can alternatively be set using the
set_data()
method, after the problem has been initialized.
Examples
Basic syntax
Given distributions for
x
,y
andz
, we can define a Bayesian problem and set the observed datay=y_data
as follows:BP = BayesianProblem(x, y, z).set_data(y=y_data)
Complete example
Consider a Bayesian inverse problem with a Gaussian prior and a Gaussian likelihood. Assume that the forward model is a linear model with a known matrix \(\mathbf{A}: \mathbf{x} \mapsto \mathbf{y}\) and that we have observed data \(\mathbf{y}=\mathbf{y}^\mathrm{obs}\). The Bayesian model can be summarized as
\[\begin{align*} \mathbf{x} &\sim \mathcal{N}(\mathbf{0}, 0.1^2 \mathbf{I}) \newline \mathbf{y} &\sim \mathcal{N}(\mathbf{A}\mathbf{x}, 0.05^2 \mathbf{I}). \end{align*}\]Using the
BayesianProblem
class, we can define the problem, set the data and compute samples from the posterior distribution associated with the problem as well as estimates such as the Maximum Likelihood (ML) and Maximum A Posteriori (MAP).Note
In this case, we use a forward model from the
testproblem
module, but any custom forward model can added via themodel
module.# Import modules import cuqi import numpy as np import matplotlib.pyplot as plt # Deterministic forward model and data (1D convolution) A, y_data, probInfo = cuqi.testproblem.Deconvolution1D().get_components() # Bayesian model x = cuqi.distribution.Gaussian(np.zeros(A.domain_dim), 0.1) y = cuqi.distribution.Gaussian(A@x, 0.05) # Define Bayesian problem and set data BP = cuqi.problem.BayesianProblem(y, x).set_data(y=y_data) # Compute MAP estimate x_MAP = BP.MAP() # Compute samples from posterior x_samples = BP.sample_posterior(1000) # Plot results x_samples.plot_ci(exact=probInfo.exactSolution) plt.show() # Plot difference between MAP and sample mean (x_MAP - x_samples.mean()).plot() plt.title("MAP estimate - sample mean") plt.show()
Notes
In the simplest form the Bayesian problem represents a posterior distribution defined by two densities, i.e.,
\[\pi_\mathrm{posterior}(\boldsymbol{\theta} \mid \mathbf{y}) \propto \pi_1(\mathbf{y} \mid \boldsymbol{\theta}) \pi_2(\boldsymbol{\theta}),\]where \(\pi_1(\mathbf{y} \mid \boldsymbol{\theta})\) is a
Likelihood
function and \(\pi_2(\boldsymbol{\theta})\) is aDistribution
. In this two-density case, the joint distribution reduces to aPosterior
distribution.Most functionality is currently only implemented for this simple case.
- __init__(*densities, **data)#
Methods
MAP
([disp, x0])Compute the Maximum A Posteriori (MAP) estimate of the posterior.
ML
([disp, x0])Compute the Maximum Likelihood (ML) estimate of the posterior.
UQ
([Ns, Nb, percent, exact, experimental])Run an Uncertainty Quantification (UQ) analysis on the Bayesian problem and provide a summary of the results.
__init__
(*densities, **data)Method that returns the model, the data and additional information to be used in formulating the Bayesian problem.
sample_posterior
(Ns[, Nb, callback, ...])Sample the posterior.
sample_prior
(Ns[, callback])Sample the prior distribution.
set_data
(**kwargs)Set the data of the problem.
Attributes
Extract the observed data from likelihood
The likelihood function.
Extract the cuqi model from likelihood.
Create posterior distribution from likelihood and prior.
The prior distribution