.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "user/_auto_howtos/defining_posterior.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_user__auto_howtos_defining_posterior.py: How to define a Posterior distribution ====================================== The recommended way to define a posterior distribution in CUQIpy is to use the :class:`~cuqi.distribution.JointDistribution` class to define the joint distribution of the parameters and the data and then condition on observed data to obtain the posterior distribution as shown in the examples below. .. GENERATED FROM PYTHON SOURCE LINES 12-15 .. code-block:: Python import cuqi import numpy as np .. GENERATED FROM PYTHON SOURCE LINES 16-25 A simple Bayesian inverse problem --------------------------------- Consider a deconvolution inverse problem given by .. math:: \mathbf{y} = \mathbf{A}\mathbf{x}. See :class:`~cuqi.testproblem.Deconvolution1D` for more details. .. GENERATED FROM PYTHON SOURCE LINES 25-28 .. code-block:: Python A, y_obs, _ = cuqi.testproblem.Deconvolution1D().get_components() .. GENERATED FROM PYTHON SOURCE LINES 29-38 Then consider the following Bayesian model .. math:: \begin{align*} \mathbf{x} &\sim \mathcal{N}(\mathbf{0}, 0.1\,\mathbf{I})\\ \mathbf{y} &\sim \mathcal{N}(\mathbf{A}\mathbf{x}, 0.05^2\,\mathbf{I}) \end{align*} which can be written in CUQIpy as .. GENERATED FROM PYTHON SOURCE LINES 38-42 .. code-block:: Python x = cuqi.distribution.Gaussian(np.zeros(A.domain_dim), 0.1) y = cuqi.distribution.Gaussian(A(x), 0.05**2) .. GENERATED FROM PYTHON SOURCE LINES 43-44 The joint distribution :math:`p(\mathbf{x}, \mathbf{y})` is then obtained by .. GENERATED FROM PYTHON SOURCE LINES 44-48 .. code-block:: Python joint = cuqi.distribution.JointDistribution(x, y) print(joint) .. rst-class:: sphx-glr-script-out .. code-block:: none JointDistribution( Equation: p(x,y) = p(x)p(y|x) Densities: x ~ CUQI Gaussian. y ~ CUQI Gaussian. Conditioning variables ['x']. ) .. GENERATED FROM PYTHON SOURCE LINES 49-51 The posterior :math:`p(\mathbf{x}|\mathbf{y}=\mathbf{y}^\mathrm{obs})` is obtained by conditioning on the observed data as follows. .. GENERATED FROM PYTHON SOURCE LINES 51-55 .. code-block:: Python posterior = joint(y=y_obs) print(posterior) .. rst-class:: sphx-glr-script-out .. code-block:: none Posterior( Equation: p(x|y) ∝ L(x|y)p(x) Densities: y ~ CUQI Gaussian Likelihood function. Parameters ['x']. x ~ CUQI Gaussian. ) .. GENERATED FROM PYTHON SOURCE LINES 56-57 Evaluating the posterior log density is then as simple as .. GENERATED FROM PYTHON SOURCE LINES 57-60 .. code-block:: Python posterior.logd(np.ones(A.domain_dim)) .. rst-class:: sphx-glr-script-out .. code-block:: none array([-21631.13432224]) .. GENERATED FROM PYTHON SOURCE LINES 61-71 Posterior with two forward models --------------------------------- Suppose we had two forward models :math:`\mathbf{A}` and :math:`\mathbf{B}`: .. math:: \begin{align*} \mathbf{y} &= \mathbf{A}\mathbf{x}\\ \mathbf{d} &= \mathbf{B}\mathbf{x}\\ \end{align*} .. GENERATED FROM PYTHON SOURCE LINES 71-76 .. code-block:: Python # Both observations come from the same unknown x A, y_obs, _ = cuqi.testproblem.Deconvolution1D().get_components() B, d_obs, _ = cuqi.testproblem.Deconvolution1D(PSF="Defocus", noise_std=0.02).get_components() .. GENERATED FROM PYTHON SOURCE LINES 77-85 Then consider the following Bayesian model .. math:: \begin{align*} \mathbf{x} &\sim \mathcal{N}(\mathbf{0}, 0.1\,\mathbf{I})\\ \mathbf{y} &\sim \mathcal{N}(\mathbf{A}\mathbf{x}, 0.05^2\mathbf{I})\\ \mathbf{d} &\sim \mathcal{N}(\mathbf{B}\mathbf{x}, 0.01^2\mathbf{I}) \end{align*} .. GENERATED FROM PYTHON SOURCE LINES 85-90 .. code-block:: Python x = cuqi.distribution.Gaussian(np.zeros(A.domain_dim), 0.1) y = cuqi.distribution.Gaussian(A(x), 0.05**2) d = cuqi.distribution.Gaussian(B(x), 0.01**2) .. GENERATED FROM PYTHON SOURCE LINES 91-93 The joint distribution :math:`p(\mathbf{x}, \mathbf{y}, \mathbf{d})` is then obtained by .. GENERATED FROM PYTHON SOURCE LINES 93-97 .. code-block:: Python joint2 = cuqi.distribution.JointDistribution(x, y, d) print(joint2) .. rst-class:: sphx-glr-script-out .. code-block:: none JointDistribution( Equation: p(x,y,d) = p(x)p(y|x)p(d|x) Densities: x ~ CUQI Gaussian. y ~ CUQI Gaussian. Conditioning variables ['x']. d ~ CUQI Gaussian. Conditioning variables ['x']. ) .. GENERATED FROM PYTHON SOURCE LINES 98-100 The posterior :math:`p(\mathbf{x}|\mathbf{y}=\mathbf{y}^\mathrm{obs},\mathbf{d}=\mathbf{d}^\mathrm{obs})` is obtained by conditioning on the observed data as follows. .. GENERATED FROM PYTHON SOURCE LINES 100-104 .. code-block:: Python posterior2 = joint2(y=y_obs, d=d_obs) print(posterior2) .. rst-class:: sphx-glr-script-out .. code-block:: none MultipleLikelihoodPosterior( Equation: p(x|y,d) ∝ p(x)L(x|y)L(x|d) Densities: x ~ CUQI Gaussian. y ~ CUQI Gaussian Likelihood function. Parameters ['x']. d ~ CUQI Gaussian Likelihood function. Parameters ['x']. ) .. GENERATED FROM PYTHON SOURCE LINES 105-106 Evaluating the posterior log density is then as simple as .. GENERATED FROM PYTHON SOURCE LINES 106-109 .. code-block:: Python posterior2.logd(np.ones(A.domain_dim)) .. rst-class:: sphx-glr-script-out .. code-block:: none array([-564946.18474828]) .. GENERATED FROM PYTHON SOURCE LINES 110-125 Arbitrarily complex posterior distributions ------------------------------------------- The :class:`~cuqi.distribution.JointDistribution` class can be used to construct arbitrarily complex posterior distributions. For example suppose we have the following 3 forward models .. math:: \begin{align*} \mathbf{y} &= \mathbf{A}\mathbf{x}\\ \mathbf{d} &= \mathbf{B}\mathbf{x}\\ \mathbf{b} &= C(\mathbf{x}) \end{align*} where :math:`C` is a nonlinear function. .. GENERATED FROM PYTHON SOURCE LINES 125-132 .. code-block:: Python # Same x for all 3 observations A, y_obs, _ = cuqi.testproblem.Deconvolution1D().get_components() B, d_obs, _ = cuqi.testproblem.Deconvolution1D(PSF="Defocus", noise_std=0.02).get_components() C = cuqi.model.Model(lambda x: np.linalg.norm(x)**2, 1, A.domain_dim) b_obs = 16 .. GENERATED FROM PYTHON SOURCE LINES 133-145 Then consider the following Bayesian model .. math:: \begin{align*} q &\sim \mathcal{U}(0.1, 10)\\ l &\sim \mathrm{Gamma}(1, 1)\\ s &\sim \mathrm{Gamma}(1, 10^{-2})\\ \mathbf{x} &\sim \mathcal{N}(\mathbf{0}, l^{-1}\mathbf{I})\\ \mathbf{y} &\sim \mathcal{N}(\mathbf{A}\mathbf{x}, s^{-1}\mathbf{I})\\ \mathbf{d} &\sim \mathcal{N}(\mathbf{B}\mathbf{x}, 0.01\mathbf{I})\\ \mathbf{b} &\sim \mathcal{L}(\mathbf{C}(\mathbf{x}), q) \end{align*} .. GENERATED FROM PYTHON SOURCE LINES 145-154 .. code-block:: Python q = cuqi.distribution.Uniform(0.1, 10) l = cuqi.distribution.Gamma(1, 1) s = cuqi.distribution.Gamma(1, 1e-2) x = cuqi.distribution.Gaussian(np.zeros(A.domain_dim), lambda l: 1/l) y = cuqi.distribution.Gaussian(A(x), lambda s: 1/s) d = cuqi.distribution.Gaussian(B(x), 0.01**2) b = cuqi.distribution.Laplace(C(x), lambda q: q) .. GENERATED FROM PYTHON SOURCE LINES 155-157 The joint distribution :math:`p(q, l, s, \mathbf{x}, \mathbf{y}, \mathbf{d}, \mathbf{b})` is then obtained by .. GENERATED FROM PYTHON SOURCE LINES 157-161 .. code-block:: Python joint3 = cuqi.distribution.JointDistribution(q, l, s, x, y, d, b) print(joint3) .. rst-class:: sphx-glr-script-out .. code-block:: none JointDistribution( Equation: p(q,l,s,x,y,d,b) = p(q)p(l)p(s)p(x|l)p(y|x,s)p(d|x)p(b|x,q) Densities: q ~ CUQI Uniform. l ~ CUQI Gamma. s ~ CUQI Gamma. x ~ CUQI Gaussian. Conditioning variables ['l']. y ~ CUQI Gaussian. Conditioning variables ['x', 's']. d ~ CUQI Gaussian. Conditioning variables ['x']. b ~ CUQI Laplace. Conditioning variables ['x', 'q']. ) .. GENERATED FROM PYTHON SOURCE LINES 162-164 The posterior :math:`p(q, l, s, \mathbf{x}|\mathbf{y}=\mathbf{y}^\mathrm{obs},\mathbf{d}=\mathbf{d}^\mathrm{obs},\mathbf{b}=\mathbf{b}^\mathrm{obs})` is obtained by conditioning on the observed data as follows. .. GENERATED FROM PYTHON SOURCE LINES 164-168 .. code-block:: Python posterior3 = joint3(y=y_obs, d=d_obs, b=b_obs) print(posterior3) .. rst-class:: sphx-glr-script-out .. code-block:: none JointDistribution( Equation: p(q,l,s,x|y,d,b) ∝ p(q)p(l)p(s)p(x|l)L(x,s|y)L(x|d)L(x,q|b) Densities: q ~ CUQI Uniform. l ~ CUQI Gamma. s ~ CUQI Gamma. x ~ CUQI Gaussian. Conditioning variables ['l']. y ~ CUQI Gaussian Likelihood function. Parameters ['x', 's']. d ~ CUQI Gaussian Likelihood function. Parameters ['x']. b ~ CUQI Laplace Likelihood function. Parameters ['x', 'q']. ) .. GENERATED FROM PYTHON SOURCE LINES 169-171 Evaluating the posterior log density jointly over p, l, s, and :math:`\mathbf{x}` is then as simple as .. GENERATED FROM PYTHON SOURCE LINES 171-174 .. code-block:: Python posterior3.logd(q=1, l=1, s=1, x=np.ones(A.domain_dim)) .. rst-class:: sphx-glr-script-out .. code-block:: none array([-540376.7348594]) .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 0.029 seconds) .. _sphx_glr_download_user__auto_howtos_defining_posterior.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: defining_posterior.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: defining_posterior.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: defining_posterior.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_