.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "user/_auto_tutorials/gibbs.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_user__auto_tutorials_gibbs.py: Gibbs sampling ============== This tutorial shows how to use CUQIpy to perform Gibbs sampling. Gibbs sampling is a Markov chain Monte Carlo (MCMC) method for sampling a joint probability distribution. Opposed to jointly sampling the distribution simultaneously, Gibbs sampling samples the variables of the distribution sequentially, one variable at a time. When a variable represents a random vector, the whole vector is sampled simultaneously. The sampling of each variable is done by sampling from the conditional distribution of that variable given (fixed, previously sampled) values of the other variables. This is often a very efficient way of sampling from a joint distribution if the conditional distributions are easy to sample from. This is one way to exploit the structure of the joint distribution. On the other hand, if the conditional distributions are highly correlated and/or are difficult to sample from, then Gibbs sampling can be very inefficient. For these reasons, Gibbs sampling is often a double-edged sword, that needs to be used in the right context. .. GENERATED FROM PYTHON SOURCE LINES 30-33 Setup ----- We start by importing the necessary modules .. GENERATED FROM PYTHON SOURCE LINES 33-42 .. code-block:: Python import numpy as np import matplotlib.pyplot as plt from cuqi.testproblem import Deconvolution1D from cuqi.distribution import Gaussian, Gamma, JointDistribution, GMRF, LMRF from cuqi.legacy.sampler import Gibbs, LinearRTO, Conjugate, UGLA, ConjugateApprox np.random.seed(0) .. GENERATED FROM PYTHON SOURCE LINES 43-56 Forward model and data ------------------------ We define the forward model and data. Here we use a 1D deconvolution problem, so the forward model is linear, that is: .. math:: \mathbf{y} = \mathbf{A} \mathbf{x} where :math:`\mathbf{A}` is the convolution matrix, and :math:`\mathbf{x}` is the input signal. We load this example from the testproblem library of CUQIpy and visualize the true solution (sharp signal) and data (convolved signal). .. GENERATED FROM PYTHON SOURCE LINES 56-72 .. code-block:: Python # Model and data A, y_obs, probinfo = Deconvolution1D(phantom='square').get_components() # Get dimension of signal n = A.domain_dim # Plot exact solution and observed data plt.subplot(121) probinfo.exactSolution.plot() plt.title('exact solution') plt.subplot(122) y_obs.plot() plt.title("Observed data") .. image-sg:: /user/_auto_tutorials/images/sphx_glr_gibbs_001.png :alt: exact solution, Observed data :srcset: /user/_auto_tutorials/images/sphx_glr_gibbs_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 1.0, 'Observed data') .. GENERATED FROM PYTHON SOURCE LINES 73-98 Hierarchical Bayesian model --------------------------- We define the following hierarchical model: .. math:: \begin{align} d &\sim \mathrm{Gamma}(1, 10^{-4}) \\ l &\sim \mathrm{Gamma}(1, 10^{-4}) \\ \mathbf{x} &\sim \mathrm{GMRF}(\mathbf{0}, d) \\ \mathbf{y} &\sim \mathcal{N}(\mathbf{A} \mathbf{x}, l^{-1} \mathbf{I}_m) \end{align} where :math:`\mathbf{y}` is the observed data, and :math:`\mathbf{x}` is the unknown signal. The hyperparameters :math:`d` and :math:`l` are the precision of the prior distribution of :math:`\mathbf{x}` and the noise, respectively. The prior distribution of :math:`\mathbf{x}` is a Gaussian Markov random field (GMRF) with zero mean and precision :math:`d`. It can be viewed as a Gaussian prior on the differences between neighboring elements of :math:`\mathbf{x}`. In CUQIpy the model can be defined as follows: .. GENERATED FROM PYTHON SOURCE LINES 98-111 .. code-block:: Python # Define distributions d = Gamma(1, 1e-4) l = Gamma(1, 1e-4) x = GMRF(np.zeros(n), lambda d: d) y = Gaussian(A, lambda l: 1/l) # Combine into a joint distribution joint = JointDistribution(d, l, x, y) # View the joint distribution print(joint) .. rst-class:: sphx-glr-script-out .. code-block:: none JointDistribution( Equation: p(d,l,x,y) = p(d)p(l)p(x|d)p(y|x,l) Densities: d ~ CUQI Gamma. l ~ CUQI Gamma. x ~ CUQI GMRF. Conditioning variables ['d']. y ~ CUQI Gaussian. Conditioning variables ['x', 'l']. ) .. GENERATED FROM PYTHON SOURCE LINES 112-115 Notice that the joint distribution prints a mathematical expression for the density functions that make up :math:`p(d,l,\mathbf{x},\mathbf{y})`. In this case they are all distributions, but this need not be the case. .. GENERATED FROM PYTHON SOURCE LINES 117-124 Defining the posterior distribution ------------------------------------ Now we define the posterior distribution, which is the joint distribution conditioned on the observed data. That is, :math:`p(d, l, \mathbf{x} \mid \mathbf{y}=\mathbf{y}_\mathrm{obs})` This is done in the following way: .. GENERATED FROM PYTHON SOURCE LINES 124-131 .. code-block:: Python # Define posterior by conditioning on the data posterior = joint(y=y_obs) # View the structure of the posterior print(posterior) .. rst-class:: sphx-glr-script-out .. code-block:: none JointDistribution( Equation: p(d,l,x|y) ∝ p(d)p(l)p(x|d)L(x,l|y) Densities: d ~ CUQI Gamma. l ~ CUQI Gamma. x ~ CUQI GMRF. Conditioning variables ['d']. y ~ CUQI Gaussian Likelihood function. Parameters ['x', 'l']. ) .. GENERATED FROM PYTHON SOURCE LINES 132-135 Notice that after conditioning on the data, the distribution associated with :math:`\mathbf{y}` became a likelihood function and that the posterior is now a joint distribution of the variables :math:`d`, :math:`l`, :math:`\mathbf{x}`. .. GENERATED FROM PYTHON SOURCE LINES 137-158 Gibbs Sampler ------------- The hierarchical model above has some important properties that we can exploit to make the sampling more efficient. First, note that the Gamma distribution are conjugate priors for the precision of the Gaussian distributions. This means that we can efficiently sample from :math:`d` and :math:`l` conditional on the other variables. Second, note that the prior distribution of :math:`\mathbf{x}` is a Gaussian Markov random field (GMRF) and that the distribution for :math:`\mathbf{y}` is also Gaussian with a Linear operator acting on :math:`\mathbf{x}` as the mean variable. This means that we can efficiently sample from :math:`\mathbf{x}` conditional on the other variables using the ``LinearRTO`` sampler. Taking these two facts into account, we can define a Gibbs sampler that uses the ``Conjugate`` sampler for :math:`d` and :math:`l` and the ``LinearRTO`` sampler for :math:`\mathbf{x}`. This is done in CUQIpy as follows: .. GENERATED FROM PYTHON SOURCE LINES 158-172 .. code-block:: Python # Define sampling strategy sampling_strategy = { 'x': LinearRTO, 'd': Conjugate, 'l': Conjugate } # Define Gibbs sampler sampler = Gibbs(posterior, sampling_strategy) # Run sampler samples = sampler.sample(Ns=1000, Nb=200) .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/CUQIpy/CUQIpy/demos/tutorials/gibbs.py:167: UserWarning: You are using the legacy sampler 'Gibbs'. This will be removed in a future release of CUQIpy. Please consider using the new samplers in the 'cuqi.sampler' module. sampler = Gibbs(posterior, sampling_strategy) /home/runner/work/CUQIpy/CUQIpy/cuqi/legacy/sampler/_rto.py:77: UserWarning: You are using the legacy sampler 'LinearRTO'. This will be removed in a future release of CUQIpy. Please consider using the new samplers in the 'cuqi.sampler' module. super().__init__(target, x0=x0, **kwargs) Warmup 2 / 200 Warmup 4 / 200 Warmup 6 / 200 Warmup 8 / 200 Warmup 10 / 200 Warmup 12 / 200 Warmup 14 / 200 Warmup 16 / 200 Warmup 18 / 200 Warmup 20 / 200 Warmup 22 / 200 Warmup 24 / 200 Warmup 26 / 200 Warmup 28 / 200 Warmup 30 / 200 Warmup 32 / 200 Warmup 34 / 200 Warmup 36 / 200 Warmup 38 / 200 Warmup 40 / 200 Warmup 42 / 200 Warmup 44 / 200 Warmup 46 / 200 Warmup 48 / 200 Warmup 50 / 200 Warmup 52 / 200 Warmup 54 / 200 Warmup 56 / 200 Warmup 58 / 200 Warmup 60 / 200 Warmup 62 / 200 Warmup 64 / 200 Warmup 66 / 200 Warmup 68 / 200 Warmup 70 / 200 Warmup 72 / 200 Warmup 74 / 200 Warmup 76 / 200 Warmup 78 / 200 Warmup 80 / 200 Warmup 82 / 200 Warmup 84 / 200 Warmup 86 / 200 Warmup 88 / 200 Warmup 90 / 200 Warmup 92 / 200 Warmup 94 / 200 Warmup 96 / 200 Warmup 98 / 200 Warmup 100 / 200 Warmup 102 / 200 Warmup 104 / 200 Warmup 106 / 200 Warmup 108 / 200 Warmup 110 / 200 Warmup 112 / 200 Warmup 114 / 200 Warmup 116 / 200 Warmup 118 / 200 Warmup 120 / 200 Warmup 122 / 200 Warmup 124 / 200 Warmup 126 / 200 Warmup 128 / 200 Warmup 130 / 200 Warmup 132 / 200 Warmup 134 / 200 Warmup 136 / 200 Warmup 138 / 200 Warmup 140 / 200 Warmup 142 / 200 Warmup 144 / 200 Warmup 146 / 200 Warmup 148 / 200 Warmup 150 / 200 Warmup 152 / 200 Warmup 154 / 200 Warmup 156 / 200 Warmup 158 / 200 Warmup 160 / 200 Warmup 162 / 200 Warmup 164 / 200 Warmup 166 / 200 Warmup 168 / 200 Warmup 170 / 200 Warmup 172 / 200 Warmup 174 / 200 Warmup 176 / 200 Warmup 178 / 200 Warmup 180 / 200 Warmup 182 / 200 Warmup 184 / 200 Warmup 186 / 200 Warmup 188 / 200 Warmup 190 / 200 Warmup 192 / 200 Warmup 194 / 200 Warmup 196 / 200 Warmup 198 / 200 Warmup 200 / 200 Warmup 200 / 200 Sample 10 / 1000 Sample 20 / 1000 Sample 30 / 1000 Sample 40 / 1000 Sample 50 / 1000 Sample 60 / 1000 Sample 70 / 1000 Sample 80 / 1000 Sample 90 / 1000 Sample 100 / 1000 Sample 110 / 1000 Sample 120 / 1000 Sample 130 / 1000 Sample 140 / 1000 Sample 150 / 1000 Sample 160 / 1000 Sample 170 / 1000 Sample 180 / 1000 Sample 190 / 1000 Sample 200 / 1000 Sample 210 / 1000 Sample 220 / 1000 Sample 230 / 1000 Sample 240 / 1000 Sample 250 / 1000 Sample 260 / 1000 Sample 270 / 1000 Sample 280 / 1000 Sample 290 / 1000 Sample 300 / 1000 Sample 310 / 1000 Sample 320 / 1000 Sample 330 / 1000 Sample 340 / 1000 Sample 350 / 1000 Sample 360 / 1000 Sample 370 / 1000 Sample 380 / 1000 Sample 390 / 1000 Sample 400 / 1000 Sample 410 / 1000 Sample 420 / 1000 Sample 430 / 1000 Sample 440 / 1000 Sample 450 / 1000 Sample 460 / 1000 Sample 470 / 1000 Sample 480 / 1000 Sample 490 / 1000 Sample 500 / 1000 Sample 510 / 1000 Sample 520 / 1000 Sample 530 / 1000 Sample 540 / 1000 Sample 550 / 1000 Sample 560 / 1000 Sample 570 / 1000 Sample 580 / 1000 Sample 590 / 1000 Sample 600 / 1000 Sample 610 / 1000 Sample 620 / 1000 Sample 630 / 1000 Sample 640 / 1000 Sample 650 / 1000 Sample 660 / 1000 Sample 670 / 1000 Sample 680 / 1000 Sample 690 / 1000 Sample 700 / 1000 Sample 710 / 1000 Sample 720 / 1000 Sample 730 / 1000 Sample 740 / 1000 Sample 750 / 1000 Sample 760 / 1000 Sample 770 / 1000 Sample 780 / 1000 Sample 790 / 1000 Sample 800 / 1000 Sample 810 / 1000 Sample 820 / 1000 Sample 830 / 1000 Sample 840 / 1000 Sample 850 / 1000 Sample 860 / 1000 Sample 870 / 1000 Sample 880 / 1000 Sample 890 / 1000 Sample 900 / 1000 Sample 910 / 1000 Sample 920 / 1000 Sample 930 / 1000 Sample 940 / 1000 Sample 950 / 1000 Sample 960 / 1000 Sample 970 / 1000 Sample 980 / 1000 Sample 990 / 1000 Sample 1000 / 1000 Sample 1000 / 1000 .. GENERATED FROM PYTHON SOURCE LINES 173-180 Analyze results --------------- After sampling we can inspect the results. The samples are stored as a dictionary with the variable names as keys. Samples for each variable is stored as a CUQIpy Samples object which contains the many convenience methods for diagnostics and plotting of MCMC samples. .. GENERATED FROM PYTHON SOURCE LINES 180-184 .. code-block:: Python # Plot credible intervals for the signal samples['x'].plot_ci(exact=probinfo.exactSolution) .. image-sg:: /user/_auto_tutorials/images/sphx_glr_gibbs_002.png :alt: gibbs :srcset: /user/_auto_tutorials/images/sphx_glr_gibbs_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none [, , ] .. GENERATED FROM PYTHON SOURCE LINES 185-186 Trace plot for d .. GENERATED FROM PYTHON SOURCE LINES 186-188 .. code-block:: Python samples['d'].plot_trace(figsize=(8,2)) .. image-sg:: /user/_auto_tutorials/images/sphx_glr_gibbs_003.png :alt: d, d :srcset: /user/_auto_tutorials/images/sphx_glr_gibbs_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none array([[, ]], dtype=object) .. GENERATED FROM PYTHON SOURCE LINES 189-190 Trace plot for l .. GENERATED FROM PYTHON SOURCE LINES 190-192 .. code-block:: Python samples['l'].plot_trace(figsize=(8,2)) .. image-sg:: /user/_auto_tutorials/images/sphx_glr_gibbs_004.png :alt: l, l :srcset: /user/_auto_tutorials/images/sphx_glr_gibbs_004.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none array([[, ]], dtype=object) .. GENERATED FROM PYTHON SOURCE LINES 193-217 Switching to a piecewise constant prior --------------------------------------- Notice that while the sampling went well in the previous example, the posterior distribution did not match the characteristics of the exact solution. We can improve this result by switching to a prior that better matches the exact solution :math:`\mathbf{x}`. One choice is the Laplace difference prior, which assumes a Laplace distribution for the differences between neighboring elements of :math:`\mathbf{x}`. That is, .. math:: \mathbf{x} \sim \text{LMRF}(d^{-1}), which means that :math:`x_i-x_{i-1} \sim \mathrm{Laplace}(0, d^{-1})`. This prior is implemented in CUQIpy as the ``LMRF`` distribution. To update our model we simply need to replace the ``GMRF`` distribution with the ``LMRF`` distribution. Note that the Laplace distribution is defined via a scale parameter, so we invert the parameter :math:`d`. This laplace distribution and new posterior can be defined as follows: .. GENERATED FROM PYTHON SOURCE LINES 217-230 .. code-block:: Python # Define new distribution for x x = LMRF(0, lambda d: 1/d, geometry=n) # Define new joint distribution with piecewise constant prior joint_Ld = JointDistribution(d, l, x, y) # Define new posterior by conditioning on the data posterior_Ld = joint_Ld(y=y_obs) print(posterior_Ld) .. rst-class:: sphx-glr-script-out .. code-block:: none JointDistribution( Equation: p(d,l,x|y) ∝ p(d)p(l)p(x|d)L(x,l|y) Densities: d ~ CUQI Gamma. l ~ CUQI Gamma. x ~ CUQI LMRF. Conditioning variables ['d']. y ~ CUQI Gaussian Likelihood function. Parameters ['x', 'l']. ) .. GENERATED FROM PYTHON SOURCE LINES 231-245 Gibbs Sampler (with Laplace prior) ---------------------------------- Using the same approach as earlier we can define a Gibbs sampler for this new hierarchical model. The only difference is that we now need to use a different sampler for :math:`\mathbf{x}` because the ``LinearRTO`` sampler only works for Gaussian distributions. In this case we use the UGLA (Unadjusted Gaussian Laplace Approximation) sampler for :math:`\mathbf{x}`. We also use an approximate Conjugate sampler for :math:`d` which approximately samples from the posterior distribution of :math:`d` conditional on the other variables in an efficient manner. For more details see e.g. `this paper `. .. GENERATED FROM PYTHON SOURCE LINES 245-259 .. code-block:: Python # Define sampling strategy sampling_strategy = { 'x': UGLA, 'd': ConjugateApprox, 'l': Conjugate } # Define Gibbs sampler sampler_Ld = Gibbs(posterior_Ld, sampling_strategy) # Run sampler samples_Ld = sampler_Ld.sample(Ns=1000, Nb=200) .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/CUQIpy/CUQIpy/demos/tutorials/gibbs.py:254: UserWarning: You are using the legacy sampler 'Gibbs'. This will be removed in a future release of CUQIpy. Please consider using the new samplers in the 'cuqi.sampler' module. sampler_Ld = Gibbs(posterior_Ld, sampling_strategy) /home/runner/work/CUQIpy/CUQIpy/cuqi/legacy/sampler/_laplace_approximation.py:57: UserWarning: You are using the legacy sampler 'UGLA'. This will be removed in a future release of CUQIpy. Please consider using the new samplers in the 'cuqi.sampler' module. super().__init__(target, x0=x0, **kwargs) Warmup 2 / 200 Warmup 4 / 200 Warmup 6 / 200 Warmup 8 / 200 Warmup 10 / 200 Warmup 12 / 200 Warmup 14 / 200 Warmup 16 / 200 Warmup 18 / 200 Warmup 20 / 200 Warmup 22 / 200 Warmup 24 / 200 Warmup 26 / 200 Warmup 28 / 200 Warmup 30 / 200 Warmup 32 / 200 Warmup 34 / 200 Warmup 36 / 200 Warmup 38 / 200 Warmup 40 / 200 Warmup 42 / 200 Warmup 44 / 200 Warmup 46 / 200 Warmup 48 / 200 Warmup 50 / 200 Warmup 52 / 200 Warmup 54 / 200 Warmup 56 / 200 Warmup 58 / 200 Warmup 60 / 200 Warmup 62 / 200 Warmup 64 / 200 Warmup 66 / 200 Warmup 68 / 200 Warmup 70 / 200 Warmup 72 / 200 Warmup 74 / 200 Warmup 76 / 200 Warmup 78 / 200 Warmup 80 / 200 Warmup 82 / 200 Warmup 84 / 200 Warmup 86 / 200 Warmup 88 / 200 Warmup 90 / 200 Warmup 92 / 200 Warmup 94 / 200 Warmup 96 / 200 Warmup 98 / 200 Warmup 100 / 200 Warmup 102 / 200 Warmup 104 / 200 Warmup 106 / 200 Warmup 108 / 200 Warmup 110 / 200 Warmup 112 / 200 Warmup 114 / 200 Warmup 116 / 200 Warmup 118 / 200 Warmup 120 / 200 Warmup 122 / 200 Warmup 124 / 200 Warmup 126 / 200 Warmup 128 / 200 Warmup 130 / 200 Warmup 132 / 200 Warmup 134 / 200 Warmup 136 / 200 Warmup 138 / 200 Warmup 140 / 200 Warmup 142 / 200 Warmup 144 / 200 Warmup 146 / 200 Warmup 148 / 200 Warmup 150 / 200 Warmup 152 / 200 Warmup 154 / 200 Warmup 156 / 200 Warmup 158 / 200 Warmup 160 / 200 Warmup 162 / 200 Warmup 164 / 200 Warmup 166 / 200 Warmup 168 / 200 Warmup 170 / 200 Warmup 172 / 200 Warmup 174 / 200 Warmup 176 / 200 Warmup 178 / 200 Warmup 180 / 200 Warmup 182 / 200 Warmup 184 / 200 Warmup 186 / 200 Warmup 188 / 200 Warmup 190 / 200 Warmup 192 / 200 Warmup 194 / 200 Warmup 196 / 200 Warmup 198 / 200 Warmup 200 / 200 Warmup 200 / 200 Sample 10 / 1000 Sample 20 / 1000 Sample 30 / 1000 Sample 40 / 1000 Sample 50 / 1000 Sample 60 / 1000 Sample 70 / 1000 Sample 80 / 1000 Sample 90 / 1000 Sample 100 / 1000 Sample 110 / 1000 Sample 120 / 1000 Sample 130 / 1000 Sample 140 / 1000 Sample 150 / 1000 Sample 160 / 1000 Sample 170 / 1000 Sample 180 / 1000 Sample 190 / 1000 Sample 200 / 1000 Sample 210 / 1000 Sample 220 / 1000 Sample 230 / 1000 Sample 240 / 1000 Sample 250 / 1000 Sample 260 / 1000 Sample 270 / 1000 Sample 280 / 1000 Sample 290 / 1000 Sample 300 / 1000 Sample 310 / 1000 Sample 320 / 1000 Sample 330 / 1000 Sample 340 / 1000 Sample 350 / 1000 Sample 360 / 1000 Sample 370 / 1000 Sample 380 / 1000 Sample 390 / 1000 Sample 400 / 1000 Sample 410 / 1000 Sample 420 / 1000 Sample 430 / 1000 Sample 440 / 1000 Sample 450 / 1000 Sample 460 / 1000 Sample 470 / 1000 Sample 480 / 1000 Sample 490 / 1000 Sample 500 / 1000 Sample 510 / 1000 Sample 520 / 1000 Sample 530 / 1000 Sample 540 / 1000 Sample 550 / 1000 Sample 560 / 1000 Sample 570 / 1000 Sample 580 / 1000 Sample 590 / 1000 Sample 600 / 1000 Sample 610 / 1000 Sample 620 / 1000 Sample 630 / 1000 Sample 640 / 1000 Sample 650 / 1000 Sample 660 / 1000 Sample 670 / 1000 Sample 680 / 1000 Sample 690 / 1000 Sample 700 / 1000 Sample 710 / 1000 Sample 720 / 1000 Sample 730 / 1000 Sample 740 / 1000 Sample 750 / 1000 Sample 760 / 1000 Sample 770 / 1000 Sample 780 / 1000 Sample 790 / 1000 Sample 800 / 1000 Sample 810 / 1000 Sample 820 / 1000 Sample 830 / 1000 Sample 840 / 1000 Sample 850 / 1000 Sample 860 / 1000 Sample 870 / 1000 Sample 880 / 1000 Sample 890 / 1000 Sample 900 / 1000 Sample 910 / 1000 Sample 920 / 1000 Sample 930 / 1000 Sample 940 / 1000 Sample 950 / 1000 Sample 960 / 1000 Sample 970 / 1000 Sample 980 / 1000 Sample 990 / 1000 Sample 1000 / 1000 Sample 1000 / 1000 .. GENERATED FROM PYTHON SOURCE LINES 260-265 Analyze results --------------- Again we can inspect the results. Here we notice the posterior distribution matches the exact solution much better. .. GENERATED FROM PYTHON SOURCE LINES 265-268 .. code-block:: Python # Plot credible intervals for the signal samples_Ld['x'].plot_ci(exact=probinfo.exactSolution) .. image-sg:: /user/_auto_tutorials/images/sphx_glr_gibbs_005.png :alt: gibbs :srcset: /user/_auto_tutorials/images/sphx_glr_gibbs_005.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none [, , ] .. GENERATED FROM PYTHON SOURCE LINES 269-270 .. code-block:: Python samples_Ld['d'].plot_trace(figsize=(8,2)) .. image-sg:: /user/_auto_tutorials/images/sphx_glr_gibbs_006.png :alt: d, d :srcset: /user/_auto_tutorials/images/sphx_glr_gibbs_006.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none array([[, ]], dtype=object) .. GENERATED FROM PYTHON SOURCE LINES 271-272 .. code-block:: Python samples_Ld['l'].plot_trace(figsize=(8,2)) .. image-sg:: /user/_auto_tutorials/images/sphx_glr_gibbs_007.png :alt: l, l :srcset: /user/_auto_tutorials/images/sphx_glr_gibbs_007.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none array([[, ]], dtype=object) .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 27.372 seconds) .. _sphx_glr_download_user__auto_tutorials_gibbs.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: gibbs.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: gibbs.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: gibbs.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_