SlideShare a Scribd company logo
1 of 300
Download to read offline
MCMC and Likelihood-free Methods
MCMC and Likelihood-free Methods
Christian P. Robert
Universit´e Paris-Dauphine & CREST
http://www.ceremade.dauphine.fr/~xian
November 2, 2010
MCMC and Likelihood-free Methods
Outline
Computational issues in Bayesian statistics
The Metropolis-Hastings Algorithm
The Gibbs Sampler
Population Monte Carlo
Approximate Bayesian computation
ABC for model choice
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Motivation and leading example
Computational issues in Bayesian statistics
The Metropolis-Hastings Algorithm
The Gibbs Sampler
Population Monte Carlo
Approximate Bayesian computation
ABC for model choice
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Latent structures make life harder!
Even simple models may lead to computational complications, as
in latent variable models
f(x|θ) = f (x, x |θ) dx
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Latent structures make life harder!
Even simple models may lead to computational complications, as
in latent variable models
f(x|θ) = f (x, x |θ) dx
If (x, x ) observed, fine!
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Latent structures make life harder!
Even simple models may lead to computational complications, as
in latent variable models
f(x|θ) = f (x, x |θ) dx
If (x, x ) observed, fine!
If only x observed, trouble!
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture models)
Models of mixtures of distributions:
X ∼ fj with probability pj,
for j = 1, 2, . . . , k, with overall density
X ∼ p1f1(x) + · · · + pkfk(x) .
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture models)
Models of mixtures of distributions:
X ∼ fj with probability pj,
for j = 1, 2, . . . , k, with overall density
X ∼ p1f1(x) + · · · + pkfk(x) .
For a sample of independent random variables (X1, · · · , Xn),
sample density
n
i=1
{p1f1(xi) + · · · + pkfk(xi)} .
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture models)
Models of mixtures of distributions:
X ∼ fj with probability pj,
for j = 1, 2, . . . , k, with overall density
X ∼ p1f1(x) + · · · + pkfk(x) .
For a sample of independent random variables (X1, · · · , Xn),
sample density
n
i=1
{p1f1(xi) + · · · + pkfk(xi)} .
Expanding this product of sums into a sum of products involves kn
elementary terms: too prohibitive to compute in large samples.
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Simple mixture (1)
−1 0 1 2 3
−10123
µ1
µ2
Case of the 0.3N (µ1, 1) + 0.7N (µ2, 1) likelihood
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Simple mixture (2)
For the mixture of two normal distributions,
0.3N(µ1, 1) + 0.7N(µ2, 1) ,
likelihood proportional to
n
i=1
[0.3ϕ (xi − µ1) + 0.7 ϕ (xi − µ2)]
containing 2n terms.
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Complex maximisation
Standard maximization techniques often fail to find the global
maximum because of multimodality or undesirable behavior
(usually at the frontier of the domain) of the likelihood function.
Example
In the special case
f(x|µ, σ) = (1 − ) exp{(−1/2)x2
} +
σ
exp{(−1/2σ2
)(x − µ)2
}
with > 0 known,
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Complex maximisation
Standard maximization techniques often fail to find the global
maximum because of multimodality or undesirable behavior
(usually at the frontier of the domain) of the likelihood function.
Example
In the special case
f(x|µ, σ) = (1 − ) exp{(−1/2)x2
} +
σ
exp{(−1/2σ2
)(x − µ)2
}
with > 0 known, whatever n, the likelihood is unbounded:
lim
σ→0
L(x1, . . . , xn|µ = x1, σ) = ∞
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Unbounded likelihood
−2 0 2 4 6
1234
µ
n= 3
−2 0 2 4 6
1234
µ
σ
n= 6
−2 0 2 4 6
1234
µ
n= 12
−2 0 2 4 6
1234 µ
σ
n= 24
−2 0 2 4 6
1234
µ
n= 48
−2 0 2 4 6
1234
µ
σ
n= 96
Case of the 0.3N (0, 1) + 0.7N (µ, σ) likelihood
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture once again)
press for MA Observations from
x1, . . . , xn ∼ f(x|θ) = pϕ(x; µ1, σ1) + (1 − p)ϕ(x; µ2, σ2)
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture once again)
press for MA Observations from
x1, . . . , xn ∼ f(x|θ) = pϕ(x; µ1, σ1) + (1 − p)ϕ(x; µ2, σ2)
Prior
µi|σi ∼ N (ξi, σ2
i /ni), σ2
i ∼ I G (νi/2, s2
i /2), p ∼ Be(α, β)
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture once again)
press for MA Observations from
x1, . . . , xn ∼ f(x|θ) = pϕ(x; µ1, σ1) + (1 − p)ϕ(x; µ2, σ2)
Prior
µi|σi ∼ N (ξi, σ2
i /ni), σ2
i ∼ I G (νi/2, s2
i /2), p ∼ Be(α, β)
Posterior
π(θ|x1, . . . , xn) ∝
n
j=1
{pϕ(xj; µ1, σ1) + (1 − p)ϕ(xj; µ2, σ2)} π(θ)
=
n
=0 (kt)
ω(kt)π(θ|(kt))
n
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture once again (cont’d))
For a given permutation (kt), conditional posterior distribution
π(θ|(kt)) = N ξ1(kt),
σ2
1
n1 +
× I G ((ν1 + )/2, s1(kt)/2)
×N ξ2(kt),
σ2
2
n2 + n −
× I G ((ν2 + n − )/2, s2(kt)/2)
×Be(α + , β + n − )
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture once again (cont’d))
where
¯x1(kt) = 1
t=1 xkt , ˆs1(kt) = t=1(xkt − ¯x1(kt))2,
¯x2(kt) = 1
n−
n
t= +1 xkt , ˆs2(kt) = n
t= +1(xkt − ¯x2(kt))2
and
ξ1(kt) =
n1ξ1 + ¯x1(kt)
n1 +
, ξ2(kt) =
n2ξ2 + (n − )¯x2(kt)
n2 + n −
,
s1(kt) = s2
1 + ˆs2
1(kt) +
n1
n1 +
(ξ1 − ¯x1(kt))2
,
s2(kt) = s2
2 + ˆs2
2(kt) +
n2(n − )
n2 + n −
(ξ2 − ¯x2(kt))2
,
posterior updates of the hyperparameters
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
Latent variables
Example (Mixture once again)
Bayes estimator of θ:
δπ
(x1, . . . , xn) =
n
=0 (kt)
ω(kt)Eπ
[θ|x, (kt)]
Too costly: 2n terms
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
AR(p) model
Auto-regressive representation of a time series,
xt|xt−1, . . . ∼ N µ +
p
i=1
i(xt−i − µ), σ2
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
AR(p) model
Auto-regressive representation of a time series,
xt|xt−1, . . . ∼ N µ +
p
i=1
i(xt−i − µ), σ2
Generalisation of AR(1)
Among the most commonly used models in dynamic settings
More challenging than the static models (stationarity
constraints)
Different models depending on the processing of the starting
value x0
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
Unknown stationarity constraints
Practical difficulty: for complex models, stationarity constraints get
quite involved to the point of being unknown in some cases
Example (AR(1))
Case of linear Markovian dependence on the last value
xt = µ + (xt−1 − µ) + t , t
i.i.d.
∼ N (0, σ2
)
If | | < 1, (xt)t∈Z can be written as
xt = µ +
∞
j=0
j
t−j
and this is a stationary representation.
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
Stationary but...
If | | > 1, alternative stationary representation
xt = µ −
∞
j=1
−j
t+j .
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
Stationary but...
If | | > 1, alternative stationary representation
xt = µ −
∞
j=1
−j
t+j .
This stationary solution is criticized as artificial because xt is
correlated with future white noises ( t)s>t, unlike the case when
| | < 1.
Non-causal representation...
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
Stationarity+causality
Stationarity constraints in the prior as a restriction on the values of
θ.
Theorem
AR(p) model second-order stationary and causal iff the roots of the
polynomial
P(x) = 1 −
p
i=1
ixi
are all outside the unit circle
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
Stationarity constraints
Under stationarity constraints, complex parameter space: each
value of needs to be checked for roots of corresponding
polynomial with modulus less than 1
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The AR(p) model
Stationarity constraints
Under stationarity constraints, complex parameter space: each
value of needs to be checked for roots of corresponding
polynomial with modulus less than 1
E.g., for an AR(2) process with
autoregressive polynomial
P(u) = 1 − 1u − 2u2, constraint is
1 + 2 < 1, 1 − 2 < 1
and | 2| < 1
q
−2 −1 0 1 2
−1.0−0.50.00.51.0 θ1
θ2
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
The MA(q) model
Alternative type of time series
xt = µ + t −
q
j=1
ϑj t−j , t ∼ N (0, σ2
)
Stationary but, for identifiability considerations, the polynomial
Q(x) = 1 −
q
j=1
ϑjxj
must have all its roots outside the unit circle.
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
Identifiability
Example
For the MA(1) model, xt = µ + t − ϑ1 t−1,
var(xt) = (1 + ϑ2
1)σ2
can also be written
xt = µ + ˜t−1 −
1
ϑ1
˜t, ˜ ∼ N (0, ϑ2
1σ2
) ,
Both pairs (ϑ1, σ) & (1/ϑ1, ϑ1σ) lead to alternative
representations of the same model.
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
Properties of MA models
Non-Markovian model (but special case of hidden Markov)
Autocovariance γx(s) is null for |s| > q
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
Representations
x1:T is a normal random variable with constant mean µ and
covariance matrix
Σ =





σ2
γ1 γ2 . . . γq 0 . . . 0 0
γ1 σ2
γ1 . . . γq−1 γq . . . 0 0
...
0 0 0 . . . 0 0 . . . γ1 σ2





,
with (|s| ≤ q)
γs = σ2
q−|s|
i=0
ϑiϑi+|s|
Not manageable in practice [large T’s]
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
Representations (contd.)
Conditional on past ( 0, . . . , −q+1),
L(µ, ϑ1, . . . , ϑq, σ|x1:T , 0, . . . , −q+1) ∝
σ−T
T
t=1
exp



−

xt − µ +
q
j=1
ϑjˆt−j


2
2σ2



,
where (t > 0)
ˆt = xt − µ +
q
j=1
ϑjˆt−j, ˆ0 = 0, . . . , ˆ1−q = 1−q
Recursive definition of the likelihood, still costly O(T × q)
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
Representations (contd.)
Encompassing approach for general time series models
State-space representation
xt = Gyt + εt , (1)
yt+1 = Fyt + ξt , (2)
(1) is the observation equation and (2) is the state equation
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
Representations (contd.)
Encompassing approach for general time series models
State-space representation
xt = Gyt + εt , (1)
yt+1 = Fyt + ξt , (2)
(1) is the observation equation and (2) is the state equation
Note
This is a special case of hidden Markov model
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
MA(q) state-space representation
For the MA(q) model, take
yt = ( t−q, . . . , t−1, t)
and then
yt+1 =






0 1 0 . . . 0
0 0 1 . . . 0
. . .
0 0 0 . . . 1
0 0 0 . . . 0






yt + t+1







0
0
...
0
1







xt = µ − ϑq ϑq−1 . . . ϑ1 −1 yt .
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
MA(q) state-space representation (cont’d)
Example
For the MA(1) model, observation equation
xt = (1 0)yt
with
yt = (y1t y2t)
directed by the state equation
yt+1 =
0 1
0 0
yt + t+1
1
ϑ1
.
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
c A typology of Bayes computational problems
(i). latent variable models in general
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
c A typology of Bayes computational problems
(i). latent variable models in general
(ii). use of a complex parameter space, as for instance in
constrained parameter sets like those resulting from imposing
stationarity constraints in dynamic models;
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
c A typology of Bayes computational problems
(i). latent variable models in general
(ii). use of a complex parameter space, as for instance in
constrained parameter sets like those resulting from imposing
stationarity constraints in dynamic models;
(iii). use of a complex sampling model with an intractable
likelihood, as for instance in some graphical models;
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
c A typology of Bayes computational problems
(i). latent variable models in general
(ii). use of a complex parameter space, as for instance in
constrained parameter sets like those resulting from imposing
stationarity constraints in dynamic models;
(iii). use of a complex sampling model with an intractable
likelihood, as for instance in some graphical models;
(iv). use of a huge dataset;
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
c A typology of Bayes computational problems
(i). latent variable models in general
(ii). use of a complex parameter space, as for instance in
constrained parameter sets like those resulting from imposing
stationarity constraints in dynamic models;
(iii). use of a complex sampling model with an intractable
likelihood, as for instance in some graphical models;
(iv). use of a huge dataset;
(v). use of a complex prior distribution (which may be the
posterior distribution associated with an earlier sample);
MCMC and Likelihood-free Methods
Computational issues in Bayesian statistics
The MA(q) model
c A typology of Bayes computational problems
(i). latent variable models in general
(ii). use of a complex parameter space, as for instance in
constrained parameter sets like those resulting from imposing
stationarity constraints in dynamic models;
(iii). use of a complex sampling model with an intractable
likelihood, as for instance in some graphical models;
(iv). use of a huge dataset;
(v). use of a complex prior distribution (which may be the
posterior distribution associated with an earlier sample);
(vi). use of a particular inferential procedure as for instance, Bayes
factors
Bπ
01(x) =
P(θ ∈ Θ0 | x)
P(θ ∈ Θ1 | x)
π(θ ∈ Θ0)
π(θ ∈ Θ1)
.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis-Hastings Algorithm
Computational issues in Bayesian statistics
The Metropolis-Hastings Algorithm
Monte Carlo basics
Importance Sampling
Monte Carlo Methods based on Markov Chains
The Metropolis–Hastings algorithm
Random-walk Metropolis-Hastings algorithms
Extensions
The Gibbs Sampler
Population Monte Carlo
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo basics
General purpose
Given a density π known up to a normalizing constant, and an
integrable function h, compute
Π(h) = h(x)π(x)µ(dx) =
h(x)˜π(x)µ(dx)
˜π(x)µ(dx)
when h(x)˜π(x)µ(dx) is intractable.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo basics
Monte Carlo 101
Generate an iid sample x1, . . . , xN from π and estimate Π(h) by
ˆΠMC
N (h) = N−1
N
i=1
h(xi).
LLN: ˆΠMC
N (h)
as
−→ Π(h)
If Π(h2) = h2(x)π(x)µ(dx) < ∞,
CLT:
√
N ˆΠMC
N (h) − Π(h)
L
N 0, Π [h − Π(h)]2
.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo basics
Monte Carlo 101
Generate an iid sample x1, . . . , xN from π and estimate Π(h) by
ˆΠMC
N (h) = N−1
N
i=1
h(xi).
LLN: ˆΠMC
N (h)
as
−→ Π(h)
If Π(h2) = h2(x)π(x)µ(dx) < ∞,
CLT:
√
N ˆΠMC
N (h) − Π(h)
L
N 0, Π [h − Π(h)]2
.
Caveat announcing MCMC
Often impossible or inefficient to simulate directly from Π
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Importance Sampling
Importance Sampling
For Q proposal distribution such that Q(dx) = q(x)µ(dx),
alternative representation
Π(h) = h(x){π/q}(x)q(x)µ(dx).
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Importance Sampling
Importance Sampling
For Q proposal distribution such that Q(dx) = q(x)µ(dx),
alternative representation
Π(h) = h(x){π/q}(x)q(x)µ(dx).
Principle of importance
Generate an iid sample x1, . . . , xN ∼ Q and estimate Π(h) by
ˆΠIS
Q,N (h) = N−1
N
i=1
h(xi){π/q}(xi).
return to pMC
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Importance Sampling
Properties of importance
Then
LLN: ˆΠIS
Q,N (h)
as
−→ Π(h) and if Q((hπ/q)2) < ∞,
CLT:
√
N(ˆΠIS
Q,N (h) − Π(h))
L
N 0, Q{(hπ/q − Π(h))2
} .
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Importance Sampling
Properties of importance
Then
LLN: ˆΠIS
Q,N (h)
as
−→ Π(h) and if Q((hπ/q)2) < ∞,
CLT:
√
N(ˆΠIS
Q,N (h) − Π(h))
L
N 0, Q{(hπ/q − Π(h))2
} .
Caveat
If normalizing constant of π unknown, impossible to use ˆΠIS
Q,N
Generic problem in Bayesian Statistics: π(θ|x) ∝ f(x|θ)π(θ).
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Importance Sampling
Self-Normalised Importance Sampling
Self normalized version
ˆΠSNIS
Q,N (h) =
N
i=1
{π/q}(xi)
−1 N
i=1
h(xi){π/q}(xi).
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Importance Sampling
Self-Normalised Importance Sampling
Self normalized version
ˆΠSNIS
Q,N (h) =
N
i=1
{π/q}(xi)
−1 N
i=1
h(xi){π/q}(xi).
LLN : ˆΠSNIS
Q,N (h)
as
−→ Π(h)
and if Π((1 + h2)(π/q)) < ∞,
CLT :
√
N(ˆΠSNIS
Q,N (h) − Π(h))
L
N 0, π {(π/q)(h − Π(h)}2
) .
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Importance Sampling
Self-Normalised Importance Sampling
Self normalized version
ˆΠSNIS
Q,N (h) =
N
i=1
{π/q}(xi)
−1 N
i=1
h(xi){π/q}(xi).
LLN : ˆΠSNIS
Q,N (h)
as
−→ Π(h)
and if Π((1 + h2)(π/q)) < ∞,
CLT :
√
N(ˆΠSNIS
Q,N (h) − Π(h))
L
N 0, π {(π/q)(h − Π(h)}2
) .
c The quality of the SNIS approximation depends on the
choice of Q
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo Methods based on Markov Chains
Running Monte Carlo via Markov Chains (MCMC)
It is not necessary to use a sample from the distribution f to
approximate the integral
I = h(x)f(x)dx ,
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo Methods based on Markov Chains
Running Monte Carlo via Markov Chains (MCMC)
It is not necessary to use a sample from the distribution f to
approximate the integral
I = h(x)f(x)dx ,
We can obtain X1, . . . , Xn ∼ f (approx) without directly
simulating from f, using an ergodic Markov chain with
stationary distribution f
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo Methods based on Markov Chains
Running Monte Carlo via Markov Chains (2)
Idea
For an arbitrary starting value x(0), an ergodic chain (X(t)) is
generated using a transition kernel with stationary distribution f
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo Methods based on Markov Chains
Running Monte Carlo via Markov Chains (2)
Idea
For an arbitrary starting value x(0), an ergodic chain (X(t)) is
generated using a transition kernel with stationary distribution f
Insures the convergence in distribution of (X(t)) to a random
variable from f.
For a “large enough” T0, X(T0) can be considered as
distributed from f
Produce a dependent sample X(T0), X(T0+1), . . ., which is
generated from f, sufficient for most approximation purposes.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Monte Carlo Methods based on Markov Chains
Running Monte Carlo via Markov Chains (2)
Idea
For an arbitrary starting value x(0), an ergodic chain (X(t)) is
generated using a transition kernel with stationary distribution f
Insures the convergence in distribution of (X(t)) to a random
variable from f.
For a “large enough” T0, X(T0) can be considered as
distributed from f
Produce a dependent sample X(T0), X(T0+1), . . ., which is
generated from f, sufficient for most approximation purposes.
Problem: How can one build a Markov chain with a given
stationary distribution?
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
The Metropolis–Hastings algorithm
Basics
The algorithm uses the objective (target) density
f
and a conditional density
q(y|x)
called the instrumental (or proposal) distribution
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
The MH algorithm
Algorithm (Metropolis–Hastings)
Given x(t),
1. Generate Yt ∼ q(y|x(t)).
2. Take
X(t+1)
=
Yt with prob. ρ(x(t), Yt),
x(t) with prob. 1 − ρ(x(t), Yt),
where
ρ(x, y) = min
f(y)
f(x)
q(x|y)
q(y|x)
, 1 .
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
Features
Independent of normalizing constants for both f and q(·|x)
(ie, those constants independent of x)
Never move to values with f(y) = 0
The chain (x(t))t may take the same value several times in a
row, even though f is a density wrt Lebesgue measure
The sequence (yt)t is usually not a Markov chain
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
Convergence properties
1. The M-H Markov chain is reversible, with
invariant/stationary density f since it satisfies the detailed
balance condition
f(y) K(y, x) = f(x) K(x, y)
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
Convergence properties
1. The M-H Markov chain is reversible, with
invariant/stationary density f since it satisfies the detailed
balance condition
f(y) K(y, x) = f(x) K(x, y)
2. As f is a probability measure, the chain is positive recurrent
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
Convergence properties
1. The M-H Markov chain is reversible, with
invariant/stationary density f since it satisfies the detailed
balance condition
f(y) K(y, x) = f(x) K(x, y)
2. As f is a probability measure, the chain is positive recurrent
3. If
Pr
f(Yt) q(X(t)|Yt)
f(X(t)) q(Yt|X(t))
≥ 1 < 1. (1)
that is, the event {X(t+1) = X(t)} is possible, then the chain
is aperiodic
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
Convergence properties (2)
4. If
q(y|x) > 0 for every (x, y), (2)
the chain is irreducible
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
Convergence properties (2)
4. If
q(y|x) > 0 for every (x, y), (2)
the chain is irreducible
5. For M-H, f-irreducibility implies Harris recurrence
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
The Metropolis–Hastings algorithm
Convergence properties (2)
4. If
q(y|x) > 0 for every (x, y), (2)
the chain is irreducible
5. For M-H, f-irreducibility implies Harris recurrence
6. Thus, for M-H satisfying (1) and (2)
(i) For h, with Ef |h(X)| < ∞,
lim
T →∞
1
T
T
t=1
h(X(t)
) = h(x)df(x) a.e. f.
(ii) and
lim
n→∞
Kn
(x, ·)µ(dx) − f
T V
= 0
for every initial distribution µ, where Kn
(x, ·) denotes the
kernel for n transitions.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Random walk Metropolis–Hastings
Use of a local perturbation as proposal
Yt = X(t)
+ εt,
where εt ∼ g, independent of X(t).
The instrumental density is of the form g(y − x) and the Markov
chain is a random walk if we take g to be symmetric g(x) = g(−x)
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Algorithm (Random walk Metropolis)
Given x(t)
1. Generate Yt ∼ g(y − x(t))
2. Take
X(t+1)
=



Yt with prob. min 1,
f(Yt)
f(x(t))
,
x(t) otherwise.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Example (Random walk and normal target)
forget History! Generate N(0, 1) based on the uniform proposal [−δ, δ]
[Hastings (1970)]
The probability of acceptance is then
ρ(x(t)
, yt) = exp{(x(t)2
− y2
t )/2} ∧ 1.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Example (Random walk & normal (2))
Sample statistics
δ 0.1 0.5 1.0
mean 0.399 -0.111 0.10
variance 0.698 1.11 1.06
c As δ ↑, we get better histograms and a faster exploration of the
support of f.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
-1 0 1 2
050100150200250
(a)
-1.5-1.0-0.50.00.5
-2 0 2
0100200300400
(b) -1.5-1.0-0.50.00.5
-3 -2 -1 0 1 2 3
0100200300400
(c)
-1.5-1.0-0.50.00.5
Three samples based on U[−δ, δ] with (a) δ = 0.1, (b) δ = 0.5
and (c) δ = 1.0, superimposed with the convergence of the
means (15, 000 simulations).
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Example (Mixture models)
π(θ|x) ∝
n
j=1
k
=1
p f(xj|µ , σ ) π(θ)
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Example (Mixture models)
π(θ|x) ∝
n
j=1
k
=1
p f(xj|µ , σ ) π(θ)
Metropolis-Hastings proposal:
θ(t+1)
=
θ(t) + ωε(t) if u(t) < ρ(t)
θ(t) otherwise
where
ρ(t)
=
π(θ(t) + ωε(t)|x)
π(θ(t)|x)
∧ 1
and ω scaled for good acceptance rate
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
p
theta
0.0 0.2 0.4 0.6 0.8 1.0
-1012
tau
theta
0.2 0.4 0.6 0.8 1.0 1.2
-1012
p
tau
0.0 0.2 0.4 0.6 0.8 1.0
0.20.40.60.81.01.2
-1 0 1 2
0.01.02.0
theta
0.2 0.4 0.6 0.8
024
tau
0.0 0.2 0.4 0.6 0.8 1.0
0123456
p
Random walk sampling (50000 iterations)
General case of a 3 component normal mixture
[Celeux & al., 2000]
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
−1 0 1 2 3
−10123
µ1
µ2
X
Random walk MCMC output for .7N(µ1, 1) + .3N(µ2, 1)
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Convergence properties
Uniform ergodicity prohibited by random walk structure
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Convergence properties
Uniform ergodicity prohibited by random walk structure
At best, geometric ergodicity:
Theorem (Sufficient ergodicity)
For a symmetric density f, log-concave in the tails, and a positive
and symmetric density g, the chain (X(t)) is geometrically ergodic.
[Mengersen & Tweedie, 1996]
no tail effect
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Example (Comparison of tail
effects)
Random-walk
Metropolis–Hastings algorithms
based on a N (0, 1) instrumental
for the generation of (a) a
N(0, 1) distribution and (b) a
distribution with density
ψ(x) ∝ (1 + |x|)−3
(a)
0 50 100 150 200
-1.5-1.0-0.50.00.51.01.5
(a)
0 50 100 150 200
-1.5-1.0-0.50.00.51.01.5
0 50 100 150 200
-1.5-1.0-0.50.00.51.01.5
0 50 100 150 200
-1.5-1.0-0.50.00.51.01.5
(b)
0 50 100 150 200
-1.5-1.0-0.50.00.51.01.5
0 50 100 150 200
-1.5-1.0-0.50.00.51.01.5
0 50 100 150 200
-1.5-1.0-0.50.00.51.01.5
90% confidence envelopes of
the means, derived from 500
parallel independent chains
1 + ξ2
1 + (ξ )2
∧ 1 ,
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Further convergence properties
Under assumptions skip detailed convergence
(A1) f is super-exponential, i.e. it is positive with positive
continuous first derivative such that
lim|x|→∞ n(x) log f(x) = −∞ where n(x) := x/|x|.
In words : exponential decay of f in every direction with rate
tending to ∞
(A2) lim sup|x|→∞ n(x) m(x) < 0, where
m(x) = f(x)/| f(x)|.
In words: non degeneracy of the countour manifold
Cf(y) = {y : f(y) = f(x)}
Q is geometrically ergodic, and
V (x) ∝ f(x)−1/2 verifies the drift condition
[Jarner & Hansen, 2000]
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Further [further] convergence properties
skip hyperdetailed convergence
If P ψ-irreducible and aperiodic, for r = (r(n))n∈N real-valued non
decreasing sequence, such that, for all n, m ∈ N,
r(n + m) ≤ r(n)r(m),
and r(0) = 1, for C a small set, τC = inf{n ≥ 1, Xn ∈ C}, and
h ≥ 1, assume
sup
x∈C
Ex
τC −1
k=0
r(k)h(Xk) < ∞,
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
then,
S(f, C, r) := x ∈ X, Ex
τC −1
k=0
r(k)h(Xk) < ∞
is full and absorbing and for x ∈ S(f, C, r),
lim
n→∞
r(n) Pn
(x, .) − f h = 0.
[Tuominen & Tweedie, 1994]
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Comments
[CLT, Rosenthal’s inequality...] h-ergodicity implies CLT
for additive (possibly unbounded) functionals of the chain,
Rosenthal’s inequality and so on...
[Control of the moments of the return-time] The
condition implies (because h ≥ 1) that
sup
x∈C
Ex[r0(τC)] ≤ sup
x∈C
Ex
τC −1
k=0
r(k)h(Xk) < ∞,
where r0(n) = n
l=0 r(l) Can be used to derive bounds for
the coupling time, an essential step to determine computable
bounds, using coupling inequalities
[Roberts & Tweedie, 1998; Fort & Moulines, 2000]
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
Alternative conditions
The condition is not really easy to work with...
[Possible alternative conditions]
(a) [Tuominen, Tweedie, 1994] There exists a sequence
(Vn)n∈N, Vn ≥ r(n)h, such that
(i) supC V0 < ∞,
(ii) {V0 = ∞} ⊂ {V1 = ∞} and
(iii) PVn+1 ≤ Vn − r(n)h + br(n)IC.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Random-walk Metropolis-Hastings algorithms
(b) [Fort 2000] ∃V ≥ f ≥ 1 and b < ∞, such that supC V < ∞
and
PV (x) + Ex
σC
k=0
∆r(k)f(Xk) ≤ V (x) + bIC(x)
where σC is the hitting time on C and
∆r(k) = r(k) − r(k − 1), k ≥ 1 and ∆r(0) = r(0).
Result (a) ⇔ (b) ⇔ supx∈C Ex
τC −1
k=0 r(k)f(Xk) < ∞.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Langevin Algorithms
Proposal based on the Langevin diffusion Lt is defined by the
stochastic differential equation
dLt = dBt +
1
2
log f(Lt)dt,
where Bt is the standard Brownian motion
Theorem
The Langevin diffusion is the only non-explosive diffusion which is
reversible with respect to f.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Discretization
Instead, consider the sequence
x(t+1)
= x(t)
+
σ2
2
log f(x(t)
) + σεt, εt ∼ Np(0, Ip)
where σ2 corresponds to the discretization step
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Discretization
Instead, consider the sequence
x(t+1)
= x(t)
+
σ2
2
log f(x(t)
) + σεt, εt ∼ Np(0, Ip)
where σ2 corresponds to the discretization step
Unfortunately, the discretized chain may be be transient, for
instance when
lim
x→±∞
σ2
log f(x)|x|−1
> 1
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
MH correction
Accept the new value Yt with probability
f(Yt)
f(x(t))
·
exp − Yt − x(t) − σ2
2 log f(x(t))
2
2σ2
exp − x(t) − Yt − σ2
2 log f(Yt)
2
2σ2
∧ 1 .
Choice of the scaling factor σ
Should lead to an acceptance rate of 0.574 to achieve optimal
convergence rates (when the components of x are uncorrelated)
[Roberts & Rosenthal, 1998; Girolami & Calderhead, 2011]
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Optimizing the Acceptance Rate
Problem of choice of the transition kernel from a practical point of
view
Most common alternatives:
(a) a fully automated algorithm like ARMS;
(b) an instrumental density g which approximates f, such that
f/g is bounded for uniform ergodicity to apply;
(c) a random walk
In both cases (b) and (c), the choice of g is critical,
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Case of the random walk
Different approach to acceptance rates
A high acceptance rate does not indicate that the algorithm is
moving correctly since it indicates that the random walk is moving
too slowly on the surface of f.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Case of the random walk
Different approach to acceptance rates
A high acceptance rate does not indicate that the algorithm is
moving correctly since it indicates that the random walk is moving
too slowly on the surface of f.
If x(t) and yt are close, i.e. f(x(t)) f(yt) y is accepted with
probability
min
f(yt)
f(x(t))
, 1 1 .
For multimodal densities with well separated modes, the negative
effect of limited moves on the surface of f clearly shows.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Case of the random walk (2)
If the average acceptance rate is low, the successive values of f(yt)
tend to be small compared with f(x(t)), which means that the
random walk moves quickly on the surface of f since it often
reaches the “borders” of the support of f
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Rule of thumb
In small dimensions, aim at an average acceptance rate of 50%. In
large dimensions, at an average acceptance rate of 25%.
[Gelman,Gilks and Roberts, 1995]
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Rule of thumb
In small dimensions, aim at an average acceptance rate of 50%. In
large dimensions, at an average acceptance rate of 25%.
[Gelman,Gilks and Roberts, 1995]
This rule is to be taken with a pinch of salt!
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Example (Noisy AR(1))
Hidden Markov chain from a regular AR(1) model,
xt+1 = ϕxt + t+1 t ∼ N (0, τ2
)
and observables
yt|xt ∼ N (x2
t , σ2
)
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Example (Noisy AR(1))
Hidden Markov chain from a regular AR(1) model,
xt+1 = ϕxt + t+1 t ∼ N (0, τ2
)
and observables
yt|xt ∼ N (x2
t , σ2
)
The distribution of xt given xt−1, xt+1 and yt is
exp
−1
2τ2
(xt − ϕxt−1)2
+ (xt+1 − ϕxt)2
+
τ2
σ2
(yt − x2
t )2
.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Example (Noisy AR(1) continued)
For a Gaussian random walk with scale ω small enough, the
random walk never jumps to the other mode. But if the scale ω is
sufficiently large, the Markov chain explores both modes and give a
satisfactory approximation of the target distribution.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Markov chain based on a random walk with scale ω = .1.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Markov chain based on a random walk with scale ω = .5.
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
MA(2)
Since the constraints on (ϑ1, ϑ2) are well-defined, use of a flat
prior over the triangle as prior.
Simple representation of the likelihood
library(mnormt)
ma2like=function(theta){
n=length(y)
sigma = toeplitz(c(1 +theta[1]^2+theta[2]^2,
theta[1]+theta[1]*theta[2],theta[2],rep(0,n-3)))
dmnorm(y,rep(0,n),sigma,log=TRUE)
}
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Basic RWHM for MA(2)
Algorithm 1 RW-HM-MA(2) sampler
set ω and ϑ(1)
for i = 2 to T do
generate ˜ϑj ∼ U(ϑ
(i−1)
j − ω, ϑ
(i−1)
j + ω)
set p = 0 and ϑ(i) = ϑ(i−1)
if ˜ϑ within the triangle then
p = exp(ma2like(˜ϑ) − ma2like(ϑ(i−1)))
end if
if U < p then
ϑ(i) = ˜ϑ
end if
end for
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Outcome
Result with a simulated sample of 100 points and ϑ1 = 0.6,
ϑ2 = 0.2 and scale ω = 0.2
q q
−2 −1 0 1 2
−1.0−0.50.00.51.0
θ1
θ2
q
qqq
q
qqq
qqqqq
q
qqq
qq qqqq
qq
qqqqqqqqqq
q qqqqqqqq
qqq
q
qqqqqqqq
q
qq
qqqqq
qq
qqq
qqqqqqqqqqq
q
qqqqq
q
qq
qqqq
qqqqqqq
qqqqq
qqq
qqqqqqqqq
q
q
qqqqqqqqqqqq
qq
qq
q
qqq
qqqqqqqq
qqqqqqq
qqqqq
qqqq
q
qqqqqq
qqqqqqq
qqqqq
qqq
qq
q
qqq
q
q
qq
qqq
qqq
qq
qq
qqqq
qq
q
qqq
qqq
q
q
qq
qqq
qqq
qq
qq
qqqqq
qqq
qqqqq
q
qqq
qqqqq
qq
qqqq
q
qqq
q
q
qq
qqqqqq
qqqqqqqqq
qq
q
qq
q
q
qqqqqqqq
qqqqqqqq
qqq
qqqqqqq
qqqqq
q
qqq
qq
qqqqqqqqqqqqq
qqqq
qqq
q
qq
q
qq
qqqq
q
qq
qq
q
qqqq
q
qq q
q
qqqqq
q
qqq
q
qq
qqqqqqq
q
qq
q
q
q
q
qqq
qqqq
q
qq
q
q
q q
qq
qq
qqqqqqq
qqq
q
qqqq
q
qqqqq
qq
qqq
qqq
q
q
q
qq
qq
qqqqqqqq
qqqqq
qqqqqq
qq
qq
qqqqqqq
q
qqq
q
qqqqqq
qqq
qq
qq
qqq
qqq
qq
qqqqqqq
q
qqq
qqqq
q
qqq
qq
q
qqqqqqqqqqq
q
qq
qq
q
qqqq
q
qqq
qqqqq
q
qqqqqq
qq
qqqq
qq
qqq
qqq
qq
qq
q
qqqq
qq
q
qqq
qq
qq
qqqqqqq
q
qqqq
qqq
q
q
qqqqqqqqqqq
q
q
qqq
qqqq
q
qqqq
qq
qq
qqqqq
q
qqqqq
qqqqqq
qq
q
q
q
q
qqq
q
qqqq
qq
qq
qqqqqqqqqqq
qqqq
q
qq
qqqq
qqqqqqqq
qq
qqq
qqq
q
q
qq
qq
qq
qqq
qqqqqq
q
q
q
q
qqqq
qq
qq
qqqq
qqq
qq
q
qqqqq
qqqqq
qqq
qq
qqq
q
q
qq
qqqq
qqqq
qq
qq
qqqqqqqqqqq
q
qqq
q
q
qqq
q
qq
qq
qq
q
qqq
q
q
qqqqqqqqqqqq
qqq
qqqqq
q
q
qq
q
qqqqq
q
q
qqqqq
q q
q
qqq
qqqqqqqqqqqqqqqqqqq
qq
qq
q
q
q
qq
qq
q
qq
qq
q
q
q
qqqqq
q
q
qqqqq
q
qq
q
qqq
q
qqq
q
q
q
q
q
qq
qqqq
q
qq
q
q
q
qqqqqqqqqq
q
qqqqqqqqq
qqqqqqqqqqqqqq
q
qq
qqqqqqqqqqqqqqqqqqqqqqqqq
qq
q
qq
qqqqqqqqq
qqqqqqqqqqqqqq
qqq
qq q
q
q
q
qqqqqqqqq
q
qq
q
qqqq
q
q
qqq
qqqqq
q
q
q
q
qqqq
qq
qq
qqq
q
qq
qqqq
q
qqqqqqq
qq
q
qq
qqqqqqqqqq
q
qqqqq
qqqqq
qqqq
qqqq
q
qqqqq
qq
qqqqqq
q
q
qq
q
qq
qq
qqq
qq
qqqq
qq
q
qq
qq
q
q
q qq
qq
qqq
q
qq
q
q
qq
q
qqqq
qqqqqqqqqqqqqqqqq
qqq
q
qq
qqq
qqq
qqqq
qqqq
qqq
qqqqqq
qq
qqqq
qqqq
q
qqq
q
qq
qqqqqqq
q
qq
qq
q
qqqq
qq
qq
q
qq
qqqq
qq
qq
qq
qq
qq
qqqqqqq
qqqqqqqqqq
qq
q
qq
qqqqq
qq
qqq
qq
qqq
q
qq
qq
qq
qqq
qq
q
q
q
qqqq
qqqqqqqq
qqq
qq
q
q
qqqqq
qq
qqqq
q
qqqqqqqq
qqq
q
q
qqqq
qq
qq
q
qqqqqqq
qqq
qq
q
q
qq
q
qq
q
qqq
q
q
q
qq
qq
qqqqqq
qqqq
q
qq
qqqq
qqqq
qqqq
q
q
qqq
q
q
qqqqq
q
qqq
qq
q
qq
q q
qqq
q
qqq
qqq
q
qqqqqq
q
q
qq
q
q
qqq
q
qqqq
qq
qqq
q
qq
qq
q
qqqqq
q
q
qqq
qqqq
qqq
qqq
qqqqqqqqqq
qqqqq
q
q
q
qqqq
qqqqqqqqqqqqqqqqqq
qqq
qq
qq
q
qqqqq
qqq
qqq
qqq
qqqqq
q
qqqqqqqqqqqq
q
qq
qqqqqqqqq
qqqqq
qqq
qqqq
qq
qqqq
q
qqqq
q
qq
q
qq
q
q
q
qqq
q
q
qq
q
qqqqq
q
q
q
q
qq
qqqqq
qqq
q
q
qqqqq
qq
qqqqq
q
qq
qqqqqqqqq
qqqqqqqq
qq
qqqqq
qq
qqqqqq
qqqqqqqqqqqqqq
qqq
qqqqqqqq qq
qqq
qqqqqqqqq
q
q
q
q
q
qqq
qq
qq qqq
q
q
qq
q
qq
q
qqqqq
qqqq
qqqq q
q
q
q
qq
q
qqq
q
qqqqqq
q
q q
qqqqqqqqq
qqq q
qqqq
qqqqqqqqqq
q
q
q
q qqqq
q
qqqq
qq
qq
qqqqq
qq
q
q
q
q
q
qqq
qq
qqqq
qq
q q q
qqqq
qqq
q
qqqq
qq
q
qq
qq
qqqqq
q
qq
q
q
q
qqq
qqqqqq
q
q
qqqqqqqqqqqqqq qq
qq
qq
qqqq
qqqqq
qqq
qqq
q
q
qq
qqqqqqq
q
qqq
q qq
q
qq
q
qq
qqqqqqqqqqqqq
qq
q
q
qqqqqqqqqq
qqq
qqq
qqqqqq
qq
qq
qqqqqqq
qqqqqqqqqqqqq
q
qqqq
qq
qq
q
qq
q
q
qqqq
qq
q
q
qqqq
q
qqqq
qq
qqqq
qq
qq
q
q
qqqqq
q
qq
qqqqqqqq
q
qqqqqqqq
q
qq
qqqqqqq
qqqqq
qqqq
qqqqq
qq
q
q
qqqqq
qqqqqqq
qqqqq
q
qq
qq
qq
qqqqqqq
q
qqqqqqq
qqqq
qqq
q
q
qqqq
qq
q
q
qq
qqq
q
qqq
qq
qqqqqqqqqqqqqqqqqq
q
qqq
qqqqqq
qqqq
qqqqqq
qqq
q
q
qqq
q
q
q
qqq
q
q
qqqqqqqqq
qqqq
qq
qqq
qq
qq
qqq
qq
q
qq
qqqqqqqqq
q
qq
q
qqqqqq
q
qqqqqq
qqqqqqqqqqqqqq
qqqq
q
qqqq
qqq
qqqqqq
qqq
q
qq
q
q
qq
q
q
qq
qqqq
q
q
qqq
qqq
q
q
qqqqqqq
q
q
qqqqqqqqq
q
qq
qq
qqqq
qq
qqqq
qqqqq
qq
qqq
q
qqq
q
qq
qqq
q
qqq
q
qq
q
qqq
qqq
qqq
q
qqq
q
qq
qq
qq
q
q
qqq
q
qqqqq
qq
qq
qq q
q
qq
qqq
qqqqqq
qq
qqq
q
qq
q
q
qqqqq
q
qqqqqqqqqqqqqq
qqqqq
qqq
q
qqq
qq
qqqqq
q
qq
q
qq
qqq
qqq
q
q
qq
qq
qq
qq
qq
q
qqqqqqq
qqqq q
qqqq
qq
qqqq
q
q
qq
qqq
qqq
qqq
qqqqq
qq
qq
q
qq
q
qqq
qq
qqq
qqq
qq
q
q
qq
qq
q
q
q
qq
qqqq
q
qq
q
qq
q
qqqqq
q
qqqqqqqqq
qq
q
qqq
qq
qq
qqq
qqqqqq
q
qqq
qqqq
q
qqqq
qqqqqqqqq
q
q
qq
q
q
qqqq
qqqqqq
qq
qq
q
qq
qqqqqqqqqqq
qq
qq
q
q
qqqq
qq
q
qqq
q
q
qq
qq
qq
qqq
qq
qqqq
qq
q
qq
q
qqqqq
qqqqq
q
q
qqq qqq
qqq qq
qqqqqqqqqqqqqqqqqqqq
q
q
q
q
qq
q
qqqqqq
qq
q
q
q
qqq
qq
q
qq
qq
q
qq
qqqq
qq
qq
qqqqqqq
qq
q
q
qqqq
q
qqq qq
q
qqqq
q
qqqqqq
qqqqqq
qqqqq
qqqqqqq q
qqqq
qqqqq
qq
qq
q
q
qqqqqqq
qqqqq
q
qqq
q
q
qqqqqq
qqq
qqq q
qq
qq
qq
qqqq
q
qqqqqqqqqq
qq
q
qqq
qqq
qqq
q
q
qq
qqq
qq
qqqqqqq
q
qq
qqqqqq
q
qqqqq
qq
qq
q
qqqq
qqqq
q
qqqqq
q
q
q
qqqq
q
qq
q
qqqqq
q
qqq
q
qqq
qqqqqqq
qq
qq
q
q
q
qqq
q
q
qq
q
qq
qqq
q
qqqq
qqqqqq
qqq
qqqq
qqqqqqqqq
qqqq
q
qqqq
qqqq
q
qqq
qqqqq
qqqq
q
q
qq
qqqqqqq
q
q
qqq
q q
qqq
qqqqqqqqq
q
qqq
qq
qqqq
qq
qqq
qqq
qqqq
qq
qqqq
q
q
qqq
q
qqqqqq
qqqqqq
qqq
qq
qqqqqqqqqqq
qq q
qq
qqqqqqqqqqqqq
qqqq
q
q
qq
qqqq
qqq
qqqq
qqq
q
qqqq
qq
qq
q
qq
qq
qqq
qqqqq
q
qqqqqqqqqqqqqqq
qq
q
qq
qqq
q
qq
qqqqqqqq
q
qqqqqq
qq
qq
qq
q
qqqqqqqqqqq
q
qqqqqqq
qq
qqqqq
q
qqqq
qq
q
q
qqqqqq
qq
qqqqq
q
qqq
qq
qqqqq
qqqqqqq
qq
qqq
q
q
q
qq
qq
qq
qqqqq
qqqq
q
q
qqqqq
q
qqq
q
q
qqq
qq
qqqq
qqqqqqq
qqq
qqq qq
qq
qq
q
qq
qqq
qqqqqq
qq
q
q
q
qqqqqqqqqq
qqqqqq
q
q
qqq
qqqqqq
qqqqqq
qqqqqqqq
qq
qqq
qq
qqqq
qqq
qqqq
qqqqq
qq
qqqqq
qqq
qqqq
q
qqq
qqqq qq
qqqqqqqqqqqqq
qqqq
qq
qqqqq
qqqqq
qq
q
qqqqqqqq
q
q
qqq
qqqqqq
qq
qq
q
qq
qqqqq
qqqq
qqqq
q
qqqqq
qq
qq
qqq
q
qq
qqqqqq
qqqq
q
qq
qqqqq
qqq
qqq
q
qq
q
q
qq
q
qq
qq
qqqq
q
q
qqqqq
qqq
qqqq
qqqqqq
q
qqqqqq
q
qqq
qqq
qqqqq
qq
qqqq
qqqqq
qq
qq
qq
q
q
qqq
q
qq
qq
qq
q
q
qq
q
qqqqq
q
qq
qq
q
qqqqqqqq
qqq
qq
qqqqq
qqq
qqqqq
qqqqqqqqqqqqqq
qqq
qqq
qqq
qq
q
qq
qqq
q
q
q
qq
q
qqqq
qqq
q
qqqqq
q
qq
qq
qq
q
qqqqq
qq
q
q
q
q
qq
qq
q
q
qq
q
qq
qq
qq
qqqqq
qq
q q
qq
q
qqqqqqqqqqqq
qq
q
qq
qqqqqq
q
q
qq
qq
qqqqqqqqq
qq
q
q
qq
q
qq
qqqqqq
qq
qqqqqqqq
q
qq
q
q
q
qqq
qq
q
q
qqq
qq
qqq
qq
qqq
qqq
q
q
qqq
qqqqq
qqqqqq
qqqqqq
qq
qqqqqqq
q
q
q
qqqq
qqqqqq
qqqqqqq
q
q
q
q
qq
qqq
q
qq
q
qqq
q
qq qq
q
qqqqq
q
qqqqqqqqqqq
q qqqq
qq
q
qqqq
qqq
qq
qqq
q
q
qq
qq
q
qq
qqq
q
qqq
qqq
qqqqqqqqqqqq
qqqqq
qqqq
qqqqqq qqqqqqq
qq
qqq
q
qq
qq
qq
qq
q
q
qqqqqq
q
q
q
qq
qqqq
qqqqq
qqqqqqqqqq
qqqq
qqqqqq
q
qqq
q
qq
qq
q
qq
qq
qq
q
q
qqqqq
qq
qqq
qq
qq
qqq
q
q
qq
qqqqqqqqq
q
qqqq
qqq
q
qqq
qq
qqqqq
q
qqqqqqqqqqqqqq qq qq
qqq
qqqq
q
qq
q
q
q
qqq
q
qq
qqqqq
qq
qqq
qqqqqqqqq
qq
q
q
q
q
qqq
qqqqq
q
q
qq
qqqqq
qqqq
qqqqq
q
qq
q
qqqq
qq
qqq
q
q
qqqqqqqqqqqqqq
qqqq
qqqqq
qq
qqq
qqq
qqqq
qq
qqqqq
q
q
q
qq
qqq
qqqqq
qqqqq
qqqqqqqq
qq
q
qqqqq
q
qqqqqqq
qqqqqqq
qq
qq
q
q
q
qq
q
q
qqqqq
qqqqqqq
q
qq
qq
qqqqqqq
q
qqqqqqqqqq
q
q
qqq
qq
qqq
qqqqqqqqqqqqqqqqqq
q
qqq
q
qqqqq
q
qqq
qq
q
q
qqqq
qq
q
qq
q
q
qqqq
q
qqqqqq
qqqqq
qqq
qqq
qq
q
q
qq
qq
qq
q
qq
q
q
q
qqq
q
q
qq
qqq
qqq
qq
qq
q
q
qq
qqq
qqqqqqqqqqqq
q
q
qqq
qqqqq
qq
q
qq
q
qqq
q
qq
qq
q
q
qqqqq
qqqqq
qqq
qq
q
q
qqq
q
qq
q
q
q
q
qqq
qq
q
qqqq
q
q
qqqq
q
qqqqq
qqq
qq
qqqq
q
qq
q
q
q
q
q
qqqq
q
q
q
qqq
qqqq
qqq
qqqq
q
q
q
qqq
qq
qqqqqq
qqq
qq
qqqqqq
qqqq
q
qqq
q
q
q
q
qqqqqqq
qqqq
qqqqq
q
qqqqq
qqqqqqqqqqqqqq
qqq
q
qq
qqqqqqqqq
qqq
qqq
qq
q
qqqq
qqqqq
qqq
qqqqqqq
qq
qq
qqqq
qq
qqqqqq
qqq
qqq
qqq
qqqqq
q
q
qq
qqq
q
qqqq
qqqq
qqqqqqqqq
qq
q
q
qqqq
qq
q
qqqq
qq
qqq
qq
qqq
q
qqqqq
q
qqqqqq
q
q
q
qqqq
q
q
qqq
qqqqq
q
q
qqqq qq
qqqq
qq
q
qq
qqqq
qq
qqq
qqqqq
q
qq
qq
qqqqq
qq
qqqq
qqqqqq
qqq
q
qqqqq
qq
qq
q
qqq
qqq
qqq
qqqqqq
qqqq
qqq
qqqqq
q
qqq
qqq q
qq
qq
q
qq
qq
q
q
q
qqq
qqqqqqqq
qq
qqqq
qqqqq qq
qq
q
qqq
qq
q
q
qqqq
qqq
qqqqqq
qqqqqq
qq
q
qqq
q
qqqqqqq
qq
q
q
qqqq
qqqq
q
q
qqqqq
q
qqqq
q
q
qqq
q
q
q
qqqqqqqqqq
qqqqq
qqqqqq
qq
qqq
qqqq
qqqq
qqqq
qq
qqqq
q
qqqqqqq
q
qqqqq
qq
q
qqq
q
qqqq
qqq
q
q
q
q
qqq
qq
qqqqqqqqqqqqq
qqq
q qqqqqq
qq
qqq
q
qq
q
qqqq
q
qq
qqqqq
qq
qqqqqqqq
qqq
q
q
qq
q
qqqq
qqq
qqqqq
qq
qqqqqqqqqq
qq
qqqqqqq
q
qq
qqqqqq
q
qqqqq
q
qqq
q
q
qq
q
qqq
qq
qq
q
q
q
qq
q
qq
q
qqq
q
qq
qqqqq
q
qqq
qqq
q
qqqqqqq
q
qqq
q
q
qqqqq
q
qqqqqq
q
qqqq
qqq
qqqqq
qqqqq
q
qqqqqq
qqqqq
qqq
q
qqqqq
qqqqqq
qq
q
qqqqqq
qqqq
qqqqqq
q
q
q
q
qqq
qq
qqq
qqqqqq
qqq
q
qqqqq
q
q
q
q
qqqq
qq
qq
qq
qqqqqq
q
q
q
qq
qq
q
qqq
qq
qqqqq
q
qqqqqqqqqqqqq
qqqqqqqq
q
qq
q
qq
qqq
qqq
q
q
qq
qq
qqqq
q
qqqqqqqq
q
q
q
qq
qqqq
qqq
qqqqqqqq
q
qqq
q
q
qq
qqqq
qqqq
qqqq q
q
qqq
qqqqqqqq
qqqqq
q
qqqqq
q
qqq
q
qqq
q
q
qq
q
qqqq qq
qq
qqqqq
qqq
qqqq
qqqqqqq
qq
qqqq
qqq
q
qq
qq
q
qqqqq
qqqqqqq
qqqqqq
qqqqqq
qq
qqqq
q
q
qqqqqqqqq qqqq
qqqqqq
q
q
qq
qq
q
qqqq
qqq
q
q
q
q
qq
q
qqqqqqq
q
qqqq
qq
q
qqqqqqq
qqq
qqqqqq
q
qqqqqqqqq
qqq
qqq
q
qqqq
qqqqqq
qqqqqq
qq q
qqqqq
qqq
q
qq
qq
qqqq
q
q
qqqqqq
q q
q
q
q
q
qqqq
q
qqqqqqq
qqq
qq
qqqqq
q
qqqq
qqq
qqqqq
q
qq
qq
qqq q
qqqq
qqqqqqqq
qqq
q
qqq
qqqqqqqqqqq
q
qq
qqq
qqqqqq
qqq
qqqq
qqq
qq
qqqqqqqqqq
qq
qq
qqqqqqqq
qqqqq
qq
qqqq qq
q
qq
q
q
q
qq
qqqqq
qqqqqqqqqq
q
qqq
qqq
qqqq qq
qqqq
qqqq
qq
qqqq
qqq
qqq
qq
qqq
qqqqq
qqqqqq
qq
qqqqqqqqqqqqq
qq
qq
qqqq
q
q
qqqqqq
q
q
qq
qqqqqq
qqqqqq
qqqqq
q qqq
qqqqq
q
qqqqq
qq
qqqq
q
qqqq
qqqq
qq
q
qqqqqqq
qq
q
q
qqqq
q
qqqqq
qqqqq
qqq
qqq
q
q
qqq
qq
qqq
qqqqqqq
qqq
q
q
qqqqqqq
q
q
qqqqq
qqq
qq
qqqqqqqqqq
qq
q
qqqq
q
qqqqqqqq
qq
qqqq
qqq
qq
qqq
qqqq
qqqq
qqqqqq
qqq
qqqq
qqqqq
qqq
qqqqq q
q
q
qqqq
q
qqqqqqq
q
q
q
q
q
qqqqq
qqq
qq
qq
qq
q
qqq
qq
q
qqqqqqq
qqqq
qqq
qqq
q
q
qqq
q
q
q
q
qq
qqq
q
qqqqqqq
qqqqqq
qqqqqqq
qqqqqqqq
qq
qqq
qq
q
qqqq
q
qqqq
qq
q
qqq
q
qq
q
q
qqqqqq
q
qq
q
q
q
q
q
qq
q
q
qqqqq
q
q
qq
q qqqq
qq
q
qqq
qqqq
qq
q
qq
qqq
qqqqqqqqqq
q
qqq
qqq
qqqq
q
qqqqqq
q
q
qqqqq
qq
qqq
qqqqqqq
qqqqq
qqq
qq
q
qqqqqqqqqq
qqqqq
q
qq
qqqq
q
q
qqqqq
qq
qq
qq
qqqqqq
qq
qqqq
q
qqqqqqqq
qq
qqq
qq
qq
qq q
q
q
qq
qqqqq
q
q
qqqqqqqqq
qq
q
q
q
qqq
qqqqq
qq
q
qqqq
qqq
qqqqqqqqqqq
q
q
qqqq
q
qqqq
qq
qq
qq
qq
qqqqq
q
qq
q
qqqqq
qqqq
qqqq
q
qqqqqqqqq qq
qqq
qq
q
qq
qqqqqqqqq
q
qqq
qqq
qqqq
qqq
qqqq
qqqqq
qqqq
q
qqqq
q
q
q
qq
qqqqqqq
qq
q
qq
q
q
q
q
qq
qqqqq
q
qqqqq
q
qqq
qqqqq q
q
qq
qq
qq
qq
qqqq
q
q
qq
qqqq
qq
q
qq
q
qqqqqqqqqqq
qqqqqqq
qqqqq
qq
qq
qqq
qqqq
qqqq
q
q
qqq
q
qqq
q
qq
q
qqqqqq
qqq
qq
qqqqqqqqqq
qqq
qq
q
q
qq
qq
qqqq
q
qq
q
qqq
q
qqqq
q
q
qq
qqqqq
qqqqqq
qqqqq
q
qqq q
qqqq
q
qq
qq
q
q
qqq
q
qq
qqqqqq
q
qqq
qqq
q
q
q
qqqqqqq
qq
q
q
qq
qqqqqq
qqqqqqqqqqqq
q
qq
qqq
qq
qqq
qq
qqq
qq
qqqq qqqqq
qqqqqqqq
qqq
qqqqqq
qqq
qqqqq
q
qqqq
q qqqqq
qqq
q
q
q
qqqqqq
qq
q
q
qq
qq
q q
qqqq
q
q
qq
q
qqqqqq
qqqqqq
qqq
qq
qq
qqq
q qqqqqq
q
qqqqqqqqqq
q
qqqqqqqqq
q
qq
qqqqq
qqqqqq
qq
qqq
qqqqqq
qqqq
qq qqqqqq
q
qq
qqqq
qqqqqqq
q
q
q
q
qqqqqqq
qq
q
q
qq
qqqq
qqqqq qqq
qqqqqqq
q
qqqqq
qqqqqqq
qq
q
q
qq
qqq
q
qqq
q
q qqqqq
qq
q
q
q
qqqqq
qqq
qqqq
q
qqq
qq
qq
qq
qqqqq
qq
qq
q
qqqq
qq
qqqqq qqqqqqqq
qq
qq
qqqq
q
q
qq
qqq
qqqq
qqqqq
q
qqqqq
qqq
q
qqqqqqq
q
qqq
q q
q
qqq
q
q
qq
qqqqqqqqqqqqqqqqqq
qqq
q
q
qqqqqq
qqqq
qqqqqqqqqqqqqqqq
qqq
q
q
qqqq
qqqq
qq
q
q
qq
qqqq
q
q
q
qqqqqqq
qqq
qqqqqqqq
q
q
qqqqq
q
qq
q
qqq
qq
qqqqqq
q
qqqqq
q
q
qq
qq
q
qqq
qq
q
qq
q
q
qq
qqqqqqq
q
qqqqqqqq
q
q
q
qqqq
q
q
q
q
qqq
qqqq
qq
q
qqqqqqqq
qqqqqqqq
qq
qqq
qq
q
qqq
qq
qqq
qqq
q
qqqq
qqqq
qqqqqq
qqqqq
qqq
q
q
qqq
qqqqqqq
q
qqq
qqqq
qqq
qq
qqq
qqqq
q
q
qqqqqqqqqq
qq
q
qq
qqqqqqq
qqqq
qq
qq
q
qq
qqqqqqq
qqq
q
qq
qqq
qqqqqqq
qqqqq
qqqqqq
q
qq
qqqqqqq
qq
q
q
qqq
qqqqqqq
qqqqq
qqqq
q
q
q
qq
qqqq
qq
q
q
qq
qqq
qq
qqqqqq
qqq
q
qqq
qqqqq
qq
q
qqqqqqqqq
q
qq
q
qqqq
qqqqqqqqq
q
qq
qqqqq
qq
q
qq
q
qqq
q
qqqqqqqq
qq
qqqqqqqq
qqqqqqqqqqqq
qqqq
q
qqq
qqqqq
q
qqqqqq
q
q
qq
qqqq
qq
q
qqqq
q
qqqq
qqqqq
q
q
q
qqq
qq
q
qqq
qq
q
q
qq
qq
q
qq
qq
qqqq
q
q
qq
qq
qqqq
qqq
q
qqq
qqqqq
q
qq
q
qqqq
q
qq
qqqqq
q
q
qqq
q
q
qqq
qqqqq
qqqqqqqqqqqqq
q
q
qqq
qq
q
qq
qqqqq
qqqq
q
qq
qq
qqq
q
q
q
q
qq
qq
qqqqqqqq
qqq
q
q
qq
qqqqqqq
q q
q
q
qq
q
qq
qqqq
qq
qqqq
q
qqq
qqqqqqqqq
qqqqqqqqqq
q
q
q
qq
qqqqqqq qqqq
qqq
qqqqqq
q
q
qqq
qq
q
qqqqq
qqqqq
q
qq
q
q
q
qqq
qq
q
q q
qqqqqqqqqq
qq
qq
qqqqq
q
qq
q
qqqq
qq
qqqq
qqq
qq
qqq
qqqq
qq
q
q
q
qqqqq
qqq
q
qq
qqqqqqqqqqq
qqqqqq
qqqqqqqq
qq
qq
qq
qqqqqq
qqq
qqq
q
q
q
qqq
qqq
qqqqqqqq
qqqqqqqq
q
qqq
q
q
qqq
qqqqqqqq
qqq
q
qq
qq
qqqqq
qqqq
qqqqq
q
qqqq
qqqqqqqqqqq
qq
qqqq
q
qq
q
qqq
q
q
q
qq
qqqq
qq
qqq
q
q
q
q
q
q
qqqqq
q
qq
qqqqqqqqq
qq
q
q
qq
qqqqq
qqqqqqqqqq
qq
qqq
qq
qqqq
qq
qq
q
qqq
qq
qq
qqqqqq
qqqqq
qqq
qqqqqqq
qqqqqq
qq
q
q
qqqqqqqqq
qqqqq
q
qqqqqq
q
qqqqqqqqqqqq
qqqq
q
q
qq
qq
qqqqqqq
qqqqqqqqq
qq
q
qq
q
qq
q
q
qq
q
q
q
qqqqqqqqq
qq
q
qqqq
qq
qq
q
qqqq
qq
qqqqqqqqqqq
q
qq
qqqqqq
q
q qqqqqqq
qqq
q
q
qqqqq
q
qq
q
qqqq
q
qq
qqqqqqqq
qqq
qqq
q
qq
qq
q
q
q
q
qq
qqq
qqq
q
qqqqqqq
qqqqq
qq
q
q
qqqq
qq
qqq
q
q
qqq
qq
qq
qq
q
qq
qqq
qq
q
qq
q
q
qqq
qqq
qq
qqq
qqqqqqqqq
q
qqqqq
q
qqqqqqqqqqqq
qqq
q
qq
qqqqqqqq
q
qqq
qqq
qqqqq
q
qqq
qq
qqqqqq
qqqqqq
qq
q
qqqq
qq
qq
q
qq
q
qqqqqqqq
qqq
qqqqqqqqq
qq
qqqqq
qqqq
q
q
qqqqq
qqqqqqq
q
qqq
qqqq
q
qq
q
qq
q
qqqqqqqqq
qqq
qqqqq
q
q
q
qqqqqqq
q
qqqq
qqqq
q
q
q
q
qqqq
qqqqqq
q
q
q
qq
qqqq
qq
qq
qqqqq
qq
q
q
q
qq
qq
qqqqqqq
qqqqq
qqqqqqq
q
qqq
qqq
q
qq
qqq
q
qq
qqqqq
q
qqq
q
q
q
q qq
qqqqqqq
qqqqq
q
q
qqqq
qqqq
q
qqq
q
q
qqqq
q
qqq q
qqqqqqq
qqqq
qqqqq
q
qq
qq
qqq
q
q
qqqq
q
qq
qq
q
q
q
qq
qq
qqqqq
qq
q
qq
q
qq
qqq
qqqq
qqq
qqqq
qqqqqqqqq
q
qqqqqqqqqqqqq
q
qqq
qqqqqqqqqqqq
q
q
q
qqq
q
qqqq
qqq
qq
qq
qq
qq
q
q
qqq
q
qqqq
qqqqqq
q
q
qqqqqq
qq
qq
q
q
qqqqq
q
qqqqqqq
q
q
qq
qqq
qqq
qqqqqqqqqqqqqq
qqqqq
qqqqqqqq
qqqq
qq
q
qqqqqqqq
q
qq
qqqqq
q
q
qq
q
qq
q
q
qq
qqqq
q
q
q
qq
qqqq
qqqq
qqqq
q
q
qqq
q
qqqqqqq
qqqq
q
qqqqqqqq qq
q
q
qqq
qq
qqqqq
qqqqqqq
qq
qqqqqq
qqq
qq
q
q
qqqqqq
q
qqqqq
q
qqqqqqq
qq
q
qq
q
qqqqq
q
q
qqqqqqq
qqqqq
q
qq
qqq
qq
q
q
qq
qq
q
q
qqq
qqqq
qqq
qqq
qqq
qqqqq
qq
qq
qq
q
qq
qqq
q
q
q
qqqqqqqq
qqq
q
qqqqqqqqqq
qq
qqqqq
qqqqq
qq
qq
qqq
qqq
q
q
q
q
qq
q
q
qqqqqqq
qq
q
q
qqq
q
qqqqqqqqqqqqq
q
qq
qqqqq
qq
qqqqqqqqqq
qqq
qqqq
qqqqqq
qq
qqqqqqqqq
q
q
qqq
q
qq
qq
qqqqqqq qqqqq
q
qqqq qq
q
qq
q
q
q
q qqqq
qqq
q
q
qqqq
q
q
q
qq
q
qqq
qqq
qqqq
qqqqqqqqqqqqqq
qqq
q
qqq
q
q
qq
qqqqq
qqqqqqqqqq
qqqqqqqqqqq
qq
qqq
qq
qqq
qqq
qqqqqq
qqqq
qq
qq
q qq q
qq
qqqqq
q
qqqqq
qq
qq
qqqqqqq
qqqqqq
qq
q
qq
qqqq
qqq qqq
qqqq
qqq
qqqqq
q
q
qqqqqqqq
q
q
q
q
q
qq
qqq
q
qq
qqqqqq
q
qqqqq
qqq
qqqq
qqqq
q
q
qqqqqq
qqqq
qqqqqq
qqqqqq
qq
q
q
qq
q
q qq
qqqqqqqqqqqqqq
qq
qqq
q
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Outcome
Result with a simulated sample of 100 points and ϑ1 = 0.6,
ϑ2 = 0.2 and scale ω = 0.5
q q
−2 −1 0 1 2
−1.0−0.50.00.51.0
θ1
θ2
qqqqqqq
qqqq
qqqqq
qq
qqq
q
q
q
qqqqqqqqqqqqqqqqqqq
qq
qq
qqqqqqqqq
qqqqqqqqqqqqq
qq
qqqqqqqqq
q
qq
q
q
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqq
qqqqqqqqqqqqqqq
qqqqqqq
qqqqqqqqq
qqqqqq
qqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qqqqqqqqqq
q
qqqqqqqqqq
qqqqqq
qqqq
q
qqqqqqqq
qqq
qqqqqqqqqq
qqqqqq
qqqqqqqqqqqqqq
qqq
q
qq
qqqqqqqqqqqqqqqq
qqqq
qqqqq
qq
qqq
qqqqqqqq
qqqqqqqqq
qqqqqqqqqqqqqqqqqqqq
qqqqqqqqqq
qqqqq
q
qqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqq
qqqqqqqqqqqqq
qqqqqqqqqq
qqqq
qq
qqqqq
qqqqqq
qqqq
qqqqq
qqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqq
qqqqqqq
qqq
qqqqqqqqqqqqqq
qqqqqq
qqqqqq
qqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqq
q
qqqqqqqqq
qqqqqq
q
qqqqqqqqqqq
qqqqqqqqqq
qqq
qqqqqqqqqqqqqqqq
qqq
qqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
q
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqq
qqq
qqqqqqqq
qqqqqqqqq
qqqq
qqqqqq
qqq
qqqqq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qqqq
qqqqqq
qqqqq
qqqqq
q
qq
qqqqqqqq
q
qqqqqqq
qq
qqqqqqqqqqqqq
qq
qqq
qqqqqqqqqqqqqqqqqqqq
qqqqqqqqqq
qqq
qqqqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqq
qq
qqq
q
qqqqq
qq
qqqqqqqqqqqq
qqq
qqqq
q
qqqq
q
qqqqq
qqqqqqqqqq
qq
qqqq
q
qqqqqq
qqq
qqq
q
qqqqq
qqq
qqqqqqqqqqqqq
qqqq
qqqqqq
qq
qqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqq
qq
q
q
qqq
qqqqqqqqqqqqq
qqqqqq
qqqqq
qqqqqq
qqq
qqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
q
qqqqqqqqqqqqqqqqqq
qqqqq
qqqqqqqqqq
qqqqqqqqqqqqqqqqq
q
qq
qqqq
qqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqq
qq
qqqqqqqqqq
qq
qqqqqq
qq
qqqqqqqqqqq
qq
qqqqq
qqqq
q
qqq
qqq
qqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qqqq
q
qqqqqqqqqqqqqqq
qqqqqqq
qqqqqqqqq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqq
qqqqq
qqqqqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqq
qq
qqqq
qqq
qqqqq
qqqqqqqqqqqqqq
qq
qqqq
qqq
qqq
qqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqq
qqq
qqqqqqq
qqqqqqqqqqqqqqq
qqq
q
q
q qqqqqqqq
qqqqqqq
qqqqqqqqqqq
qqqqq
qq
qqqqqq
qq
q
qqqqq
qqqqqq
qqqqqqqqqq
qqqqqqqqq
qqqqqqqq
qqqqqqq
qqq
qqqqqqqq
qqqqqqqqqqqqq
qqqqqqqqqqqqqqq
qqq
qqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqq
qqqqqqqq
qqqqqqqqqqqqq
qqqqq
qqqq
qqqqqqqqqqqqqq
qqq
q
qq
qqq
q
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqq
qqqqqq
qqq
qqqqqqq
qqqqq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqq
q
qqqqqqqqqqqqqqqqq qqqq
qqqqqqqqqq
qqqqqqq
qq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqq
qqq
qqqqqqqqqqq
qqqqq
qqqqqqqq
qqq
qq
qqqqq
qq
qqqq
qqqqq
qqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqq
qq qqq
qqqqq
qqqqqqqqqqq
q
qqqqqqq
qqqqqqqqqqqqq
qqq
qqq
qqqqqqqq
qq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqq
qqqqqqqqq
qq
qq
qqqq
qqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqq
qq
qqqqqq
qqqqqqqqqqqqqqqqqq
qqqqq
qqqqqqqqqqqqqqqqqq
qqqq
q
qqqqq
q
qqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qq
q
qqqqqq
qqqqq
qq
qq
q
q
qqqqqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqq
q
qqq
qqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqq
qqqqqqqqq
q
qqq
qqqqqqq
qqqqqqq
qqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqq
qq
q
qqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqq
qqqqq
qqq
qqqqqqqqqqqqq
qqqqqqqqqqqqqqqqq
qqqq
q
qqqq
q
qqqqqqqqqqqqq
qqqqqqqqqqq
qqqqqqqq
qqqqqqqqq
qqqqqqqqqqq
qqqqqqqqqqqqqqq
q
qqqqqqq
qq
qqqqqqqqqq
qqq
qqqqqqqqqqqq
qqqqqqqqqqqq
qqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qq
qqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqq
qqq
qq
q
qqqqq
qqq
qqqqq
qqqqqq
qqqqqqqqqqqqq
qqqqqqq
qq
qqqqqqqqqqq
qqqqq
qqqqqqqq
qqqqq
qqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqqqqq
qqqqqqqqqqqqqqqqqqqq
q
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqq
q
qqq
qqqqqqqqqq
q
qq
qqqqqqqqqq
qq
qqqqqqqqqqqqqq
qq
qqqqqqqqqqqqqqqqq
qqq
q
qqqqq
qqqq
qqqq
qq
q
q
q
qqqqqqqq
qqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qq
q
qqqqqqqqqqqqqqqqqqqqqq
qqqqqqq
qqqqqqq
qqqq
qqqqqqqqq
qq
qq
qqqq
q
qqq
qqqqqqq
qqqqq
qqq
qqq
qqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qqqqqqqq
qqqqqqqqqqqqq
qqqq
qq
qqq
qqqq
qqqqqq
qqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqq
qqqqqqq
qqq
qq
qqqq
qqq
qqqqqqqqqqqqq
qqqqqqq
qq
qqqqqq
qqqqqqq
qqqqqqqqqqqqq q
qqqqqqqq
qqqqqqqqqqqqqqq
qq
qqq
q
qq
qqqqqqqq
qqqqqqq
qqqqqqqqqqqq
qqqqqqqqqqqqqqqqqq
qqqq
q
qqqqqq
qqqqqqqqqqqq
qqq
q
qq
qqqq
qqqqq
qqqqqqqqqqqqqqqqqqqq
qqq
qq
qqqqqq
qqq
q
qq
qqqq
qqqqq
qqq
qqqqqqqqqqq
qq
qqqqqqqqqqqqqq
qqqqqqqqqqqqqqqq
qq
q
qqqqq
qqqq
qq
qqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqq
qq
qqqqqqqqqq q
qqqqqqqqqqqqqqq
qq
qqq
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qq
qqqqqq
qqqqqqqqqqqqqqqqqqq
qqq
qqqqqq
qqqqqqqqqqqqqq
qqqq
qqqqqqq
q
qqqqqqqqqqqqqqqqqqqq
qqqqq
q
qqqqqqqqq
qqq
qqq
qqqqqqqq
qqqqqqqqqqqqqqqqqqq
q
qqqq
qqqqq
qqqq
qqq
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qq
q
qqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqq
qqqqqqq
qqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqq
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qq
q
qq
qqq
qqq
qqqqqqqqq
qqqqqqqqq
q
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqq
qqqqq
qqq
qqqqqqqq
qqqqqqqqqqqqqqqqqqq
q
qqq
q
qqqqqqqqqqqqqq
qq
qqqqqqq
qqqqq
qqqqqqqqqq
qq
qqqqqq
qqqqq
qqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqqqq
qqqqq
qqqqqqqqqq
qqqqqqqqqqqq
qqqq
q
qqqqqqq
q
qq
qqqqqqqqqqqqqqq
qqq
qqqqqqqqqq
qqqq
qq
qq
qqqqqqqqqqqqqqqqqq
qqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qqqqqqqqqqq
qq
qqqqqqqqqqqqqqq
qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqq
qqq
qqqqqqq
q
qqqqqqqqqqqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qq
qqqqqq
qqqqqq
qqq
q
qqqqqqq
qqqqqqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqq
qqqqqqqqqq
qqq
qq
qqqq
qqqqqqqqq
qqqqqqq
qqqqqq
qq
q
qqq
qqqqqqq
qqqqqqqqq
qqqq
qqq
qqqqqqqqq
qqqq q
qqqqqqq
qqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqq
qq
qqqqqq
qqqqq
q
qqqqq
qqq
qqqqqqqq
qqq qqq
qqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqqq
qq
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
q
qqqqqqqqq
qqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqq
qq
qqq
qqqqqqqq
qqqqqqqq
qqqqqqq
qq
qqqqqqqqqqq
qqqqqqqqqqqqqqqq
qqqqqqqqqqq
qqqqqqqqqq
qq
qq
qqqqqqqqqqqqqq
qqqq
qq
q
qqqqq
qq
qqqqqqqqqqqqq
qqqqqqqqqqqqqq
q
q
qqq
qqqqqq
qqqqq
qq
qqqqqqq
qqqqq
qqqqq
qqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqq
qqqqqqqqqq
qqqqq
q
qqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqq
qqqqqqq
qqq
q qq
qq
q
qqq
q
q
qqqqqqq
qq
qqqqqqqqq
q
qqq
qqqqq
qqqqq
qqqqqqqqqq
q
qqqq
qqqqqqqqqqqqqqqq
qqqq
qqqqq
qqqqq
qqqqqqqqqqqq
qqqqqq
qqqqqq
qqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqq
qqq
qqq
qqqqqqqqqqqqqq
q
qqqq
qqqq
qqqqqq
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqqqqqqqqqqq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqq
qq
qqqqq
qqqqqq
qqqqqqqqqq
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqq
qqq
qqqqqqqqqqq
qqqqq
qqqqq
qqq
qqqqqqqqqqqq
qqqqqqqqq
qq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqq
qqqqq
qq
qqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqqqqqqqqqq
q
qqqqq
qqqqqq
qqqqqqq
qqqq
q
q
qqqq
qqqqqqqq
q
qqqqqqq
qqqqqqqq
q
qqqqqqqqq
qqqqqqqqq
qqqqqqq
qqqqqqqqq
qqqqqqqqqqqqqqq
qq
qqqqqqqqqqqqq
qqqqq
qq
qqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqq
qqqqqq
qqqqq
q
qqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqq
qqqqqqqqqqqqqq
qqq
qq qqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqq
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqq
qqqq
qqqqqqqq
qqqqq
qqqqqqq
qqqq
qqqqq
qq
qqqqqqqqqqqqqq
qqqqqqqq
qqqqqqqqqqqqq
qqqqqqqqq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqqqqqqqqqqqqqq
qqqqqq
qqqqqqqqqqqqqqqqqqqqqqq
qqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qqqqqqqq
q
qqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qqqq
qqq
qqqq
qqq
qqqqqqqqq
q
qqqq
qq
qqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qq
qqqqqqq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
q
qq
qqq
qqqq
q
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqqqq
qqqqqq
qqq
q
qqqqq
qqq
qqq
qqqqq
qqqqqqqqqqq
qqqqqqqqqq
qqqqqqqqqq
qqqqq
qqqqq
q
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqq
qq
qqqqqqqqqqqqqqqqqq
qqqqqqqq
q
q
q
qqqqq
qqqq
q
qqqq
qqqqq qqqqqqqq
qqqq
qqqqqqq
q
q
qqqqqqq
qqqqqqqq
qqqqqq
qq
qqqq
q qqqq
qqqqqqq
qqqqqqq
qq
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
q
qqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqq
q
qqqq
q
qqqqqqqqqq
qqqqqqqqqqqqq
qqq
qq
q
qqqqq
q
qqqqqqqqqqqqqq
qqqq
qqq
qqqqqqq
qqqqqqq
qq
qqqqqqqq qqqqqq
qqqqqqqqqq
q
qqqqq
qqqqqq
qqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qqqqqqqqqqqqqqqqq
q
qq
qqqqqqqqq
qqqqqqqq
qqqqqqq
qq
qqqqqqqq
qqqqqq
q
qqqq
qqqqqq
qqqq
qq
qqq
q
q
qqqqqqq
qqqqqqqq
qqqqq
qqqqq
qqqqqq
qqqqq
qqqq
q
qqqq qqqqq
qqqqqqqqqqqqq
qqqqqqqqqqqqqqq
qqqq
qqqqqqqqqqqqqqqqqq
qqqqqqq
q
qqq
qqqqqqqqqqqqqqqqq
q
q
qqqqqq
qqqqqqqqq
qqqqqqqqqqqqq
qqq
qqqq
qqqqqqqqqqqqqqqq
q
qqqqqqq
qqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqq
qqqqqq
qqqqq
qqqqqqq
qqqqqq
qqqqq
q
qqqqqq
q
qqqqqqqqqqq
q
qqqqqqq
qqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqq
qqq
qqqq
qqqqqqqqqqqqqqqqqqq
qqqqqqqqq
qqqq
qqqqqqqq
qqq
qq
qqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qq
qqqqqqqqqqq
qqq
qqqqqqqqqqqq
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqq
q
qqq
qqq
qqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqq
qqq
qqqqqqqqq
qqqqqqq
qqqq
q
q
qqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqq
q
q
qq
qqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qq
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqq
qqqqqqqqq
qq
qqqqqqqqqqqq
qqqqqqqqqq
q
qq
qqqqqqqqq
q
qqqqqqq
qqqqqqqqqqqqqqq
qqqq
qq
qqq
qqqqqqqq
qq
qqq
qqqqqqqqq
qqqqqqqqqqqqqqqq
qqqqqq
q
q
q
qq
qqqqqq
qq
q
qqqqqqqqqqqqq
qqq
qqqqqqqqqqqqq
qqqqq
qqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqq
qqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqq
qq
qqqqqq
qqqqqqqqqqqqqqqqq
q
qqqqq
qqqqqqqqqqqqqqqqqq
qqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqqqq
q
qqqqqq
qqqqqq
q
qqqq
qqqqqqqqqqq
qqqqqqqqqq
qqq
qqqqqqqqqq
qqqqq
qq
qqq
qqq
qqqqqqq
q
MCMC and Likelihood-free Methods
The Metropolis-Hastings Algorithm
Extensions
Outcome
Result with a simulated sample of 100 points and ϑ1 = 0.6,
ϑ2 = 0.2 and scale ω = 2.0
q q
−2 −1 0 1 2
−1.0−0.50.00.51.0
θ1
θ2
qqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
q
qqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
MCMC and Likelihood-free Methods
The Gibbs Sampler
The Gibbs Sampler
skip to population Monte Carlo
The Gibbs Sampler
General Principles
Completion
Convergence
The Hammersley-Clifford theorem
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
General Principles
A very specific simulation algorithm based on the target
distribution f:
1. Uses the conditional densities f1, . . . , fp from f
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
General Principles
A very specific simulation algorithm based on the target
distribution f:
1. Uses the conditional densities f1, . . . , fp from f
2. Start with the random variable X = (X1, . . . , Xp)
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
General Principles
A very specific simulation algorithm based on the target
distribution f:
1. Uses the conditional densities f1, . . . , fp from f
2. Start with the random variable X = (X1, . . . , Xp)
3. Simulate from the conditional densities,
Xi|x1, x2, . . . , xi−1, xi+1, . . . , xp
∼ fi(xi|x1, x2, . . . , xi−1, xi+1, . . . , xp)
for i = 1, 2, . . . , p.
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Algorithm (Gibbs sampler)
Given x(t) = (x
(t)
1 , . . . , x
(t)
p ), generate
1. X
(t+1)
1 ∼ f1(x1|x
(t)
2 , . . . , x
(t)
p );
2. X
(t+1)
2 ∼ f2(x2|x
(t+1)
1 , x
(t)
3 , . . . , x
(t)
p ),
. . .
p. X
(t+1)
p ∼ fp(xp|x
(t+1)
1 , . . . , x
(t+1)
p−1 )
X(t+1) → X ∼ f
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Properties
The full conditionals densities f1, . . . , fp are the only densities used
for simulation. Thus, even in a high dimensional problem, all of
the simulations may be univariate
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Properties
The full conditionals densities f1, . . . , fp are the only densities used
for simulation. Thus, even in a high dimensional problem, all of
the simulations may be univariate
The Gibbs sampler is not reversible with respect to f. However,
each of its p components is. Besides, it can be turned into a
reversible sampler, either using the Random Scan Gibbs sampler
see section or running instead the (double) sequence
f1 · · · fp−1fpfp−1 · · · f1
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example (Bivariate Gibbs sampler)
(X, Y ) ∼ f(x, y)
Generate a sequence of observations by
Set X0 = x0
For t = 1, 2, . . . , generate
Yt ∼ fY |X(·|xt−1)
Xt ∼ fX|Y (·|yt)
where fY |X and fX|Y are the conditional distributions
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
A Very Simple Example: Independent N(µ, σ2
)
Observations
When Y1, . . . , Yn
iid
∼ N(y|µ, σ2) with both µ and σ unknown, the
posterior in (µ, σ2) is conjugate outside a standard familly
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
A Very Simple Example: Independent N(µ, σ2
)
Observations
When Y1, . . . , Yn
iid
∼ N(y|µ, σ2) with both µ and σ unknown, the
posterior in (µ, σ2) is conjugate outside a standard familly
But...
µ|Y 0:n, σ2 ∼ N µ 1
n
n
i=1 Yi, σ2
n )
σ2|Y 1:n, µ ∼ IG σ2 n
2 − 1, 1
2
n
i=1(Yi − µ)2
assuming constant (improper) priors on both µ and σ2
Hence we may use the Gibbs sampler for simulating from the
posterior of (µ, σ2)
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
R Gibbs Sampler for Gaussian posterior
n = length(Y);
S = sum(Y);
mu = S/n;
for (i in 1:500)
S2 = sum((Y-mu)^2);
sigma2 = 1/rgamma(1,n/2-1,S2/2);
mu = S/n + sqrt(sigma2/n)*rnorm(1);
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3, 4
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3, 4, 5
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3, 4, 5, 10
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3, 4, 5, 10, 25
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3, 4, 5, 10, 25, 50
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3, 4, 5, 10, 25, 50, 100
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Example of results with n = 10 observations from the
N(0, 1) distribution
Number of Iterations 1, 2, 3, 4, 5, 10, 25, 50, 100, 500
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Limitations of the Gibbs sampler
Formally, a special case of a sequence of 1-D M-H kernels, all with
acceptance rate uniformly equal to 1.
The Gibbs sampler
1. limits the choice of instrumental distributions
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Limitations of the Gibbs sampler
Formally, a special case of a sequence of 1-D M-H kernels, all with
acceptance rate uniformly equal to 1.
The Gibbs sampler
1. limits the choice of instrumental distributions
2. requires some knowledge of f
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Limitations of the Gibbs sampler
Formally, a special case of a sequence of 1-D M-H kernels, all with
acceptance rate uniformly equal to 1.
The Gibbs sampler
1. limits the choice of instrumental distributions
2. requires some knowledge of f
3. is, by construction, multidimensional
MCMC and Likelihood-free Methods
The Gibbs Sampler
General Principles
Limitations of the Gibbs sampler
Formally, a special case of a sequence of 1-D M-H kernels, all with
acceptance rate uniformly equal to 1.
The Gibbs sampler
1. limits the choice of instrumental distributions
2. requires some knowledge of f
3. is, by construction, multidimensional
4. does not apply to problems where the number of parameters
varies as the resulting chain is not irreducible.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Latent variables are back
The Gibbs sampler can be generalized in much wider generality
A density g is a completion of f if
Z
g(x, z) dz = f(x)
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Latent variables are back
The Gibbs sampler can be generalized in much wider generality
A density g is a completion of f if
Z
g(x, z) dz = f(x)
Note
The variable z may be meaningless for the problem
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Purpose
g should have full conditionals that are easy to simulate for a
Gibbs sampler to be implemented with g rather than f
For p > 1, write y = (x, z) and denote the conditional densities of
g(y) = g(y1, . . . , yp) by
Y1|y2, . . . , yp ∼ g1(y1|y2, . . . , yp),
Y2|y1, y3, . . . , yp ∼ g2(y2|y1, y3, . . . , yp),
. . . ,
Yp|y1, . . . , yp−1 ∼ gp(yp|y1, . . . , yp−1).
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
The move from Y (t) to Y (t+1) is defined as follows:
Algorithm (Completion Gibbs sampler)
Given (y
(t)
1 , . . . , y
(t)
p ), simulate
1. Y
(t+1)
1 ∼ g1(y1|y
(t)
2 , . . . , y
(t)
p ),
2. Y
(t+1)
2 ∼ g2(y2|y
(t+1)
1 , y
(t)
3 , . . . , y
(t)
p ),
. . .
p. Y
(t+1)
p ∼ gp(yp|y
(t+1)
1 , . . . , y
(t+1)
p−1 ).
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example (Mixtures all over again)
Hierarchical missing data structure:
If
X1, . . . , Xn ∼
k
i=1
pif(x|θi),
then
X|Z ∼ f(x|θZ), Z ∼ p1I(z = 1) + . . . + pkI(z = k),
Z is the component indicator associated with observation x
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example (Mixtures (2))
Conditionally on (Z1, . . . , Zn) = (z1, . . . , zn) :
π(p1, . . . , pk, θ1, . . . , θk|x1, . . . , xn, z1, . . . , zn)
∝ pα1+n1−1
1 . . . pαk+nk−1
k
×π(θ1|y1 + n1¯x1, λ1 + n1) . . . π(θk|yk + nk ¯xk, λk + nk),
with
ni =
j
I(zj = i) and ¯xi =
j; zj=i
xj/ni.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Algorithm (Mixture Gibbs sampler)
1. Simulate
θi ∼ π(θi|yi + ni¯xi, λi + ni) (i = 1, . . . , k)
(p1, . . . , pk) ∼ D(α1 + n1, . . . , αk + nk)
2. Simulate (j = 1, . . . , n)
Zj|xj, p1, . . . , pk, θ1, . . . , θk ∼
k
i=1
pijI(zj = i)
with (i = 1, . . . , k)
pij ∝ pif(xj|θi)
and update ni and ¯xi (i = 1, . . . , k).
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
A wee problem
−1 0 1 2 3 4
−101234
µ1
µ2
Gibbs started at random
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
A wee problem
−1 0 1 2 3 4
−101234
µ1
µ2
Gibbs started at random
Gibbs stuck at the wrong mode
−1 0 1 2 3
−10123
µ1
µ2
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Slice sampler as generic Gibbs
If f(θ) can be written as a product
k
i=1
fi(θ),
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Slice sampler as generic Gibbs
If f(θ) can be written as a product
k
i=1
fi(θ),
it can be completed as
k
i=1
I0≤ωi≤fi(θ),
leading to the following Gibbs algorithm:
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Algorithm (Slice sampler)
Simulate
1. ω
(t+1)
1 ∼ U[0,f1(θ(t))];
. . .
k. ω
(t+1)
k ∼ U[0,fk(θ(t))];
k+1. θ(t+1) ∼ UA(t+1) , with
A(t+1)
= {y; fi(y) ≥ ω
(t+1)
i , i = 1, . . . , k}.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example of results with a truncated N(−3, 1) distribution
0.0 0.2 0.4 0.6 0.8 1.0
0.0000.0020.0040.0060.0080.010
x
y
Number of Iterations 2
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example of results with a truncated N(−3, 1) distribution
0.0 0.2 0.4 0.6 0.8 1.0
0.0000.0020.0040.0060.0080.010
x
y
Number of Iterations 2, 3
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example of results with a truncated N(−3, 1) distribution
0.0 0.2 0.4 0.6 0.8 1.0
0.0000.0020.0040.0060.0080.010
x
y
Number of Iterations 2, 3, 4
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example of results with a truncated N(−3, 1) distribution
0.0 0.2 0.4 0.6 0.8 1.0
0.0000.0020.0040.0060.0080.010
x
y
Number of Iterations 2, 3, 4, 5
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example of results with a truncated N(−3, 1) distribution
0.0 0.2 0.4 0.6 0.8 1.0
0.0000.0020.0040.0060.0080.010
x
y
Number of Iterations 2, 3, 4, 5, 10
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example of results with a truncated N(−3, 1) distribution
0.0 0.2 0.4 0.6 0.8 1.0
0.0000.0020.0040.0060.0080.010
x
y
Number of Iterations 2, 3, 4, 5, 10, 50
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example of results with a truncated N(−3, 1) distribution
0.0 0.2 0.4 0.6 0.8 1.0
0.0000.0020.0040.0060.0080.010
x
y
Number of Iterations 2, 3, 4, 5, 10, 50, 100
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Good slices
The slice sampler usually enjoys good theoretical properties (like
geometric ergodicity and even uniform ergodicity under bounded f
and bounded X ).
As k increases, the determination of the set A(t+1) may get
increasingly complex.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example (Stochastic volatility core distribution)
Difficult part of the stochastic volatility model
π(x) ∝ exp − σ2
(x − µ)2
+ β2
exp(−x)y2
+ x /2 ,
simplified in exp − x2 + α exp(−x)
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
Example (Stochastic volatility core distribution)
Difficult part of the stochastic volatility model
π(x) ∝ exp − σ2
(x − µ)2
+ β2
exp(−x)y2
+ x /2 ,
simplified in exp − x2 + α exp(−x)
Slice sampling means simulation from a uniform distribution on
A = x; exp − x2
+ α exp(−x) /2 ≥ u
= x; x2
+ α exp(−x) ≤ ω
if we set ω = −2 log u.
Note Inversion of x2 + α exp(−x) = ω needs to be done by
trial-and-error.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Completion
0 10 20 30 40 50 60 70 80 90 100
−0.1
−0.05
0
0.05
0.1
Lag
Correlation
−1 −0.5 0 0.5 1 1.5 2 2.5 3 3.5
0
0.2
0.4
0.6
0.8
1
Density
Histogram of a Markov chain produced by a slice sampler
and target distribution in overlay.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
Properties of the Gibbs sampler
Theorem (Convergence)
For
(Y1, Y2, · · · , Yp) ∼ g(y1, . . . , yp),
if either
[Positivity condition]
(i) g(i)(yi) > 0 for every i = 1, · · · , p, implies that
g(y1, . . . , yp) > 0, where g(i) denotes the marginal distribution
of Yi, or
(ii) the transition kernel is absolutely continuous with respect to g,
then the chain is irreducible and positive Harris recurrent.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
Properties of the Gibbs sampler (2)
Consequences
(i) If h(y)g(y)dy < ∞, then
lim
nT→∞
1
T
T
t=1
h1(Y (t)
) = h(y)g(y)dy a.e. g.
(ii) If, in addition, (Y (t)) is aperiodic, then
lim
n→∞
Kn
(y, ·)µ(dx) − f
TV
= 0
for every initial distribution µ.
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
Slice sampler
fast on that slice
For convergence, the properties of Xt and of f(Xt) are identical
Theorem (Uniform ergodicity)
If f is bounded and suppf is bounded, the simple slice sampler is
uniformly ergodic.
[Mira & Tierney, 1997]
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
A small set for a slice sampler
no slice detail
For > ,
C = {x ∈ X; < f(x) < }
is a small set:
Pr(x, ·) ≥ µ(·)
where
µ(A) =
1
0
λ(A ∩ L( ))
λ(L( ))
d
if L( ) = {x ∈ X; f(x) > }‘
[Roberts & Rosenthal, 1998]
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
Slice sampler: drift
Under differentiability and monotonicity conditions, the slice
sampler also verifies a drift condition with V (x) = f(x)−β, is
geometrically ergodic, and there even exist explicit bounds on the
total variation distance
[Roberts & Rosenthal, 1998]
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
Slice sampler: drift
Under differentiability and monotonicity conditions, the slice
sampler also verifies a drift condition with V (x) = f(x)−β, is
geometrically ergodic, and there even exist explicit bounds on the
total variation distance
[Roberts & Rosenthal, 1998]
Example (Exponential Exp(1))
For n > 23,
||Kn
(x, ·) − f(·)||TV ≤ .054865 (0.985015)n
(n − 15.7043)
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
Slice sampler: convergence
no more slice detail
Theorem
For any density such that
∂
∂
λ ({x ∈ X; f(x) > }) is non-increasing
then
||K523
(x, ·) − f(·)||TV ≤ .0095
[Roberts & Rosenthal, 1998]
MCMC and Likelihood-free Methods
The Gibbs Sampler
Convergence
A poor slice sampler
Example
Consider
f(x) = exp {−||x||} x ∈ Rd
Slice sampler equivalent to
one-dimensional slice sampler on
π(z) = zd−1
e−z
z > 0
or on
π(u) = e−u1/d
u > 0
Poor performances when d large
(heavy tails)
0 200 400 600 800 1000
-2-101
1 dimensional run
correlation
0 10 20 30 40
0.00.20.40.60.81.0
1 dimensional acf
0 200 400 600 800 1000
1015202530
10 dimensional run
correlation
0 10 20 30 40
0.00.20.40.60.81.0
10 dimensional acf
0 200 400 600 800 1000
0204060
20 dimensional run
correlation
0 10 20 30 40
0.00.20.40.60.81.0
20 dimensional acf
0 200 400 600 800 1000
0100200300400
100 dimensional run
correlation
0 10 20 30 40
0.00.20.40.60.81.0
100 dimensional acf
Sample runs of log(u) and
ACFs for log(u) (Roberts
MCMC and Likelihood-free Methods
The Gibbs Sampler
The Hammersley-Clifford theorem
Hammersley-Clifford theorem
An illustration that conditionals determine the joint distribution
Theorem
If the joint density g(y1, y2) have conditional distributions
g1(y1|y2) and g2(y2|y1), then
g(y1, y2) =
g2(y2|y1)
g2(v|y1)/g1(y1|v) dv
.
[Hammersley & Clifford, circa 1970]
MCMC and Likelihood-free Methods
The Gibbs Sampler
The Hammersley-Clifford theorem
General HC decomposition
Under the positivity condition, the joint distribution g satisfies
g(y1, . . . , yp) ∝
p
j=1
g j
(y j
|y 1 , . . . , y j−1
, y j+1
, . . . , y p
)
g j
(y j
|y 1 , . . . , y j−1
, y j+1
, . . . , y p
)
for every permutation on {1, 2, . . . , p} and every y ∈ Y .
MCMC and Likelihood-free Methods
Population Monte Carlo
Sequential importance sampling
Computational issues in Bayesian statistics
The Metropolis-Hastings Algorithm
The Gibbs Sampler
Population Monte Carlo
Approximate Bayesian computation
ABC for model choice
MCMC and Likelihood-free Methods
Population Monte Carlo
Importance sampling (revisited)
basic importance
Approximation of integrals
I = h(x)π(x)dx
by unbiased estimators
ˆI =
1
n
n
i=1
ih(xi)
when
x1, . . . , xn
iid
∼ q(x) and i
def
=
π(xi)
q(xi)
MCMC and Likelihood-free Methods
Population Monte Carlo
Iterated importance sampling
As in Markov Chain Monte Carlo (MCMC) algorithms,
introduction of a temporal dimension :
x
(t)
i ∼ qt(x|x
(t−1)
i ) i = 1, . . . , n, t = 1, . . .
and
ˆIt =
1
n
n
i=1
(t)
i h(x
(t)
i )
is still unbiased for
(t)
i =
πt(x
(t)
i )
qt(x
(t)
i |x
(t−1)
i )
, i = 1, . . . , n
MCMC and Likelihood-free Methods
Population Monte Carlo
Fundamental importance equality
Preservation of unbiasedness
E h(X(t)
)
π(X(t))
qt(X(t)|X(t−1))
= h(x)
π(x)
qt(x|y)
qt(x|y) g(y) dx dy
= h(x) π(x) dx
for any distribution g on X(t−1)
MCMC and Likelihood-free Methods
Population Monte Carlo
Sequential variance decomposition
Furthermore,
var ˆIt =
1
n2
n
i=1
var
(t)
i h(x
(t)
i ) ,
if var
(t)
i exists, because the x
(t)
i ’s are conditionally uncorrelated
Note
This decomposition is still valid for correlated [in i] x
(t)
i ’s when
incorporating weights
(t)
i
MCMC and Likelihood-free Methods
Population Monte Carlo
Simulation of a population
The importance distribution of the sample (a.k.a. particles) x(t)
qt(x(t)
|x(t−1)
)
can depend on the previous sample x(t−1) in any possible way as
long as marginal distributions
qit(x) = qt(x(t)
) dx
(t)
−i
can be expressed to build importance weights
it =
π(x
(t)
i )
qit(x
(t)
i )
MCMC and Likelihood-free Methods
Population Monte Carlo
Special case of the product proposal
If
qt(x(t)
|x(t−1)
) =
n
i=1
qit(x
(t)
i |x(t−1)
)
[Independent proposals]
then
var ˆIt =
1
n2
n
i=1
var
(t)
i h(x
(t)
i ) ,
MCMC and Likelihood-free Methods
Population Monte Carlo
Validation
skip validation
E
(t)
i h(X
(t)
i )
(t)
j h(X
(t)
j )
= h(xi)
π(xi)
qit(xi|x(t−1))
π(xj)
qjt(xj|x(t−1))
h(xj)
qit(xi|x(t−1)
) qjt(xj|x(t−1)
) dxi dxj g(x(t−1)
)dx(t−1)
= Eπ [h(X)]2
whatever the distribution g on x(t−1)
MCMC and Likelihood-free Methods
Population Monte Carlo
Self-normalised version
In general, π is unscaled and the weight
(t)
i ∝
π(x
(t)
i )
qit(x
(t)
i )
, i = 1, . . . , n ,
is scaled so that
i
(t)
i = 1
MCMC and Likelihood-free Methods
Population Monte Carlo
Self-normalised version properties
Loss of the unbiasedness property and the variance
decomposition
Normalising constant can be estimated by
t =
1
tn
t
τ=1
n
i=1
π(x
(τ)
i )
qiτ (x
(τ)
i )
Variance decomposition (approximately) recovered if t−1 is
used instead
MCMC and Likelihood-free Methods
Population Monte Carlo
Sampling importance resampling
Importance sampling from g can also produce samples from the
target π
[Rubin, 1987]
MCMC and Likelihood-free Methods
Population Monte Carlo
Sampling importance resampling
Importance sampling from g can also produce samples from the
target π
[Rubin, 1987]
Theorem (Bootstraped importance sampling)
If a sample (xi )1≤i≤m is derived from the weighted sample
(xi, i)1≤i≤n by multinomial sampling with weights i, then
xi ∼ π(x)
MCMC and Likelihood-free Methods
Population Monte Carlo
Sampling importance resampling
Importance sampling from g can also produce samples from the
target π
[Rubin, 1987]
Theorem (Bootstraped importance sampling)
If a sample (xi )1≤i≤m is derived from the weighted sample
(xi, i)1≤i≤n by multinomial sampling with weights i, then
xi ∼ π(x)
Note
Obviously, the xi ’s are not iid
MCMC and Likelihood-free Methods
Population Monte Carlo
Iterated sampling importance resampling
This principle can be extended to iterated importance sampling:
After each iteration, resampling produces a sample from π
[Again, not iid!]
MCMC and Likelihood-free Methods
Population Monte Carlo
Iterated sampling importance resampling
This principle can be extended to iterated importance sampling:
After each iteration, resampling produces a sample from π
[Again, not iid!]
Incentive
Use previous sample(s) to learn about π and q
MCMC and Likelihood-free Methods
Population Monte Carlo
Generic Population Monte Carlo
Algorithm (Population Monte Carlo Algorithm)
For t = 1, . . . , T
For i = 1, . . . , n,
1. Select the generating distribution qit(·)
2. Generate ˜x
(t)
i ∼ qit(x)
3. Compute
(t)
i = π(˜x
(t)
i )/qit(˜x
(t)
i )
Normalise the
(t)
i ’s into ¯
(t)
i ’s
Generate Ji,t ∼ M((¯
(t)
i )1≤i≤N ) and set xi,t = ˜x
(t)
Ji,t
MCMC and Likelihood-free Methods
Population Monte Carlo
D-kernels in competition
A general adaptive construction:
Construct qi,t as a mixture of D different transition kernels
depending on x
(t−1)
i
qi,t =
D
=1
pt, K (x
(t−1)
i , x),
D
=1
pt, = 1 ,
and adapt the weights pt, .
MCMC and Likelihood-free Methods
Population Monte Carlo
D-kernels in competition
A general adaptive construction:
Construct qi,t as a mixture of D different transition kernels
depending on x
(t−1)
i
qi,t =
D
=1
pt, K (x
(t−1)
i , x),
D
=1
pt, = 1 ,
and adapt the weights pt, .
Darwinian example
Take pt, proportional to the survival rate of the points
(a.k.a. particles) x
(t)
i generated from K
MCMC and Likelihood-free Methods
Population Monte Carlo
Implementation
Algorithm (D-kernel PMC)
For t = 1, . . . , T
generate (Ki,t)1≤i≤N ∼ M ((pt,k)1≤k≤D)
for 1 ≤ i ≤ N, generate
˜xi,t ∼ KKi,t (x)
compute and renormalize the importance weights ωi,t
generate (Ji,t)1≤i≤N ∼ M ((ωi,t)1≤i≤N )
take xi,t = ˜xJi,t,t and pt+1,d = N
i=1 ¯ωi,tId(Ki,t)
MCMC and Likelihood-free Methods
Population Monte Carlo
Links with particle filters
Sequential setting where π = πt changes with t: Population
Monte Carlo also adapts to this case
Can be traced back all the way to Hammersley and Morton
(1954) and the self-avoiding random walk problem
Gilks and Berzuini (2001) produce iterated samples with (SIR)
resampling steps, and add an MCMC step: this step must use
a πt invariant kernel
Chopin (2001) uses iterated importance sampling to handle
large datasets: this is a special case of PMC where the qit’s
are the posterior distributions associated with a portion kt of
the observed dataset
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods
MCMC and likelihood-free methods

More Related Content

What's hot

Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsChristian Robert
 
Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Christian Robert
 
Coordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerCoordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerChristian Robert
 
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithmno U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithmChristian Robert
 
Bayesian model choice in cosmology
Bayesian model choice in cosmologyBayesian model choice in cosmology
Bayesian model choice in cosmologyChristian Robert
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsCaleb (Shiqiang) Jin
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsStefano Cabras
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferenceChristian Robert
 

What's hot (20)

Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Delayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithmsDelayed acceptance for Metropolis-Hastings algorithms
Delayed acceptance for Metropolis-Hastings algorithms
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010Mark Girolami's Read Paper 2010
Mark Girolami's Read Paper 2010
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
ABC in Venezia
ABC in VeneziaABC in Venezia
ABC in Venezia
 
Coordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like samplerCoordinate sampler: A non-reversible Gibbs-like sampler
Coordinate sampler: A non-reversible Gibbs-like sampler
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithmno U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
no U-turn sampler, a discussion of Hoffman & Gelman NUTS algorithm
 
Bayesian model choice in cosmology
Bayesian model choice in cosmologyBayesian model choice in cosmology
Bayesian model choice in cosmology
 
Bayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear modelsBayesian hybrid variable selection under generalized linear models
Bayesian hybrid variable selection under generalized linear models
 
Approximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-LikelihoodsApproximate Bayesian Computation with Quasi-Likelihoods
Approximate Bayesian Computation with Quasi-Likelihoods
 
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
Program on Quasi-Monte Carlo and High-Dimensional Sampling Methods for Applie...
 
Poster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conferencePoster for Bayesian Statistics in the Big Data Era conference
Poster for Bayesian Statistics in the Big Data Era conference
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 

Viewers also liked

Computer Intensive Statistical Methods
Computer Intensive Statistical MethodsComputer Intensive Statistical Methods
Computer Intensive Statistical MethodsDaniele Baker
 
Bootstrap: a bias minimization technique of an estimator
Bootstrap: a bias minimization technique of an estimatorBootstrap: a bias minimization technique of an estimator
Bootstrap: a bias minimization technique of an estimatorMichel Alves
 
Reading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrapReading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrapChristian Robert
 
Aula 2 Teoria Da Amostragem Daniel
Aula 2 Teoria Da Amostragem DanielAula 2 Teoria Da Amostragem Daniel
Aula 2 Teoria Da Amostragem Danielguest8af68839
 
Uchim.org 6-klass-vilenkin
Uchim.org 6-klass-vilenkinUchim.org 6-klass-vilenkin
Uchim.org 6-klass-vilenkinRazon Ej
 

Viewers also liked (9)

ABC in Roma
ABC in RomaABC in Roma
ABC in Roma
 
Computer Intensive Statistical Methods
Computer Intensive Statistical MethodsComputer Intensive Statistical Methods
Computer Intensive Statistical Methods
 
Jsm09 talk
Jsm09 talkJsm09 talk
Jsm09 talk
 
jrudd1_RDay
jrudd1_RDayjrudd1_RDay
jrudd1_RDay
 
Bootstrap: a bias minimization technique of an estimator
Bootstrap: a bias minimization technique of an estimatorBootstrap: a bias minimization technique of an estimator
Bootstrap: a bias minimization technique of an estimator
 
Reading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrapReading Efron's 1979 paper on bootstrap
Reading Efron's 1979 paper on bootstrap
 
Aula 2 Teoria Da Amostragem Daniel
Aula 2 Teoria Da Amostragem DanielAula 2 Teoria Da Amostragem Daniel
Aula 2 Teoria Da Amostragem Daniel
 
A.b aula 4 amostragem
A.b aula 4 amostragemA.b aula 4 amostragem
A.b aula 4 amostragem
 
Uchim.org 6-klass-vilenkin
Uchim.org 6-klass-vilenkinUchim.org 6-klass-vilenkin
Uchim.org 6-klass-vilenkin
 

Similar to MCMC and likelihood-free methods

Linear models for classification
Linear models for classificationLinear models for classification
Linear models for classificationSung Yub Kim
 
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...Yandex
 
A lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controlsA lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controlsAlejandro Díaz-Caro
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Fabian Pedregosa
 
Litvinenko low-rank kriging +FFT poster
Litvinenko low-rank kriging +FFT  posterLitvinenko low-rank kriging +FFT  poster
Litvinenko low-rank kriging +FFT posterAlexander Litvinenko
 
Mcmc & lkd free I
Mcmc & lkd free IMcmc & lkd free I
Mcmc & lkd free IDeb Roy
 
Monash University short course, part I
Monash University short course, part IMonash University short course, part I
Monash University short course, part IChristian Robert
 
SPDE presentation 2012
SPDE presentation 2012SPDE presentation 2012
SPDE presentation 2012Zheng Mengdi
 
Introduction to advanced Monte Carlo methods
Introduction to advanced Monte Carlo methodsIntroduction to advanced Monte Carlo methods
Introduction to advanced Monte Carlo methodsChristian Robert
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsElvis DOHMATOB
 
Sampled-Data Piecewise Affine Slab Systems: A Time-Delay Approach
Sampled-Data Piecewise Affine Slab Systems: A Time-Delay ApproachSampled-Data Piecewise Affine Slab Systems: A Time-Delay Approach
Sampled-Data Piecewise Affine Slab Systems: A Time-Delay ApproachBehzad Samadi
 
Mathematics and AI
Mathematics and AIMathematics and AI
Mathematics and AIMarc Lelarge
 

Similar to MCMC and likelihood-free methods (20)

Linear models for classification
Linear models for classificationLinear models for classification
Linear models for classification
 
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
Subgradient Methods for Huge-Scale Optimization Problems - Юрий Нестеров, Cat...
 
talk MCMC & SMC 2004
talk MCMC & SMC 2004talk MCMC & SMC 2004
talk MCMC & SMC 2004
 
A lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controlsA lambda calculus for density matrices with classical and probabilistic controls
A lambda calculus for density matrices with classical and probabilistic controls
 
Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3Random Matrix Theory and Machine Learning - Part 3
Random Matrix Theory and Machine Learning - Part 3
 
Litvinenko low-rank kriging +FFT poster
Litvinenko low-rank kriging +FFT  posterLitvinenko low-rank kriging +FFT  poster
Litvinenko low-rank kriging +FFT poster
 
Main
MainMain
Main
 
Mcmc & lkd free I
Mcmc & lkd free IMcmc & lkd free I
Mcmc & lkd free I
 
Monash University short course, part I
Monash University short course, part IMonash University short course, part I
Monash University short course, part I
 
KAUST_talk_short.pdf
KAUST_talk_short.pdfKAUST_talk_short.pdf
KAUST_talk_short.pdf
 
Shanghai tutorial
Shanghai tutorialShanghai tutorial
Shanghai tutorial
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 
SPDE presentation 2012
SPDE presentation 2012SPDE presentation 2012
SPDE presentation 2012
 
Introduction to advanced Monte Carlo methods
Introduction to advanced Monte Carlo methodsIntroduction to advanced Monte Carlo methods
Introduction to advanced Monte Carlo methods
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
 
Sampled-Data Piecewise Affine Slab Systems: A Time-Delay Approach
Sampled-Data Piecewise Affine Slab Systems: A Time-Delay ApproachSampled-Data Piecewise Affine Slab Systems: A Time-Delay Approach
Sampled-Data Piecewise Affine Slab Systems: A Time-Delay Approach
 
Mathematics and AI
Mathematics and AIMathematics and AI
Mathematics and AI
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
 
5.n nmodels i
5.n nmodels i5.n nmodels i
5.n nmodels i
 
intro
introintro
intro
 

More from Christian Robert

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceChristian Robert
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
 

More from Christian Robert (20)

Asymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de FranceAsymptotics of ABC, lecture, Collège de France
Asymptotics of ABC, lecture, Collège de France
 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 

Recently uploaded

ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomnelietumpap1
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfSpandanaRallapalli
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
FILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinoFILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinojohnmickonozaleda
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 

Recently uploaded (20)

ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choom
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdf
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
FILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinoFILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipino
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
 

MCMC and likelihood-free methods

  • 1. MCMC and Likelihood-free Methods MCMC and Likelihood-free Methods Christian P. Robert Universit´e Paris-Dauphine & CREST http://www.ceremade.dauphine.fr/~xian November 2, 2010
  • 2. MCMC and Likelihood-free Methods Outline Computational issues in Bayesian statistics The Metropolis-Hastings Algorithm The Gibbs Sampler Population Monte Carlo Approximate Bayesian computation ABC for model choice
  • 3. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Motivation and leading example Computational issues in Bayesian statistics The Metropolis-Hastings Algorithm The Gibbs Sampler Population Monte Carlo Approximate Bayesian computation ABC for model choice
  • 4. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Latent structures make life harder! Even simple models may lead to computational complications, as in latent variable models f(x|θ) = f (x, x |θ) dx
  • 5. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Latent structures make life harder! Even simple models may lead to computational complications, as in latent variable models f(x|θ) = f (x, x |θ) dx If (x, x ) observed, fine!
  • 6. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Latent structures make life harder! Even simple models may lead to computational complications, as in latent variable models f(x|θ) = f (x, x |θ) dx If (x, x ) observed, fine! If only x observed, trouble!
  • 7. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture models) Models of mixtures of distributions: X ∼ fj with probability pj, for j = 1, 2, . . . , k, with overall density X ∼ p1f1(x) + · · · + pkfk(x) .
  • 8. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture models) Models of mixtures of distributions: X ∼ fj with probability pj, for j = 1, 2, . . . , k, with overall density X ∼ p1f1(x) + · · · + pkfk(x) . For a sample of independent random variables (X1, · · · , Xn), sample density n i=1 {p1f1(xi) + · · · + pkfk(xi)} .
  • 9. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture models) Models of mixtures of distributions: X ∼ fj with probability pj, for j = 1, 2, . . . , k, with overall density X ∼ p1f1(x) + · · · + pkfk(x) . For a sample of independent random variables (X1, · · · , Xn), sample density n i=1 {p1f1(xi) + · · · + pkfk(xi)} . Expanding this product of sums into a sum of products involves kn elementary terms: too prohibitive to compute in large samples.
  • 10. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Simple mixture (1) −1 0 1 2 3 −10123 µ1 µ2 Case of the 0.3N (µ1, 1) + 0.7N (µ2, 1) likelihood
  • 11. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Simple mixture (2) For the mixture of two normal distributions, 0.3N(µ1, 1) + 0.7N(µ2, 1) , likelihood proportional to n i=1 [0.3ϕ (xi − µ1) + 0.7 ϕ (xi − µ2)] containing 2n terms.
  • 12. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Complex maximisation Standard maximization techniques often fail to find the global maximum because of multimodality or undesirable behavior (usually at the frontier of the domain) of the likelihood function. Example In the special case f(x|µ, σ) = (1 − ) exp{(−1/2)x2 } + σ exp{(−1/2σ2 )(x − µ)2 } with > 0 known,
  • 13. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Complex maximisation Standard maximization techniques often fail to find the global maximum because of multimodality or undesirable behavior (usually at the frontier of the domain) of the likelihood function. Example In the special case f(x|µ, σ) = (1 − ) exp{(−1/2)x2 } + σ exp{(−1/2σ2 )(x − µ)2 } with > 0 known, whatever n, the likelihood is unbounded: lim σ→0 L(x1, . . . , xn|µ = x1, σ) = ∞
  • 14. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Unbounded likelihood −2 0 2 4 6 1234 µ n= 3 −2 0 2 4 6 1234 µ σ n= 6 −2 0 2 4 6 1234 µ n= 12 −2 0 2 4 6 1234 µ σ n= 24 −2 0 2 4 6 1234 µ n= 48 −2 0 2 4 6 1234 µ σ n= 96 Case of the 0.3N (0, 1) + 0.7N (µ, σ) likelihood
  • 15. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture once again) press for MA Observations from x1, . . . , xn ∼ f(x|θ) = pϕ(x; µ1, σ1) + (1 − p)ϕ(x; µ2, σ2)
  • 16. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture once again) press for MA Observations from x1, . . . , xn ∼ f(x|θ) = pϕ(x; µ1, σ1) + (1 − p)ϕ(x; µ2, σ2) Prior µi|σi ∼ N (ξi, σ2 i /ni), σ2 i ∼ I G (νi/2, s2 i /2), p ∼ Be(α, β)
  • 17. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture once again) press for MA Observations from x1, . . . , xn ∼ f(x|θ) = pϕ(x; µ1, σ1) + (1 − p)ϕ(x; µ2, σ2) Prior µi|σi ∼ N (ξi, σ2 i /ni), σ2 i ∼ I G (νi/2, s2 i /2), p ∼ Be(α, β) Posterior π(θ|x1, . . . , xn) ∝ n j=1 {pϕ(xj; µ1, σ1) + (1 − p)ϕ(xj; µ2, σ2)} π(θ) = n =0 (kt) ω(kt)π(θ|(kt)) n
  • 18. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture once again (cont’d)) For a given permutation (kt), conditional posterior distribution π(θ|(kt)) = N ξ1(kt), σ2 1 n1 + × I G ((ν1 + )/2, s1(kt)/2) ×N ξ2(kt), σ2 2 n2 + n − × I G ((ν2 + n − )/2, s2(kt)/2) ×Be(α + , β + n − )
  • 19. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture once again (cont’d)) where ¯x1(kt) = 1 t=1 xkt , ˆs1(kt) = t=1(xkt − ¯x1(kt))2, ¯x2(kt) = 1 n− n t= +1 xkt , ˆs2(kt) = n t= +1(xkt − ¯x2(kt))2 and ξ1(kt) = n1ξ1 + ¯x1(kt) n1 + , ξ2(kt) = n2ξ2 + (n − )¯x2(kt) n2 + n − , s1(kt) = s2 1 + ˆs2 1(kt) + n1 n1 + (ξ1 − ¯x1(kt))2 , s2(kt) = s2 2 + ˆs2 2(kt) + n2(n − ) n2 + n − (ξ2 − ¯x2(kt))2 , posterior updates of the hyperparameters
  • 20. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics Latent variables Example (Mixture once again) Bayes estimator of θ: δπ (x1, . . . , xn) = n =0 (kt) ω(kt)Eπ [θ|x, (kt)] Too costly: 2n terms
  • 21. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model AR(p) model Auto-regressive representation of a time series, xt|xt−1, . . . ∼ N µ + p i=1 i(xt−i − µ), σ2
  • 22. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model AR(p) model Auto-regressive representation of a time series, xt|xt−1, . . . ∼ N µ + p i=1 i(xt−i − µ), σ2 Generalisation of AR(1) Among the most commonly used models in dynamic settings More challenging than the static models (stationarity constraints) Different models depending on the processing of the starting value x0
  • 23. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model Unknown stationarity constraints Practical difficulty: for complex models, stationarity constraints get quite involved to the point of being unknown in some cases Example (AR(1)) Case of linear Markovian dependence on the last value xt = µ + (xt−1 − µ) + t , t i.i.d. ∼ N (0, σ2 ) If | | < 1, (xt)t∈Z can be written as xt = µ + ∞ j=0 j t−j and this is a stationary representation.
  • 24. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model Stationary but... If | | > 1, alternative stationary representation xt = µ − ∞ j=1 −j t+j .
  • 25. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model Stationary but... If | | > 1, alternative stationary representation xt = µ − ∞ j=1 −j t+j . This stationary solution is criticized as artificial because xt is correlated with future white noises ( t)s>t, unlike the case when | | < 1. Non-causal representation...
  • 26. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model Stationarity+causality Stationarity constraints in the prior as a restriction on the values of θ. Theorem AR(p) model second-order stationary and causal iff the roots of the polynomial P(x) = 1 − p i=1 ixi are all outside the unit circle
  • 27. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model Stationarity constraints Under stationarity constraints, complex parameter space: each value of needs to be checked for roots of corresponding polynomial with modulus less than 1
  • 28. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The AR(p) model Stationarity constraints Under stationarity constraints, complex parameter space: each value of needs to be checked for roots of corresponding polynomial with modulus less than 1 E.g., for an AR(2) process with autoregressive polynomial P(u) = 1 − 1u − 2u2, constraint is 1 + 2 < 1, 1 − 2 < 1 and | 2| < 1 q −2 −1 0 1 2 −1.0−0.50.00.51.0 θ1 θ2
  • 29. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model The MA(q) model Alternative type of time series xt = µ + t − q j=1 ϑj t−j , t ∼ N (0, σ2 ) Stationary but, for identifiability considerations, the polynomial Q(x) = 1 − q j=1 ϑjxj must have all its roots outside the unit circle.
  • 30. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model Identifiability Example For the MA(1) model, xt = µ + t − ϑ1 t−1, var(xt) = (1 + ϑ2 1)σ2 can also be written xt = µ + ˜t−1 − 1 ϑ1 ˜t, ˜ ∼ N (0, ϑ2 1σ2 ) , Both pairs (ϑ1, σ) & (1/ϑ1, ϑ1σ) lead to alternative representations of the same model.
  • 31. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model Properties of MA models Non-Markovian model (but special case of hidden Markov) Autocovariance γx(s) is null for |s| > q
  • 32. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model Representations x1:T is a normal random variable with constant mean µ and covariance matrix Σ =      σ2 γ1 γ2 . . . γq 0 . . . 0 0 γ1 σ2 γ1 . . . γq−1 γq . . . 0 0 ... 0 0 0 . . . 0 0 . . . γ1 σ2      , with (|s| ≤ q) γs = σ2 q−|s| i=0 ϑiϑi+|s| Not manageable in practice [large T’s]
  • 33. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model Representations (contd.) Conditional on past ( 0, . . . , −q+1), L(µ, ϑ1, . . . , ϑq, σ|x1:T , 0, . . . , −q+1) ∝ σ−T T t=1 exp    −  xt − µ + q j=1 ϑjˆt−j   2 2σ2    , where (t > 0) ˆt = xt − µ + q j=1 ϑjˆt−j, ˆ0 = 0, . . . , ˆ1−q = 1−q Recursive definition of the likelihood, still costly O(T × q)
  • 34. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model Representations (contd.) Encompassing approach for general time series models State-space representation xt = Gyt + εt , (1) yt+1 = Fyt + ξt , (2) (1) is the observation equation and (2) is the state equation
  • 35. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model Representations (contd.) Encompassing approach for general time series models State-space representation xt = Gyt + εt , (1) yt+1 = Fyt + ξt , (2) (1) is the observation equation and (2) is the state equation Note This is a special case of hidden Markov model
  • 36. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model MA(q) state-space representation For the MA(q) model, take yt = ( t−q, . . . , t−1, t) and then yt+1 =       0 1 0 . . . 0 0 0 1 . . . 0 . . . 0 0 0 . . . 1 0 0 0 . . . 0       yt + t+1        0 0 ... 0 1        xt = µ − ϑq ϑq−1 . . . ϑ1 −1 yt .
  • 37. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model MA(q) state-space representation (cont’d) Example For the MA(1) model, observation equation xt = (1 0)yt with yt = (y1t y2t) directed by the state equation yt+1 = 0 1 0 0 yt + t+1 1 ϑ1 .
  • 38. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model c A typology of Bayes computational problems (i). latent variable models in general
  • 39. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model c A typology of Bayes computational problems (i). latent variable models in general (ii). use of a complex parameter space, as for instance in constrained parameter sets like those resulting from imposing stationarity constraints in dynamic models;
  • 40. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model c A typology of Bayes computational problems (i). latent variable models in general (ii). use of a complex parameter space, as for instance in constrained parameter sets like those resulting from imposing stationarity constraints in dynamic models; (iii). use of a complex sampling model with an intractable likelihood, as for instance in some graphical models;
  • 41. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model c A typology of Bayes computational problems (i). latent variable models in general (ii). use of a complex parameter space, as for instance in constrained parameter sets like those resulting from imposing stationarity constraints in dynamic models; (iii). use of a complex sampling model with an intractable likelihood, as for instance in some graphical models; (iv). use of a huge dataset;
  • 42. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model c A typology of Bayes computational problems (i). latent variable models in general (ii). use of a complex parameter space, as for instance in constrained parameter sets like those resulting from imposing stationarity constraints in dynamic models; (iii). use of a complex sampling model with an intractable likelihood, as for instance in some graphical models; (iv). use of a huge dataset; (v). use of a complex prior distribution (which may be the posterior distribution associated with an earlier sample);
  • 43. MCMC and Likelihood-free Methods Computational issues in Bayesian statistics The MA(q) model c A typology of Bayes computational problems (i). latent variable models in general (ii). use of a complex parameter space, as for instance in constrained parameter sets like those resulting from imposing stationarity constraints in dynamic models; (iii). use of a complex sampling model with an intractable likelihood, as for instance in some graphical models; (iv). use of a huge dataset; (v). use of a complex prior distribution (which may be the posterior distribution associated with an earlier sample); (vi). use of a particular inferential procedure as for instance, Bayes factors Bπ 01(x) = P(θ ∈ Θ0 | x) P(θ ∈ Θ1 | x) π(θ ∈ Θ0) π(θ ∈ Θ1) .
  • 44. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis-Hastings Algorithm Computational issues in Bayesian statistics The Metropolis-Hastings Algorithm Monte Carlo basics Importance Sampling Monte Carlo Methods based on Markov Chains The Metropolis–Hastings algorithm Random-walk Metropolis-Hastings algorithms Extensions The Gibbs Sampler Population Monte Carlo
  • 45. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo basics General purpose Given a density π known up to a normalizing constant, and an integrable function h, compute Π(h) = h(x)π(x)µ(dx) = h(x)˜π(x)µ(dx) ˜π(x)µ(dx) when h(x)˜π(x)µ(dx) is intractable.
  • 46. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo basics Monte Carlo 101 Generate an iid sample x1, . . . , xN from π and estimate Π(h) by ˆΠMC N (h) = N−1 N i=1 h(xi). LLN: ˆΠMC N (h) as −→ Π(h) If Π(h2) = h2(x)π(x)µ(dx) < ∞, CLT: √ N ˆΠMC N (h) − Π(h) L N 0, Π [h − Π(h)]2 .
  • 47. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo basics Monte Carlo 101 Generate an iid sample x1, . . . , xN from π and estimate Π(h) by ˆΠMC N (h) = N−1 N i=1 h(xi). LLN: ˆΠMC N (h) as −→ Π(h) If Π(h2) = h2(x)π(x)µ(dx) < ∞, CLT: √ N ˆΠMC N (h) − Π(h) L N 0, Π [h − Π(h)]2 . Caveat announcing MCMC Often impossible or inefficient to simulate directly from Π
  • 48. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Importance Sampling Importance Sampling For Q proposal distribution such that Q(dx) = q(x)µ(dx), alternative representation Π(h) = h(x){π/q}(x)q(x)µ(dx).
  • 49. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Importance Sampling Importance Sampling For Q proposal distribution such that Q(dx) = q(x)µ(dx), alternative representation Π(h) = h(x){π/q}(x)q(x)µ(dx). Principle of importance Generate an iid sample x1, . . . , xN ∼ Q and estimate Π(h) by ˆΠIS Q,N (h) = N−1 N i=1 h(xi){π/q}(xi). return to pMC
  • 50. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Importance Sampling Properties of importance Then LLN: ˆΠIS Q,N (h) as −→ Π(h) and if Q((hπ/q)2) < ∞, CLT: √ N(ˆΠIS Q,N (h) − Π(h)) L N 0, Q{(hπ/q − Π(h))2 } .
  • 51. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Importance Sampling Properties of importance Then LLN: ˆΠIS Q,N (h) as −→ Π(h) and if Q((hπ/q)2) < ∞, CLT: √ N(ˆΠIS Q,N (h) − Π(h)) L N 0, Q{(hπ/q − Π(h))2 } . Caveat If normalizing constant of π unknown, impossible to use ˆΠIS Q,N Generic problem in Bayesian Statistics: π(θ|x) ∝ f(x|θ)π(θ).
  • 52. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Importance Sampling Self-Normalised Importance Sampling Self normalized version ˆΠSNIS Q,N (h) = N i=1 {π/q}(xi) −1 N i=1 h(xi){π/q}(xi).
  • 53. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Importance Sampling Self-Normalised Importance Sampling Self normalized version ˆΠSNIS Q,N (h) = N i=1 {π/q}(xi) −1 N i=1 h(xi){π/q}(xi). LLN : ˆΠSNIS Q,N (h) as −→ Π(h) and if Π((1 + h2)(π/q)) < ∞, CLT : √ N(ˆΠSNIS Q,N (h) − Π(h)) L N 0, π {(π/q)(h − Π(h)}2 ) .
  • 54. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Importance Sampling Self-Normalised Importance Sampling Self normalized version ˆΠSNIS Q,N (h) = N i=1 {π/q}(xi) −1 N i=1 h(xi){π/q}(xi). LLN : ˆΠSNIS Q,N (h) as −→ Π(h) and if Π((1 + h2)(π/q)) < ∞, CLT : √ N(ˆΠSNIS Q,N (h) − Π(h)) L N 0, π {(π/q)(h − Π(h)}2 ) . c The quality of the SNIS approximation depends on the choice of Q
  • 55. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo Methods based on Markov Chains Running Monte Carlo via Markov Chains (MCMC) It is not necessary to use a sample from the distribution f to approximate the integral I = h(x)f(x)dx ,
  • 56. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo Methods based on Markov Chains Running Monte Carlo via Markov Chains (MCMC) It is not necessary to use a sample from the distribution f to approximate the integral I = h(x)f(x)dx , We can obtain X1, . . . , Xn ∼ f (approx) without directly simulating from f, using an ergodic Markov chain with stationary distribution f
  • 57. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo Methods based on Markov Chains Running Monte Carlo via Markov Chains (2) Idea For an arbitrary starting value x(0), an ergodic chain (X(t)) is generated using a transition kernel with stationary distribution f
  • 58. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo Methods based on Markov Chains Running Monte Carlo via Markov Chains (2) Idea For an arbitrary starting value x(0), an ergodic chain (X(t)) is generated using a transition kernel with stationary distribution f Insures the convergence in distribution of (X(t)) to a random variable from f. For a “large enough” T0, X(T0) can be considered as distributed from f Produce a dependent sample X(T0), X(T0+1), . . ., which is generated from f, sufficient for most approximation purposes.
  • 59. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Monte Carlo Methods based on Markov Chains Running Monte Carlo via Markov Chains (2) Idea For an arbitrary starting value x(0), an ergodic chain (X(t)) is generated using a transition kernel with stationary distribution f Insures the convergence in distribution of (X(t)) to a random variable from f. For a “large enough” T0, X(T0) can be considered as distributed from f Produce a dependent sample X(T0), X(T0+1), . . ., which is generated from f, sufficient for most approximation purposes. Problem: How can one build a Markov chain with a given stationary distribution?
  • 60. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm The Metropolis–Hastings algorithm Basics The algorithm uses the objective (target) density f and a conditional density q(y|x) called the instrumental (or proposal) distribution
  • 61. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm The MH algorithm Algorithm (Metropolis–Hastings) Given x(t), 1. Generate Yt ∼ q(y|x(t)). 2. Take X(t+1) = Yt with prob. ρ(x(t), Yt), x(t) with prob. 1 − ρ(x(t), Yt), where ρ(x, y) = min f(y) f(x) q(x|y) q(y|x) , 1 .
  • 62. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm Features Independent of normalizing constants for both f and q(·|x) (ie, those constants independent of x) Never move to values with f(y) = 0 The chain (x(t))t may take the same value several times in a row, even though f is a density wrt Lebesgue measure The sequence (yt)t is usually not a Markov chain
  • 63. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm Convergence properties 1. The M-H Markov chain is reversible, with invariant/stationary density f since it satisfies the detailed balance condition f(y) K(y, x) = f(x) K(x, y)
  • 64. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm Convergence properties 1. The M-H Markov chain is reversible, with invariant/stationary density f since it satisfies the detailed balance condition f(y) K(y, x) = f(x) K(x, y) 2. As f is a probability measure, the chain is positive recurrent
  • 65. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm Convergence properties 1. The M-H Markov chain is reversible, with invariant/stationary density f since it satisfies the detailed balance condition f(y) K(y, x) = f(x) K(x, y) 2. As f is a probability measure, the chain is positive recurrent 3. If Pr f(Yt) q(X(t)|Yt) f(X(t)) q(Yt|X(t)) ≥ 1 < 1. (1) that is, the event {X(t+1) = X(t)} is possible, then the chain is aperiodic
  • 66. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm Convergence properties (2) 4. If q(y|x) > 0 for every (x, y), (2) the chain is irreducible
  • 67. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm Convergence properties (2) 4. If q(y|x) > 0 for every (x, y), (2) the chain is irreducible 5. For M-H, f-irreducibility implies Harris recurrence
  • 68. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm The Metropolis–Hastings algorithm Convergence properties (2) 4. If q(y|x) > 0 for every (x, y), (2) the chain is irreducible 5. For M-H, f-irreducibility implies Harris recurrence 6. Thus, for M-H satisfying (1) and (2) (i) For h, with Ef |h(X)| < ∞, lim T →∞ 1 T T t=1 h(X(t) ) = h(x)df(x) a.e. f. (ii) and lim n→∞ Kn (x, ·)µ(dx) − f T V = 0 for every initial distribution µ, where Kn (x, ·) denotes the kernel for n transitions.
  • 69. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Random walk Metropolis–Hastings Use of a local perturbation as proposal Yt = X(t) + εt, where εt ∼ g, independent of X(t). The instrumental density is of the form g(y − x) and the Markov chain is a random walk if we take g to be symmetric g(x) = g(−x)
  • 70. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Algorithm (Random walk Metropolis) Given x(t) 1. Generate Yt ∼ g(y − x(t)) 2. Take X(t+1) =    Yt with prob. min 1, f(Yt) f(x(t)) , x(t) otherwise.
  • 71. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Example (Random walk and normal target) forget History! Generate N(0, 1) based on the uniform proposal [−δ, δ] [Hastings (1970)] The probability of acceptance is then ρ(x(t) , yt) = exp{(x(t)2 − y2 t )/2} ∧ 1.
  • 72. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Example (Random walk & normal (2)) Sample statistics δ 0.1 0.5 1.0 mean 0.399 -0.111 0.10 variance 0.698 1.11 1.06 c As δ ↑, we get better histograms and a faster exploration of the support of f.
  • 73. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms -1 0 1 2 050100150200250 (a) -1.5-1.0-0.50.00.5 -2 0 2 0100200300400 (b) -1.5-1.0-0.50.00.5 -3 -2 -1 0 1 2 3 0100200300400 (c) -1.5-1.0-0.50.00.5 Three samples based on U[−δ, δ] with (a) δ = 0.1, (b) δ = 0.5 and (c) δ = 1.0, superimposed with the convergence of the means (15, 000 simulations).
  • 74. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Example (Mixture models) π(θ|x) ∝ n j=1 k =1 p f(xj|µ , σ ) π(θ)
  • 75. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Example (Mixture models) π(θ|x) ∝ n j=1 k =1 p f(xj|µ , σ ) π(θ) Metropolis-Hastings proposal: θ(t+1) = θ(t) + ωε(t) if u(t) < ρ(t) θ(t) otherwise where ρ(t) = π(θ(t) + ωε(t)|x) π(θ(t)|x) ∧ 1 and ω scaled for good acceptance rate
  • 76. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms p theta 0.0 0.2 0.4 0.6 0.8 1.0 -1012 tau theta 0.2 0.4 0.6 0.8 1.0 1.2 -1012 p tau 0.0 0.2 0.4 0.6 0.8 1.0 0.20.40.60.81.01.2 -1 0 1 2 0.01.02.0 theta 0.2 0.4 0.6 0.8 024 tau 0.0 0.2 0.4 0.6 0.8 1.0 0123456 p Random walk sampling (50000 iterations) General case of a 3 component normal mixture [Celeux & al., 2000]
  • 77. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms −1 0 1 2 3 −10123 µ1 µ2 X Random walk MCMC output for .7N(µ1, 1) + .3N(µ2, 1)
  • 78. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Convergence properties Uniform ergodicity prohibited by random walk structure
  • 79. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Convergence properties Uniform ergodicity prohibited by random walk structure At best, geometric ergodicity: Theorem (Sufficient ergodicity) For a symmetric density f, log-concave in the tails, and a positive and symmetric density g, the chain (X(t)) is geometrically ergodic. [Mengersen & Tweedie, 1996] no tail effect
  • 80. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Example (Comparison of tail effects) Random-walk Metropolis–Hastings algorithms based on a N (0, 1) instrumental for the generation of (a) a N(0, 1) distribution and (b) a distribution with density ψ(x) ∝ (1 + |x|)−3 (a) 0 50 100 150 200 -1.5-1.0-0.50.00.51.01.5 (a) 0 50 100 150 200 -1.5-1.0-0.50.00.51.01.5 0 50 100 150 200 -1.5-1.0-0.50.00.51.01.5 0 50 100 150 200 -1.5-1.0-0.50.00.51.01.5 (b) 0 50 100 150 200 -1.5-1.0-0.50.00.51.01.5 0 50 100 150 200 -1.5-1.0-0.50.00.51.01.5 0 50 100 150 200 -1.5-1.0-0.50.00.51.01.5 90% confidence envelopes of the means, derived from 500 parallel independent chains 1 + ξ2 1 + (ξ )2 ∧ 1 ,
  • 81. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Further convergence properties Under assumptions skip detailed convergence (A1) f is super-exponential, i.e. it is positive with positive continuous first derivative such that lim|x|→∞ n(x) log f(x) = −∞ where n(x) := x/|x|. In words : exponential decay of f in every direction with rate tending to ∞ (A2) lim sup|x|→∞ n(x) m(x) < 0, where m(x) = f(x)/| f(x)|. In words: non degeneracy of the countour manifold Cf(y) = {y : f(y) = f(x)} Q is geometrically ergodic, and V (x) ∝ f(x)−1/2 verifies the drift condition [Jarner & Hansen, 2000]
  • 82. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Further [further] convergence properties skip hyperdetailed convergence If P ψ-irreducible and aperiodic, for r = (r(n))n∈N real-valued non decreasing sequence, such that, for all n, m ∈ N, r(n + m) ≤ r(n)r(m), and r(0) = 1, for C a small set, τC = inf{n ≥ 1, Xn ∈ C}, and h ≥ 1, assume sup x∈C Ex τC −1 k=0 r(k)h(Xk) < ∞,
  • 83. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms then, S(f, C, r) := x ∈ X, Ex τC −1 k=0 r(k)h(Xk) < ∞ is full and absorbing and for x ∈ S(f, C, r), lim n→∞ r(n) Pn (x, .) − f h = 0. [Tuominen & Tweedie, 1994]
  • 84. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Comments [CLT, Rosenthal’s inequality...] h-ergodicity implies CLT for additive (possibly unbounded) functionals of the chain, Rosenthal’s inequality and so on... [Control of the moments of the return-time] The condition implies (because h ≥ 1) that sup x∈C Ex[r0(τC)] ≤ sup x∈C Ex τC −1 k=0 r(k)h(Xk) < ∞, where r0(n) = n l=0 r(l) Can be used to derive bounds for the coupling time, an essential step to determine computable bounds, using coupling inequalities [Roberts & Tweedie, 1998; Fort & Moulines, 2000]
  • 85. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms Alternative conditions The condition is not really easy to work with... [Possible alternative conditions] (a) [Tuominen, Tweedie, 1994] There exists a sequence (Vn)n∈N, Vn ≥ r(n)h, such that (i) supC V0 < ∞, (ii) {V0 = ∞} ⊂ {V1 = ∞} and (iii) PVn+1 ≤ Vn − r(n)h + br(n)IC.
  • 86. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Random-walk Metropolis-Hastings algorithms (b) [Fort 2000] ∃V ≥ f ≥ 1 and b < ∞, such that supC V < ∞ and PV (x) + Ex σC k=0 ∆r(k)f(Xk) ≤ V (x) + bIC(x) where σC is the hitting time on C and ∆r(k) = r(k) − r(k − 1), k ≥ 1 and ∆r(0) = r(0). Result (a) ⇔ (b) ⇔ supx∈C Ex τC −1 k=0 r(k)f(Xk) < ∞.
  • 87. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Langevin Algorithms Proposal based on the Langevin diffusion Lt is defined by the stochastic differential equation dLt = dBt + 1 2 log f(Lt)dt, where Bt is the standard Brownian motion Theorem The Langevin diffusion is the only non-explosive diffusion which is reversible with respect to f.
  • 88. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Discretization Instead, consider the sequence x(t+1) = x(t) + σ2 2 log f(x(t) ) + σεt, εt ∼ Np(0, Ip) where σ2 corresponds to the discretization step
  • 89. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Discretization Instead, consider the sequence x(t+1) = x(t) + σ2 2 log f(x(t) ) + σεt, εt ∼ Np(0, Ip) where σ2 corresponds to the discretization step Unfortunately, the discretized chain may be be transient, for instance when lim x→±∞ σ2 log f(x)|x|−1 > 1
  • 90. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions MH correction Accept the new value Yt with probability f(Yt) f(x(t)) · exp − Yt − x(t) − σ2 2 log f(x(t)) 2 2σ2 exp − x(t) − Yt − σ2 2 log f(Yt) 2 2σ2 ∧ 1 . Choice of the scaling factor σ Should lead to an acceptance rate of 0.574 to achieve optimal convergence rates (when the components of x are uncorrelated) [Roberts & Rosenthal, 1998; Girolami & Calderhead, 2011]
  • 91. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Optimizing the Acceptance Rate Problem of choice of the transition kernel from a practical point of view Most common alternatives: (a) a fully automated algorithm like ARMS; (b) an instrumental density g which approximates f, such that f/g is bounded for uniform ergodicity to apply; (c) a random walk In both cases (b) and (c), the choice of g is critical,
  • 92. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Case of the random walk Different approach to acceptance rates A high acceptance rate does not indicate that the algorithm is moving correctly since it indicates that the random walk is moving too slowly on the surface of f.
  • 93. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Case of the random walk Different approach to acceptance rates A high acceptance rate does not indicate that the algorithm is moving correctly since it indicates that the random walk is moving too slowly on the surface of f. If x(t) and yt are close, i.e. f(x(t)) f(yt) y is accepted with probability min f(yt) f(x(t)) , 1 1 . For multimodal densities with well separated modes, the negative effect of limited moves on the surface of f clearly shows.
  • 94. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Case of the random walk (2) If the average acceptance rate is low, the successive values of f(yt) tend to be small compared with f(x(t)), which means that the random walk moves quickly on the surface of f since it often reaches the “borders” of the support of f
  • 95. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Rule of thumb In small dimensions, aim at an average acceptance rate of 50%. In large dimensions, at an average acceptance rate of 25%. [Gelman,Gilks and Roberts, 1995]
  • 96. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Rule of thumb In small dimensions, aim at an average acceptance rate of 50%. In large dimensions, at an average acceptance rate of 25%. [Gelman,Gilks and Roberts, 1995] This rule is to be taken with a pinch of salt!
  • 97. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Example (Noisy AR(1)) Hidden Markov chain from a regular AR(1) model, xt+1 = ϕxt + t+1 t ∼ N (0, τ2 ) and observables yt|xt ∼ N (x2 t , σ2 )
  • 98. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Example (Noisy AR(1)) Hidden Markov chain from a regular AR(1) model, xt+1 = ϕxt + t+1 t ∼ N (0, τ2 ) and observables yt|xt ∼ N (x2 t , σ2 ) The distribution of xt given xt−1, xt+1 and yt is exp −1 2τ2 (xt − ϕxt−1)2 + (xt+1 − ϕxt)2 + τ2 σ2 (yt − x2 t )2 .
  • 99. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Example (Noisy AR(1) continued) For a Gaussian random walk with scale ω small enough, the random walk never jumps to the other mode. But if the scale ω is sufficiently large, the Markov chain explores both modes and give a satisfactory approximation of the target distribution.
  • 100. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Markov chain based on a random walk with scale ω = .1.
  • 101. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Markov chain based on a random walk with scale ω = .5.
  • 102. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions MA(2) Since the constraints on (ϑ1, ϑ2) are well-defined, use of a flat prior over the triangle as prior. Simple representation of the likelihood library(mnormt) ma2like=function(theta){ n=length(y) sigma = toeplitz(c(1 +theta[1]^2+theta[2]^2, theta[1]+theta[1]*theta[2],theta[2],rep(0,n-3))) dmnorm(y,rep(0,n),sigma,log=TRUE) }
  • 103. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Basic RWHM for MA(2) Algorithm 1 RW-HM-MA(2) sampler set ω and ϑ(1) for i = 2 to T do generate ˜ϑj ∼ U(ϑ (i−1) j − ω, ϑ (i−1) j + ω) set p = 0 and ϑ(i) = ϑ(i−1) if ˜ϑ within the triangle then p = exp(ma2like(˜ϑ) − ma2like(ϑ(i−1))) end if if U < p then ϑ(i) = ˜ϑ end if end for
  • 104. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Outcome Result with a simulated sample of 100 points and ϑ1 = 0.6, ϑ2 = 0.2 and scale ω = 0.2 q q −2 −1 0 1 2 −1.0−0.50.00.51.0 θ1 θ2 q qqq q qqq qqqqq q qqq qq qqqq qq qqqqqqqqqq q qqqqqqqq qqq q qqqqqqqq q qq qqqqq qq qqq qqqqqqqqqqq q qqqqq q qq qqqq qqqqqqq qqqqq qqq qqqqqqqqq q q qqqqqqqqqqqq qq qq q qqq qqqqqqqq qqqqqqq qqqqq qqqq q qqqqqq qqqqqqq qqqqq qqq qq q qqq q q qq qqq qqq qq qq qqqq qq q qqq qqq q q qq qqq qqq qq qq qqqqq qqq qqqqq q qqq qqqqq qq qqqq q qqq q q qq qqqqqq qqqqqqqqq qq q qq q q qqqqqqqq qqqqqqqq qqq qqqqqqq qqqqq q qqq qq qqqqqqqqqqqqq qqqq qqq q qq q qq qqqq q qq qq q qqqq q qq q q qqqqq q qqq q qq qqqqqqq q qq q q q q qqq qqqq q qq q q q q qq qq qqqqqqq qqq q qqqq q qqqqq qq qqq qqq q q q qq qq qqqqqqqq qqqqq qqqqqq qq qq qqqqqqq q qqq q qqqqqq qqq qq qq qqq qqq qq qqqqqqq q qqq qqqq q qqq qq q qqqqqqqqqqq q qq qq q qqqq q qqq qqqqq q qqqqqq qq qqqq qq qqq qqq qq qq q qqqq qq q qqq qq qq qqqqqqq q qqqq qqq q q qqqqqqqqqqq q q qqq qqqq q qqqq qq qq qqqqq q qqqqq qqqqqq qq q q q q qqq q qqqq qq qq qqqqqqqqqqq qqqq q qq qqqq qqqqqqqq qq qqq qqq q q qq qq qq qqq qqqqqq q q q q qqqq qq qq qqqq qqq qq q qqqqq qqqqq qqq qq qqq q q qq qqqq qqqq qq qq qqqqqqqqqqq q qqq q q qqq q qq qq qq q qqq q q qqqqqqqqqqqq qqq qqqqq q q qq q qqqqq q q qqqqq q q q qqq qqqqqqqqqqqqqqqqqqq qq qq q q q qq qq q qq qq q q q qqqqq q q qqqqq q qq q qqq q qqq q q q q q qq qqqq q qq q q q qqqqqqqqqq q qqqqqqqqq qqqqqqqqqqqqqq q qq qqqqqqqqqqqqqqqqqqqqqqqqq qq q qq qqqqqqqqq qqqqqqqqqqqqqq qqq qq q q q q qqqqqqqqq q qq q qqqq q q qqq qqqqq q q q q qqqq qq qq qqq q qq qqqq q qqqqqqq qq q qq qqqqqqqqqq q qqqqq qqqqq qqqq qqqq q qqqqq qq qqqqqq q q qq q qq qq qqq qq qqqq qq q qq qq q q q qq qq qqq q qq q q qq q qqqq qqqqqqqqqqqqqqqqq qqq q qq qqq qqq qqqq qqqq qqq qqqqqq qq qqqq qqqq q qqq q qq qqqqqqq q qq qq q qqqq qq qq q qq qqqq qq qq qq qq qq qqqqqqq qqqqqqqqqq qq q qq qqqqq qq qqq qq qqq q qq qq qq qqq qq q q q qqqq qqqqqqqq qqq qq q q qqqqq qq qqqq q qqqqqqqq qqq q q qqqq qq qq q qqqqqqq qqq qq q q qq q qq q qqq q q q qq qq qqqqqq qqqq q qq qqqq qqqq qqqq q q qqq q q qqqqq q qqq qq q qq q q qqq q qqq qqq q qqqqqq q q qq q q qqq q qqqq qq qqq q qq qq q qqqqq q q qqq qqqq qqq qqq qqqqqqqqqq qqqqq q q q qqqq qqqqqqqqqqqqqqqqqq qqq qq qq q qqqqq qqq qqq qqq qqqqq q qqqqqqqqqqqq q qq qqqqqqqqq qqqqq qqq qqqq qq qqqq q qqqq q qq q qq q q q qqq q q qq q qqqqq q q q q qq qqqqq qqq q q qqqqq qq qqqqq q qq qqqqqqqqq qqqqqqqq qq qqqqq qq qqqqqq qqqqqqqqqqqqqq qqq qqqqqqqq qq qqq qqqqqqqqq q q q q q qqq qq qq qqq q q qq q qq q qqqqq qqqq qqqq q q q q qq q qqq q qqqqqq q q q qqqqqqqqq qqq q qqqq qqqqqqqqqq q q q q qqqq q qqqq qq qq qqqqq qq q q q q q qqq qq qqqq qq q q q qqqq qqq q qqqq qq q qq qq qqqqq q qq q q q qqq qqqqqq q q qqqqqqqqqqqqqq qq qq qq qqqq qqqqq qqq qqq q q qq qqqqqqq q qqq q qq q qq q qq qqqqqqqqqqqqq qq q q qqqqqqqqqq qqq qqq qqqqqq qq qq qqqqqqq qqqqqqqqqqqqq q qqqq qq qq q qq q q qqqq qq q q qqqq q qqqq qq qqqq qq qq q q qqqqq q qq qqqqqqqq q qqqqqqqq q qq qqqqqqq qqqqq qqqq qqqqq qq q q qqqqq qqqqqqq qqqqq q qq qq qq qqqqqqq q qqqqqqq qqqq qqq q q qqqq qq q q qq qqq q qqq qq qqqqqqqqqqqqqqqqqq q qqq qqqqqq qqqq qqqqqq qqq q q qqq q q q qqq q q qqqqqqqqq qqqq qq qqq qq qq qqq qq q qq qqqqqqqqq q qq q qqqqqq q qqqqqq qqqqqqqqqqqqqq qqqq q qqqq qqq qqqqqq qqq q qq q q qq q q qq qqqq q q qqq qqq q q qqqqqqq q q qqqqqqqqq q qq qq qqqq qq qqqq qqqqq qq qqq q qqq q qq qqq q qqq q qq q qqq qqq qqq q qqq q qq qq qq q q qqq q qqqqq qq qq qq q q qq qqq qqqqqq qq qqq q qq q q qqqqq q qqqqqqqqqqqqqq qqqqq qqq q qqq qq qqqqq q qq q qq qqq qqq q q qq qq qq qq qq q qqqqqqq qqqq q qqqq qq qqqq q q qq qqq qqq qqq qqqqq qq qq q qq q qqq qq qqq qqq qq q q qq qq q q q qq qqqq q qq q qq q qqqqq q qqqqqqqqq qq q qqq qq qq qqq qqqqqq q qqq qqqq q qqqq qqqqqqqqq q q qq q q qqqq qqqqqq qq qq q qq qqqqqqqqqqq qq qq q q qqqq qq q qqq q q qq qq qq qqq qq qqqq qq q qq q qqqqq qqqqq q q qqq qqq qqq qq qqqqqqqqqqqqqqqqqqqq q q q q qq q qqqqqq qq q q q qqq qq q qq qq q qq qqqq qq qq qqqqqqq qq q q qqqq q qqq qq q qqqq q qqqqqq qqqqqq qqqqq qqqqqqq q qqqq qqqqq qq qq q q qqqqqqq qqqqq q qqq q q qqqqqq qqq qqq q qq qq qq qqqq q qqqqqqqqqq qq q qqq qqq qqq q q qq qqq qq qqqqqqq q qq qqqqqq q qqqqq qq qq q qqqq qqqq q qqqqq q q q qqqq q qq q qqqqq q qqq q qqq qqqqqqq qq qq q q q qqq q q qq q qq qqq q qqqq qqqqqq qqq qqqq qqqqqqqqq qqqq q qqqq qqqq q qqq qqqqq qqqq q q qq qqqqqqq q q qqq q q qqq qqqqqqqqq q qqq qq qqqq qq qqq qqq qqqq qq qqqq q q qqq q qqqqqq qqqqqq qqq qq qqqqqqqqqqq qq q qq qqqqqqqqqqqqq qqqq q q qq qqqq qqq qqqq qqq q qqqq qq qq q qq qq qqq qqqqq q qqqqqqqqqqqqqqq qq q qq qqq q qq qqqqqqqq q qqqqqq qq qq qq q qqqqqqqqqqq q qqqqqqq qq qqqqq q qqqq qq q q qqqqqq qq qqqqq q qqq qq qqqqq qqqqqqq qq qqq q q q qq qq qq qqqqq qqqq q q qqqqq q qqq q q qqq qq qqqq qqqqqqq qqq qqq qq qq qq q qq qqq qqqqqq qq q q q qqqqqqqqqq qqqqqq q q qqq qqqqqq qqqqqq qqqqqqqq qq qqq qq qqqq qqq qqqq qqqqq qq qqqqq qqq qqqq q qqq qqqq qq qqqqqqqqqqqqq qqqq qq qqqqq qqqqq qq q qqqqqqqq q q qqq qqqqqq qq qq q qq qqqqq qqqq qqqq q qqqqq qq qq qqq q qq qqqqqq qqqq q qq qqqqq qqq qqq q qq q q qq q qq qq qqqq q q qqqqq qqq qqqq qqqqqq q qqqqqq q qqq qqq qqqqq qq qqqq qqqqq qq qq qq q q qqq q qq qq qq q q qq q qqqqq q qq qq q qqqqqqqq qqq qq qqqqq qqq qqqqq qqqqqqqqqqqqqq qqq qqq qqq qq q qq qqq q q q qq q qqqq qqq q qqqqq q qq qq qq q qqqqq qq q q q q qq qq q q qq q qq qq qq qqqqq qq q q qq q qqqqqqqqqqqq qq q qq qqqqqq q q qq qq qqqqqqqqq qq q q qq q qq qqqqqq qq qqqqqqqq q qq q q q qqq qq q q qqq qq qqq qq qqq qqq q q qqq qqqqq qqqqqq qqqqqq qq qqqqqqq q q q qqqq qqqqqq qqqqqqq q q q q qq qqq q qq q qqq q qq qq q qqqqq q qqqqqqqqqqq q qqqq qq q qqqq qqq qq qqq q q qq qq q qq qqq q qqq qqq qqqqqqqqqqqq qqqqq qqqq qqqqqq qqqqqqq qq qqq q qq qq qq qq q q qqqqqq q q q qq qqqq qqqqq qqqqqqqqqq qqqq qqqqqq q qqq q qq qq q qq qq qq q q qqqqq qq qqq qq qq qqq q q qq qqqqqqqqq q qqqq qqq q qqq qq qqqqq q qqqqqqqqqqqqqq qq qq qqq qqqq q qq q q q qqq q qq qqqqq qq qqq qqqqqqqqq qq q q q q qqq qqqqq q q qq qqqqq qqqq qqqqq q qq q qqqq qq qqq q q qqqqqqqqqqqqqq qqqq qqqqq qq qqq qqq qqqq qq qqqqq q q q qq qqq qqqqq qqqqq qqqqqqqq qq q qqqqq q qqqqqqq qqqqqqq qq qq q q q qq q q qqqqq qqqqqqq q qq qq qqqqqqq q qqqqqqqqqq q q qqq qq qqq qqqqqqqqqqqqqqqqqq q qqq q qqqqq q qqq qq q q qqqq qq q qq q q qqqq q qqqqqq qqqqq qqq qqq qq q q qq qq qq q qq q q q qqq q q qq qqq qqq qq qq q q qq qqq qqqqqqqqqqqq q q qqq qqqqq qq q qq q qqq q qq qq q q qqqqq qqqqq qqq qq q q qqq q qq q q q q qqq qq q qqqq q q qqqq q qqqqq qqq qq qqqq q qq q q q q q qqqq q q q qqq qqqq qqq qqqq q q q qqq qq qqqqqq qqq qq qqqqqq qqqq q qqq q q q q qqqqqqq qqqq qqqqq q qqqqq qqqqqqqqqqqqqq qqq q qq qqqqqqqqq qqq qqq qq q qqqq qqqqq qqq qqqqqqq qq qq qqqq qq qqqqqq qqq qqq qqq qqqqq q q qq qqq q qqqq qqqq qqqqqqqqq qq q q qqqq qq q qqqq qq qqq qq qqq q qqqqq q qqqqqq q q q qqqq q q qqq qqqqq q q qqqq qq qqqq qq q qq qqqq qq qqq qqqqq q qq qq qqqqq qq qqqq qqqqqq qqq q qqqqq qq qq q qqq qqq qqq qqqqqq qqqq qqq qqqqq q qqq qqq q qq qq q qq qq q q q qqq qqqqqqqq qq qqqq qqqqq qq qq q qqq qq q q qqqq qqq qqqqqq qqqqqq qq q qqq q qqqqqqq qq q q qqqq qqqq q q qqqqq q qqqq q q qqq q q q qqqqqqqqqq qqqqq qqqqqq qq qqq qqqq qqqq qqqq qq qqqq q qqqqqqq q qqqqq qq q qqq q qqqq qqq q q q q qqq qq qqqqqqqqqqqqq qqq q qqqqqq qq qqq q qq q qqqq q qq qqqqq qq qqqqqqqq qqq q q qq q qqqq qqq qqqqq qq qqqqqqqqqq qq qqqqqqq q qq qqqqqq q qqqqq q qqq q q qq q qqq qq qq q q q qq q qq q qqq q qq qqqqq q qqq qqq q qqqqqqq q qqq q q qqqqq q qqqqqq q qqqq qqq qqqqq qqqqq q qqqqqq qqqqq qqq q qqqqq qqqqqq qq q qqqqqq qqqq qqqqqq q q q q qqq qq qqq qqqqqq qqq q qqqqq q q q q qqqq qq qq qq qqqqqq q q q qq qq q qqq qq qqqqq q qqqqqqqqqqqqq qqqqqqqq q qq q qq qqq qqq q q qq qq qqqq q qqqqqqqq q q q qq qqqq qqq qqqqqqqq q qqq q q qq qqqq qqqq qqqq q q qqq qqqqqqqq qqqqq q qqqqq q qqq q qqq q q qq q qqqq qq qq qqqqq qqq qqqq qqqqqqq qq qqqq qqq q qq qq q qqqqq qqqqqqq qqqqqq qqqqqq qq qqqq q q qqqqqqqqq qqqq qqqqqq q q qq qq q qqqq qqq q q q q qq q qqqqqqq q qqqq qq q qqqqqqq qqq qqqqqq q qqqqqqqqq qqq qqq q qqqq qqqqqq qqqqqq qq q qqqqq qqq q qq qq qqqq q q qqqqqq q q q q q q qqqq q qqqqqqq qqq qq qqqqq q qqqq qqq qqqqq q qq qq qqq q qqqq qqqqqqqq qqq q qqq qqqqqqqqqqq q qq qqq qqqqqq qqq qqqq qqq qq qqqqqqqqqq qq qq qqqqqqqq qqqqq qq qqqq qq q qq q q q qq qqqqq qqqqqqqqqq q qqq qqq qqqq qq qqqq qqqq qq qqqq qqq qqq qq qqq qqqqq qqqqqq qq qqqqqqqqqqqqq qq qq qqqq q q qqqqqq q q qq qqqqqq qqqqqq qqqqq q qqq qqqqq q qqqqq qq qqqq q qqqq qqqq qq q qqqqqqq qq q q qqqq q qqqqq qqqqq qqq qqq q q qqq qq qqq qqqqqqq qqq q q qqqqqqq q q qqqqq qqq qq qqqqqqqqqq qq q qqqq q qqqqqqqq qq qqqq qqq qq qqq qqqq qqqq qqqqqq qqq qqqq qqqqq qqq qqqqq q q q qqqq q qqqqqqq q q q q q qqqqq qqq qq qq qq q qqq qq q qqqqqqq qqqq qqq qqq q q qqq q q q q qq qqq q qqqqqqq qqqqqq qqqqqqq qqqqqqqq qq qqq qq q qqqq q qqqq qq q qqq q qq q q qqqqqq q qq q q q q q qq q q qqqqq q q qq q qqqq qq q qqq qqqq qq q qq qqq qqqqqqqqqq q qqq qqq qqqq q qqqqqq q q qqqqq qq qqq qqqqqqq qqqqq qqq qq q qqqqqqqqqq qqqqq q qq qqqq q q qqqqq qq qq qq qqqqqq qq qqqq q qqqqqqqq qq qqq qq qq qq q q q qq qqqqq q q qqqqqqqqq qq q q q qqq qqqqq qq q qqqq qqq qqqqqqqqqqq q q qqqq q qqqq qq qq qq qq qqqqq q qq q qqqqq qqqq qqqq q qqqqqqqqq qq qqq qq q qq qqqqqqqqq q qqq qqq qqqq qqq qqqq qqqqq qqqq q qqqq q q q qq qqqqqqq qq q qq q q q q qq qqqqq q qqqqq q qqq qqqqq q q qq qq qq qq qqqq q q qq qqqq qq q qq q qqqqqqqqqqq qqqqqqq qqqqq qq qq qqq qqqq qqqq q q qqq q qqq q qq q qqqqqq qqq qq qqqqqqqqqq qqq qq q q qq qq qqqq q qq q qqq q qqqq q q qq qqqqq qqqqqq qqqqq q qqq q qqqq q qq qq q q qqq q qq qqqqqq q qqq qqq q q q qqqqqqq qq q q qq qqqqqq qqqqqqqqqqqq q qq qqq qq qqq qq qqq qq qqqq qqqqq qqqqqqqq qqq qqqqqq qqq qqqqq q qqqq q qqqqq qqq q q q qqqqqq qq q q qq qq q q qqqq q q qq q qqqqqq qqqqqq qqq qq qq qqq q qqqqqq q qqqqqqqqqq q qqqqqqqqq q qq qqqqq qqqqqq qq qqq qqqqqq qqqq qq qqqqqq q qq qqqq qqqqqqq q q q q qqqqqqq qq q q qq qqqq qqqqq qqq qqqqqqq q qqqqq qqqqqqq qq q q qq qqq q qqq q q qqqqq qq q q q qqqqq qqq qqqq q qqq qq qq qq qqqqq qq qq q qqqq qq qqqqq qqqqqqqq qq qq qqqq q q qq qqq qqqq qqqqq q qqqqq qqq q qqqqqqq q qqq q q q qqq q q qq qqqqqqqqqqqqqqqqqq qqq q q qqqqqq qqqq qqqqqqqqqqqqqqqq qqq q q qqqq qqqq qq q q qq qqqq q q q qqqqqqq qqq qqqqqqqq q q qqqqq q qq q qqq qq qqqqqq q qqqqq q q qq qq q qqq qq q qq q q qq qqqqqqq q qqqqqqqq q q q qqqq q q q q qqq qqqq qq q qqqqqqqq qqqqqqqq qq qqq qq q qqq qq qqq qqq q qqqq qqqq qqqqqq qqqqq qqq q q qqq qqqqqqq q qqq qqqq qqq qq qqq qqqq q q qqqqqqqqqq qq q qq qqqqqqq qqqq qq qq q qq qqqqqqq qqq q qq qqq qqqqqqq qqqqq qqqqqq q qq qqqqqqq qq q q qqq qqqqqqq qqqqq qqqq q q q qq qqqq qq q q qq qqq qq qqqqqq qqq q qqq qqqqq qq q qqqqqqqqq q qq q qqqq qqqqqqqqq q qq qqqqq qq q qq q qqq q qqqqqqqq qq qqqqqqqq qqqqqqqqqqqq qqqq q qqq qqqqq q qqqqqq q q qq qqqq qq q qqqq q qqqq qqqqq q q q qqq qq q qqq qq q q qq qq q qq qq qqqq q q qq qq qqqq qqq q qqq qqqqq q qq q qqqq q qq qqqqq q q qqq q q qqq qqqqq qqqqqqqqqqqqq q q qqq qq q qq qqqqq qqqq q qq qq qqq q q q q qq qq qqqqqqqq qqq q q qq qqqqqqq q q q q qq q qq qqqq qq qqqq q qqq qqqqqqqqq qqqqqqqqqq q q q qq qqqqqqq qqqq qqq qqqqqq q q qqq qq q qqqqq qqqqq q qq q q q qqq qq q q q qqqqqqqqqq qq qq qqqqq q qq q qqqq qq qqqq qqq qq qqq qqqq qq q q q qqqqq qqq q qq qqqqqqqqqqq qqqqqq qqqqqqqq qq qq qq qqqqqq qqq qqq q q q qqq qqq qqqqqqqq qqqqqqqq q qqq q q qqq qqqqqqqq qqq q qq qq qqqqq qqqq qqqqq q qqqq qqqqqqqqqqq qq qqqq q qq q qqq q q q qq qqqq qq qqq q q q q q q qqqqq q qq qqqqqqqqq qq q q qq qqqqq qqqqqqqqqq qq qqq qq qqqq qq qq q qqq qq qq qqqqqq qqqqq qqq qqqqqqq qqqqqq qq q q qqqqqqqqq qqqqq q qqqqqq q qqqqqqqqqqqq qqqq q q qq qq qqqqqqq qqqqqqqqq qq q qq q qq q q qq q q q qqqqqqqqq qq q qqqq qq qq q qqqq qq qqqqqqqqqqq q qq qqqqqq q q qqqqqqq qqq q q qqqqq q qq q qqqq q qq qqqqqqqq qqq qqq q qq qq q q q q qq qqq qqq q qqqqqqq qqqqq qq q q qqqq qq qqq q q qqq qq qq qq q qq qqq qq q qq q q qqq qqq qq qqq qqqqqqqqq q qqqqq q qqqqqqqqqqqq qqq q qq qqqqqqqq q qqq qqq qqqqq q qqq qq qqqqqq qqqqqq qq q qqqq qq qq q qq q qqqqqqqq qqq qqqqqqqqq qq qqqqq qqqq q q qqqqq qqqqqqq q qqq qqqq q qq q qq q qqqqqqqqq qqq qqqqq q q q qqqqqqq q qqqq qqqq q q q q qqqq qqqqqq q q q qq qqqq qq qq qqqqq qq q q q qq qq qqqqqqq qqqqq qqqqqqq q qqq qqq q qq qqq q qq qqqqq q qqq q q q q qq qqqqqqq qqqqq q q qqqq qqqq q qqq q q qqqq q qqq q qqqqqqq qqqq qqqqq q qq qq qqq q q qqqq q qq qq q q q qq qq qqqqq qq q qq q qq qqq qqqq qqq qqqq qqqqqqqqq q qqqqqqqqqqqqq q qqq qqqqqqqqqqqq q q q qqq q qqqq qqq qq qq qq qq q q qqq q qqqq qqqqqq q q qqqqqq qq qq q q qqqqq q qqqqqqq q q qq qqq qqq qqqqqqqqqqqqqq qqqqq qqqqqqqq qqqq qq q qqqqqqqq q qq qqqqq q q qq q qq q q qq qqqq q q q qq qqqq qqqq qqqq q q qqq q qqqqqqq qqqq q qqqqqqqq qq q q qqq qq qqqqq qqqqqqq qq qqqqqq qqq qq q q qqqqqq q qqqqq q qqqqqqq qq q qq q qqqqq q q qqqqqqq qqqqq q qq qqq qq q q qq qq q q qqq qqqq qqq qqq qqq qqqqq qq qq qq q qq qqq q q q qqqqqqqq qqq q qqqqqqqqqq qq qqqqq qqqqq qq qq qqq qqq q q q q qq q q qqqqqqq qq q q qqq q qqqqqqqqqqqqq q qq qqqqq qq qqqqqqqqqq qqq qqqq qqqqqq qq qqqqqqqqq q q qqq q qq qq qqqqqqq qqqqq q qqqq qq q qq q q q q qqqq qqq q q qqqq q q q qq q qqq qqq qqqq qqqqqqqqqqqqqq qqq q qqq q q qq qqqqq qqqqqqqqqq qqqqqqqqqqq qq qqq qq qqq qqq qqqqqq qqqq qq qq q qq q qq qqqqq q qqqqq qq qq qqqqqqq qqqqqq qq q qq qqqq qqq qqq qqqq qqq qqqqq q q qqqqqqqq q q q q q qq qqq q qq qqqqqq q qqqqq qqq qqqq qqqq q q qqqqqq qqqq qqqqqq qqqqqq qq q q qq q q qq qqqqqqqqqqqqqq qq qqq q
  • 105. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Outcome Result with a simulated sample of 100 points and ϑ1 = 0.6, ϑ2 = 0.2 and scale ω = 0.5 q q −2 −1 0 1 2 −1.0−0.50.00.51.0 θ1 θ2 qqqqqqq qqqq qqqqq qq qqq q q q qqqqqqqqqqqqqqqqqqq qq qq qqqqqqqqq qqqqqqqqqqqqq qq qqqqqqqqq q qq q q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqq qqqqqqqqqqqqqqq qqqqqqq qqqqqqqqq qqqqqq qqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqq q qqqqqqqqqq qqqqqq qqqq q qqqqqqqq qqq qqqqqqqqqq qqqqqq qqqqqqqqqqqqqq qqq q qq qqqqqqqqqqqqqqqq qqqq qqqqq qq qqq qqqqqqqq qqqqqqqqq qqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qqqqq q qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qqqqqqqqqqqqq qqqqqqqqqq qqqq qq qqqqq qqqqqq qqqq qqqqq qqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqq qqqqqqq qqq qqqqqqqqqqqqqq qqqqqq qqqqqq qqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqq q qqqqqqqqq qqqqqq q qqqqqqqqqqq qqqqqqqqqq qqq qqqqqqqqqqqqqqqq qqq qqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q q qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqq qqq qqqqqqqq qqqqqqqqq qqqq qqqqqq qqq qqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqqq qqqqqq qqqqq qqqqq q qq qqqqqqqq q qqqqqqq qq qqqqqqqqqqqqq qq qqq qqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qqq qqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqq qq qqq q qqqqq qq qqqqqqqqqqqq qqq qqqq q qqqq q qqqqq qqqqqqqqqq qq qqqq q qqqqqq qqq qqq q qqqqq qqq qqqqqqqqqqqqq qqqq qqqqqq qq qqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqq qqqqq qq q q qqq qqqqqqqqqqqqq qqqqqq qqqqq qqqqqq qqq qqqqqqq qqqq qqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq q qqqqqqqqqqqqqqqqqq qqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqq q qq qqqq qqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqq qq qqqqqqqqqq qq qqqqqq qq qqqqqqqqqqq qq qqqqq qqqq q qqq qqq qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqq q qqqqqqqqqqqqqqq qqqqqqq qqqqqqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqq qqqqq qqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqq qq qqqq qqq qqqqq qqqqqqqqqqqqqq qq qqqq qqq qqq qqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqq qqq qqqqqqq qqqqqqqqqqqqqqq qqq q q q qqqqqqqq qqqqqqq qqqqqqqqqqq qqqqq qq qqqqqq qq q qqqqq qqqqqq qqqqqqqqqq qqqqqqqqq qqqqqqqq qqqqqqq qqq qqqqqqqq qqqqqqqqqqqqq qqqqqqqqqqqqqqq qqq qqqqqq qqqqqqqqqqqqqqqqqqqqq qqq qqqqqqqq qqqqqqqqqqqqq qqqqq qqqq qqqqqqqqqqqqqq qqq q qq qqq q qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqqq qqqq qqqqqq qqq qqqqqqq qqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqq q qqqqqqqqqqqqqqqqq qqqq qqqqqqqqqq qqqqqqq qq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq qqq qqqqqqqqqqq qqqqq qqqqqqqq qqq qq qqqqq qq qqqq qqqqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqq qq qqq qqqqq qqqqqqqqqqq q qqqqqqq qqqqqqqqqqqqq qqq qqq qqqqqqqq qq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqq qqqqqqqqq qq qq qqqq qqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqq qq qqqqqq qqqqqqqqqqqqqqqqqq qqqqq qqqqqqqqqqqqqqqqqq qqqq q qqqqq q qqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq q qqqqqq qqqqq qq qq q q qqqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqq q qqq qqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqq qqqqqqqqq q qqq qqqqqqq qqqqqqq qqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqq qq q qqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqq qqqqq qqq qqqqqqqqqqqqq qqqqqqqqqqqqqqqqq qqqq q qqqq q qqqqqqqqqqqqq qqqqqqqqqqq qqqqqqqq qqqqqqqqq qqqqqqqqqqq qqqqqqqqqqqqqqq q qqqqqqq qq qqqqqqqqqq qqq qqqqqqqqqqqq qqqqqqqqqqqq qqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqq qqq qq q qqqqq qqq qqqqq qqqqqq qqqqqqqqqqqqq qqqqqqq qq qqqqqqqqqqq qqqqq qqqqqqqq qqqqq qqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqqqqq qqqqqqqqqqqqqqqqqqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqq q qqq qqqqqqqqqq q qq qqqqqqqqqq qq qqqqqqqqqqqqqq qq qqqqqqqqqqqqqqqqq qqq q qqqqq qqqq qqqq qq q q q qqqqqqqq qqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q qqqqqqqqqqqqqqqqqqqqqq qqqqqqq qqqqqqq qqqq qqqqqqqqq qq qq qqqq q qqq qqqqqqq qqqqq qqq qqq qqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqq qqqqqqqqqqqqq qqqq qq qqq qqqq qqqqqq qqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqq qqqqqqq qqq qq qqqq qqq qqqqqqqqqqqqq qqqqqqq qq qqqqqq qqqqqqq qqqqqqqqqqqqq q qqqqqqqq qqqqqqqqqqqqqqq qq qqq q qq qqqqqqqq qqqqqqq qqqqqqqqqqqq qqqqqqqqqqqqqqqqqq qqqq q qqqqqq qqqqqqqqqqqq qqq q qq qqqq qqqqq qqqqqqqqqqqqqqqqqqqq qqq qq qqqqqq qqq q qq qqqq qqqqq qqq qqqqqqqqqqq qq qqqqqqqqqqqqqq qqqqqqqqqqqqqqqq qq q qqqqq qqqq qq qqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq qq qqqqqqqqqq q qqqqqqqqqqqqqqq qq qqq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qqqqqq qqqqqqqqqqqqqqqqqqq qqq qqqqqq qqqqqqqqqqqqqq qqqq qqqqqqq q qqqqqqqqqqqqqqqqqqqq qqqqq q qqqqqqqqq qqq qqq qqqqqqqq qqqqqqqqqqqqqqqqqqq q qqqq qqqqq qqqq qqq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq q qqqqqqqqqqqqqqqqqqqqqq qqqqqqqqq qqqqqqq qqqqqqqqqqqqqqqqqqqq qqqqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qq q qq qqq qqq qqqqqqqqq qqqqqqqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqq qqqqq qqq qqqqqqqq qqqqqqqqqqqqqqqqqqq q qqq q qqqqqqqqqqqqqq qq qqqqqqq qqqqq qqqqqqqqqq qq qqqqqq qqqqq qqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqqqq qqqqq qqqqqqqqqq qqqqqqqqqqqq qqqq q qqqqqqq q qq qqqqqqqqqqqqqqq qqq qqqqqqqqqq qqqq qq qq qqqqqqqqqqqqqqqqqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqqq qq qqqqqqqqqqqqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqq qqq qqqqqqq q qqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqq qqqq qq qqqqqq qqqqqq qqq q qqqqqqq qqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqq qqqqqqqqqq qqq qq qqqq qqqqqqqqq qqqqqqq qqqqqq qq q qqq qqqqqqq qqqqqqqqq qqqq qqq qqqqqqqqq qqqq q qqqqqqq qqqqqqqqqqqqqqqq qqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqq qq qqqqqq qqqqq q qqqqq qqq qqqqqqqq qqq qqq qqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqqq qq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq q qqqqqqqqq qqqqqqqq qqqqqqqqqqqqqqqqqqqqqqq qq qqq qqqqqqqq qqqqqqqq qqqqqqq qq qqqqqqqqqqq qqqqqqqqqqqqqqqq qqqqqqqqqqq qqqqqqqqqq qq qq qqqqqqqqqqqqqq qqqq qq q qqqqq qq qqqqqqqqqqqqq qqqqqqqqqqqqqq q q qqq qqqqqq qqqqq qq qqqqqqq qqqqq qqqqq qqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqq qqqqqqqqqq qqqqq q qqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqq qqqqqqq qqq q qq qq q qqq q q qqqqqqq qq qqqqqqqqq q qqq qqqqq qqqqq qqqqqqqqqq q qqqq qqqqqqqqqqqqqqqq qqqq qqqqq qqqqq qqqqqqqqqqqq qqqqqq qqqqqq qqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqq qqq qqq qqqqqqqqqqqqqq q qqqq qqqq qqqqqq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqqqqqqqqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqqq qq qqqqq qqqqqq qqqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqq qqq qqqqqqqqqqq qqqqq qqqqq qqq qqqqqqqqqqqq qqqqqqqqq qq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqq qqqqq qq qqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqqqqqqqqqq q qqqqq qqqqqq qqqqqqq qqqq q q qqqq qqqqqqqq q qqqqqqq qqqqqqqq q qqqqqqqqq qqqqqqqqq qqqqqqq qqqqqqqqq qqqqqqqqqqqqqqq qq qqqqqqqqqqqqq qqqqq qq qqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqq qqqqqq qqqqq q qqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqq qqqqqqqqqqqqqq qqq qq qqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqq qqqq qqqqqqqq qqqqq qqqqqqq qqqq qqqqq qq qqqqqqqqqqqqqq qqqqqqqq qqqqqqqqqqqqq qqqqqqqqq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqqqqqqqqqqqqqq qqqqqq qqqqqqqqqqqqqqqqqqqqqqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqq q qqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qqqq qqq qqqq qqq qqqqqqqqq q qqqq qq qqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qq qqqqqqq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqq q q qq qqq qqqq q qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqqqq qqqqqq qqq q qqqqq qqq qqq qqqqq qqqqqqqqqqq qqqqqqqqqq qqqqqqqqqq qqqqq qqqqq q qq qqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqq qq qqqqqqqqqqqqqqqqqq qqqqqqqq q q q qqqqq qqqq q qqqq qqqqq qqqqqqqq qqqq qqqqqqq q q qqqqqqq qqqqqqqq qqqqqq qq qqqq q qqqq qqqqqqq qqqqqqq qq qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq q qqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqq q qqqq q qqqqqqqqqq qqqqqqqqqqqqq qqq qq q qqqqq q qqqqqqqqqqqqqq qqqq qqq qqqqqqq qqqqqqq qq qqqqqqqq qqqqqq qqqqqqqqqq q qqqqq qqqqqq qqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqqqqqqqqqqq q qq qqqqqqqqq qqqqqqqq qqqqqqq qq qqqqqqqq qqqqqq q qqqq qqqqqq qqqq qq qqq q q qqqqqqq qqqqqqqq qqqqq qqqqq qqqqqq qqqqq qqqq q qqqq qqqqq qqqqqqqqqqqqq qqqqqqqqqqqqqqq qqqq qqqqqqqqqqqqqqqqqq qqqqqqq q qqq qqqqqqqqqqqqqqqqq q q qqqqqq qqqqqqqqq qqqqqqqqqqqqq qqq qqqq qqqqqqqqqqqqqqqq q qqqqqqq qqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqq qqqqqq qqqqq qqqqqqq qqqqqq qqqqq q qqqqqq q qqqqqqqqqqq q qqqqqqq qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqq qqq qqqq qqqqqqqqqqqqqqqqqqq qqqqqqqqq qqqq qqqqqqqq qqq qq qqq qqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qq qqqqqqqqqqq qqq qqqqqqqqqqqq qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqq q qqq qqq qqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqq qqq qqqqqqqqq qqqqqqq qqqq q q qqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqq q q qq qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqq qqqqqqqqq qq qqqqqqqqqqqq qqqqqqqqqq q qq qqqqqqqqq q qqqqqqq qqqqqqqqqqqqqqq qqqq qq qqq qqqqqqqq qq qqq qqqqqqqqq qqqqqqqqqqqqqqqq qqqqqq q q q qq qqqqqq qq q qqqqqqqqqqqqq qqq qqqqqqqqqqqqq qqqqq qqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqq qqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqq qq qqqqqq qqqqqqqqqqqqqqqqq q qqqqq qqqqqqqqqqqqqqqqqq qqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqqqq q qqqqqq qqqqqq q qqqq qqqqqqqqqqq qqqqqqqqqq qqq qqqqqqqqqq qqqqq qq qqq qqq qqqqqqq q
  • 106. MCMC and Likelihood-free Methods The Metropolis-Hastings Algorithm Extensions Outcome Result with a simulated sample of 100 points and ϑ1 = 0.6, ϑ2 = 0.2 and scale ω = 2.0 q q −2 −1 0 1 2 −1.0−0.50.00.51.0 θ1 θ2 qqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq q qqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq qqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqq
  • 107. MCMC and Likelihood-free Methods The Gibbs Sampler The Gibbs Sampler skip to population Monte Carlo The Gibbs Sampler General Principles Completion Convergence The Hammersley-Clifford theorem
  • 108. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles General Principles A very specific simulation algorithm based on the target distribution f: 1. Uses the conditional densities f1, . . . , fp from f
  • 109. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles General Principles A very specific simulation algorithm based on the target distribution f: 1. Uses the conditional densities f1, . . . , fp from f 2. Start with the random variable X = (X1, . . . , Xp)
  • 110. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles General Principles A very specific simulation algorithm based on the target distribution f: 1. Uses the conditional densities f1, . . . , fp from f 2. Start with the random variable X = (X1, . . . , Xp) 3. Simulate from the conditional densities, Xi|x1, x2, . . . , xi−1, xi+1, . . . , xp ∼ fi(xi|x1, x2, . . . , xi−1, xi+1, . . . , xp) for i = 1, 2, . . . , p.
  • 111. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Algorithm (Gibbs sampler) Given x(t) = (x (t) 1 , . . . , x (t) p ), generate 1. X (t+1) 1 ∼ f1(x1|x (t) 2 , . . . , x (t) p ); 2. X (t+1) 2 ∼ f2(x2|x (t+1) 1 , x (t) 3 , . . . , x (t) p ), . . . p. X (t+1) p ∼ fp(xp|x (t+1) 1 , . . . , x (t+1) p−1 ) X(t+1) → X ∼ f
  • 112. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Properties The full conditionals densities f1, . . . , fp are the only densities used for simulation. Thus, even in a high dimensional problem, all of the simulations may be univariate
  • 113. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Properties The full conditionals densities f1, . . . , fp are the only densities used for simulation. Thus, even in a high dimensional problem, all of the simulations may be univariate The Gibbs sampler is not reversible with respect to f. However, each of its p components is. Besides, it can be turned into a reversible sampler, either using the Random Scan Gibbs sampler see section or running instead the (double) sequence f1 · · · fp−1fpfp−1 · · · f1
  • 114. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example (Bivariate Gibbs sampler) (X, Y ) ∼ f(x, y) Generate a sequence of observations by Set X0 = x0 For t = 1, 2, . . . , generate Yt ∼ fY |X(·|xt−1) Xt ∼ fX|Y (·|yt) where fY |X and fX|Y are the conditional distributions
  • 115. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles A Very Simple Example: Independent N(µ, σ2 ) Observations When Y1, . . . , Yn iid ∼ N(y|µ, σ2) with both µ and σ unknown, the posterior in (µ, σ2) is conjugate outside a standard familly
  • 116. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles A Very Simple Example: Independent N(µ, σ2 ) Observations When Y1, . . . , Yn iid ∼ N(y|µ, σ2) with both µ and σ unknown, the posterior in (µ, σ2) is conjugate outside a standard familly But... µ|Y 0:n, σ2 ∼ N µ 1 n n i=1 Yi, σ2 n ) σ2|Y 1:n, µ ∼ IG σ2 n 2 − 1, 1 2 n i=1(Yi − µ)2 assuming constant (improper) priors on both µ and σ2 Hence we may use the Gibbs sampler for simulating from the posterior of (µ, σ2)
  • 117. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles R Gibbs Sampler for Gaussian posterior n = length(Y); S = sum(Y); mu = S/n; for (i in 1:500) S2 = sum((Y-mu)^2); sigma2 = 1/rgamma(1,n/2-1,S2/2); mu = S/n + sqrt(sigma2/n)*rnorm(1);
  • 118. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1
  • 119. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2
  • 120. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3
  • 121. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3, 4
  • 122. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3, 4, 5
  • 123. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3, 4, 5, 10
  • 124. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3, 4, 5, 10, 25
  • 125. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3, 4, 5, 10, 25, 50
  • 126. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3, 4, 5, 10, 25, 50, 100
  • 127. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Example of results with n = 10 observations from the N(0, 1) distribution Number of Iterations 1, 2, 3, 4, 5, 10, 25, 50, 100, 500
  • 128. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Limitations of the Gibbs sampler Formally, a special case of a sequence of 1-D M-H kernels, all with acceptance rate uniformly equal to 1. The Gibbs sampler 1. limits the choice of instrumental distributions
  • 129. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Limitations of the Gibbs sampler Formally, a special case of a sequence of 1-D M-H kernels, all with acceptance rate uniformly equal to 1. The Gibbs sampler 1. limits the choice of instrumental distributions 2. requires some knowledge of f
  • 130. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Limitations of the Gibbs sampler Formally, a special case of a sequence of 1-D M-H kernels, all with acceptance rate uniformly equal to 1. The Gibbs sampler 1. limits the choice of instrumental distributions 2. requires some knowledge of f 3. is, by construction, multidimensional
  • 131. MCMC and Likelihood-free Methods The Gibbs Sampler General Principles Limitations of the Gibbs sampler Formally, a special case of a sequence of 1-D M-H kernels, all with acceptance rate uniformly equal to 1. The Gibbs sampler 1. limits the choice of instrumental distributions 2. requires some knowledge of f 3. is, by construction, multidimensional 4. does not apply to problems where the number of parameters varies as the resulting chain is not irreducible.
  • 132. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Latent variables are back The Gibbs sampler can be generalized in much wider generality A density g is a completion of f if Z g(x, z) dz = f(x)
  • 133. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Latent variables are back The Gibbs sampler can be generalized in much wider generality A density g is a completion of f if Z g(x, z) dz = f(x) Note The variable z may be meaningless for the problem
  • 134. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Purpose g should have full conditionals that are easy to simulate for a Gibbs sampler to be implemented with g rather than f For p > 1, write y = (x, z) and denote the conditional densities of g(y) = g(y1, . . . , yp) by Y1|y2, . . . , yp ∼ g1(y1|y2, . . . , yp), Y2|y1, y3, . . . , yp ∼ g2(y2|y1, y3, . . . , yp), . . . , Yp|y1, . . . , yp−1 ∼ gp(yp|y1, . . . , yp−1).
  • 135. MCMC and Likelihood-free Methods The Gibbs Sampler Completion The move from Y (t) to Y (t+1) is defined as follows: Algorithm (Completion Gibbs sampler) Given (y (t) 1 , . . . , y (t) p ), simulate 1. Y (t+1) 1 ∼ g1(y1|y (t) 2 , . . . , y (t) p ), 2. Y (t+1) 2 ∼ g2(y2|y (t+1) 1 , y (t) 3 , . . . , y (t) p ), . . . p. Y (t+1) p ∼ gp(yp|y (t+1) 1 , . . . , y (t+1) p−1 ).
  • 136. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example (Mixtures all over again) Hierarchical missing data structure: If X1, . . . , Xn ∼ k i=1 pif(x|θi), then X|Z ∼ f(x|θZ), Z ∼ p1I(z = 1) + . . . + pkI(z = k), Z is the component indicator associated with observation x
  • 137. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example (Mixtures (2)) Conditionally on (Z1, . . . , Zn) = (z1, . . . , zn) : π(p1, . . . , pk, θ1, . . . , θk|x1, . . . , xn, z1, . . . , zn) ∝ pα1+n1−1 1 . . . pαk+nk−1 k ×π(θ1|y1 + n1¯x1, λ1 + n1) . . . π(θk|yk + nk ¯xk, λk + nk), with ni = j I(zj = i) and ¯xi = j; zj=i xj/ni.
  • 138. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Algorithm (Mixture Gibbs sampler) 1. Simulate θi ∼ π(θi|yi + ni¯xi, λi + ni) (i = 1, . . . , k) (p1, . . . , pk) ∼ D(α1 + n1, . . . , αk + nk) 2. Simulate (j = 1, . . . , n) Zj|xj, p1, . . . , pk, θ1, . . . , θk ∼ k i=1 pijI(zj = i) with (i = 1, . . . , k) pij ∝ pif(xj|θi) and update ni and ¯xi (i = 1, . . . , k).
  • 139. MCMC and Likelihood-free Methods The Gibbs Sampler Completion A wee problem −1 0 1 2 3 4 −101234 µ1 µ2 Gibbs started at random
  • 140. MCMC and Likelihood-free Methods The Gibbs Sampler Completion A wee problem −1 0 1 2 3 4 −101234 µ1 µ2 Gibbs started at random Gibbs stuck at the wrong mode −1 0 1 2 3 −10123 µ1 µ2
  • 141. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Slice sampler as generic Gibbs If f(θ) can be written as a product k i=1 fi(θ),
  • 142. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Slice sampler as generic Gibbs If f(θ) can be written as a product k i=1 fi(θ), it can be completed as k i=1 I0≤ωi≤fi(θ), leading to the following Gibbs algorithm:
  • 143. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Algorithm (Slice sampler) Simulate 1. ω (t+1) 1 ∼ U[0,f1(θ(t))]; . . . k. ω (t+1) k ∼ U[0,fk(θ(t))]; k+1. θ(t+1) ∼ UA(t+1) , with A(t+1) = {y; fi(y) ≥ ω (t+1) i , i = 1, . . . , k}.
  • 144. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example of results with a truncated N(−3, 1) distribution 0.0 0.2 0.4 0.6 0.8 1.0 0.0000.0020.0040.0060.0080.010 x y Number of Iterations 2
  • 145. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example of results with a truncated N(−3, 1) distribution 0.0 0.2 0.4 0.6 0.8 1.0 0.0000.0020.0040.0060.0080.010 x y Number of Iterations 2, 3
  • 146. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example of results with a truncated N(−3, 1) distribution 0.0 0.2 0.4 0.6 0.8 1.0 0.0000.0020.0040.0060.0080.010 x y Number of Iterations 2, 3, 4
  • 147. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example of results with a truncated N(−3, 1) distribution 0.0 0.2 0.4 0.6 0.8 1.0 0.0000.0020.0040.0060.0080.010 x y Number of Iterations 2, 3, 4, 5
  • 148. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example of results with a truncated N(−3, 1) distribution 0.0 0.2 0.4 0.6 0.8 1.0 0.0000.0020.0040.0060.0080.010 x y Number of Iterations 2, 3, 4, 5, 10
  • 149. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example of results with a truncated N(−3, 1) distribution 0.0 0.2 0.4 0.6 0.8 1.0 0.0000.0020.0040.0060.0080.010 x y Number of Iterations 2, 3, 4, 5, 10, 50
  • 150. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example of results with a truncated N(−3, 1) distribution 0.0 0.2 0.4 0.6 0.8 1.0 0.0000.0020.0040.0060.0080.010 x y Number of Iterations 2, 3, 4, 5, 10, 50, 100
  • 151. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Good slices The slice sampler usually enjoys good theoretical properties (like geometric ergodicity and even uniform ergodicity under bounded f and bounded X ). As k increases, the determination of the set A(t+1) may get increasingly complex.
  • 152. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example (Stochastic volatility core distribution) Difficult part of the stochastic volatility model π(x) ∝ exp − σ2 (x − µ)2 + β2 exp(−x)y2 + x /2 , simplified in exp − x2 + α exp(−x)
  • 153. MCMC and Likelihood-free Methods The Gibbs Sampler Completion Example (Stochastic volatility core distribution) Difficult part of the stochastic volatility model π(x) ∝ exp − σ2 (x − µ)2 + β2 exp(−x)y2 + x /2 , simplified in exp − x2 + α exp(−x) Slice sampling means simulation from a uniform distribution on A = x; exp − x2 + α exp(−x) /2 ≥ u = x; x2 + α exp(−x) ≤ ω if we set ω = −2 log u. Note Inversion of x2 + α exp(−x) = ω needs to be done by trial-and-error.
  • 154. MCMC and Likelihood-free Methods The Gibbs Sampler Completion 0 10 20 30 40 50 60 70 80 90 100 −0.1 −0.05 0 0.05 0.1 Lag Correlation −1 −0.5 0 0.5 1 1.5 2 2.5 3 3.5 0 0.2 0.4 0.6 0.8 1 Density Histogram of a Markov chain produced by a slice sampler and target distribution in overlay.
  • 155. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence Properties of the Gibbs sampler Theorem (Convergence) For (Y1, Y2, · · · , Yp) ∼ g(y1, . . . , yp), if either [Positivity condition] (i) g(i)(yi) > 0 for every i = 1, · · · , p, implies that g(y1, . . . , yp) > 0, where g(i) denotes the marginal distribution of Yi, or (ii) the transition kernel is absolutely continuous with respect to g, then the chain is irreducible and positive Harris recurrent.
  • 156. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence Properties of the Gibbs sampler (2) Consequences (i) If h(y)g(y)dy < ∞, then lim nT→∞ 1 T T t=1 h1(Y (t) ) = h(y)g(y)dy a.e. g. (ii) If, in addition, (Y (t)) is aperiodic, then lim n→∞ Kn (y, ·)µ(dx) − f TV = 0 for every initial distribution µ.
  • 157. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence Slice sampler fast on that slice For convergence, the properties of Xt and of f(Xt) are identical Theorem (Uniform ergodicity) If f is bounded and suppf is bounded, the simple slice sampler is uniformly ergodic. [Mira & Tierney, 1997]
  • 158. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence A small set for a slice sampler no slice detail For > , C = {x ∈ X; < f(x) < } is a small set: Pr(x, ·) ≥ µ(·) where µ(A) = 1 0 λ(A ∩ L( )) λ(L( )) d if L( ) = {x ∈ X; f(x) > }‘ [Roberts & Rosenthal, 1998]
  • 159. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence Slice sampler: drift Under differentiability and monotonicity conditions, the slice sampler also verifies a drift condition with V (x) = f(x)−β, is geometrically ergodic, and there even exist explicit bounds on the total variation distance [Roberts & Rosenthal, 1998]
  • 160. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence Slice sampler: drift Under differentiability and monotonicity conditions, the slice sampler also verifies a drift condition with V (x) = f(x)−β, is geometrically ergodic, and there even exist explicit bounds on the total variation distance [Roberts & Rosenthal, 1998] Example (Exponential Exp(1)) For n > 23, ||Kn (x, ·) − f(·)||TV ≤ .054865 (0.985015)n (n − 15.7043)
  • 161. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence Slice sampler: convergence no more slice detail Theorem For any density such that ∂ ∂ λ ({x ∈ X; f(x) > }) is non-increasing then ||K523 (x, ·) − f(·)||TV ≤ .0095 [Roberts & Rosenthal, 1998]
  • 162. MCMC and Likelihood-free Methods The Gibbs Sampler Convergence A poor slice sampler Example Consider f(x) = exp {−||x||} x ∈ Rd Slice sampler equivalent to one-dimensional slice sampler on π(z) = zd−1 e−z z > 0 or on π(u) = e−u1/d u > 0 Poor performances when d large (heavy tails) 0 200 400 600 800 1000 -2-101 1 dimensional run correlation 0 10 20 30 40 0.00.20.40.60.81.0 1 dimensional acf 0 200 400 600 800 1000 1015202530 10 dimensional run correlation 0 10 20 30 40 0.00.20.40.60.81.0 10 dimensional acf 0 200 400 600 800 1000 0204060 20 dimensional run correlation 0 10 20 30 40 0.00.20.40.60.81.0 20 dimensional acf 0 200 400 600 800 1000 0100200300400 100 dimensional run correlation 0 10 20 30 40 0.00.20.40.60.81.0 100 dimensional acf Sample runs of log(u) and ACFs for log(u) (Roberts
  • 163. MCMC and Likelihood-free Methods The Gibbs Sampler The Hammersley-Clifford theorem Hammersley-Clifford theorem An illustration that conditionals determine the joint distribution Theorem If the joint density g(y1, y2) have conditional distributions g1(y1|y2) and g2(y2|y1), then g(y1, y2) = g2(y2|y1) g2(v|y1)/g1(y1|v) dv . [Hammersley & Clifford, circa 1970]
  • 164. MCMC and Likelihood-free Methods The Gibbs Sampler The Hammersley-Clifford theorem General HC decomposition Under the positivity condition, the joint distribution g satisfies g(y1, . . . , yp) ∝ p j=1 g j (y j |y 1 , . . . , y j−1 , y j+1 , . . . , y p ) g j (y j |y 1 , . . . , y j−1 , y j+1 , . . . , y p ) for every permutation on {1, 2, . . . , p} and every y ∈ Y .
  • 165. MCMC and Likelihood-free Methods Population Monte Carlo Sequential importance sampling Computational issues in Bayesian statistics The Metropolis-Hastings Algorithm The Gibbs Sampler Population Monte Carlo Approximate Bayesian computation ABC for model choice
  • 166. MCMC and Likelihood-free Methods Population Monte Carlo Importance sampling (revisited) basic importance Approximation of integrals I = h(x)π(x)dx by unbiased estimators ˆI = 1 n n i=1 ih(xi) when x1, . . . , xn iid ∼ q(x) and i def = π(xi) q(xi)
  • 167. MCMC and Likelihood-free Methods Population Monte Carlo Iterated importance sampling As in Markov Chain Monte Carlo (MCMC) algorithms, introduction of a temporal dimension : x (t) i ∼ qt(x|x (t−1) i ) i = 1, . . . , n, t = 1, . . . and ˆIt = 1 n n i=1 (t) i h(x (t) i ) is still unbiased for (t) i = πt(x (t) i ) qt(x (t) i |x (t−1) i ) , i = 1, . . . , n
  • 168. MCMC and Likelihood-free Methods Population Monte Carlo Fundamental importance equality Preservation of unbiasedness E h(X(t) ) π(X(t)) qt(X(t)|X(t−1)) = h(x) π(x) qt(x|y) qt(x|y) g(y) dx dy = h(x) π(x) dx for any distribution g on X(t−1)
  • 169. MCMC and Likelihood-free Methods Population Monte Carlo Sequential variance decomposition Furthermore, var ˆIt = 1 n2 n i=1 var (t) i h(x (t) i ) , if var (t) i exists, because the x (t) i ’s are conditionally uncorrelated Note This decomposition is still valid for correlated [in i] x (t) i ’s when incorporating weights (t) i
  • 170. MCMC and Likelihood-free Methods Population Monte Carlo Simulation of a population The importance distribution of the sample (a.k.a. particles) x(t) qt(x(t) |x(t−1) ) can depend on the previous sample x(t−1) in any possible way as long as marginal distributions qit(x) = qt(x(t) ) dx (t) −i can be expressed to build importance weights it = π(x (t) i ) qit(x (t) i )
  • 171. MCMC and Likelihood-free Methods Population Monte Carlo Special case of the product proposal If qt(x(t) |x(t−1) ) = n i=1 qit(x (t) i |x(t−1) ) [Independent proposals] then var ˆIt = 1 n2 n i=1 var (t) i h(x (t) i ) ,
  • 172. MCMC and Likelihood-free Methods Population Monte Carlo Validation skip validation E (t) i h(X (t) i ) (t) j h(X (t) j ) = h(xi) π(xi) qit(xi|x(t−1)) π(xj) qjt(xj|x(t−1)) h(xj) qit(xi|x(t−1) ) qjt(xj|x(t−1) ) dxi dxj g(x(t−1) )dx(t−1) = Eπ [h(X)]2 whatever the distribution g on x(t−1)
  • 173. MCMC and Likelihood-free Methods Population Monte Carlo Self-normalised version In general, π is unscaled and the weight (t) i ∝ π(x (t) i ) qit(x (t) i ) , i = 1, . . . , n , is scaled so that i (t) i = 1
  • 174. MCMC and Likelihood-free Methods Population Monte Carlo Self-normalised version properties Loss of the unbiasedness property and the variance decomposition Normalising constant can be estimated by t = 1 tn t τ=1 n i=1 π(x (τ) i ) qiτ (x (τ) i ) Variance decomposition (approximately) recovered if t−1 is used instead
  • 175. MCMC and Likelihood-free Methods Population Monte Carlo Sampling importance resampling Importance sampling from g can also produce samples from the target π [Rubin, 1987]
  • 176. MCMC and Likelihood-free Methods Population Monte Carlo Sampling importance resampling Importance sampling from g can also produce samples from the target π [Rubin, 1987] Theorem (Bootstraped importance sampling) If a sample (xi )1≤i≤m is derived from the weighted sample (xi, i)1≤i≤n by multinomial sampling with weights i, then xi ∼ π(x)
  • 177. MCMC and Likelihood-free Methods Population Monte Carlo Sampling importance resampling Importance sampling from g can also produce samples from the target π [Rubin, 1987] Theorem (Bootstraped importance sampling) If a sample (xi )1≤i≤m is derived from the weighted sample (xi, i)1≤i≤n by multinomial sampling with weights i, then xi ∼ π(x) Note Obviously, the xi ’s are not iid
  • 178. MCMC and Likelihood-free Methods Population Monte Carlo Iterated sampling importance resampling This principle can be extended to iterated importance sampling: After each iteration, resampling produces a sample from π [Again, not iid!]
  • 179. MCMC and Likelihood-free Methods Population Monte Carlo Iterated sampling importance resampling This principle can be extended to iterated importance sampling: After each iteration, resampling produces a sample from π [Again, not iid!] Incentive Use previous sample(s) to learn about π and q
  • 180. MCMC and Likelihood-free Methods Population Monte Carlo Generic Population Monte Carlo Algorithm (Population Monte Carlo Algorithm) For t = 1, . . . , T For i = 1, . . . , n, 1. Select the generating distribution qit(·) 2. Generate ˜x (t) i ∼ qit(x) 3. Compute (t) i = π(˜x (t) i )/qit(˜x (t) i ) Normalise the (t) i ’s into ¯ (t) i ’s Generate Ji,t ∼ M((¯ (t) i )1≤i≤N ) and set xi,t = ˜x (t) Ji,t
  • 181. MCMC and Likelihood-free Methods Population Monte Carlo D-kernels in competition A general adaptive construction: Construct qi,t as a mixture of D different transition kernels depending on x (t−1) i qi,t = D =1 pt, K (x (t−1) i , x), D =1 pt, = 1 , and adapt the weights pt, .
  • 182. MCMC and Likelihood-free Methods Population Monte Carlo D-kernels in competition A general adaptive construction: Construct qi,t as a mixture of D different transition kernels depending on x (t−1) i qi,t = D =1 pt, K (x (t−1) i , x), D =1 pt, = 1 , and adapt the weights pt, . Darwinian example Take pt, proportional to the survival rate of the points (a.k.a. particles) x (t) i generated from K
  • 183. MCMC and Likelihood-free Methods Population Monte Carlo Implementation Algorithm (D-kernel PMC) For t = 1, . . . , T generate (Ki,t)1≤i≤N ∼ M ((pt,k)1≤k≤D) for 1 ≤ i ≤ N, generate ˜xi,t ∼ KKi,t (x) compute and renormalize the importance weights ωi,t generate (Ji,t)1≤i≤N ∼ M ((ωi,t)1≤i≤N ) take xi,t = ˜xJi,t,t and pt+1,d = N i=1 ¯ωi,tId(Ki,t)
  • 184. MCMC and Likelihood-free Methods Population Monte Carlo Links with particle filters Sequential setting where π = πt changes with t: Population Monte Carlo also adapts to this case Can be traced back all the way to Hammersley and Morton (1954) and the self-avoiding random walk problem Gilks and Berzuini (2001) produce iterated samples with (SIR) resampling steps, and add an MCMC step: this step must use a πt invariant kernel Chopin (2001) uses iterated importance sampling to handle large datasets: this is a special case of PMC where the qit’s are the posterior distributions associated with a portion kt of the observed dataset