MCMC sampling introduction

MCMC是贝叶斯常用的抽样算法之一,主要包括MH算法和Gibbs抽样等。MCMC是基于趋于平稳分布的Markov链,本文主要介绍了Markov链的基本性质和平稳分布。

1 Motivation

1.1 Monte Carlo (MC)

where is iid sample from the posterior distribution .

However, obtaining the iid sample is difficult, and the complete form of the posterior distribution must be known.

1.2 MCMC (Markov Chain MC)

MCMC generally generates a random sequence satisfying markov properties, which is ergodic and the limit distribution is . Based on Markov chains, get "not independent" samples from that have the same effect as iid sample.

2 Basic properties

The Markov chain satisfies: stationary distribution, ergodic (or irreducile and aperiodic).

  • Markov property

  • Transition Probability (kernel) Homogeneous: the transition kernel are the same for all .

  • Marginal Distribution

3. Stationary (Invariant) Chains

A -finite measure is invariant for the transition kernel (and for the associated chain) if

The invariant distribution is also referred to as stationary if is a probability measure, since implies that for every ; thus, the chain is stationary in distribution.

4. Detailed balance condition

Detailed balance condition

A Markov chain with transition kernel satisfies the detailed balance condition (reversible) if there exists a function satisfying

for every .

Proof.

Remark
Detailed balance condition provides a sufficient but not necessary condition for to be a stationary measure associated with the transition kernel .

Theorem

Suppose that a Markov chain with transition function satisfies the detailed balance condition with a probability density function. Then:
(1) The density is the invariant density of the chain.
(2) The chain is reversible.

Reversible:

5. Ergodicity

To prove the convergence (independence of initial conditions)

  • positive recurrent
    State is positive recurrent, if where is the first time to return state .

  • aperiodic
    A state has period ,

where gcd is greatest common divisor. Aperiodic means the gcd of any state is 1.

6. Irreducible

For any state , the probability of this chain going from state to state is positive. Namely, for ,

The law of large numbers for Markov chains

Suppose is a Markov chain with countable state space , and the transition probability matrix is , and suppose that it is irreducible and has stationary distribution . Then for any bounded functions and any initial distribution, have

Remark.

  • A given Markov chain may have more than one invariant distribution.

  • Stationary + ergodicity equilibrium distribution (unique)

  • Stationary + irreducible + aperiodic unique(A ergodic Markov chain is irreducible. )

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

weixin_43604091

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值