概率论常用定理


Central Limit Theorem ( 中心极限定理)

Independent Identically Distribution (独立同分布的中心极限定理 )

Definition

It refers that a sequence of random variables and its distribution can be asymptotically approximated to Normal/Gaussian distribution.

Let random variables X 1 , X 2 , . . . . . . X n X1,X2,......X_n X1X2......Xn be Independently Identical distributed(i.i.d), and we have E ( X i ) = μ ,   D ( X i ) = σ 2 ,   ( i = 1 , 2... , n ) \mathbb{E}(X_i)=μ,~ \mathbb{D}(X_i)=σ^2,~ (i=1,2...,n) E(Xi)=μ, D(Xi)=σ2, (i=1,2...,n). For any x x x, the distribution function
F n ( x ) = P { ∑ i = 1 n X i − n μ σ n ≤ x } F_{n}(x)=P\left\{\frac{\sum_{i=1}^{n} X_{i}-n \mu}{\sigma \sqrt{n}} \leq x\right\} Fn(x)=P{σn i=1nXinμx} satisfies
lim ⁡ n → ∞ F n ( x ) = lim ⁡ n → ∞ { ∑ i = 1 n X i − n μ n σ ≤ x } = 1 2 π ∫ − ∞ x e − − t 2 2 d t = ∅ ( x ) . \lim _{n \rightarrow \infty} F_{n}(x)=\lim _{n \rightarrow \infty}\left\{\frac{\sum_{i=1}^{n} X_{i}-n \mu}{\sqrt{n} \sigma} \leq x\right\}=\frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{x} e^{-\frac{-t^{2}}{2}} d t=\emptyset(x). nlimFn(x)=nlim{n σi=1nXinμx}=2π 1xe2t2dt=(x).
This theorem give the fact: for enough n n n, random variable Y n = ∑ i = 1 n X i − n μ n σ Y_n = \frac{\sum_{i=1}^{n} X_i-n\mu }{\sqrt{n}\sigma} Yn=n σi=1nXinμ can be approximated to standard normal distribution, i.e., Y n ∼ N ( 0 , 1 ) Y_n \sim N(0,1) YnN(0,1). Thus, for enough n n n, ∑ i = 1 n X i = n σ Y n + n μ \sum_{i=1}^{n} X_i = \sqrt{n}\sigma Y_n + n \mu i=1nXi=n σYn+nμ can be approximated to normal distribution N ( n μ , n σ ) N(n\mu,n\sigma) N(nμ,nσ).
Note that for sufficiently large n n n, the sum of n n n Independently Identically Distributed random variables can be considered as a normal variable.

Different Distributed Central Limit Theorem

Let X 1 , X 2 , . . . , X n X_1, X_2, ...,X_n X1,X2,...,Xn and f x k ( x ) f_{x_k}(x) fxk(x) for k = 1 , . . . , n k=1,...,n k=1,...,n be a list of independent random variables and probability density function, respectively, with the following mean and variable:
E ( X k ) = μ k ,     D ( X k ) = σ k 2 . \mathbb{E}(X_k) = \mu k,~~~ \mathbb{D}(X_k) = \sigma^2_k. E(Xk)=μk,   D(Xk)=σk2. Then, we define B n 2 = ∑ i = 1 n σ k 2 B^2_n = \sum_{i=1}^n \sigma^2_k Bn2=i=1nσk2 and Y n = ∑ k = 1 n X k − ∑ k = 1 n μ k B n . Y_n = \frac{ \sum_{k=1}^n X_k - \sum_{k=1}^n \mu k}{B_n}. Yn=Bnk=1nXkk=1nμk.

For any positive number τ \tau τ,
lim ⁡ n → ∞ 1 B n 2 ∑ k = 1 n ∫ ∣ x − μ k > τ B n ∣ ( x − μ k ) 2 f x k ( x ) d x = 0. \lim _{n \rightarrow \infty} \frac{1}{B_{n}^{2}} \sum_{k=1}^{n} \int_{\left|x-\mu_{k}>\tau B_{n}\right|}\left(x-\mu_{k}\right)^{2} f_{x_{k}}(x) d x=0. nlimBn21k=1nxμk>τBn(xμk)2fxk(x)dx=0.

For any x x x, the distribution function F n ( x ) F_n(x) Fn(x) of random variable Y n Y_n Yn should satisfy lim ⁡ n → ∞ F n ( x ) = lim ⁡ n → ∞ P { ∑ k = 1 n X k − ∑ k = 1 n μ k B n ≤ x } = 1 2 π ∫ − ∞ x e − t 2 2 d t . \lim _{n \rightarrow \infty} F_{n}(x)=\lim _{n \rightarrow \infty} P\left\{\frac{\sum_{k=1}^{n} X_{k}-\sum_{k=1}^{n} \mu k}{B_{n}} \leq x\right\}=\frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{x} e^{-\frac{t^{2}}{2}} \text{d} t. nlimFn(x)=nlimP{Bnk=1nXkk=1nμkx}=2π 1xe2t2dt.

This theorem proves that the distribution function can be approximated to normal/Gaussian distribution, if the random variables have a large number of independent and uniform random variables.


  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值