矩母函数(Monment Generating Function)
矩母函数(Monment Generating Function)
D e f i n i t i o n ( m g f ) : ( f r o m M a t h e m a t i c a l S t a t i s t i c s ) f o r a r . v . X , i f E ( e t X ) e x i s t s f o r a n y t ∈ ( − ϵ , + ϵ ) f o r s o m e ϵ > 0 t h e n : M x ( t ) = E ( e t X ) i s c a l l e d t h e c o m m e n t g e n e r a t i n g f u n c t i o n ( m g f ) o f X Definition(mgf):\\ (from~ Mathematical~Statistics)\\ for~a~r.v.~X,if~E(e^{tX})~exists~for~any~t{\in(-\epsilon,+\epsilon) }~for ~some ~\epsilon>0\\ then:\\ M_x(t)=E(e^{tX})\\ is~called~the~comment~generating~function~(mgf)~of~X Definition(mgf):(from Mathematical Statistics)for a r.v. X,if E(etX) exists for any t∈(−ϵ,+ϵ) for some ϵ>0then:Mx(t)=E(etX)is called the comment generating function (mgf) of X
Some properties of mgf:
-
M a + b X ( t ) = e a t M x ( b t ) , w h e r e a a n d b a r e t w o c o n s t a n t s M a + b X ( t ) = E ( e t ( a + b X ) ) = E ( e a t e b t X ) = e a t E ( e t b X ) = e a t M x ( b t ) E ( X ) D i s c r e t e : E ( x ) = ∑ k = 1 + ∞ x k p k C o n t i n u o u s : E ( X ) = ∫ − ∞ + ∞ x f ( x ) d x M_{a+bX}(t)=e^{at}M_x(bt),where~a~and~b~are~two~constants\\ M_{a+bX}(t)=E(e^{t(a+bX)})=\\ E(e^{at}e^{btX}) =e^{at}E(e^{tbX}) =e^{at}M_x(bt)\\ \\ E(X)\\ Discrete:E(x)= \sum^{+\infty}_{k=1}{x_k}{p_k} \\ Continuous:E(X)=\int_{-\infty}^{+\infty}{xf(x)dx}\\ Ma+bX(t)=eatMx(bt),where a and b are two constantsMa+bX(t)=E(et(a+bX))=E(eatebtX)=eatE(etbX)=eatMx(bt)E(X)Discrete:E(x)=k=1∑+∞xkpkContinuous:E(X)=∫−∞+∞xf(x)dx
-
i f X ∥ ‾ Y , t h e n : M X + Y ( t ) = M x ( t ) M y ( t ) b e c a u s e w h e n X ∥ ‾ Y : E ( e X Y ) = E ( e X e Y ) = E ( e X ) E ( e Y ) if~X{\underline{\|}}Y,then:\\ M_{X+Y}(t)=M_{x}(t)M_y(t)\\ because~when~X{\underline{\|}}Y:\\ E(e^{XY})=E(e^Xe^Y)=E(e^X)E(e^Y)\\ if X∥Y,then:MX+Y(t)=Mx(t)My(t)because when X∥Y:E(eXY)=E(eXeY)=E(eX)E(eY)
-
m g f : O n l y c e r t a i n , b u t n o t n e c e s s a r i l y a l l e x i s t mgf:Only ~certain, ~but ~not~ necessarily~ all ~exist~ mgf:Only certain, but not necessarily all exist
-
M ( 0 ) = 1 M(0)=1 M(0)=1
-
M k ( 0 ) = E ( X k ) M ( t ) = E ( e t x ) = ∫ e t x f ( x ) d x M ′ ( t ) = ∫ e t x x f ( x ) d x M ′ ( 0 ) = ∫ x f ( x ) d x = E X M^k(0)=E(X^k)\\ M(t)=E(e^{tx})=\int{e^{tx}{f(x)}dx}\\ M'(t)=\int{e^{tx}xf(x)dx}\\ M'(0)=\int{xf(x)dx}=EX\\ Mk(0)=E(Xk)M(t)=E(etx)=∫etxf(x)dxM′(t)=∫etxxf(x)dxM′(0)=∫xf(x)dx=EX
-
i f X 1... X n a r e i n d e p e n d e n t r a n d o m v a r i a b l e s a n d M i ( t ) a r e t h e i r m g f , a n d a s s u m e Y = ∑ i = 1 n x i t h e n : t h e m g f o f Y i s ∏ i = 1 n M i ( t ) P r o v e : M Y ( t ) = E ( e t Y ) = E ( e t ( X 1 + . . . . . . X n ) ) = E ( e t X 1 . . . . . . e t X n ) b e c a u s e X 1... X n a r e i n d e p e n d e n t r a n d o m v a r i a b l e s : E ( e t X 1 . . . . . . e t X n ) = ∏ i = 1 n M i ( t ) if~X1...X_n~are~independent~random~variables\\ and~M_i(t)~are~their~mgf,and~assume~Y =\sum^{n}_{i=1}{x_i}\\ then:\\ the~mgf~of~Y~is\prod^{n}_{i=1}M_i(t)\\ \\ Prove:\\ M_Y(t)=E{(e^{tY})}=E(e^{t{(X_1+......X_n)}})=E(e^{tX_1}......e^{tX_n})\\ because~X1...X_n~are~independent~random~variables:\\ E(e^{tX_1}......e^{tX_n})=\prod^{n}_{i=1}M_i(t)\\ if X1...Xn are independent random variablesand Mi(t) are their mgf,and assume Y=i=1∑nxithen:the mgf of Y isi=1∏nMi(t)Prove:MY(t)=E(etY)=E(et(X1+......Xn))=E(etX1......etXn)because X1...Xn are independent random variables:E(etX1......etXn)=i=1∏nMi(t)