【学习笔记】数理统计习题十二

Q1: Consider the multiple linear regression model
Y = X β + ϵ , \boldsymbol{Y} = \boldsymbol{X}\boldsymbol {\beta} + \boldsymbol\epsilon, Y=Xβ+ϵ,
where Y = ( y 1 , … , y n ) ⊤ \boldsymbol Y=(y_1,\dots,y_n)^\top Y=(y1,,yn), β = ( β 0 , … , β p − 1 ) ⊤ \boldsymbol\beta=(\beta_0,\dots,\beta_{p-1})^\top β=(β0,,βp1), X \boldsymbol X X is the n × p n\times p n×p design matrix, and ϵ = ( ϵ 1 , … , ϵ n ) ⊤ \boldsymbol\epsilon=(\epsilon_1,\dots,\epsilon_n)^\top ϵ=(ϵ1,,ϵn). Assume that r a n k ( X ) = p < n \mathrm{rank}(X)=p<n rank(X)=p<n, E [ ϵ ] = 0 E[\boldsymbol\epsilon]=\boldsymbol 0 E[ϵ]=0, and V a r [ ϵ ] = σ 2 I n \mathrm{Var}[\boldsymbol\epsilon]= \sigma^2 I_n Var[ϵ]=σ2In with σ > 0 \sigma>0 σ>0.

(a). Show that the covariance matrix of the least squares estimates is diagonal if and only if the columns of X \boldsymbol{X} X, X 1 , … , X p \boldsymbol{X}_1,\dots,\boldsymbol{X}_p X1,,Xp, are orthogonal, that is X i ⊤ X j = 0 \boldsymbol{X}_i^\top \boldsymbol{X}_j=0 XiXj=0 for i ≠ j i\neq j i=j.

(b). Let y ^ i \hat y_i y^i and ϵ ^ i \hat\epsilon_i ϵ^i be the fitted values and the residuals, respectively. Show that n σ 2 = ∑ i = 1 n V a r [ y ^ i ] + ∑ i = 1 n V a r [ ϵ ^ i ] n\sigma^2 = \sum_{i=1}^n \mathrm{Var}[\hat y_i]+\sum_{i=1}^n\mathrm{Var}[\hat\epsilon_i] nσ2=i=1nVar[y^i]+i=1nVar[ϵ^i].

(c). Suppose further that ϵ ∼ N ( 0 , σ 2 I n ) \boldsymbol\epsilon\sim N(\boldsymbol 0,\sigma^2 I_n) ϵN(0,σ2In), and you use F test to handle the hypothesis
H 0 : β 1 = β 2 = ⋯ = β p − 1 = 0   v s .   H 1 : ∑ i = 1 p − 1 β i 2 ≠ 0. H_0: \beta_1=\beta_2=\dots=\beta_{p-1}=0\ vs.\ H_1:\sum_{i=1}^{p-1} \beta_i^2\neq0. H0:β1=β2==βp1=0 vs. H1:i=1p1βi2=0.If the coefficient of determination R 2 = 0.58 R^2=0.58 R2=0.58, p = 5 p = 5 p=5 and n = 15 n=15 n=15, is the null rejected at the significance level α = 0.05 \alpha =0.05 α=0.05?
( F 0.95 ( 4 , 10 ) = 3.48 , F 0.95 ( 5 , 10 ) = 3.33 , t 0.95 ( 10 ) = 1.81 F_{0.95}(4,10)=3.48,F_{0.95}(5,10)=3.33,t_{0.95}(10)=1.81 F0.95(4,10)=3.48,F0.95(5,10)=3.33,t0.95(10)=1.81)

解:
  (a) 可知最小二乘估计为 β ^ = ( X T X ) − 1 X T Y \hat\beta=(X^TX)^{-1}X^TY β^=(XTX)1XTY,有 V a r [ β ^ ] = σ 2 ( X T X ) − 1 Var[\hat\beta]=\sigma^2(X^TX)^{-1} Var[β^]=σ2(XTX)1,如果 V a r [ β ^ ] Var[\hat\beta] Var[β^]是对角的,于是 X T X X^TX XTX也是对角的,这表明矩阵 X \pmb X XXX的列 X 1 , X 2 , ⋯   , X n X_1,X_2,\cdots,X_n X1,X2,,Xn是正交的,反之亦然。
  (b) 残差为 ϵ ^ = Y − X β ^ = Y − X ( X T X ) − 1 X T Y = ( I n − P ) Y \hat\epsilon=Y-X\hat\beta=Y-X(X^TX)^{-1}X^TY=(I_n-P)Y ϵ^=YXβ^=YX(XTX)1XTY=(InP)Y其中 P = X ( X T X ) − 1 X T P=X(X^TX)^{-1}X^T P=X(XTX

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值