NOTES
1 Multiple reegression model
1.1 multiple regression equation
y = β 0 + β 1 x 1 + β 2 x 2 + β 3 x 3 + β 4 x 4 + . . . + β k x k + ε y=\beta_0+\beta_1x_1+\beta_2x_2+\beta_3x_3+\beta_4x_4+...+\beta_kx_k+\varepsilon y=β0+β1x1+β2x2+β3x3+β4x4+...+βkxk+ε
Assumption:
A1: E ( ε ) = 0 E(\varepsilon)=0 E(ε)=0
A2: The variance of ε \varepsilon ε is constant, v a r ( ε ) = σ 2 var(\varepsilon)=\sigma^2 var(ε)=σ2
A3: The ε \varepsilon ε's are independent.
A4: The values of xiare not random and are not exact linear functions of the other explanatory variables
A5: The ε \varepsilon ε's are normally distributed
1.2 estimated multiple regression equation
y ^ = β ^ 0 + β ^ 1 x 1 + β ^ 2 x 2 + β ^ 3 x 3 + β ^ 4 x 4 + . . . + β ^ k x k \hat y=\hat \beta_0+\hat \beta_1x_1+\hat \beta_2x_2+\hat \beta_3x_3+\hat \beta_4x_4+...+\hat \beta_kx_k y^=β^0+β^1x1+β^2x2+β^3x3+β^4x4+...+β^kxk
1.3 method of least squares
Q = ∑ ( y i − y ^ i ) 2 = ∑ ( y i − β ^ 0 − β ^ 1 x 1 − . . . − β ^ k x k ) 2 Q=\sum(y_i-\hat y_i)^2=\sum(y_i-\hat \beta_0-\hat \beta_1x_1-...-\hat \beta_kx_k)^2 Q=∑(yi−y^i)2=∑(yi−β^0−β^1x1−...−β^kxk)2
minimize:
{ ∂ Q ∂ β 0 ∣ β 0 = β ^ 0 = 0 ∂ Q ∂ β i ∣ β i = β ^ i = 0 , i = 1 , 2 , . . . , k \begin{cases} \frac{\partial Q}{\partial \beta_0}|_{\beta_0=\hat\beta_0}=0\\ \frac{\partial Q}{\partial \beta_i}|_{\beta_i=\hat\beta_i}=0, i=1,2,...,k \end{cases} {
∂β0∂Q∣β0=β^0=0∂βi∂Q