文章目录
一、一元线性回归
1.1一元线性回归模型
1.1.1 表示方法1
{ η = a + b x + ε ε ∼ N ( 0 , σ 2 ) η ∼ N ( a + b x , σ 2 ) \left\{ \begin{array}{l} {\eta=a+b x+\varepsilon} \\ {\varepsilon \sim N\left(0, \sigma^{2}\right)}\\ {\eta \sim N\left(a+b x, \sigma^{2}\right)} \end{array}\right. ⎩⎨⎧η=a+bx+εε∼N(0,σ2)η∼N(a+bx,σ2)
1.1.2 表示方法2
{ η 1 = a + b x 1 + ε 1 η 2 = a + b x 2 + ε 2 … … … η n = a + b x n + ε n ε 1 , ε 2 … ε n ∼ i i d N ( 0 , σ 2 ) η i ∼ N ( a + b x i , σ 2 ) ( i = 1 , 2 , ⋯   , n ) \left\{ \begin{array}{l} {\eta_{1} =a+b x_{1}+\varepsilon_{1} }\\ {\eta_{2} =a+b x_{2}+\varepsilon_{2}}\\ \dots \ldots \ldots \\ {\eta_{n} =a+b x_{n}+\varepsilon_{n}}\\ {\varepsilon_{1} ,\varepsilon_{2}…\varepsilon_{n} \overset{iid} \sim N\left(0, \sigma^{2}\right)}\\ {\eta_{i} \sim N\left(a+b x_{i}, \sigma^{2}\right) \quad(i=1,2, \cdots, n)} \end{array} \right. ⎩⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎧η1=a+bx1+ε1η2=a+bx2+ε2………ηn=a+bxn+εnε1,ε2…εn∼iidN(0,σ2)ηi∼N(a+bxi,σ2)(i=1,2,⋯,n)
1.1.3 表示方法3
{
η
=
X
β
+
ε
ε
∼
N
(
0
,
σ
2
E
n
)
\left\{\begin{array}{l}{\eta=X \beta+\varepsilon} \\ {\varepsilon \sim N\left(0, \sigma^{2} E_{n}\right)}\end{array}\right.
{η=Xβ+εε∼N(0,σ2En)
X
=
[
1
x
1
⋮
⋮
1
x
n
]
,
η
=
[
η
1
⋮
η
n
]
X=\left[ \begin{array}{cccc} {1} & {x_{1}} \\ {\vdots} & {\vdots} \\ {1} & {x_{n }} \end{array} \right], \quad η=\left[ \begin{array}{c} {η_{1}} \\ {\vdots} \\ {η_{n}} \end{array} \right]
X=⎣⎢⎡1⋮1x1⋮xn⎦⎥⎤,η=⎣⎢⎡η1⋮ηn⎦⎥⎤
ε
=
[
ε
1
⋯
ε
n
]
T
,
β
=
[
a
b
]
T
\varepsilon=\left[ \begin{array}{lll}{\varepsilon_{1}} & {\cdots} & {\varepsilon_{n}}\end{array}\right]^{T}, \quad \beta=\left[ \begin{array}{llll}{a} & {b} \end{array}\right]^{T}
ε=[ε1⋯εn]T,β=[ab]T
1.2 一元线性回归函数
y ~ = E η = a + b x \tilde{y}=E \eta=a+b x y~=Eη=a+bx
1.3 一元线性回归方程
一元线性回归方程又称为经验回归方程、回归方程、经验公式,其图形称为经验回归直线,简称回归直线
y
^
=
a
^
+
b
^
x
\hat{y}=\hat{a}+\hat{b} x
y^=a^+b^x
二、多元线性回归
2.1 多元线性回归模型
2.1.1表示方法1
{ η = b 0 + b 1 x 1 + b 2 x 2 + ⋯ + b m x m + ε ε ∼ N ( 0 , σ 2 ) η ∼ N ( b 0 + b 1 x 1 + b 2 x 2 + ⋯ + b m x m , σ 2 ) \left\{ \begin{array}{l} {\eta=b_{0}+b_{1} x_{1}+b_{2} x_{2}+\cdots+b_{m} x_{m}+\varepsilon} \\ {\varepsilon \sim N\left(0, \sigma^{2}\right)}\\ {\eta \sim N\left(b_{0}+b_{1} x_{1}+b_{2} x_{2}+\cdots+b_{m} x_{m}, \sigma^{2}\right)}\\ \end{array}\right. ⎩⎨⎧η=b0+b1x1+b2x2+⋯+bmxm+εε∼N(0,σ2)η∼N(b0+b1x1+b2x2+⋯+bmxm,σ2)
2.1.2 表示方法2
{ η 1 = b 0 + b 1 x 11 + b 2 x 12 + ⋯ + b m x 1 m + ε 1 η 2 = b 0 + b 1 x 21 + b 2 x 22 + ⋯ + b m x 2 m + ε 2 … … … η n = b 0 + b 1 x n 1 + b 2 x n 2 + ⋯ + b m x n n + ε n ε 1 , ε 2 … ε n ∼ i i d N ( 0 , σ 2 ) η i ∼ N ( b 0 + b 1 x i 1 + b 2 x i 2 + ⋯ + b m x i m , σ 2 ) ( i = 1 , 2 , ⋯   , n ) \left\{\begin{array} {l} {\eta_{1} =b_{0}+b_{1} x_{11}+b_{2} x_{12}+\cdots+b_{m} x_{1 m}+\varepsilon_{1}} \\ {\eta_{2} =b_{0}+b_{1} x_{21}+b_{2} x_{22}+\cdots+b_{m} x_{2 m}+\varepsilon_{2}} \\ \dots \ldots \ldots \\ {\eta_{n} =b_{0}+b_{1} x_{n 1}+b_{2} x_{n 2}+\cdots+b_{m} x_{n n}+\varepsilon_{n} }\\ {\varepsilon_{1} ,\varepsilon_{2}…\varepsilon_{n} \overset{iid} \sim N\left(0, \sigma^{2}\right)}\\ {\eta_{i} \sim N\left(b_{0}+b_{1} x_{i1}+b_{2} x_{i2}+\cdots+b_{m} x_{im}, \sigma^{2}\right)\quad(i=1,2, \cdots, n)} \end{array}\right. ⎩⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎧η1=b0+b1x11+b2x12+⋯+bmx1m+ε1η2=b0+b1x21+b2x22+⋯+bmx2m+ε2………ηn=b0+b1xn1+b2xn2+⋯+bmxnn+εnε1,ε2…εn∼iidN(0,σ2)ηi∼N(b0+b1xi1+b2xi2+⋯+bmxim,σ2)(i=1,2,⋯,n)
2.1.3 表示方法3
{
η
=
X
β
+
ε
ε
∼
N
(
0
,
σ
2
E
n
)
\left\{\begin{array}{l}{\eta=X \beta+\varepsilon} \\ {\varepsilon \sim N\left(0, \sigma^{2} E_{n}\right)}\end{array}\right.
{η=Xβ+εε∼N(0,σ2En)
X
=
[
1
x
11
⋯
x
1
m
⋮
⋮
⋯
⋮
1
x
n
1
⋯
x
n
n
]
,
η
=
[
η
1
⋮
η
n
]
X=\left[ \begin{array}{cccc}{1} & {x_{11}} & {\cdots} & {x_{1 m}} \\ {\vdots} & {\vdots} & {\cdots} & {\vdots} \\ {1} & {x_{n 1}} & {\cdots} & {x_{n n}}\end{array}\right], \quad η=\left[ \begin{array}{c}{η_{1}} \\ {\vdots} \\ {η_{n}}\end{array}\right]
X=⎣⎢⎡1⋮1x11⋮xn1⋯⋯⋯x1m⋮xnn⎦⎥⎤,η=⎣⎢⎡η1⋮ηn⎦⎥⎤
ε
=
[
ε
1
⋯
ε
n
]
T
,
β
=
[
β
0
β
1
⋯
β
m
]
T
\varepsilon=\left[ \begin{array}{lll}{\varepsilon_{1}} & {\cdots} & {\varepsilon_{n}}\end{array}\right]^{T}, \quad \beta=\left[ \begin{array}{llll}{\beta_{0}} & {\beta_{1}} & {\cdots} & {\beta_{m}}\end{array}\right]^{T}
ε=[ε1⋯εn]T,β=[β0β1⋯βm]T
2.2 多元线性回归函数
y ~ = E η = b 0 + b 1 x 1 + b 2 x 2 + ⋯ + b m x m \tilde{y}=E \eta =b_{0}+b_{1} x_{1}+b_{2} x_{2}+\cdots+b_{m} x_{m} y~=Eη=b0+b1x1+b2x2+⋯+bmxm
2.3 多元线性回归方程
y ^ = b ^ 0 + b ^ 1 x 1 + b ^ 2 x 2 + ⋯ + b ^ m x m \hat{y}=\hat{b}_{0}+\hat{b}_{1} x_{1}+\hat{b}_{2} x_{2}+\cdots+\hat{b}_{m} x_{m} y^=b^0+b^1x1+b^2x2+⋯+b^mxm
三、比较
从这些罗列的公式中可以看出,回归模型、回归函数以及回归方程之间是有差别的:
- 回归模型初步建立起了自变量和因变量之间的关系,这个关系式包括两部分,一部分是自变量的线性函数部分,另一部分是剩余误差项 ;
线性部分反映了因变量随自变量变化而变化的部分,剩余误差项表示的是除x和η之间线性关系之外的随机因素对η的影响,代表着不能由x和η之间的线性关系所解释的变异性; - 回归函数,是描述随机变量η的平均值即期望是如何依赖于自变量x的函数;
- 回归方程它是回归函数的估计,是更为具体的表达。它首先利用样本的估计值 ( x i 1 , x i 2 , ⋯   , x i n ; y i ) ( i = 1 , 2 , ⋯   , n ) \left(x_{i 1}, x_{i 2}, \cdots, x_{i n} ; y_{i}\right) \quad(i=1,2, \cdots, n) (xi1,xi2,⋯,xin;yi)(i=1,2,⋯,n)求得回归常数和回归系数的估计量及其相应的估计值 b ^ 0 , b ^ 1 , … , b ^ m \hat{b}_{0},\hat{b}_{1},…,\hat{b}_{m} b^0,b^1,…,b^m,再将其带入到回归函数得到的表达式。