多元线性回归

多元变量梯度下降

$\begin{array}{rl}& \text{repeat until convergence:}\phantom{\rule{thickmathspace}{0ex}}\left\{\\ \phantom{\rule{thickmathspace}{0ex}}& {\theta }_{0}:={\theta }_{0}-\alpha \frac{1}{m}\sum _{i=1}^{m}\left({h}_{\theta }\left({x}^{\left(i\right)}\right)-{y}^{\left(i\right)}\right)\cdot {x}_{0}^{\left(i\right)}\\ \phantom{\rule{thickmathspace}{0ex}}& {\theta }_{1}:={\theta }_{1}-\alpha \frac{1}{m}\sum _{i=1}^{m}\left({h}_{\theta }\left({x}^{\left(i\right)}\right)-{y}^{\left(i\right)}\right)\cdot {x}_{1}^{\left(i\right)}\\ \phantom{\rule{thickmathspace}{0ex}}& {\theta }_{2}:={\theta }_{2}-\alpha \frac{1}{m}\sum _{i=1}^{m}\left({h}_{\theta }\left({x}^{\left(i\right)}\right)-{y}^{\left(i\right)}\right)\cdot {x}_{2}^{\left(i\right)}\\ & \cdots \\ \right\}\end{array}$\begin{align*} & \text{repeat until convergence:} \; \lbrace \newline \; & \theta_0 := \theta_0 - \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (h_\theta(x^{(i)}) - y^{(i)}) \cdot x_0^{(i)}\newline \; & \theta_1 := \theta_1 - \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (h_\theta(x^{(i)}) - y^{(i)}) \cdot x_1^{(i)} \newline \; & \theta_2 := \theta_2 - \alpha \frac{1}{m} \sum\limits_{i=1}^{m} (h_\theta(x^{(i)}) - y^{(i)}) \cdot x_2^{(i)} \newline & \cdots \newline \rbrace \end{align*}

特征缩放与均值归一化

${x}_{i}:=\frac{{x}_{i}-{\mu }_{i}}{{s}_{i}}$

正规方程法

$\frac{\mathrm{\partial }J\left(\theta \right)}{\mathrm{\partial }{\theta }_{j}}=0$$\frac{\partial J(\theta)}{\partial \theta_j}=0$，for $j=0,1,\cdots ,n$$j=0,1,\cdots,n$

$\theta =\left({X}^{T}X{\right)}^{-1}{X}^{T}y$$\theta = (X^T X)^{-1}X^T y$

$Y=X\theta$$Y=X\theta$
${X}^{T}Y={X}^{T}X\theta$$X^TY=X^TX\theta$
$\left({X}^{T}X{\right)}^{-1}{X}^{T}Y=\theta$$(X^TX)^{-1}X^TY=\theta$

XTX${X}^{T}X$$X^TX$不可逆问题

1. 多余的特征，比如两个特征可以通过线性关系相互表示
2. 特征数目过多，$m\le n$$m\leq n$，删除某些特征或者使用正则化

• 广告
• 抄袭
• 版权
• 政治
• 色情
• 无意义
• 其他

120