- Suppose f ( x ) = 1 2 ∥ A x − b ∥ 2 2 f(x)=\frac{1}{2}\|Ax-b\|_2^2 f(x)=21∥Ax−b∥22
- gradient-based optimization
- obtain the gradient: ∇ x f ( x ) = A T ( A x − b ) = A T A x − A T b \nabla_xf(x)=A^T(Ax-b)=A^TAx-A^Tb ∇xf(x)=AT(Ax−b)=ATAx−ATb
- detail algorithm
- Newton’s method
- constraint: x T x ≤ 1 x^Tx\leq1 xTx≤1
- Lagrangian: L ( x , λ ) = f ( x ) + λ ( x T x − 1 ) L(x,\lambda)=f(x)+\lambda(x^Tx-1) L(x,λ)=f(x)+λ(xTx−1)
- problem: min x max λ , λ ≥ 0 L ( x , λ ) \min\limits_x\max\limits_{\lambda,\lambda\geq0}L(x,\lambda) xminλ,λ≥0maxL(x,λ)
- differentiate: A T A x − A T b + 2 λ x = 0 A^TAx-A^Tb+2\lambda x=0 ATAx−ATb+2λx=0
- solution: x = ( A T A + 2 λ I ) − 1 A T b x=(A^TA+2\lambda I)^{-1}A^Tb x=(ATA+2λI)−1ATb
- λ \lambda λ for constraint: ∂ ∂ λ L ( x , λ ) = x T x − 1 \frac{\partial}{\partial\lambda}L(x,\lambda)=x^Tx-1 ∂λ∂L(x,λ)=xTx−1
C4eg1-Linear Least Squares
最新推荐文章于 2022-09-13 22:25:32 发布