![093d8c582cf72c5dd936dc26aa9ec91c.png](https://i-blog.csdnimg.cn/blog_migrate/e768fb41660e3c0e28d96858bace00ea.jpeg)
上一节讲到了静态迭代法。因为收敛太慢,更常用的还是非静态迭代法。
这里先从变分原理入手。考虑函数
![equation?tex=%5CPhi%5Cleft%28%5Cvec%7Bx%7D%5Cright%29%3D%5Cfrac%7B1%7D%7B2%7D%5Cvec%7Bx%7D%5Ccdot+A%5Cvec%7Bx%7D-%5Cvec%7Bx%7D%5Ccdot+%5Cvec%7Bb%7D](https://i-blog.csdnimg.cn/blog_migrate/3ce4128d0a2e4dbd4869bc3e5c9b7173.png)
若
![equation?tex=A](https://i-blog.csdnimg.cn/blog_migrate/a86f8657535c007558ea628b4b5b9c7a.png)
![equation?tex=%5Cnabla+%5CPhi%5Cleft%28%5Cvec%7Bx%7D%5Cright%29%3DA%5Cvec%7Bx%7D-+%5Cvec%7Bb%7D](https://i-blog.csdnimg.cn/blog_migrate/11590a074a064877df0a0f1404f3a70e.png)
若
![equation?tex=A](https://i-blog.csdnimg.cn/blog_migrate/a86f8657535c007558ea628b4b5b9c7a.png)
![equation?tex=A%5Cvec%7Bx%7D%3D+%5Cvec%7Bb%7D](https://i-blog.csdnimg.cn/blog_migrate/a86f8657535c007558ea628b4b5b9c7a.png%5Cvec%7Bx%7D%3D+%5Cvec%7Bb%7D)
![equation?tex=%5CPhi%5Cleft%28%5Cvec%7Bx%7D%5Cright%29](https://i-blog.csdnimg.cn/blog_migrate/4f6abcd1e4abf35888131baedc155309.png)
![equation?tex=%5Cvec%7Br%7D%3D%5Cvec%7Bb%7D-A%5Cvec%7Bx%7D](https://i-blog.csdnimg.cn/blog_migrate/3fcf5acff26094cfa13bd241092a7688.png)
![equation?tex=%5CPhi%5Cleft%28%5Cvec%7Bx%7D%5Cright%29](https://i-blog.csdnimg.cn/blog_migrate/4f6abcd1e4abf35888131baedc155309.png)
一维优化问题
假设已经得到了
![equation?tex=k-1](https://i-blog.csdnimg.cn/blog_migrate/30a1e604e90d06fc09219023a21a3fd0.png)
![equation?tex=%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D](https://i-blog.csdnimg.cn/blog_migrate/52f5f0476f395ec249ce8c255ce2ad0b.png)
![equation?tex=k](https://i-blog.csdnimg.cn/blog_migrate/9d4a1163d2eb1bb72228d1279049c2d8.png)
![equation?tex=%5Cvec%7Bp_k%7D](https://i-blog.csdnimg.cn/blog_migrate/5cf35e239574c3d26838488dfbf477e9.png)
![equation?tex=%5Cvec%7Bx%7D%5E%7B%5Cleft%28k%5Cright%29%7D%3D%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%2B%5Calpha_k%5Cvec%7Bp_k%7D](https://i-blog.csdnimg.cn/blog_migrate/2b7da5a2544919af88e919a0a65e78bf.png)
代入函数表达式可得
![equation?tex=%5CPhi%5Cleft%28%5Cvec%7Bx%7D%5E%7B%5Cleft%28k%5Cright%29%7D%5Cright%29%3D%5Cfrac%7B1%7D%7B2%7D%5Cvec%7Bp_k%7D%5Ccdot+A%5Cvec%7Bp_k%7D%5Calpha_k%5E2-%5Cleft%28%5Cvec%7Bb%7D-A%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%5Cright%29%5Ccdot%5Cvec%7Bp_k%7D%5Calpha_k%2B%5Cfrac%7B1%7D%7B2%7D%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%5Ccdot+A%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D-%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%5Ccdot+%5Cvec%7Bb%7D](https://i-blog.csdnimg.cn/blog_migrate/18e4c51a9922b71721b60aa998f959af.png)
最优的
![equation?tex=%5Calpha_k](https://i-blog.csdnimg.cn/blog_migrate/e6841fa747413b5ceac32fdfd7d27bed.png)
![equation?tex=%5Calpha_k+%3D+%5Cfrac%7B%5Cleft%28%5Cvec%7Bb%7D-A%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%5Cright%29%5Ccdot%5Cvec%7Bp_k%7D%7D%7B%5Cvec%7Bp_k%7D%5Ccdot+A%5Cvec%7Bp_k%7D%7D%3D%5Cfrac%7B+%5Cvec%7Br%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%5Ccdot%5Cvec%7Bp_k%7D%7D%7B%5Cvec%7Bp_k%7D%5Ccdot+A%5Cvec%7Bp_k%7D%7D](https://i-blog.csdnimg.cn/blog_migrate/e6841fa747413b5ceac32fdfd7d27bed.png+%3D+%5Cfrac%7B%5Cleft%28%5Cvec%7Bb%7D-A%5Cvec%7Bx%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%5Cright%29%5Ccdot%5Cvec%7Bp_k%7D%7D%7B%5Cvec%7Bp_k%7D%5Ccdot+A%5Cvec%7Bp_k%7D%7D%3D%5Cfrac%7B+%5Cvec%7Br%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D%5Ccdot%5Cvec%7Bp_k%7D%7D%7B%5Cvec%7Bp_k%7D%5Ccdot+A%5Cvec%7Bp_k%7D%7D)
得到新的解后,新的残差
![equation?tex=%5Cvec%7Br%7D%5E%7B%5Cleft%28k%5Cright%29%7D%3D%5Cvec%7Br%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D-%5Calpha_k+A%5Cvec%7Bp_k%7D](https://i-blog.csdnimg.cn/blog_migrate/eb85c890e8ceace04ae41d4c86898cd1.png)
最速下降法(Steepest descent method)
搜索方向的最简单的取法就是取
![equation?tex=%5Cvec%7Bp_k%7D%3D%5Cvec%7Br%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D](https://i-blog.csdnimg.cn/blog_migrate/5cf35e239574c3d26838488dfbf477e9.png%3D%5Cvec%7Br%7D%5E%7B%5Cleft%28k-1%5Cright%29%7D)
function [x, iter] = sd_solve(A, b, x0, epsilon, iter_max)
iter = 0;
x = x0;
r = b - A * x0;
rr = dot(r, r);
while sqrt(rr) >= epsilon && iter < iter_max
iter = iter + 1;
Ar = A * r;
alpha = rr / dot(r, Ar);
x = x + alpha * r;
r = r - alpha * Ar;
rr = dot(r, r);
end
end
注意提取出了
![equation?tex=%5Cvec%7Br%7D%5Ccdot+%5Cvec%7Br%7D](https://i-blog.csdnimg.cn/blog_migrate/55d9a7868358caf4c04148cc07acfc54.png)
![equation?tex=A%5Cvec%7Br%7D](https://i-blog.csdnimg.cn/blog_migrate/a86f8657535c007558ea628b4b5b9c7a.png%5Cvec%7Br%7D)
每一步迭代需要计算一次矩阵向量乘,两次向量内积,两次向量线性组合。
注意到
![equation?tex=%5Cvec%7Br%7D%5E%7B%5Cleft%28k%5Cright%29%7D](https://i-blog.csdnimg.cn/blog_migrate/74fcd63faba35f14dc664fe6c4945e69.png)