用SSOR迭代法和预优共轭梯度法求解Ax=b方程组(附MATLAB代码)

一、 SSOR 迭代法

SSOR迭代法的迭代格式为

(D-\omega C_L)x_{m-\frac{1}{2}}=\{\omega C_U+(1-\omega)D\}x_{m-1}+\omega b,\\ (D-\omega C_U)x_m= \{\omega C_L+(1-\omega)D\}x_{m-\frac{1}{2}}+\omega b.

对应的MATLAB代码为

function [x,k,error,time]=myssor(A,b,x,w,tol,max_it)
%% SSOR迭代算法
%% To solve the equation Ax=b;
%% Iterative formula : (D-wL)x_(k+1)=((1-w)D+wU)x_k+wb; (D-wU)x_(k+1)=((1-w)D+wL)x_k+wb;
% A: input matrix;  b:right vector 
% x: initial vector;
% max_it : the maximum number of iterations;
% tol : Accuracy;
% w : relaxation factor;
% k : number of iterations at termination;
% error : the L^2 error at termination;
% time : CPU time;

if nargin < 6
    max_it = 1000;
end
if nargin < 5
    tol = 1.e-6;
end
if nargin < 4
    w = 0.75;
end

tic;
bnorm2 = norm(b); % calculate the L^2 norm of vector b;
r = b-A*x; % calculate residual vector;
error = norm(r)/bnorm2;
if (error < tol)
    return;
end
%iteration method 
D = diag(diag(A));
L = -tril(A,-1);
U = -triu(A,1);
for k = 1:max_it
    x = (D-w*L)\(((1-w)*D+w*U)*x+w*b);
    x = (D-w*U)\(((1-w)*D+w*L)*x+w*b);
    r = b-A*x;
    error = norm(r)/bnorm2;
    if (error < tol)
        break;
    end
end
    time = toc;
%% remark : break is used to break out of the loop; return is used to jump out of the function.    
    

我们可以观察 \omega 对算法性能的影响, 有

二、 预优共轭梯度法

采用左预优矩阵

以下是预优矩阵为SSOR矩阵的CG MATLAB代码

function [x,iter,time,res,resvec]=myJocabi_cg(A,b,x,M,max_it,tol)
%% 预优共轭梯度算法 (By X.Feng, Qiao.H, Zhang.L)
%% To solve the equation Ax=b;
% A : input matrix;  
% b:right vector;
% x : initial vector;
% M : Preprocessing matrix; 
% max_it : the maximum number of iterations;
% tol : Accuracy;
% time : CPU time;
% iter : number of iterations at termination;
% res : the norm of residual vector at termination;
% resvec : the residual vector at termaination; 

if nargin < 6
    max_it = 1000;
end
if nargin < 5
    tol = 1.e-6;
end
if nargin < 4
    M = diag(diag(A)); % where use Jacobi Preprocessing matrix ; 
    % Note that Gauss-Seidel Preprocessing, SOR Preprocessing matrix is not
    % work. In fact, we can also SSOR Preprocessing matrix;
end

tic;
r = b - A * x;
z = M \ r;
p = z;
rho = z' * r;
mr = norm(r); % The L^2 norm of the vector r;
iter = 0;
while (iter < max_it)
    iter = iter + 1; 
    u = A * p;
    alpha = rho / ( p' * u )';
    x = x + alpha * p;
    r = r - alpha * u;
    z = M \ r;
    rho1 = z' * r;
    beta = rho1 / rho;
    p = z + beta * p;
    res = norm (r) / mr;
    resvec = res;
    if (res < tol )
        break;
    end
    rho = rho1;
end
time = toc;

同样, 观察 \omega 对算法性能的影响.

从图中可以看出,  \omega 取 0.8 左右效率可以达到最好.

  • 1
    点赞
  • 28
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值