关闭

近邻算子的方法应用和比较——优化算法

标签: 优化近邻算子Proximal Gradient MeADMM
666人阅读 评论(0) 收藏 举报
分类:

I use three methodsto solve the problem to contrast.

1.      ProximalGradient Method

2.      AlternatingDirection Method of Multipliers(ADMM)

3.      CVX

 

the problem:

(1). Proximal Gradient Method

introduction: Proximal gradientmethods are a generalized form of projection used to solve non-differentiableconvex optimization problems. Many interesting problems can be formulated asconvex optimization problems of form


(2). ADMM


(3).CVX

 I use the toolbox of matlab to solve theproblem as one of the reference solution. In the decrementof objective function w.r.t. iteration, I use CVX as the True solution and drawit a horizon line.

the algorithon:

 

cvx_begin quiet

        cvx_precision low

        variable x(n)

        minimize(0.5*x.'*A*x + b.'*x + c +gamma*norm(x,1))

cvx_end

 

result of picture:

the 2D contour plot of objective functionand the trajectory of the value update:



decrement of objective function w.r.t.iteration:


CVXtime elapsed: 0.19 seconds.

Proximalgradient time elapsed: 0.00 seconds.

ADMMtime elapsed: 0.02 seconds.

 

 

附件:

1.Proximal Gradient Method(proximalGradient.m)

function[x,h]=proximalGradient(A,b,c,gamma)

%%解决 1/2*X'AX+b'X+c+gamma(normx,1)

%% A:n*n   X:n*1

MAX_ITER =400;

ABSTOL   = 1e-6;

RELTOL   = 1e-2;

 

f = @(x)0.5*(x'*A*x)+b'*x+c;%%为了确定线搜索步长

lambda = 1;

beta = 0.5;

[~,n]=size(A);

tic;

x = zeros(n,1);

xprev = x;

for k = 1:MAX_ITER

     while 1

        grad_x = 0.5*(A+A')*x+b;

        z = soft_threshold(x - lambda*grad_x,lambda*gamma);%%迭代更新x

        if f(z) <= f(x) + grad_x'*(z - x) +(1/(2*lambda))*(norm(z - x))^2

            break;

        end

        lambda = beta*lambda;

    end

    xprev = x;

    x = z;

    h.x_iter(k,:) = x;

    h.prox_optval(k) = objective(A, b,c, gamma,x, x);

    if k > 1 && abs(h.prox_optval(k)- h.prox_optval(k-1)) < ABSTOL

        break;

    end

end

 

h.x_prox = x;

h.p_prox =h.prox_optval(end);

h.prox_iter =length(h.prox_optval);

h.prox_grad_toc =toc;

 

end

 

2.ADMM (ADMM.m)

function[x,h]=ADMM(A,b,c,gamma)

%%解决 1/2*X'AX+b'X+c+gamma(normx,1)

%% A:n*n   X:n*1

MAX_ITER =400;

ABSTOL   = 1e-4;

RELTOL   = 1e-2;

 

%f = @(x)0.5*(x'*A*x)+b'*x+c;%%为了确定线搜索步长

lambda = 1;

beta = 0.5;

[~,n]=size(A);

tic

x = zeros(n,1);

z = zeros(n,1);

u = zeros(n,1);

rho = 1/lambda;

for k = 1:MAX_ITER

    while 1

        x = inv(A + rho.*eye(n))*(rho*(z-u)-b);

        z_old = z;

        z = soft_threshold(x+u,lambda*gamma);

        u = u+x-z;

        h.admm_optval(k) = objective(A, b,c,gamma, x, x);

        h.r_norm(k)   = norm(x - z);

        h.s_norm(k)   = norm(-rho*(z - z_old));

        h.eps_pri(k)  = sqrt(n)*ABSTOL + RELTOL*max(norm(x),norm(-z));

        h.eps_dual(k) = sqrt(n)*ABSTOL +RELTOL*norm(rho*u);

        h.x_iter(k,:) = z;

        ifh.r_norm(k)<h.eps_pri(k)&&h.s_norm(k)<h.eps_dual(k)

            break;

        end

       

    end

end

h.x_admm = z;

h.admm_iter =length(h.admm_optval);

h.p_admm =h.admm_optval(end);

h.admm_toc = toc;

end

 

3.CVX(cvx_optimal.m)

function [x,h] =cvx_optimal(A,b,c,gamma)

    [~,n]=size(A);

    tic

    cvx_begin quiet

        cvx_precision low

        variable x(n)

        minimize(0.5*x.'*A*x + b.'*x + c +gamma*norm(x,1))

    cvx_end

    h.x_cvx = x

    h.p_cvx = cvx_optval

    %h.p_cvx = objective(A, b,c, gamma, x, x)

    h.cvx_toc = toc

end

 

4.objective function(objective.m)

function p =objective(A, b,c, gamma, x, z)

%UNTITLED5 Summaryof this function goes here

%   Detailed explanation goes here

    p=0.5*(x'*A*x)+b'*x+c+ gamma*norm(z,1);

end

 

5. soft-threshold (soft_threshold.m)

function[X]=soft_threshold(b,lambda)

%UNTITLED4 Summaryof this function goes here

%   Detailed explanation goes here

    X=sign(b).*max(abs(b) - lambda,0);

end

 

0
0

查看评论
* 以上用户言论只代表其个人观点,不代表CSDN网站的观点或立场
    个人资料
    • 访问:72484次
    • 积分:1172
    • 等级:
    • 排名:千里之外
    • 原创:17篇
    • 转载:184篇
    • 译文:0篇
    • 评论:10条
    最新评论