关于Moreau-Yosida的一些优良性质

1、Introduction

Moreau -Yosida regularitztion is a preconditioner of a convex function f.

And this preconditioner has second-order properties

共轭: conjugate of a closed convex function
在这里插入图片描述
closed function =lower semi-continuous function

具体表示: We denote by F the Moreau-Yosida regularization of a given closed convex function f associated to the metric de ned by M :
在这里插入图片描述
Denote p ( x ) p(x) p(x) is the unique minimizer in F ( x ) F(x) F(x)

2、Properities of the moreau-yosida regularization

Definition1:
The recession (or asymptotic) function of a closed convex function C is defined by 在这里插入图片描述
(a limit which does not depend on x ∈ d o m ϕ x\in dom\phi xdomϕ). This function is useful because ϕ \phi ϕ has a nonempty bounded set of minima if and only if ϕ ∞ ′ ( d ) > 0 \phi^{'}_{\infty}(d)>0 ϕ(d)>0 for all d ≠ 0 d\not=0 d=0.
Definition2: Proximal Point: We will extensively use the following system of notation:
p M ( x ) = a r g m i n y f ( y ) + 1 2 M T ( y − x ) T ( y − x ) p_{M}(x)=argmin_{y} f(y)+\frac{1}{2}M^{T}(y-x)^{T}(y-x) pM(x)=argminyf(y)+21MT(yx)T(yx)
is called the proximal point of x.

Lemma1:
The minimization problem of F ( x ) F(x) F(x) has a unique solution, characterized as the unique point y ∈ R n y\in R^{n} yRn satisfying
M ( x − y ) ∈ ∂ f ( y ) M(x-y)\in\partial f(y) M(xy)f(y) M ( x − p M ( x ) ) ∈ ∂ f ( p M ( x ) ) M(x-p_{M}(x))\in\partial f(p_{M}(x)) M(xpM(x))f(pM(x))
proof is on J.-B. Hiriart-Urruty and C. Lemar echal, Convex Analysis and Minimization Algorithms, Springer-Verlag, Berlin, New York, 1993;XV,Lemma.4.1.1

Theorem1:
The recession functions of f f f and F F F are identical.
proof : Ref:Rockafellar,Corollary 9.2.1

Theorem 2.
For a finite-valued convex function f f f , the following statements are equivalent:
(i) f f f is strongly convex with modulus 1 l \frac{1}{l} l1;
(ii) f f f has a Lipschitzian gradient with Lipschitz constant l l l
(iii) F F F has a Lipschitzian gradient with Lipschitz constant L L L;
(iv) F F F is strongly convex with modulus 1 L \frac{1}{L} L1.
Furthermore, we have the inequalities l − 1 λ < L < l − 1 λ \frac{l-1}{\lambda}<L<\frac{l-1}{\lambda} λl1<L<λl1.
Reference
Reference

proof:[REF:Lemar C , chal, Sagastiz C A . Practical Aspects of the Moreau–Yosida Regularization[J]. SIAM Journal on Optimization, 1997.]

Theorem3:
The function F ( y ) F(y) F(y) is finite everywhere,convex and differentiable ;its gradient is G ( y ) = ∇ F ( y ) = s M ( y ) = M ( y − p ( y ) ) G(y)=\nabla F(y)=s_{M}(y)=M(y-p(y)) G(y)=F(y)=sM(y)=M(yp(y)).
Furthermore,there holds for all y y y and y ′ y^{'} y in R n R^{n} Rn,
⟨ s M ( y ) − s M ( y ′ ) , W ( s M ( y ) − s M ( y ′ ) ) ⟩ ≤ ⟨ s M ( y ) − s M ( y ′ ) , y − y ′ ⟩ \langle s_{M}(y)-s_{M}(y^{'}),W(s_{M}(y)-s_{M}(y^{'}))\rangle \leq \langle s_{M}(y)-s_{M}(y^{'}),y-y^{'}\rangle sM(y)sM(y),W(sM(y)sM(y))⟩sM(y)sM(y),yy
and
∥ ∇ F ( y ) − ∇ F ( y ′ ) ∥ ≤ ∥ M ∥ ∥ y − y ′ ∥ \parallel \nabla F(y)-\nabla F(y^{'})\parallel\leq \parallel M\parallel \parallel y-y^{'} \parallel F(y)F(y)∥≤∥M∥∥yy
proof

Theorem4:
Minimizing f f f and F F F are equivalent problems,in the sense
inf ⁡ x ∈ R n F ( x ) = inf ⁡ x ∈ R n f ( x ) \inf_{x\in R^{n}} F(x) = \inf_{x\in R^{n}} f(x) xRninfF(x)=xRninff(x)
an equality in R ∪ { − ∞ } R\cup\lbrace -\infty\rbrace R{},and the following statements are equvalent.
(i) x x x minimizes f f f
(ii) p M ( x ) = x p_{M}(x)=x pM(x)=x
(iii) s M ( x ) = 0 s_{M}(x)=0 sM(x)=0
(iv) x x x minimizes f M ( x ) f_{M}(x) fM(x)
(v) f ( p M ( x ) ) = f ( x ) f(p_{M}(x))=f(x) f(pM(x))=f(x)
(vi) f M ( x ) = f ( x ) f_{M}(x)=f(x) fM(x)=f(x)
在这里插入图片描述
We can use the corollary 2.1.4 to proov the equivalent of inf ⁡ F ( x ) \inf F(x) infF(x) and inf ⁡ ( x ) \inf (x) inf(x)

proof: On J.-B. Hiriart-Urruty and C. Lemar echal, Convex Analysis and Minimization Algorithms, Springer-Verlag, Berlin, New York, 1993.XV,TH4.1.7

Proposition 1
Assume f f f is a closed convex function. Then ∇ F ( ) \nabla F ( ) F() has directional derivatives if and only if p ( ) p( ) p() has directional derivatives. The Hessian ∇ 2 F ( x ) \nabla^{2}F (x) 2F(x) exists if and only if the Jacobian ∇ p ( x ) \nabla p(x) p(x) exists:
∇ 2 F ( x ) = M ( I − ∇ p ( x ) ) , f o r a l l x ∈ R n \nabla ^{2}F(x)=M(I-\nabla p(x)),for\quad all \quad x\in R^{n} 2F(x)=M(Ip(x)),forallxRn

Reference:
[1] Lemar C , chal, Sagastiz C A . Practical Aspects of the Moreau–Yosida Regularization[J]. SIAM Journal on Optimization, 1997.
[2]J.-B. Hiriart-Urruty and C. Lemar echal, Convex Analysis and Minimization Algorithms, Springer-Verlag, Berlin, New York, 1993

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值