[笔记] Convex Optimization 2015.11.18

原创 2015年11月21日 16:33:07

Let {fa:aA} be a collection of convex functinos from Rn to R, with same domain, then f(x)=supaAfa(x) is a convex function.

  • Proof1: Take x,ydomf, θ[0,1],
    f(θx+(1θ)y)==supafa(θx+(1θ)y)supa[θfa(x)+(1θ)fa(y)]θsupafa(x)+(1θ)supafa(y)θf(x)+(1θ)f(y)

(The proposition is true for domf=aAdomfa, but false for domf=aAdomfa.)

  • Proof2:

    epi(f)==={(x,t):xdomf,t>fa(x)aA}{(x,t):(x,t)epi(fa)aA}aAepi(fa)

  • Example: Let x[i] denote the i-th largest component of x=(x1,,xn)Rn,
    then maxsumr(x)=x[1]+x[2]++x[r] is convex.

  • Proof: maxsumr(x)xi1+xi2++xir
    for any {i1,,ir}{1,,n} with ijik for jk
    so it is convex.

  • Example: Let CRn, define Sc(x)=sup{yTx:yC}, then Sc is convex.

  • Example: Let f:SnR, f(X) is the largest eigenvalue of X.
    Claim: f is convex.

  • Proof: First claim that f(X)=sup{yTXy:y2=1}
  • Proof of claim:

    supy2=1yTXy====supy2=1yTPDPTysupv2=1vTDvsupv2=1λiv2isupv2=1max(λi)i=1nv2imax(λi)

  • Example: Let f:Rm×nR be defined by f(X)=X2 where X2=supy2=1Xy2 is the spectural norm of XRm×n.

  • Claim: f(X)=supu,v{uTXv:u2=1,v2=1}
    because Xv2=sup{uTXv:uRn,u2=1}
    more generally: Xa,b=sup{Xvb:va=1}=sup{uTXv:va=1,ub=1}
    (Xvb=Xvb=supu{uTXv:ub=1})

  • Composition: Have f(x)=h(g(x)), xRn, g(x)R, when is f convex?

  • Would-be-proof: Take x,ydomf, θ[0,1] then
    f(θx+(1θ)y)==h(g(θx+(1θ)y))h(θg(x)+(1θ)g(y))θh(g(x))+(1θ)h(g(y))θf(x)+(1θ)f(y)use domf is convexuse g is convex/concave, h is nondecreasing/nonincreasinguse h is convex
- - 1 2 3 4
Condition g convex concave convex concave
- h nondecreasing nonincreasing nonincreasing nondecreasing
- h convex convex concave concave
Result f convex convex concave concave

- Example: g(x)=x21,h(x)=x3/2,domh=R+
then dom(hg)=(,1][1,+] is not convex.

  • Example 3.13:
    If g is convex then eg(x) is convex. 1
    If g is concave and positive then log(g(x)) is concave. 4
    If g is concave and positive then 1g(x) is convex. 2
    If g is convex and nonnegative and p1, g(x)p is convex. 1
    g:RnRm, say g is K-convex, where K is a cone in Rm,
    if domg is convex and g(θx+(1θ)y)Kθg(x)+(1θ)g(t)
    xKyh(x)h(y) K-nondecreasing.

  • Example 3.14:
    h(z)=log(ki=1ezi), so log(ki=1egi(x)) will be convex if g1,,gk are convex.

  • Minimization: Let f:Rn×Rm be a convex function, then
    g(x)=infy:(x,y)domff(x,y) is convex.

  • Proof: Let x1,x2domg, θ[0,1].
    then for any ε>0, y1,y2, s.t.
    g(x1)f(x1,y1)ε, g(x2)f(x2,y2)εand

    g(θx1+(1θ)x2)=infyf(θx1+(1θ)x2,y)f(θx1+(1θ)x2,θy1+(1θ)y2)θf(x1,y1)+(1θ)f(x2,y2)θg(x1)+(1θ)g(x2)+ε

  • Example: Let CRn be a convex set, then
    g(x)=infyCxy is a convex function.

  • Proof: Use f(x,y)=xydomf=Rn×C,
    f(θx1+(1θ)x2,θy+(1θ)y2)===θx1+(1θ)x2θy1(1θ)y2θ(x1y1)+(1θ)(x2y2)θx1y1+(1θ)x2y2f(x1,y1)+f(x2,y2)

Consider function g(w)=infxmi=1wi(aTixbi)2, “weighted least square”
concave function of w
Let g(w)=infx(Axb)TW(Axb)=infx(xTATWAx2bTWAx+bTWb)
assume ATWA0, then optimal x=(ATWA)1ATWb
(minx(xTAx+2bTx),=2Ax+2b=0bestx=A1b)

optimal value====bTWA(ATWA)1ATWb2bTWA(ATWA)1ATWb+bTWbbTWA(ATWA)1ATWb+bTWbi=1mb2iwi(i=1mwibiaTi)(i=1mwiaiaTi)1(i=1mwibiai)i=1mb2iwii,jwiwjbibjaTi(i=1mwiaiaTi)1aj

(ATWA=ai(wiaTi)=wiaiaTi)

版权声明:本文为博主原创文章,未经博主允许不得转载。

凸优化(Convex Optimization)浅析

本文对凸优化进行了简单介绍,转载自凸优化(Convex Optimization)浅析——博客园kemaswill.对原作者的付出表示感谢.版权归原作者所有....
  • clheang
  • clheang
  • 2015年04月18日 11:56
  • 4585

Numerical Optimization和Convex optimization 两本书的选择?

Numerical Optimization和Convex optimization 两本书的选择? - 知乎https://www.zhihu.com/question/49689245 ...
  • shengfang05
  • shengfang05
  • 2017年12月18日 20:19
  • 117

Low-Rank Matrix Recovery and Completion via Convex Optimization

Low-Rank Matrix Recovery and Completion via Convex Optimization This website introduces new too...
  • BingeCuiLab
  • BingeCuiLab
  • 2015年05月17日 17:16
  • 732

《2013-Finding Locally Optimal, Collision-Free Trajectories with Sequential Convex Optimization》

non-convex optimization problems (一)Sequential convex optimization通过构造凸的子问题来解非凸问题,主要有...
  • kl1411
  • kl1411
  • 2017年02月27日 22:11
  • 140

凸优化:ADMM(Alternating Direction Method of Multipliers)交替方向乘子算法系列之五: Constrained Convex Optimization

凸优化:ADMM(Alternating Direction Method of Multipliers)交替方向乘子算法系列之五: Constrained Convex Optimization ...
  • shanglianlm
  • shanglianlm
  • 2015年07月08日 20:04
  • 2673

Convex Optimization

Courses and MOOCs Introduction to Convex Optimization MITOCW by Prof. Stephen Boyd Convex Analysis...
  • sunny13love
  • sunny13love
  • 2018年02月03日 16:32
  • 11

[笔记] Convex Optimization 2015.10.28

Convex Optimization 2015.10.28
  • LiJiancheng0614
  • LiJiancheng0614
  • 2015年11月09日 01:19
  • 450

[笔记] Convex Optimization 2015.10.14

Convex Optimization 2015.10.14
  • LiJiancheng0614
  • LiJiancheng0614
  • 2015年10月21日 20:59
  • 360

[笔记] Convex Optimization 2015.11.04

Convex Optimization 2015.11.04
  • LiJiancheng0614
  • LiJiancheng0614
  • 2015年11月09日 21:46
  • 618

[笔记] Convex Optimization 2015.09.23

Convex Optimization 2015.09.23
  • LiJiancheng0614
  • LiJiancheng0614
  • 2015年09月26日 01:00
  • 546
内容举报
返回顶部
收藏助手
不良信息举报
您举报文章:[笔记] Convex Optimization 2015.11.18
举报原因:
原因补充:

(最多只允许输入30个字)