Stephen P. Boyd convex lecture notes

   昨天没去听Stanford大学的 Stephen 教授(convex optimization 教材的作者)的convex的第一课,甚是可惜。好在他昨天主要讲的是basic  concepts 以及framework of convex optimizattion。今天讲的是他的拿手好戏,报告的名字就叫做:Distributed Optimization via alternating direction method of multiplier,核心算法简称ADMM。第一次听stanford 大牛讲课,甚是激动啊,写篇博客总结一下课堂笔记大笑

    刚开始,professor 先简单介绍了报告的outline:包括Dual decomposition(对偶分解),multiplier methods(乘子方法),ADMM(报告主题),讲了ADMM的common pattern,以及采用ADMM的examples,再就是讨论一致性和交换性,最后是结论。

Part 1

对偶问题是常用的解决约束优化的方法,常见的就是Lagrange dual。假设一个凸优化问题为转化为Lagrange function为:对偶函数就是:,inf表示下确界。于是对偶问题变成,求解问题就变成了:

如果说f函数时可分离的: ,那么我们就可以推导出L函数也是可分的:,其中。于是对x矢量的最小化过程便可以分解为N个分离的最优化过程,分而治之的思想。当然,这个需要很多假设条件,而且计算比较慢,这些也是促使ADMM出现的原因。


Part2:

    Part1 介绍完了,然后介绍part2---------method of multipliers。它是对上述方法的改进,使用了augmented Lagrangian(拉格朗日增广法)就是在原来的方程的后面添加了一项,变成:,rou is positive。于是迭代公式变为:,与part1的方法相比,多个乘子的方法收敛的条件更加宽松,但是由于lagrange中二次项的引入已经不能执行分解运算(decomposition)。于是一种结合着part1和part2的方法横空出世了就是下面要介绍的ADMM。

Part 3

       我们期望的到这样的效果:既有dual 方法的decomposition又有multiplier方法的robust,于是1976年,Gabay等人提出了ADMM。ADMM问题的形式如下:,两个变量集合,不同的目标函数。于是写出来的Lagrange函数就是:

,ADMM为:

采用高斯-赛德尔方法,交替优化的思想,依次优化x和z。

仿照着增广拉格朗日的思想,合并Lagrange 方程的一次项和二次项我们得到:。同时ADMM也变为:

我们注意到在对x的更新的时候是需要求解:minimize,同理对z也是一样的,这里针对几种特殊的case,这些case下可以简化更新的算法。

Case 1:如果f是可分为成块的情况,case2 函数f是smooth function(光滑函数)。


Part 4

   介绍完ADMM之后,就到了举栗子的时候,这里简单介绍几个栗子:

ex1:带约束条件,原来形式为:,转化为ADMM格式下:,更新算法就是

ex2:Lasso 算法,原始形式为:,转化为ADMM形式为:,更新算法为:

Part 5:

结语:ADMM与上世纪70年代的很多算法相似,计算速度很快,可以解决大规模问题!


  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
'Boyd and Vandenberghe have written a beautiful book that I strongly recommend to everyone interested in optimization and computational mathematics: Convex Optimization is a very readable introduction to this modern field of research.' Mathematics of Operations Research '... a beautiful book that I strongly recommend to everyone interested in optimization and computational mathematics ... a very readable and inspiring introduction to this modern field of research. I recommend it as one of the best optimization textbooks that have appeared in the last years.' Mathematical Methods of Operations Research 'I highly recommend it either if you teach nonlinear optimization at the graduate level for a supplementary reading list and for your library, or if you solve optimization problems and wish to know more about solution methods and applications.' International Statistical institute '... the whole book is characterized by clarity. ... a very good pedagogical book ... excellent to grasp the important concepts of convex analysis [and] to develop an art in modelling optimization problems intelligently.' Matapli 'The book by Boyd and Vandenberghe reviewed here is one of ... the best I have ever seen ... it is a gentle, but rigorous, introduction to the basic concepts and methods of the field ... this book is meant to be a 'first book' for the student or practitioner of optimization. However, I think that even the experienced researcher in the field has something to gain from reading this book: I have very much enjoyed the easy to follow presentation of many meaningful examples and suggestive interpretations meant to help the student's understanding penetrate beyond the surface of the formal description of the concepts and techniques. For teachers of convex optimization this book can be a gold mine of exercises

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值