for all x, y ∈ Rn and all α, β ∈ R with α + β = 1, α ≥ 0, β ≥ 0.
Since any linear program is therefore a convex optimization problem,
we can consider convex optimization to be a generalization of linear programming.
our ability to solve the optimization problem (1.1), varies considerably, and depends on factors such as the particular forms of the objective and constraint functions, how many variables and constraints there are, and special structure, such as sparsity. (A problem is sparse if each constraint
function depends on only a small number of the variables).
There are, however, some important exceptions to the general rule that most
optimization problems are difficult to solve. Like least-squares or linear programming, there are very effective algorithms that can reliably and efficiently solve even large convex problems.
least-squares and linear programming are special subclasses of convex optimization.
这里的problem 1.8就是problem 1.3,凸优化问题。
由于没有很好的通法,只能做出一些让步。比如下图讲的“局部最优”。
最优化也可以用于判断一个关键系统是否安全可靠(看最坏情况是否可接受)。
https://www.cnblogs.com/yanganling/p/8007050.html 空间中任意一点到超平面的距离
除了解决可以被归结为凸优化的问题,凸优化还可以:
Initialization for local optimization
Convex heuristics for nonconvex optimization
Bounds for global optimization
unconstrained optimization, equality constrained optimization, and inequality constrained optimization follow a natural hierarchy, in which solving a problem is reduced to solving a sequence of simpler problems. Quadratic optimization problems (including, e.g., least-squares) form the base of the hierarchy; they can be solved exactly by solving a set of linear equations. Newton’s method is the next level in the hierarchy. In Newton’s method, solving an unconstrained or equality constrained problem is reduced to solving a sequence of quadratic problems. Interior-point methods form the top level of the hierarchy. These methods solve an inequality constrained problem by solving a sequence of unconstrained or equality constrained problems.