halcon中的凸性检测
机器学习 (Machine Learning)
The most interesting thing you would first come across when starting out with machine learning is the optimization algorithm and to be specific, it is the gradient descent, which is a first-order iterative optimization algorithm used to minimize the cost function.
从机器学习开始,您首先会遇到的最有趣的事情是优化算法,具体地说,它是梯度下降,它是用于最小化成本函数的一阶迭代优化算法。
The intuition behind gradient descent is converging to a solution, which could be a local minimum in the neighborhood or in best-case, the global minimum.
梯度下降背后的直觉正在收敛到一个解决方案,该解决方案可以是附近的局部最小值,或者在最佳情况下可以是全局最小值。
Everything seems fine and okay until you start questioning yourself about the convergence problem. Having a good understanding of convexity helps you prove the intuition behind the idea of gradient descent. So let us discuss the same.
一切似乎都很好,直到您开始对收敛问题提出疑问。 很好地了解凸度有助于您证明梯度下降概念背后的直觉。 因此,让我们讨论同样的问题。
I hope you have a good understanding of gradient descent. Check out this article for a recap.
希望您对梯度下降有所了解。 查看这篇文章以进行回顾。
凸集 (Convex Sets)
To simply things, think of convex sets as shapes where any line joining 2 points in this set is never outside the set. This is called a convex set.
简单来说,将凸集视为形状,其中连接该点中2个点的任何线都永远不会超出集合。 这称为凸集。
Take a look at the examples below.
看下面的例子。
It is evident that any line joining 2 points on a circle or say a square (the shapes on extreme left and middle), will have all the line segments within the shape. These are examples of convex sets.
显然,任何在圆上连接两个点或说一个正方形(最左端和中间的形状)的线都将具有该形状内的所有线段。 这些是凸集的示例。
On the other hand, the shape on the extreme right in the figure above has part of a line outside the shape. Thus, this is not a convex set.
另一方面,上图中最右端的形状在