- 优化是很有趣的事情,在机器学习等应用里面起着重大作用。
- 本节就用最简单的几行代码,用梯度下降实现一个一元二次方程的最小值求解。核心思想是不断将自变量往负梯度方向移动,找到极值点。
# TODO:
# find the minimum of y described below
# y = x^2 + x + 1
x0 = 1 # start with x0
lama = 0.1 # lamada
g = 2*x0 + 1
# kernel function
while abs(g) > 0.001: # partial x > a very very small epsilon, iterate it
g = 2*x0 + 1 # update partial x
x0 = x0 - lama*g # update x0
print(g)
y = x0**2 + x0 + 1
print('\nx_final is {0}, and min_y is {1}'.format(x0, y))