import matplotlib.pyplot as plt
init = tf.initialize_all_variables()
sess = tf.Session()
sess.run(init)
#图片框
fig = plt.figure()
ax = fig.add_subplot(1,1,1)
ax.scatter(x_data,y_data)
plt.show()
#如果想要连续的画,要用ion
plt.ion()
for i in range(1000):
if i%50==0:
try:
ax.lines.remove(lines[0])
except Exception:
pass
prediction_value = sess.run(prediction,feed_dict{xs:x_data})
lines = ax.plot(x_data,prediction_value,'r-',lw=5)
#lw:line width
plt.puase(0,1)
Stochastic Gradient Descent(SGD)
W += -Learning rate*dx
(但是走的路会很曲折,所以耗时会很长)
Momentum
m = b1m - Learning ratedx
W += m
AdaGrad
v += dx^2
W += -Learning rate *dx/ (v)^1/2
RMSProp
Monmentum + AdaGrad
Adam
m = b1*m + (1-b1)dx
v = b2v + (1-b2)*dx^2
W += -Learning rate*m/(v)^1/2