#损失函数为平方损失,初值为3,损失函数的导数为二倍初值,则梯度为6
import tensorflow as tf
with tf.GradientTape() as tape:
w = tf.Variable(tf.constant(3.0))
loss = tf.pow(w,2)
grad = tape.gradient(loss,w)
print(grad.numpy())
输出:
6.0
2.通过自动梯度计算迭代求极值
#初值为5,损失函数为w+1的平方,学习率为0.03,迭代轮数为30
#第一轮的梯度为12,乘学习率后为0.36,第一轮过后w更新为4.64,loss为36
w = tf.Variable(tf.constant(5.0))
lr = 0.03
epochs = 30
for epoch in range(epochs):
with tf.GradientTape() as tape:
loss = tf.square(w+1)
grad = tape.gradient(loss,w)
w.assign_sub(lr*grad)
print('After %s epoch,w is %f,loss is %f'%(epoch,w.numpy(),loss))
输出:
After 0 epoch,w is 4.640000,loss is 36.000000
After 1 epoch,w is 4.301600,loss is 31.809599
After 2 epoch,w is 3.983504,loss is 28.106962
After 3 epoch,w is 3.684494,loss is 24.835316
After 4 epoch,w is 3.403424,loss is 21.944485
After 5 epoch,w is 3.139219,loss is 19.390144
After 6 epoch,w is 2.890866,loss is 17.133133
After 7 epoch,w is 2.657414,loss is 15.138837
After 8 epoch,w is 2.437969,loss is 13.376677
After 9 epoch,w is 2.231691,loss is 11.819633
After 10 epoch,w is 2.037790,loss is 10.443828
After 11 epoch,w is 1.855522,loss is 9.228166
After 12 epoch,w is 1.684191,loss is 8.154007
After 13 epoch,w is 1.523139,loss is 7.204880
After 14 epoch,w is 1.371751,loss is 6.366233
After 15 epoch,w is 1.229446,loss is 5.625203
After 16 epoch,w is 1.095679,loss is 4.970429
After 17 epoch,w is 0.969939,loss is 4.391871
After 18 epoch,w is 0.851742,loss is 3.880658
After 19 epoch,w is 0.740638,loss is 3.428949
After 20 epoch,w is 0.636199,loss is 3.029819
After 21 epoch,w is 0.538027,loss is 2.677149
After 22 epoch,w is 0.445746,loss is 2.365529
After 23 epoch,w is 0.359001,loss is 2.090181
After 24 epoch,w is 0.277461,loss is 1.846884
After 25 epoch,w is 0.200813,loss is 1.631907
After 26 epoch,w is 0.128765,loss is 1.441953
After 27 epoch,w is 0.061039,loss is 1.274109
After 28 epoch,w is -0.002624,loss is 1.125803
After 29 epoch,w is -0.062466,loss is 0.994760
目录1.自动梯度计算2.通过自动梯度计算迭代求极值1.自动梯度计算#损失函数为平方损失,初值为3,损失函数的导数为二倍初值,则梯度为6import tensorflow as tfwith tf.GradientTape() as tape: w = tf.Variable(tf.constant(3.0)) loss = tf.pow(w,2)grad = tape.gradient(loss,w)print(grad.numpy())输出:6.02.通过自动梯度计算迭