**
一 2阶梯度的计算
**
将需要计算梯度的变量设置为tf.Variable( ),并将计算函数置于with tf.GradientTape ( ) as tape内.
import tensorflow as tf
x = tf.Variable(tf.constant([2.]))
w = tf.Variable(tf.constant([2.]))
b = tf.Variable(tf.constant([2.]))
with tf.GradientTape(persistent=True) as tape1:
with tf.GradientTape(persistent=True) as tape2:
y = 3*x*w**3 + 2*x*w**2 + w*x + b**2
dw,db = tape2.gradient(y,[w,b])
dw2 = tape1.gradient(dw,w)
db2 = tape1.gradient(db,b)
print(dw)
print(dw2)
print(db)
print(db2)
其输出结果为:
tf.Tensor([90.], shape=(1,), dtype=float32)
tf.Tensor([80.], shape=(1,), dtype=float32)
tf.Tensor([4.], shape=(1,), dtype=float32)
tf.Tensor([2.], shape=(1,), dtype=float32)
**
二 激活函数及梯度
**
2.1 sigmoid函数–tf.sigmoid( )
import tensorflow as tf
a = tf.Variable(tf.linspace(-10.,10.,10))
with tf.GradientTape( ) as tape:
y =