版权提示:以下所有例子都是参考github大神制作,我只是搬运工
https://github.com/YunYang1994/TensorFlow2.0-Examples
一、介绍tf2.0的 四个激活函数
分别是relu,sigmoid,tanh,softplus
mport tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
# -5 ,5 之间生成100个等差数列的数据
x = np.linspace(-5, 5, 100)
# 下面将展示一些tf2.0的激活函数
# 1\ relu
y_relu = tf.nn.relu(x)
y_sigmoid = tf.nn.sigmoid(x)
y_tanh = tf.nn.tanh(x)
y_softplus = tf.nn.softplus(x)
# plt to visualize these activation function
plt.figure(1, figsize=(8, 6))
plt.subplot(221)
plt.plot(x, y_relu, c='red', label='relu')
plt.ylim((-1, 5))
plt.legend(loc='best')
plt.subplot(222)
plt.plot(x, y_sigmoid, c='red', label='sigmoid')
plt.ylim((-0.2, 1.2))
plt.legend(loc='best')
plt.subplot(223)
plt.plot(x, y_tanh, c='red', label='tanh')
plt.ylim((-1.2, 1.2))
plt.legend(loc='best')
plt.subplot(224)
plt.plot(x, y_softplus, c='red', label='softplus')
plt.ylim((-0.2, 6))
plt.legend(loc='best')
plt.show()
图如下
二、TF2.0解微分神器,tf.Gradient函数
import tensorflow as tf
x = tf.constant(3.0)
with tf.GradientTape(persistent=True) as t:
t.watch(x) # Ensures that `tensor` is being traced by this tape.
y = x * x
z = y * y
dz_dx = t.gradient(z, x) # 108.0 (4*x^3 at x = 3)
dy_dx = t.gradient(y, x) # 6.0
print("dz/dx=", dz_dx.numpy())
print("dy/dx=", dy_dx.numpy())
以上输出
dz/dx= 108.0
dy/dx= 6.0
关于梯度带函数的讲解,直接传送门
TensorFlow学习(四):梯度带(GradientTape),优化器(Optimizer)和损失函数(losses)