《Python 深度学习》刷书笔记 Chapter 7 使用TensorBoard检查监控深度学习模型


使用TensorBoard可视化深度学习


TensorBoard是一个基于浏览器的可视化工具,只有当Keras使用TensorFlow后端时,这一方法才可运用于模型,其功能如下

  • 在训练过程中以可视化的方式监控指标
  • 将模型架构可视化
  • 将激活和梯度的直方图可视化
  • 以三维的形式嵌入

下面将使用IMDB情感分析任务来演示这一功能


7-7 使用了TensorBoard的文本分类器


import keras
from keras import layers
from keras.datasets import imdb
from keras.preprocessing import sequence

# 特征单词个数
max_features = 20000
# 最大长度
max_len = 500

# 展平数据
(x_train, y_train), (x_test, y_test) = imdb.load_data(num_words = max_features)
x_train = sequence.pad_sequences(x_train, maxlen = max_len)
x_test = sequence.pad_sequences(x_test, maxlen = max_len)

# 模型建立
model = keras.models.Sequential()
model.add(layers.Embedding(max_features, 128,
                           input_length = max_len,
                           name = 'embed'))
model.add(layers.Conv1D(32, 7, activation = 'relu'))
model.add(layers.MaxPooling1D(5))
model.add(layers.Conv1D(32, 7, activation = 'relu'))
model.add(layers.GlobalMaxPooling1D())
model.add(layers.Dense(1))
model.summary()
model.compile(optimizer = 'rmsprop', 
              loss = 'binary_crossentropy',
              metrics = ['acc'])
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embed (Embedding)            (None, 500, 128)          2560000   
_________________________________________________________________
conv1d_3 (Conv1D)            (None, 494, 32)           28704     
_________________________________________________________________
max_pooling1d_2 (MaxPooling1 (None, 98, 32)            0         
_________________________________________________________________
conv1d_4 (Conv1D)            (None, 92, 32)            7200      
_________________________________________________________________
global_max_pooling1d_2 (Glob (None, 32)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 33        
=================================================================
Total params: 2,595,937
Trainable params: 2,595,937
Non-trainable params: 0
_________________________________________________________________

7-8 创建一个文件


mkdir my_log_dir
子目录或文件 my_log_dir 已经存在。

7-9 使用一个TensorBoard回调函数来训练模型


# 文件目录
my_log = r'C:\Users\19924\my_log_dir'

callbacks = [
    keras.callbacks.TensorBoard(
    log_dir = r'C:\Users\19924\my_log_dir',
    histogram_freq = 1,
    embeddings_freq = 1,)]

history = model.fit(x_train, y_train,
                    epochs = 20,
                    batch_size = 128,
                    validation_split = 0.2,
                    callbacks = callbacks)
E:\develop_tools\Anaconda\envs\py36\lib\site-packages\keras\callbacks\tensorboard_v2.py:102: UserWarning: The TensorBoard callback does not support embeddings display when using TensorFlow 2.0. Embeddings-related arguments are ignored.
  warnings.warn('The TensorBoard callback does not support '


Train on 20000 samples, validate on 5000 samples
Epoch 1/20
20000/20000 [==============================] - 33s 2ms/step - loss: 0.2647 - acc: 0.7224 - val_loss: 0.5293 - val_acc: 0.6618
Epoch 2/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.1860 - acc: 0.6292 - val_loss: 0.6009 - val_acc: 0.5730
Epoch 3/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.1427 - acc: 0.5043 - val_loss: 0.7279 - val_acc: 0.4718
Epoch 4/20
20000/20000 [==============================] - 30s 2ms/step - loss: 0.1077 - acc: 0.3997 - val_loss: 0.8791 - val_acc: 0.3640
Epoch 5/20
20000/20000 [==============================] - 30s 2ms/step - loss: 0.0950 - acc: 0.2855 - val_loss: 0.9829 - val_acc: 0.3152
Epoch 6/20
20000/20000 [==============================] - 30s 2ms/step - loss: 0.0893 - acc: 0.2062 - val_loss: 1.0653 - val_acc: 0.2818
Epoch 7/20
20000/20000 [==============================] - 36s 2ms/step - loss: 0.0873 - acc: 0.1567 - val_loss: 1.1621 - val_acc: 0.2578
Epoch 8/20
20000/20000 [==============================] - 32s 2ms/step - loss: 0.0858 - acc: 0.1302 - val_loss: 1.1040 - val_acc: 0.2312
Epoch 9/20
20000/20000 [==============================] - 36s 2ms/step - loss: 0.0914 - acc: 0.1077 - val_loss: 1.2115 - val_acc: 0.2070
Epoch 10/20
20000/20000 [==============================] - 35s 2ms/step - loss: 0.0859 - acc: 0.0903 - val_loss: 1.1452 - val_acc: 0.1992
Epoch 11/20
20000/20000 [==============================] - 32s 2ms/step - loss: 0.0837 - acc: 0.0785 - val_loss: 1.2037 - val_acc: 0.1760
Epoch 12/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.0824 - acc: 0.0689 - val_loss: 1.4837 - val_acc: 0.1760
Epoch 13/20
20000/20000 [==============================] - 33s 2ms/step - loss: 0.0819 - acc: 0.0599 - val_loss: 1.5078 - val_acc: 0.1650
Epoch 14/20
20000/20000 [==============================] - 34s 2ms/step - loss: 0.0808 - acc: 0.0516 - val_loss: 1.2435 - val_acc: 0.1534
Epoch 15/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.0800 - acc: 0.0436 - val_loss: 1.3941 - val_acc: 0.1482
Epoch 16/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.0814 - acc: 0.0358 - val_loss: 1.3475 - val_acc: 0.1468
Epoch 17/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.0838 - acc: 0.0340 - val_loss: 1.3086 - val_acc: 0.1286
Epoch 18/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.0808 - acc: 0.0304 - val_loss: 1.3466 - val_acc: 0.1298
Epoch 19/20
20000/20000 [==============================] - 31s 2ms/step - loss: 0.0817 - acc: 0.0264 - val_loss: 1.3077 - val_acc: 0.1326
Epoch 20/20
20000/20000 [==============================] - 42s 2ms/step - loss: 0.0788 - acc: 0.0281 - val_loss: 1.4667 - val_acc: 0.1228

打开tensorboard的方法


果不能正常打开,请参考我的另一篇博文TensorBoard 的正确打开方法(含错误解决方法,超详细)解决问题

(备用地址)
https://blog.csdn.net/LeungSr/article/details/120800763?spm=1001.2014.3001.5501


注意,版本的打开方式换了

tensorboard --logdir “my_log_dir”

双引号里面可以填绝对路径或者相对路径


最终效果

1
2
3
4
5


写在最后

注:本文代码来自《Python 深度学习》,做成电子笔记的方式上传,仅供学习参考,作者均已运行成功,如有遗漏请练习本文作者

各位看官,都看到这里了,麻烦动动手指头给博主来个点赞8,您的支持作者最大的创作动力哟!
<(^-^)>
才疏学浅,若有纰漏,恳请斧正
本文章仅用于各位同志作为学习交流之用,不作任何商业用途,若涉及版权问题请速与作者联系,望悉知

  • 1
    点赞
  • 9
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值