tensorboard可视化工具最简单使用示例(tf2.6:上手就会)

1 前言

实验环境:tf2.6,对应的keras为2.6.0

使用情境:最简单的在model.fit使用,十分简单。

2 代码示例

参考文档为官方文档~

#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2022/7/13 20:48
# @Author : hqc
# @File : tensorboard_usage.py
# @Software: PyCharm

# import necessary modules
import tensorflow as tf
import datetime

# load and process the mnist dataset
mnist = tf.keras.datasets.mnist

(x_train, y_train),(x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

# create and compile a model
def create_model():
  return tf.keras.models.Sequential([
    tf.keras.layers.Flatten(input_shape=(28, 28)),
    tf.keras.layers.Dense(512, activation='relu'),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(10, activation='softmax')
  ])

model = create_model()
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

# set the log directory and define a tensorboard callback
log_dir="logs/fit/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)

# training
model.fit(x=x_train,
          y=y_train,
          epochs=5,
          validation_data=(x_test, y_test),
          callbacks=[tensorboard_callback])

运行结果:

"D:\app install\anaconda3\envs\tf\pythonw.exe" "D:/research/python learning/tensorflow learning/tensorboard/tensorboard_usage.py"
2022-07-13 20:54:33.527731: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2022-07-13 20:54:33.530076: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
2022-07-13 20:54:33.605376: I tensorflow/core/profiler/lib/profiler_session.cc:131] Profiler session initializing.
2022-07-13 20:54:33.605630: I tensorflow/core/profiler/lib/profiler_session.cc:146] Profiler session started.
2022-07-13 20:54:33.617606: I tensorflow/core/profiler/lib/profiler_session.cc:164] Profiler session tear down.
2022-07-13 20:54:34.305415: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:185] None of the MLIR Optimization Passes are enabled (registered 2)
Epoch 1/5
   1/1875 [..............................] - ETA: 14:01 - loss: 2.3267 - accuracy: 0.09382022-07-13 20:54:34.886541: I tensorflow/core/profiler/lib/profiler_session.cc:131] Profiler session initializing.
2022-07-13 20:54:34.886726: I tensorflow/core/profiler/lib/profiler_session.cc:146] Profiler session started.
   2/1875 [..............................] - ETA: 4:06 - loss: 2.3347 - accuracy: 0.1094 2022-07-13 20:54:34.949857: I tensorflow/core/profiler/lib/profiler_session.cc:66] Profiler session collecting data.
2022-07-13 20:54:35.008305: I tensorflow/core/profiler/lib/profiler_session.cc:164] Profiler session tear down.
2022-07-13 20:54:35.082561: I tensorflow/core/profiler/rpc/client/save_profile.cc:136] Creating directory: logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35

2022-07-13 20:54:35.103060: I tensorflow/core/profiler/rpc/client/save_profile.cc:142] Dumped gzipped tool data for trace.json.gz to logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35\HQC.trace.json.gz
2022-07-13 20:54:35.130759: I tensorflow/core/profiler/rpc/client/save_profile.cc:136] Creating directory: logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35

2022-07-13 20:54:35.153171: I tensorflow/core/profiler/rpc/client/save_profile.cc:142] Dumped gzipped tool data for memory_profile.json.gz to logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35\HQC.memory_profile.json.gz
2022-07-13 20:54:35.250805: I tensorflow/core/profiler/rpc/client/capture_profile.cc:251] Creating directory: logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35
Dumped tool data for xplane.pb to logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35\HQC.xplane.pb
Dumped tool data for overview_page.pb to logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35\HQC.overview_page.pb
Dumped tool data for input_pipeline.pb to logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35\HQC.input_pipeline.pb
Dumped tool data for tensorflow_stats.pb to logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35\HQC.tensorflow_stats.pb
Dumped tool data for kernel_stats.pb to logs/fit/20220713-205433\train\plugins\profile\2022_07_13_12_54_35\HQC.kernel_stats.pb

1875/1875 [==============================] - 10s 5ms/step - loss: 0.2177 - accuracy: 0.9347 - val_loss: 0.1125 - val_accuracy: 0.9644
Epoch 2/5
1875/1875 [==============================] - 14s 8ms/step - loss: 0.0959 - accuracy: 0.9703 - val_loss: 0.0868 - val_accuracy: 0.9739
Epoch 3/5
1875/1875 [==============================] - 10s 5ms/step - loss: 0.0677 - accuracy: 0.9784 - val_loss: 0.0695 - val_accuracy: 0.9789
Epoch 4/5
1875/1875 [==============================] - 16s 9ms/step - loss: 0.0529 - accuracy: 0.9826 - val_loss: 0.0703 - val_accuracy: 0.9777
Epoch 5/5
1875/1875 [==============================] - 15s 8ms/step - loss: 0.0433 - accuracy: 0.9861 - val_loss: 0.0636 - val_accuracy: 0.9812

并在当前目录下生成日志文件夹:
在这里插入图片描述

3 注意事项

  1. 在终端命令行查看tensorboard时要注意日志文件夹的路径,不能照抄教程
    在这里插入图片描述
    可以看到,第二次运行(tensorboard/logs/fit根据自己的情况来定)才能得到正常的tensorboard界面
    在这里插入图片描述
  2. 重新训练之后一般要把之前的日志先清空

教程是在终端执行以下命令:rm -rf ./tensorboard/logs/,但这好像是在linux系统中的命令,笔者使用的是window系统进行学习。

因此可直接手动删除logs日志文件夹。

如果不删除,也不会报错,将会同时将两次日志弄到一个图中。
在这里插入图片描述

tensorboard中的情况:
在这里插入图片描述
可以筛选查看~

  • 3
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值