【Tensorflow】.ckpt模型转换为.pb模型(BERT)

目录

1. 模型文件转换

1.1 获取模型中节点名称

1.2 将ckpt模型转换为pb模型

1.3 查看.ckpt文件保存的tensor信息

2. 模型文件可视化

2.1 ckpt模型可视化

2.2 pb模型可视化



Google_BERT

 

1. 模型文件转换

.indel是对应模型的索引文件,保存.data文件与.meta文件中图的结构关系;

.data-00000-of-00001文件:保存Tensorflow每个变量的取值,存储格式SSTable,(key, value)列表;

.meta文件保存tensorflow计算图的网络结构,MetaGraph元图,以protocal buffer格式保存

将tensorflow的ckpt模型转为pb文件, 需要知道网络的输出节点名称, 如果不指定输出节点名称, 程序就不知道该freeze哪些节点, 就没有办法保存模型.

 

1.1 获取模型中节点名称

# function: get the node name of ckpt model
from tensorflow.python import pywrap_tensorflow
# checkpoint_path = 'model.ckpt-xxx'
checkpoint_path = './uncased_L-12_H-768_A-12/bert_model.ckpt'
reader = pywrap_tensorflow.NewCheckpointReader(checkpoint_path)
var_to_shape_map = reader.get_variable_to_shape_map()
for key in var_to_shape_map:
    print("tensor_name: ", key)

tensorflow获取模型节点名称及将.ckpt转为.pb文件

1.2 将ckpt模型转换为pb模型

import tensorflow as tf
from tensorflow.python.framework import graph_util
from tensorflow.python.platform import gfile

def freeze_graph(ckpt, output_graph):
    output_node_names = 'bert/encoder/layer_7/output/dense/kernel'
    # saver = tf.train.import_meta_graph(ckpt+'.meta', clear_devices=True)
    saver = tf.compat.v1.train.import_meta_graph(ckpt+'.meta', clear_devices=True)
    graph = tf.get_default_graph()
    input_graph_def = graph.as_graph_def()

    with tf.Session() as sess:
        saver.restore(sess, ckpt)
        output_graph_def = graph_util.convert_variables_to_constants(
            sess=sess,
            input_graph_def=input_graph_def,
            output_node_names=output_node_names.split(',')
        )
        with tf.gfile.GFile(output_graph, 'wb') as fw:
            fw.write(output_graph_def.SerializeToString())
        print ('{} ops in the final graph.'.format(len(output_graph_def.node)))

ckpt = '/home/jie/gitdir/ckpt_pb/uncased_L-12_H-768_A-12/bert_model.ckpt'
pb   = '/home/jie/gitdir/ckpt_pb/bert_model.pb'

if __name__ == '__main__':
    freeze_graph(ckpt, pb)

 

1.3 查看.ckpt文件保存的tensor信息

import os
from tensorflow.python import pywrap_tensorflow
 
# code for finall ckpt
checkpoint_path = "./uncased_L-12_H-768_A-12/bert_model.ckpt"
# Read data from checkpoint file
reader = pywrap_tensorflow.NewCheckpointReader(checkpoint_path)
var_to_shape_map = reader.get_variable_to_shape_map()
# Print tensor name and values
for key in var_to_shape_map:
    print("tensor_name: ", key)
    print(reader.get_tensor(key))

[reference]

1.4 查看.pb文件的所有tensor

# params: pb_file_direction
import argparse
import tensorflow as tf

def print_tensors(pb_file):
    print('Model File: {}\n'.format(pb_file))
    # read pb into graph_def
    with tf.gfile.GFile(pb_file, "rb") as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())

    # import graph_def
    with tf.Graph().as_default() as graph:
        tf.import_graph_def(graph_def)

    # print operations
    for op in graph.get_operations():
        print(op.name + '\t' + str(op.values()))


if __name__ == '__main__':
    parser = argparse.ArgumentParser()
    parser.add_argument("--pb_file", type=str, required=True, help="Pb file")
    args = parser.parse_args()
    print_tensors(args.pb_file)

 

 

2. 模型文件可视化

2.1 ckpt模型可视化

 

2.2 pb模型可视化

1. 从pb文件中恢复计算图

import tensorflow as tf
# path of pb file
model = './bert_model.pb'
# graph = tf.get_default_graph()
graph = tf.compat.v1.get_default_graph()
graph_def = graph.as_graph_def()
graph_def.ParseFromString(tf.gfile.FastGFile(model, 'rb').read())
tf.import_graph_def(graph_def, name='graph')
# summaryWriter = tf.summary.FileWriter('log/', graph)
summaryWriter = tf.compat.v1.summary.FileWriter('log/', graph)

2. Tensorboard查看计算图

tensorboard --logdir ./log/

Tensorflow之pb文件分析

3. 打印pb模型的tensor info

# coding:utf-8
import tensorflow as tf
from tensorflow.python.platform import gfile

tf.reset_default_graph()  # 重置计算图
output_graph_path = '1.pb'
with tf.Session() as sess:
    tf.global_variables_initializer().run()
    output_graph_def = tf.GraphDef()
    # 获得默认的图
    graph = tf.get_default_graph()
    with gfile.FastGFile(output_graph_path, 'rb') as f:
        output_graph_def.ParseFromString(f.read())
        _ = tf.import_graph_def(output_graph_def, name="")
        # 得到当前图有几个操作节点
        print("%d ops in the final graph." % len(output_graph_def.node))

        tensor_name = [tensor.name for tensor in output_graph_def.node]
        print(tensor_name)
        print('---------------------------')
        # 在log_graph文件夹下生产日志文件,可以在tensorboard中可视化模型
        # summaryWriter = tf.summary.FileWriter('log_graph/', graph)

        for op in graph.get_operations():
            # print出tensor的name和值
            print(op.name, op.values())

查看TensorFlow的pb模型文件并使用TensorBoard可视化

 

 

 

 

 

 

  • 9
    点赞
  • 56
    收藏
    觉得还不错? 一键收藏
  • 24
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 24
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值