TensorFlow .pb: Save and Display Models

本文介绍了 TensorFlow 中 *.pb 文件的用途及加载方法。*.pb 文件采用 Google 的 Protocol Buffer 格式存储模型定义和权重。文章提供了加载 *.pb 文件的 Python 示例代码,并解释了如何在 TensorFlow 2.x 中使用 SavedModel 加载模型。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

What is the use of a *.pb file in TensorFlow and how does it work? - Stack Overflow

Basics

pb stands for protobuf. In TensorFlow, the protbuf file contains the graph definition as well as the weights of the model. Thus, a pb file is all you need to be able to run a given trained model.

Given a pb file, you can load it as follow.

def load_pb(path_to_pb):
    with tf.gfile.GFile(path_to_pb, "rb") as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
    with tf.Graph().as_default() as graph:
        tf.import_graph_def(graph_def, name='')
        return graph

Once you have loaded the graph, you can basically do anything. For instance, you can retrieve tensors of interest with

input = graph.get_tensor_by_name('input:0')
output = graph.get_tensor_by_name('output:0')

and use regular TensorFlow routine like:

sess.run(output, feed_dict={input: some_data})

Explanation

The .pb format is the protocol buffer (protobuf) format, and in Tensorflow, this format is used to hold models. Protobufs are a general way to store data by Google that is much nicer to transport, as it compacts the data more efficiently and enforces a structure to the data. When used in TensorFlow, it's called a SavedModel protocol buffer, which is the default format when saving Keras/ Tensorflow 2.0 models. More information about this format can be found here and here.

For example, the following code (specifically, m.save), will create a folder called my_new_model, and save in it, the saved_model.pb, an assets/ folder, and a variables/ folder.

# first download a SavedModel from TFHub.dev, a website with models
m = tf.keras.Sequential([
    hub.KerasLayer("https://tfhub.dev/google/imagenet/mobilenet_v2_130_224/classification/4")
])
m.build([None, 224, 224, 3])  # Batch input shape.
m.save("my_new_model") # defaults to save as SavedModel in tensorflow 2

In some places, you may also see .h5 models, which was the default format for TF 1.X. source


Extra information: In TensorFlow Lite, the library for running models on mobile and IoT devices, instead of protocol buffers, flatbuffers are used. This is what the TensorFlow Lite Converter converts into (.tflite format). This is another Google format which is also very efficient: it allows access to any part of the message without deserialization (unlike json, xml). For devices with less memory (RAM), it makes more sense to load what you need from the model file, instead of loading the entire thing into memory to deserialize it.


Loading SavedModels in TensorFlow 2

I noticed BiBi's answer to show loading models was popular, and there is a shorter way to do this in TF2:

import tensorflow as tf
model_path = "/path/to/directory/inception_v1_224_quant_20181026"
model = tf.saved_model.load(model_path)

Note,

  • the directory (i.e. inception_v1_224_quantBasis _20181026) has to have a saved_model.pb or saved_model.pbtxt, otherwise the code will crash. You cannot specify the .pb path, specify the directory.
  • you might get TypeError: 'AutoTrackable' object is not callable for older models, fix here.

If you load a TF1 model, I found that I don't get any errors, but the loaded file doesn't behave as expected. (e.g. it doesn't have any functions on it, like predict)

More on Saved Models: Official Docs

https://www.tensorflow.org/guide/saved_model

includes: create, save, load and finetuning

View the Model on TensorBoard

tensorboard: view graph from saved_model.pb file [feature request] · Issue #8854 · tensorflow/tensorflow · GitHub

Soln1 (March 31st 2017)

As far as I know, you don't need to create any summaries to load the graph into Tensorboard. If you begin to create a summarywriter and then add the graph to it then you should see the graph appear in Tensorboard. I have some code that does that (as part of another project). I do however agree that being able to import something into Tensorboard would be handy.

I've quickly created a bit of code to load a graph into Tensorboard. See how that goes. Also available as a gist here

import tensorflow as tf
from tensorflow.python.platform import gfile
with tf.Session() as sess:
    model_filename ='PATH_TO_PB.pb'
    with gfile.FastGFile(model_filename, 'rb') as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
        g_in = tf.import_graph_def(graph_def)
LOGDIR='YOUR_LOG_LOCATION'
train_writer = tf.summary.FileWriter(LOGDIR)
train_writer.add_graph(sess.graph)

 ==> the variation in .pb file and tf versions might cause the example scripts to fail, see from further discussions how to deal with these issues, such as:

Don't forget to flush ;) (Nov. 29th 2019)

train_writer.flush()

Soln 2 (Dec. 8th 2020)

New Readers/Visitors

Please note that my original solution was thrown together to temporarily answer the issue to help the user. I then PR'd a helper function into tf where much smarter people have taken it over and seemingly converted for 2.0. I've been fairly dormant on TF now too. Judging by the reactions on my original solution this is clearly a high traffic issue, but please ensure you checkout the latest version of tensorflow/import_pb_to_tensorboard.py at master · tensorflow/tensorflow · GitHub

You may be best off raising a new issue with that code explicitly (or other tb bits. Tb has become a lot more sophisticated since the early days when I was still working with tf regularly!). Do reference this issue still so those who land here can link over to your issue and find a better solution.

Also note that this method is most likely superseded by more accessible/superior ways of loading your model into TB ==> it seems so far loading the models is simplified but working with TB still requires crutches. I've not actually ever needed to use my own helper function, especially if you are using the Keras functionality, so I would recommend that you take some time to get your TB integration working in the best, native way possible so you can get this most out of it. I'll keep an eye on this to help where I can. Cheers

 ==> script usage example from Feb. 13th 2020:

https://colab.research.google.com/drive/13LAUUT9tEH2XeoNA_z9A7uE5omc-lzNv

local file management and import/load local models in colab:

https://blog.csdn.net/maxzcl/article/details/123731762

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值