tensorflow学习笔记-从checkpoint中获取graph信息

    代码:
import tensorflow as tf

sess = tf.Session()
check_point_path = 'variables' 
saver = tf.train.import_meta_graph('variables/save_variables.ckpt.meta')

saver.restore(sess, tf.train.latest_checkpoint(check_point_path))

graph = tf.get_default_graph()

#print(graph.get_operations())

#with open('op.txt','a') as f:
#    f.write(str(graph.get_operations()))
op1 = graph.get_tensor_by_name('fully_connected/biases:0')
print(op1)

    使用函数graph.get_operations()获取ckpt.meta中保存的graph中的所有operation,而tensor_name为'op_name:0'。

然后使用graph.get_tensor_by_name('op_name:0') 获取tensor信息。

     代码从ckpt文件中获取保存的variable的数据(tensor的name和value):

import os
import tensorflow as tf
from tensorflow.python import pywrap_tensorflow
check_point_path = 'variables'
#checkpoint_path = os.path.join(logs_train_dir, 'model.ckpt')
ckpt = tf.train.get_checkpoint_state(checkpoint_dir=check_point_path)
checkpoint_path = os.path.join('.', ckpt.model_checkpoint_path)
#print(ckpt.model_checkpoint_path)
reader = pywrap_tensorflow.NewCheckpointReader(checkpoint_path)
var_to_shape_map = reader.get_variable_to_shape_map()
for key in var_to_shape_map:
    print("tensor_name: ", key)
    #print(reader.get_tensor(key))

法二:

from tensorflow.python.tools.inspect_checkpoint import print_tensors_in_checkpoint_file

print_tensors_in_checkpoint_file("variables/save_variables.ckpt",tensor_name='', all_tensors=False, all_tensor_names=False)


注意:tf.train.latest_checkpoint(check_point_path) 方法用来获取最后一次ckeckpoint的路径,等价于

ckpt = tf.train.get_checkpoint_state(check_point_path)
ckpt.model_checkpoint_path
不能将 tf.train.latest_checkpoint与tf.train.get_checkpoint_state 搞混了

参考博客

Tensorflow Error笔记3

Save tensorflow model: variables not found

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值