flags的使用:
import tensorflow as tf
flags = tf.flags # flags是tf中的一个模块
FLAGS = flags.FLAGS # FLAGS是一个全局的变量,通过该变量可以访问所有的参数
flags.DEFINE_string("flag_name","default_value","doc_string") # 调用flags模块中的DEFINE_string函数添加optional argument(可选参数)
flags.DEFINE_bool("flag_name","default_value","doc_string")
flags.DEFINE_int("flag_name","default_value","doc_string")
flags.DEFINE_float("flag_name","default_value","doc_string")
# 访问参数的方法
FLAGS.flag_name...
tensorflow中的flags模块的源代码很简单,内部是通过import argparse as _argparse来实现命令行参数的解析。
yield的使用:
相当于关键字return,只不过返回的是生成器generator.
只有在迭代时内部代码才实际执行。下一次迭代时,从yield之后的代码开始执行
the .ckpt file is the old version output of
saver.save(sess)
, which is the equivalent of your.ckpt-data
(see below)the "checkpoint" file is only here to tell some TF functions which is the latest checkpoint file.
.ckpt-meta
contains the metagraph, i.e. the structure of your computation graph, without the values of the variables (basically what you can see in tensorboard/graph)..ckpt-data
contains the values for all the variables, without the structure. To restore a model in python, you'll usually use the meta and data files with (but you can also use the.pb
file):saver = tf.train.import_meta_graph(path_to_ckpt_meta) saver.restore(sess, path_to_ckpt_data)
I don't know exactly for
.ckpt-index
, I guess it's some kind of index needed internally to map the two previous files correctly. Anyway it's not really necessary usually, you can restore a model with only.ckpt-meta
and.ckpt-data
.the
.pb
file can save your whole graph (meta + data). To load and use (but not train) a graph in c++ you'll usually use it, created withfreeze_graph
, which creates the.pb
file from the meta and data. Be careful, (at least in previous TF versions and for some people) the py function provided byfreeze_graph
did not work properly, so you'd have to use the script version. Tensorflow also provides atf.train.Saver.to_proto()
method, but I don't know what it does exactly.-
meta file: describes the saved graph structure, includes GraphDef, SaverDef, and so on; then apply
tf.train.import_meta_graph('/tmp/model.ckpt.meta')
, will restoreSaver
andGraph
.index file: it is a string-string immutable table(tensorflow::table::Table). Each key is a name of a tensor and its value is a serialized BundleEntryProto. Each BundleEntryProto describes the metadata of a tensor: which of the "data" files contains the content of a tensor, the offset into that file, checksum, some auxiliary data, etc.
data file: it is TensorBundle collection, save the values of all variables.
cp $< $@
$< 第一个依赖
$@ 目标
tensorflow模型转成numpy
# c is checkpoint prefix
var_list=tf.contrib.framework.list_variables(c)
reader=tf.contrib.framework.load_checkpoint(c)
args={}
for name,shape in var_list:
if not name.startswith("global_step") and not name.startswith("train") and not name.startswith("losses"):
tensor=reader.get_tensor(name)
args[name]=tensor