tensorflow基础知识汇总

##############################################################
-- tf.summary.FileWriter
  """Writes `Summary` protocol buffers to event files.

  The `FileWriter` class provides a mechanism to create an event file in a
  given directory and add summaries and events to it. The class updates the
  file contents asynchronously. This allows a training program to call methods
  to add data to the file directly from the training loop, without slowing down
  training.

  When constructed with a `tf.compat.v1.Session` parameter, a `FileWriter`
  instead forms a compatibility layer over new graph-based summaries
  (`tf.contrib.summary`) to facilitate the use of new summary writing with
  pre-existing code that expects a `FileWriter` instance.
  
  
-- output
events.out.tfevents.1564104102.DESKTOP-6H9VRDC


##############################################################
-- tf.train.trainable_variables()
def trainable_variables(scope=None):
  """Returns all variables created with `trainable=True`.

  When passed `trainable=True`, the `Variable()` constructor automatically
  adds new variables to the graph collection
  `GraphKeys.TRAINABLE_VARIABLES`. This convenience function returns the
  contents of that collection.

-- 补充说明
1、不包含placeholder  
  
  
  
  
##############################################################
-- class Saver(object):
  The `Saver` class adds ops to save and restore "variables" to and from *checkpoints*.  
  It also provides convenience methods to run these ops.

  Checkpoints are binary files in a proprietary format which map variable names
  to tensor values.  The best way to examine the contents of a checkpoint is to
  load it using a `Saver`.

  [variables, map variable names to tensor values]
  
  
  
##############################################################
-- tf.train.export_meta_graph  
  """Returns `MetaGraphDef` proto.

  Optionally writes it to filename.

  This function exports the graph, saver, and collection objects into
  `MetaGraphDef` protocol buffer with the intention of it being imported
  at a later time or location to restart training, run inference, or be
  a subgraph.
    
  ["graph, saver, and collection objects", "MetaGraphDef protocol buffer"]

  
##############################################################  
-- tf.MetaGraphDef  
-- Class MetaGraphDef
  
Properties
    repeated     AssetFileDef         asset_file_def
    repeated     CollectionDefEntry     collection_def
                GraphDef             graph_def
                MetaInfoDef         meta_info_def
                SavedObjectGraph     object_graph_def
                SaverDef             saver_def
    repeated     SignatureDefEntry     signature_def  
  
  
https://github.com/tensorflow/tensorflow/blob/r1.14/tensorflow/core/protobuf/meta_graph.proto  

  
message MetaGraphDef {
...
...
}  
  
// SignatureDef defines the signature of a computation supported by a TensorFlow
// graph.
//
// For example, a model with two loss computations, sharing a single input,
// might have the following signature_def map.
  
message SignatureDef {
  // Named input parameters.
  map<string, TensorInfo> inputs = 1;
  
  // Named output parameters.
  map<string, TensorInfo> outputs = 2;
  
  string method_name = 3;
}

// An asset file def for a single file or a set of sharded files with the same
// name.
message AssetFileDef {
  string filename = 2;
}  
    
    
##############################################################
-- tf.GraphDef

Class GraphDef

Properties
                FunctionDefLibrary library
    repeated     NodeDef node
                int32 version
                VersionDef versions

源码
tensorflow/tensorflow/core/framework/graph.proto  
  
  
// Represents the graph of operations
message GraphDef {
  repeated NodeDef node = 1;

  // Compatibility versions of the graph.  See core/public/version.h for version
  VersionDef versions = 4;
    
  int32 version = 3 [deprecated = true];

  // EXPERIMENTAL. DO NOT USE OR DEPEND ON THIS YET.
  //
  // "library" provides user-defined functions.
  //
  // Function call semantics:
  //
  FunctionDefLibrary library = 2;
};  
  
  
##############################################################
-- tf.io.write_graph  
-- Writes a graph proto to a file.
  
Aliases
    tf.train.write_graph

tf.io.write_graph(
    graph_or_graph_def,
    logdir,
    name,
    as_text=True
)  
  
  
  

##############################################################
-- Graph vs GraphDef 区别
  
how do the other languages transform the Graph to C++? 
They use a tool called protobuf which can generate specific language stubs, 
that's where the GraphDef come from. 
It's a serialized version of Graph.
  
  
-- tf.Graph
A TensorFlow computation, represented as a dataflow graph.

  
-- tf.GraphDef(self, /, *args, **kwargs)  
A ProtocolMessage  
  
  
  
-- tf.GraphKeys  
Standard names to use for graph collections.

The standard library uses various well-known names to collect and
retrieve values associated with a graph. For example, the
`tf.Optimizer` subclasses default to optimizing the variables
collected under `tf.GraphKeys.TRAINABLE_VARIABLES` if none is
specified, but it is also possible to pass an explicit list of
variables.  
  
  
  
  

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值