里面讲解了
saver = tf.train.Saver(max_to_keep=5)
saver.save(sess,'filepath,global_step=step)
tf.train.latest_checkpoint(checkpoint_dir, latest_filename=None)
saver.restore(sess, save_path)
加载模型参数
def load_weights(checkpoint_dir, sess, saver): """ Load the weights of a model stored in saver. Parameters ---------- checkpoint_dir : str The directory of checkpoints. sess : tf.Session A Session to use to restore the parameters. saver : tf.train.Saver Returns ----------- int training step of checkpoint """ ckpt = tf.train.get_checkpoint_state(checkpoint_dir) if ckpt and ckpt.model_checkpoint_path: logging.info(ckpt.model_checkpoint_path) file = os.path.basename(ckpt.model_checkpoint_path) checkpoint_path = os.path.join(checkpoint_dir, file) saver.restore(sess, checkpoint_path) return int(file.split('-')[1])
ckpt = tf.train.get_checkpoint_state(checkpoint_dir)
ckpt.
model_checkpoint_path和ckpt.all_model_checkpoint_paths该函数返回的是checkpoint文件CheckpointState proto类型的内容,其中有model_checkpoint_path和all_model_checkpoint_paths两个属性,里面的内容都是路径。其中model_checkpoint_path保存了最新的tensorflow模型文件的文件名,all_model_checkpoint_paths则有未被删除的所有tensorflow模型文件的文件名。如果没有checkpoint文件那么返回None。点击查看具体例子
os.path.basename(ckpt.model_checkpoint_path)
Returns the final component of a pathname也就是返回路径最后的那部分(路径是以'/'为界限划块的),相当于
ckpt.model_checkpoint_path.split('/')[-1]
saver.restore(sess, checkpoint_path)
sess:表示当前会话,之前保存的结果将被加载入这个会话;
checkpoint_path:
表示模型存储的位置,不需要提供模型的名字,它会去查看checkpoint文件,看看最新的是谁,叫做什么