tensorflow:提示找不到Adam或者RMSProp变量,Did you mean to set reuse=None in VarScope?

https://stackoverflow.com/questions/43183850/tensorflow-valueerror-variable-does-not-exist-or-was-not-created-with-tf-get-v

Traceback (most recent call last):
  File "/home/chuwei/PycharmProjects/GAN-segan/main.py", line 125, in <module>
    tf.app.run()
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/platform/app.py", line 44, in run
    _sys.exit(main(_sys.argv[:1] + flags_passthrough))
  File "/home/chuwei/PycharmProjects/GAN-segan/main.py", line 91, in main
    se_model = SEGAN(sess, FLAGS, udevices)   #时间长,出错
  File "/home/chuwei/PycharmProjects/GAN-segan/model.py", line 118, in __init__
    self.build_model(args)
  File "/home/chuwei/PycharmProjects/GAN-segan/model.py", line 144, in build_model
    self.d_opt = d_opt.apply_gradients(avg_d_grads)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/training/optimizer.py", line 412, in apply_gradients
    self._create_slots(var_list)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/training/rmsprop.py", line 101, in _create_slots
    self._get_or_make_slot(v, val_rms, "rms", self._name)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/training/optimizer.py", line 639, in _get_or_make_slot
    named_slots[var] = slot_creator.create_slot(var, val, op_name)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/training/slot_creator.py", line 101, in create_slot
    return _create_slot_var(primary, val, '')
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/training/slot_creator.py", line 55, in _create_slot_var
    slot = variable_scope.get_variable(scope, initializer=val, trainable=False)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/variable_scope.py", line 988, in get_variable
    custom_getter=custom_getter)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/variable_scope.py", line 890, in get_variable
    custom_getter=custom_getter)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/variable_scope.py", line 348, in get_variable
    validate_shape=validate_shape)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/variable_scope.py", line 333, in _true_getter
    caching_device=caching_device, validate_shape=validate_shape)
  File "/usr/local/lib/python3.4/dist-packages/tensorflow/python/ops/variable_scope.py", line 657, in _get_single_variable
    "VarScope?" % name)
ValueError: Variable d_model/d_block_1/d_vbn_1/gamma/RMSProp/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope

根据错误提示,在d_opt.apply_gradients前添加  with tf.variable_scope(tf.get_variable_scope(),reuse=False):  

(问题解决)

with tf.variable_scope(tf.get_variable_scope(), reuse=False):  #添加语句
      self.d_opt = d_opt.apply_gradients(avg_d_grads)
      self.g_opt = g_opt.apply_gradients(avg_g_grads)

并且python3.4可以,python2.7不行

换成 with tf.variable_scope("for_reuse_scope"): ,不行

其他参考内容:

https://blog.csdn.net/qq_25737169/article/details/77856961

https://www.baidu.com/link?url=VUL_CLGZzWEuHCKYZs2aF_Qoas3vZ7rkLW5yOhNXlMjj_mBieFmeWMhCf4lkwqCnZboolB30MDvXhJmZlehVje2XzIVz1_IJFOR2aAVD7tqjuS254z--Y94TbG5CX6oOt7gpaVBvFG-P617l44xu_LvF8f0lj49LgK34pFWlf6q&wd=&eqid=cd1c0e4400003e23000000065b28c4af


ValueError: Variable d_model/d_block_2/d_vbn_2/gamma/RMSProp/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?
Variable discriminator/conv/weights/RMSProp/does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?

这个错误是在使用优化函数

tf.train.RMSPropOptimizer()
tf.train.AdamOptimizer()

引起的,使用梯度下降算法反而没有出现问题, 
并且使用了 
tf.get_variable_scope().reuse_variables() 
原因是使用Adam或者RMSProp优化函数时,Adam函数会创建一个Adam变量,目的是保存你使用tensorflow创建的graph中的每个可训练参数的动量,但是这个Adam是在reuse=True条件下创建的,之后reuse就回不到None或者False上去,当reuse=True,就会在你当前的scope中reuse变量,如果在此scope中进行优化操作,就是使用AdamOptimizer等,他就会重用slot variable,这样子会导致找不到Adam变量,进而报错。

设置reuse=True的地方是

tf.get_variable_scope().reuse_variables()

或者

With tf.variable_scope(name) as scope :
       Scope.reuse_variables()

一般在运行GAN程序的时候会用到这段代码。解决方法就是将这个scope独立出来,reuse=True就只在当前scope中起作用,使用

With tf.variable_scope(tf.get_variables_scope())

把它放在需要调用函数的地方,我这里是discriminator函数: 
代码更改如下所示: 
Wrong:

       G = generator(z)
       D, D_logits = discriminator(images)
       samples = sampler(z)
       D_, D_logits_ = discriminator(G, reuse=True)

True:

   with tf.variable_scope("for_reuse_scope"):
       G = generator(z)
       D, D_logits = discriminator(images)
       samples = sampler(z)
       D_, D_logits_ = discriminator(G, reuse=True)


  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值