ValueError: Variable E_conv0/w/Adam/ does not exist, or was not created with tf.get_variable().

运行tf.train.AdamOptimizer()函数,例如下面代码:

 self.EG_optimizer = tf.train.AdamOptimizer(
            learning_rate=EG_learning_rate,
            beta1=beta1
        ).minimize(
            loss=self.loss_EG,
            global_step=self.EG_global_step,
            var_list=self.E_variables + self.G_variables
        )



出现错误:

ValueError: Variable E_conv0/w/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?


解决方法:


在函数第一行添加:


with tf.variable_scope("encoder") as scope:




例如原encoder()函数代码为:

def encoder(self, image, reuse_variables=False):
       
        if reuse_variables:
            tf.get_variable_scope().reuse_variables()
        num_layers = int(np.log2(self.size_image)) - int(self.size_kernel / 2)
        current = image
        # conv layers with stride 2
        for i in range(num_layers):
            name = 'E_conv' + str(i)
            current = conv2d(
                    input_map=current,
                    num_output_channels=self.num_encoder_channels * (2 ** i),
                    size_kernel=self.size_kernel,
                    name=name
                )
            current = tf.nn.relu(current)

        # fully connection layer
        name = 'E_fc'
        current = fc(
            input_vector=tf.reshape(current, [self.size_batch, -1]),
            num_output_length=self.num_z_channels,
            name=name
        )

        # output
        return tf.nn.tanh(current)



修改后为:

  def encoder(self, image, reuse_variables=False):
        with tf.variable_scope("encoder") as scope:
            if reuse_variables:
                tf.get_variable_scope().reuse_variables()
            num_layers = int(np.log2(self.size_image)) - int(self.size_kernel / 2)
            current = image
            # conv layers with stride 2
            for i in range(num_layers):
                name = 'E_conv' + str(i)
                current = conv2d(
                        input_map=current,
                        num_output_channels=self.num_encoder_channels * (2 ** i),
                        size_kernel=self.size_kernel,
                        name=name
                    )
                current = tf.nn.relu(current)

            # fully connection layer
            name = 'E_fc'
            current = fc(
                input_vector=tf.reshape(current, [self.size_batch, -1]),
                num_output_length=self.num_z_channels,
                name=name
            )

            # output
            return tf.nn.tanh(current)





  • 1
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值