tensorflow某些层交替训练

问题描述:遇到多loss约束,需要某些层单独训练。

代码:

    方法一:

first_train_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES,scope='conv0'and'dense_1'and 'trans_1'and'dense_2'and'trans_2'and'dense_3'and'trans_3'and'dense_4'and'linear_batch'and'linear')
optimizer_1 = tf.train.AdamOptimizer(learning_rate = lr)
train_op_1 = slim.learning.create_train_op(loss, optimizer_1, update_ops = first_train_vars) 
        
        
second_train_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES,scope='conv_0'and'conv_1'and'conv_2')
optimizer_2 = tf.train.AdamOptimizer(learning_rate = lr/1000)
train_op_2 = slim.learning.create_train_op(loss_cng, optimizer_2, update_ops = second_train_vars) 

    方法二:

all_vars =  tf.trainable_variables() 
group_vars = ...
second_vars = [...]

first_train_vars = [var for var in all_vars if var.name.startswith(group_vars)]
optimizer_1 = tf.train.AdamOptimizer(learning_rate = 5e-6)
train_op_1 = slim.learning.create_train_op(m_loss, optimizer_1 , variables_to_train = first_train_vars)

temp = list()
second_train_vars = list()
for j in range(len(second_vars)):
   #second_train_vars.append(tf.get_collection(tf.GraphKeys.TRAINABLE_RESOURCE_VARIABLES, scope=second_vars[j]))
   temp.append([var for var in all_vars if var.name.startswith(second_vars[j])])
for m in range(len(temp)):
   for n in range(len(temp[m])):
       second_train_vars.append(temp[m][n])
optimizer_2 = tf.train.AdamOptimizer(learning_rate = lr)
train_op_2 = slim.learning.create_train_op(loss, optimizer_2, variables_to_train = second_train_vars)


  • 1
    点赞
  • 6
    收藏
    觉得还不错? 一键收藏
  • 3
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值