tf.contrib.layer1.l2_regularizer(regularization_rate)(weight)
对weight施加正则化率为regularization_rate的L2正则化(L1同理)。注意,两个括号之间没有逗号(,)。函数返回一个施加了正则化项的tensor,加入loss中。
regularization = tf.contrib.layer2.l2_regularizer(REGULARIZATION_RATE)(weight1)
loss = cross_entropy + regularization