kernel_regularizer 计算的就是层参数的相应值(l1、l2等)
class Outlayer(layers.Layer):
def __init__(self):
super(Outlayer, self).__init__()
self.dense = layers.Dense(32, kernel_regularizer=keras.regularizers.l2(1e-2))
self.dense1 = layers.Dense(10, kernel_regularizer=keras.regularizers.l1(1e-2))
def call(self, inputs):
h1 = self.dense(inputs)
return self.dense1(h1)
my_layer = Outlayer()
y = my_layer(tf.ones([2,2]))
print(y)
print(my_layer.losses)
print(tf.square(tf.norm(my_layer.dense.kernel)) * 1e-2)
print(tf.norm(my_layer.dense1.kernel, ord=1) * 1e-2)
y:<tf.Tensor: id=233, shape=(2, 10), dtype=float32, numpy=
array([[ 0.36865774, 0.5883919 , 0.2711479 , 0.3792193 , -0.3248419 ,
0.5872762 , -0.02513709, -0.1538085 , 0.02563459, 1.1253573 ],
[ 0.36865774, 0.5883919 , 0.2711479 , 0.3792193 , -0.3248419 ,
0.5872762 , -0.02513709, -0.1538085 , 0.02563459, 1.1253573 ]],
dtype=float32)>
my_layer.losses:
[<tf.Tensor: id=241, shape=(), dtype=float32, numpy=0.030209353>,
<tf.Tensor: id=249, shape=(), dtype=float32, numpy=0.6088533>]
tf.square(tf.norm(my_layer.dense.kernel)) * 1e-2:
<tf.Tensor: id=309, shape=(), dtype=float32, numpy=0.030209355>
tf.norm(my_layer.dense1.kernel, ord=1) * 1e-2:
<tf.Tensor: id=334, shape=(), dtype=float32, numpy=0.6088533>
my_layer.losses,是以列表显示,所以如果对损失函数添加正则,计算每一层regularizers,加在loss项即可。
如有问题,请各位看官指正。