BN(BatchNorm)和LN(LayerNorm)是两种最常用的Normalization的方法,它们都是将输入特征转换为均值为0,方差为1的数据。
一、BatchNorm2d
函数参数:
num_features:输入数据是(N, C, H, W),应为C
eps:为避免分母为0,默认值为e-5
二、LayerNorm
函数参数:
num_features:输入有2种
eps:为避免分母为0,默认值为e-5
(1)NLP方式
# NLP Example >>> batch, sentence_length, embedding_dim = 20, 5, 10 >>> embedding = torch.randn(batch, sentence_length, embedding_dim) >>> layer_norm = nn.LayerNorm(embedding_dim)
(2)图像方式
# Image Example >>> N, C, H, W = 20, 5, 10, 10 >>> input = torch.randn(N, C, H, W) >>> # Normalize over the last three dimensions (i.e. the channel and spatial dimensions) >>> # as shown in the image below >>> layer_norm = nn.LayerNorm([C, H, W])
若用第一种方式需要对输入维度进行转换,变成(N,H,W,C),因为embedding要在最后。