参考:
PyTorch 有哪些坑/bug?
batch norm、relu、dropout 等的相对顺序和BN、dropout的几个问题和思考
##错误!!
Class DropoutFC(nn.Module):
def __init__(self):
super(DropoutFC, self).__init__()
self.fc = nn.Linear(100,20)
def forward(self, input):
out = self.fc(input)
out = F.dropout(out, p=0.5)#就是这一句!!!
return out
Net = DropoutFC()
Net.train()
# train the Net
##正确的version1 !!
Class DropoutFC(nn.Module):
def __init__(self):
super(DropoutFC, self).__init__()
self.fc = nn.Linear(100,20)
def forward(self, input):
out = self.fc(input)
out = F.dropout(out, p=0.5, training=self.training)#正确做法
return out
Net = DropoutFC()
Net.train()
# train the Net
##正确的version2 !!
Class DropoutFC(nn.Module):
def __init__(self):
super(DropoutFC, self).__init__()
self.fc = nn.Linear(100,20)
self.dropout = nn.Dropout(p=0.5)
def forward(self, input):
out = self.fc(input)
out = self.dropout(out)
return out
Net = DropoutFC()
Net.train()
# train the Net
作者:雷杰
链接:https://www.zhihu.com/question/67209417/answer/302434279
来源:知乎
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。