想要设置某一层的weight值在0-1之间:
参考回答:https://discuss.pytorch.org/t/set-constraints-on-parameters-or-layers/23620/11
It is impossible to declare a constrained parameter in pytorch. So, in init an unconstained parameter is declared, e.g.:
self.my_param = nn.Parameter(torch.zeros(1))
And in forward(), you do the transformation:
my_param_limited = torch.sigmoid(my_param