1. 错误提示
Traceback (most recent call last):
File "D:/Python_practice/CIRFAR-10/daily_test4.py", line 33, in <module>
perception = Perception(3,4,1)
File "D:/Python_practice/CIRFAR-10/daily_test4.py", line 27, in __init__
self.layer1 = Linear(in_features,hidden_features)
File "D:/Python_practice/CIRFAR-10/daily_test4.py", line 10, in __init__
nn.Module().__init__(self)
TypeError: __init__() takes 1 positional argument but 2 were given
2.我的代码
import torch as t
from torch import nn
from torch.autograd import Variable as V
'''单个全连接层'''
class Linear(nn.Module): # 继承nn.Module
#在构造函数中__init__()中必须自己定义可学习的参数,并封住成Parameter
def __init__(self,in_features,out_features):
# super(Linear,self).__init__() #与nn.Module().__init__(self)
nn.Module().__init__(self)
self.w = nn.Parameter(t.randn(in_features,out_features))
self.b = nn.Parameter(t.randn(out_features))
def forward(self,x): #单词 错了??? forward
x = x.mm(self.w)
return x + self.b.expand_as(x)
'''感知机'''
class Perception(nn.Module):
def __init__(self,in_features,hidden_features,out_features):
nn.Module.__init__(self)
# super(Perception,self).__init__()
self.layer1 = Linear(in_features,hidden_features)
self.layer2 = Linear(hidden_features,out_features)
def forward(self,x):
x = self.layer1(x)
x = t.sigmoid(x)
return self.layer2(x)
perception = Perception(3,4,1)
for name,param in perception.named_parameters():
print(name,param.size())
3.错误原因
感觉是super方法调用nn.Module方法会包括Linear类中的自定义一些初始参数传递,而用nn.Module().__init__(self)方法调用仅仅是调用了该模块的自带的初始参数。所以会出现传参位数不匹配错误。
以上只是个人看法,如果有不错欢迎指正,谢谢各位大佬!
4. 修改方法
将Linear类中__init__()方法中的调用nn.Module模块中的构造函数形式换为super(Linear,self).__init__() 形式即可。
建议可以用super(classname,self).__init)__()方法来调用nn.Module()的构造函数,避免出错。