具体报错如下:
File "/usr/local/lib64/python3.6/site-packages/torch/nn/modules/module.py", line 541, in __ call__
608 result = self.forward(*input, **kwargs)
609 File "/usr/local/lib64/python3.6/site-packages/torch/nn/modules/container.py", line 92, in forward
610 input = module(input)
611 File "/usr/local/lib64/python3.6/site-packages/torch/nn/modules/module.py", line 541, in __ call__
612 result = self.forward(*input, **kwargs)
613 File "/usr/local/lib64/python3.6/site-packages/torch/nn/modules/module.py", line 97, in for ward
614 raise NotImplementedError
615 NotImplementedError
报错代码大概是这样的:
elf.enc = nn.Sequential(*list(model.children())[:-2])
print(self.enc)
# Replace the network transfer learning part with my own:
nc = list(model.children())[-1].in_features
self.head = nn.Sequential(AdaptiveConcatPool2d(),
Flatten(),
nn.Linear(2*nc,512),
Mish(),
nn.BatchNorm1d(512),
nn.Dropout(0.5),
nn.Linear(512,n))
def forward(self, *x):
#...
x = self.enc(x)
# ...
x = self.head(x)
return x
错误原因是 nn.Sequential可能不能打包model里的组件,所以导致找不到forward函数,具体可以看nn.Sequential源码是这样的:
class Sequential(Module):
r"""A sequential container.
Modules will be added to it in the order they are passed in the constructor.
Alternatively, an ordered dict of modules can also be passed in.
To make it easier to understand, here is a small example::
# Example of using Sequential
model = nn.Sequential(
nn.Conv2d(1,20,5),
nn.ReLU(),
nn.Conv2d(20,64,5),
nn.ReLU()
)
# Example of using Sequential with OrderedDict
model = nn.Sequential(OrderedDict([
('conv1', nn.Conv2d(1,20,5)),
('relu1', nn.ReLU()),
('conv2', nn.Conv2d(20,64,5)),
('relu2', nn.ReLU())
]))
"""
@overload
def __init__(self, *args: Module) -> None:
...
@overload
def __init__(self, arg: 'OrderedDict[str, Module]') -> None:
...
def __init__(self, *args: Any):
super(Sequential, self).__init__()
if len(args) == 1 and isinstance(args[0], OrderedDict):
for key, module in args[0].items():
self.add_module(key, module)
else:
for idx, module in enumerate(args):
self.add_module(str(idx), module)
在init 函数中,只能用add_module函数加入module,意味着如果你的model的模块不是直接继承自nn.module,就没法加入到Sequential中,所以导致最后报错。
解决办法就是不要用nn.sequential打包,直接在forward函数里调用model就行