先说结论:
原来,他们与维度无关,
只要维度合适,都能计算。哈哈哈哈
测试代码:
model = Net(10,128,Config.num_output_feat)
inputs = torch.randn(3,5,10)
res = model(inputs)
# 3X5
print(res)
label = torch.randn(3,5)
# loss_fn mseloss的事搞定了,哈哈哈哈
loss_fn = nn.MSELoss()
print(loss_fn(res,label))
a = torch.Tensor(1)
print("aaaa===",a)
print("bbbb===",loss_fn(a,a))
print("cccc====",torch.Tensor([1,2]))
print("dddd=====",loss_fn(torch.Tensor([1,2]),torch.Tensor([1,2])))
### 下面验证 分类损失的事
loss_fn = nn.CrossEntropyLoss()
# 3 X 4
inputs = torch.Tensor([[1,2,3,4],[4,3,2,1],[0,0,1,2]])
labels = torch.Tensor([
[0,1,0,0],
[1,0,0,0],
[1,0,0,0]])
res1 = loss_fn(inputs,labels)
res = torch.argmax(inputs,dim=1)
label1 = torch.argmax(labels,dim=1)
print("res1=",res1, "res=",res,"labels=",labels,"label1=",label1,(res==label1).sum()/3)
因此,对于回归问题而言,最好可以用卷积来取代线性层!!