ulatu = torch.stack(ulats, dim=1) ulatuu = torch.mean(ulatu, dim=1) ilati = torch.stack(ilats, dim=1) ilatii = torch.mean(ilati, dim=1) ulats = torch.stack(ulats, dim=1)
进行拼接的时候似乎会出现这种错误,不知道为什么。如果用循环相加来改变这个代码就不会出现问题了。比如:
all_userEmbeddings = torch.Tensor(torch.empty(uEmbed0.shape[0], uEmbed0.shape[1])).to('cuda:0') for i in ulats: all_userEmbeddings += i
all_itemEmbeddings = torch.Tensor(torch.empty(iEmbed0.shape[0], iEmbed0.shape[1])).to('cuda:0') for j in ilats: all_itemEmbeddings += j