UserWarning: An output with one or more elements was resized since it had shape [3211264], which does not match the required output shape [1, 64, 224, 224]. This behavior is deprecated, and in a future PyTorch release outputs will not be resized unless they have zero elements. You can explicitly reuse an out tensor t by resizing it, inplace, to zero elements with t.resize_(0). (Triggered internally at ../aten/src/ATen/native/Resize.cpp:26.)
return torch.stack(batch, 1, out=out)
关于这段报错是因为它的storage是不可改变的,只需要克隆一个相同的tensor再将其.resize(0)即可
# storage = batch[0].storage()._new_shared(numel)
# out = batch[0].new(storage)
storage = batch[0].untyped_storage()._new_shared(numel)
storage1=storage.clone().resize_(0)
out = batch[0].new(storage1)