import torch
device = torch.device('cuda:0')
x = torch.tensor([1,2,3])
x = x.to(device)
import torch
x = torch.tensor([1,2,3])
x = x.cuda(0)
下面的代码是不能改变原数据部署的:
import torch
device = torch.device('cuda:0')
x = torch.tensor([1,2,3])
x.to(device)
import torch
x = torch.tensor([1,2,3])
x.cuda(0)
只能将数据暂时转移到cuda:0上,当这一行代码运行完毕,x又会重新回到CPU上