PyTorch之tensor和variable

输入代码

x = variable(tensor.ones(2,2),requires_grad = Ture)
y = x.sum()
print(x)
print(y)

产生错误

UserWarning: torch.autograd.variable(...) is deprecated, use torch.tensor(...) instead

  warnings.warn("torch.autograd.variable(...) is deprecated, use torch.tensor(...) instead")

在python之前的版本中,variable可以封装tensor,计算反向传播梯度时需要将tensor封装在variable中。但是在python 0.4版本之后,将variable和tensor合并,也就是说不需要将tensor封装在variable中就可以计算梯度。tensor具有variable的性质。
作为能否autograd的标签,requires_grad现在是Tensor的属性,所以,只要当一个操作的任何输入Tensor具有requires_grad = True的属性,autograd就可以自动追踪历史和反向传播了。

官方代码

# 默认创建requires_grad = False的Tensor
 x = torch.ones(1)   # create a tensor with requires_grad=False (default)
 x.requires_grad
 # out: False
 
 # 创建另一个Tensor,同样requires_grad = False
 y = torch.ones(1)  # another tensor with requires_grad=False
 # both inputs have requires_grad=False. so does the output
 z = x + y
 # 因为两个Tensor x,y,requires_grad=False.都无法实现自动微分,
 # 所以操作(operation)z=x+y后的z也是无法自动微分,requires_grad=False
 z.requires_grad
 # out: False
 
 # then autograd won't track this computation. let's verify!
 # 因而无法autograd,程序报错
 z.backward()
 # out:程序报错:RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

# now create a tensor with requires_grad=True
 w = torch.ones(1, requires_grad=True)
 w.requires_grad
 # out: True
 
 # add to the previous result that has require_grad=False
 # 因为total的操作中输入Tensor w的requires_grad=True,因而操作可以进行反向传播和自动求导。
 total = w + z
# the total sum now requires grad!
total.requires_grad
# out: True
# autograd can compute the gradients as well
total.backward()
w.grad
#out: tensor([ 1.])

# and no computation is wasted to compute gradients for x, y and z, which don't require grad
# 由于z,x,y的requires_grad=False,所以并没有计算三者的梯度
z.grad == x.grad == y.grad == None
# True

  • 8
    点赞
  • 6
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值