pytorch 广播语义(Broadcasting semantics)

许多PyTorch操作都支持。NumPy Broadcasting Semantics

简而言之,如果PyTorch操作支持广播,那么它的Tensor参数可以自动扩展为相同的大小(不需要复制数据)。

一般语义

如果遵守以下规则,则两个张量是“可播放的”:

  • 每个张量至少有一个维度。
  • 迭代尺寸大小时,从尾随尺寸开始,尺寸大小必须相等,其中一个为1,或者其中一个不存在。

例如:

>>> x=torch.empty(5,7,3)
>>> y=torch.empty(5,7,3)
# same shapes are always broadcastable (i.e. the above rules always hold)

>>> x=torch.empty((0,))
>>> y=torch.empty(2,2)
# x and y are not broadcastable, because x does not have at least 1 dimension

# can line up trailing dimensions
>>> x=torch.empty(5,3,4,1)
>>> y=torch.empty(  3,1,1)
# x and y are broadcastable.
# 1st trailing dimension: both have size 1
# 2nd trailing dimension: y has size 1
# 3rd trailing dimension: x size == y size
# 4th trailing dimension: y dimension doesn't exist

# but:
>>> x=torch.empty(5,2,4,1)
>>> y=torch.empty(  3,1,1)
# x and y are not broadcastable, because in the 3rd trailing dimension 2 != 3

如果两个张量xy是“broadcastable”,所得到的张量大小的计算方法如下:

  • 如果尺寸的数量xy不相等,则在尺寸较小的张量的前面加1,使它们的长度相等。
  • 然后,对于每个维度大小,生成的维度大小是该维度的大小xy沿该维度的最大值 。

例如:

# can line up trailing dimensions to make reading easier
>>> x=torch.empty(5,1,4,1)
>>> y=torch.empty(  3,1,1)
>>> (x+y).size()
torch.Size([5, 3, 4, 1])

# but not necessary:
>>> x=torch.empty(1)
>>> y=torch.empty(3,1,7)
>>> (x+y).size()
torch.Size([3, 1, 7])

>>> x=torch.empty(5,2,4,1)
>>> y=torch.empty(3,1,1)
>>> (x+y).size()
RuntimeError: The size of tensor a (2) must match the size of tensor b (3) at non-singleton dimension 1

就地语义

一个复杂因素是就地操作不允许就地张量由于广播而改变形状。

例如:

>>> x=torch.empty(5,3,4,1)
>>> y=torch.empty(3,1,1)
>>> (x.add_(y)).size()
torch.Size([5, 3, 4, 1])

# but:
>>> x=torch.empty(1,3,1)
>>> y=torch.empty(3,1,7)
>>> (x.add_(y)).size()
RuntimeError: The expanded size of the tensor (1) must match the existing size (7) at non-singleton dimension 2.

向后兼容性

PyTorch的早期版本允许某些逐点函数在具有不同形状的张量上执行,只要每个张量中的元素数量相等即可。然后通过将每个张量视为1维来执行逐点运算。PyTorch现在支持广播,并且“1维”逐点行为被认为已弃用,并且在张量不可播放但具有相同数量的元素的情况下将生成Python警告。

注意,在两个张量不具有相同形状但是可广播并且具有相同数量的元素的情况下,广播的引入可能导致向后不兼容的改变。例如:

>>> torch.add(torch.ones(4,1), torch.randn(4))

之前会产生一个尺寸为Tensor的尺寸:torch.Size([4,1]),但现在产生尺寸为Tensor:torch.Size([4,4])。为了帮助识别代码中可能存在广播引起的向后不兼容性的情况,您可以将torch.utils.backcompat.broadcast_warning.enabled设置为True,这将在这种情况下生成python警告。

例如:

>>> torch.utils.backcompat.broadcast_warning.enabled=True
>>> torch.add(torch.ones(4,1), torch.ones(4))
__main__:1: UserWarning: self and other do not have the same shape, but are broadcastable, and have the same number of elements.
Changing behavior in a backwards incompatible manner to broadcasting rather than viewing as 1-dimensional.

 

Many PyTorch operations support NumPy Broadcasting Semantics.

In short, if a PyTorch operation supports broadcast, then its Tensor arguments can be automatically expanded to be of equal sizes (without making copies of the data).

General semantics

Two tensors are “broadcastable” if the following rules hold:

  • Each tensor has at least one dimension.
  • When iterating over the dimension sizes, starting at the trailing dimension, the dimension sizes must either be equal, one of them is 1, or one of them does not exist.

For Example:

>>> x=torch.empty(5,7,3)
>>> y=torch.empty(5,7,3)
# same shapes are always broadcastable (i.e. the above rules always hold)

>>> x=torch.empty((0,))
>>> y=torch.empty(2,2)
# x and y are not broadcastable, because x does not have at least 1 dimension

# can line up trailing dimensions
>>> x=torch.empty(5,3,4,1)
>>> y=torch.empty(  3,1,1)
# x and y are broadcastable.
# 1st trailing dimension: both have size 1
# 2nd trailing dimension: y has size 1
# 3rd trailing dimension: x size == y size
# 4th trailing dimension: y dimension doesn't exist

# but:
>>> x=torch.empty(5,2,4,1)
>>> y=torch.empty(  3,1,1)
# x and y are not broadcastable, because in the 3rd trailing dimension 2 != 3

If two tensors xy are “broadcastable”, the resulting tensor size is calculated as follows:

  • If the number of dimensions of x and y are not equal, prepend 1 to the dimensions of the tensor with fewer dimensions to make them equal length.
  • Then, for each dimension size, the resulting dimension size is the max of the sizes of x and yalong that dimension.

For Example:

# can line up trailing dimensions to make reading easier
>>> x=torch.empty(5,1,4,1)
>>> y=torch.empty(  3,1,1)
>>> (x+y).size()
torch.Size([5, 3, 4, 1])

# but not necessary:
>>> x=torch.empty(1)
>>> y=torch.empty(3,1,7)
>>> (x+y).size()
torch.Size([3, 1, 7])

>>> x=torch.empty(5,2,4,1)
>>> y=torch.empty(3,1,1)
>>> (x+y).size()
RuntimeError: The size of tensor a (2) must match the size of tensor b (3) at non-singleton dimension 1

In-place semantics

One complication is that in-place operations do not allow the in-place tensor to change shape as a result of the broadcast.

For Example:

>>> x=torch.empty(5,3,4,1)
>>> y=torch.empty(3,1,1)
>>> (x.add_(y)).size()
torch.Size([5, 3, 4, 1])

# but:
>>> x=torch.empty(1,3,1)
>>> y=torch.empty(3,1,7)
>>> (x.add_(y)).size()
RuntimeError: The expanded size of the tensor (1) must match the existing size (7) at non-singleton dimension 2.

Backwards compatibility

Prior versions of PyTorch allowed certain pointwise functions to execute on tensors with different shapes, as long as the number of elements in each tensor was equal. The pointwise operation would then be carried out by viewing each tensor as 1-dimensional. PyTorch now supports broadcasting and the “1-dimensional” pointwise behavior is considered deprecated and will generate a Python warning in cases where tensors are not broadcastable, but have the same number of elements.

Note that the introduction of broadcasting can cause backwards incompatible changes in the case where two tensors do not have the same shape, but are broadcastable and have the same number of elements. For Example:

>>> torch.add(torch.ones(4,1), torch.randn(4))

would previously produce a Tensor with size: torch.Size([4,1]), but now produces a Tensor with size: torch.Size([4,4]). In order to help identify cases in your code where backwards incompatibilities introduced by broadcasting may exist, you may set torch.utils.backcompat.broadcast_warning.enabled to True, which will generate a python warning in such cases.

For Example:

>>> torch.utils.backcompat.broadcast_warning.enabled=True
>>> torch.add(torch.ones(4,1), torch.ones(4))
__main__:1: UserWarning: self and other do not have the same shape, but are broadcastable, and have the same number of elements.
Changing behavior in a backwards incompatible manner to broadcasting rather than viewing as 1-dimensional.
  • 4
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
PyTorch 中,broadcasting 是一种在不同形状的张量之间执行数学运算的方式。它允许我们在不进行显式复制数据的情况下对张量进行操作,从而减少了内存使用和计算时间。 PyTorch 中的 broadcasting 遵循以下规则: 1. 如果两个张量的维度数不同,则将较小的张量的形状通过在前面添加大小为 1 的维度来扩展,直到两个张量的维度数相同。 2. 对于每个维度,如果两个张量的大小相等,或其中一个张量的大小为 1,则可以执行广播。 3. 如果两个张量在某个维度上的大小都不为 1,且大小不相等,则无法执行广播,会报错。 下面是一些示例: ``` import torch # 例子1 a = torch.tensor([1, 2, 3]) b = torch.tensor([4, 5, 6]) c = a + b print(c) # tensor([5, 7, 9]) # 例子2 a = torch.tensor([1, 2, 3]) b = torch.tensor([[4], [5], [6]]) c = a + b print(c) # tensor([[5, 6, 7], # [6, 7, 8], # [7, 8, 9]]) # 例子3 a = torch.tensor([1, 2, 3]) b = torch.tensor([4, 5]) c = a + b # 报错:RuntimeError: The size of tensor a (3) must match the size of tensor b (2) at non-singleton dimension 0 ``` 在例子1中,a 和 b 都是形状为 (3,) 的张量,直接进行加法运算时可以执行广播。 在例子2中,a 是形状为 (3,) 的张量,b 是形状为 (3,1) 的张量,通过添加一个大小为 1 的维度来扩展 a,从而可以执行广播。 在例子3中,a 和 b 在第 0 维的大小不同,无法执行广播,因此会报错。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值