Values are allocated in contiguous chunks of memory, managed by <torch.Storage> instances.
A PyTorch Tensor is a view over such a Storage that’s capable of indexing into that storage by using an offset and per-dimension strides.(通过偏移量和每个维度的距离在内存中索引)
The underlying memory is allocated only once, however, so creating alternative tensor views on the data can be done quickly, regardless of the size of the data managed by the Storage instance.(无论如何改变位置,张量一旦创建,在内存中的位置不会变化)
You can access the storage for a given tensor by using the .storage property:
import torch
point = torch.tensor([[1,2],[5,4]])>>>
tensor([[1,2],[5,4]])
point.storage()>>>1254[torch.LongStorage of size 4]
1. You can’t index a storage of a 2D tensor by using two indices.
2. The layout of a storage is always one-dimensional, irrespective
of the dimensionality of any tensors that may refer to it.
(在内存中只有一个维度)
At this point, it shouldn’t come as a surprise that changing the value of a storage changes the content of its referring tensor:
Point_storage[1]=1000
Point_storage
>>>1100054[torch.LongStorage of size 4]
It starts with a tensorNO.3 Tensors and storagesValues are allocated in contiguous chunks of memory, managed by <torch.Storage> instances.A PyTorch Tensor is a view over such a Storage that’s capable of indexing into that storage by using an offs