中午时间,简单说一下:Torch Tensor和Numpy array共享底层的内存空间的问题
from __future__ import print_function
import torch
x = torch.rand((5, 3))
y = x.numpy()
print("x:", x)
print("y:", y)
print("-"*20)
x.add_(1)
print("x:", x)
print("y:", y)
输出结果:
x: tensor([[0.7742, 0.7696, 0.3913],
[0.8454, 0.9364, 0.0193],
[0.9225, 0.6411, 0.8034],
[0.8698, 0.0743, 0.0633],
[0.0689, 0.6693, 0.1203]])
y: [[0.7741955 0.76957345 0.39132845]
[0.84536415 0.93640083 0.01926541]
[0.922459 0.64109427 0.8033993 ]
[0.8697707 0.07434338 0.06329459]
[0.06892037 0.66925764 0.12028617]]
--------------------
x: tensor([[1.7742, 1.7696, 1.3913],
[1.8454, 1.9364, 1.0193],
[1.9225, 1.6411, 1.8034],
[1.8698, 1.0743, 1.0633],
[1.0689, 1.6693, 1.1203]])
y: [[1.7741954 1.7695735 1.3913285]
[1.8453641 1.9364009 1.0192654]
[1.922459 1.6410942 1.8033993]
[1.8697708 1.0743434 1.0632946]
[1.0689204 1.6692576 1.1202862]]
总有一种感觉,某个维度上他的机制特别像Python中的浅拷贝,具体请去我以前的博客上逛逛…
博客链接:https://blog.csdn.net/qq_41475067/article/details/113854217