tensorflow2(2)

第二天学习tensorflow2

import numpy as np
import tensorflow as tf

类型转换

a = tf.constant(np.pi,dtype=tf.float16)
a
<tf.Tensor: id=0, shape=(), dtype=float16, numpy=3.14>
tf.cast(a,tf.double)
<tf.Tensor: id=1, shape=(), dtype=float64, numpy=3.140625>

bool类型和putongleixi普通类型也可以互相转换

a = tf.constant([True,False])
tf.cast(a,tf.int32)
<tf.Tensor: id=3, shape=(2,), dtype=int32, numpy=array([1, 0])>

从上面可以看出转化为1和0,对于bool来说,主要不为0均为True

待优化张量

为了区分需要计算梯度信息的张量与不需要计算梯度信息的张量,TensorFlow 增加了
一种专门的数据类型来支持梯度信息的记录:tf.Variable。tf.Variable 类型在普通的张量类
型基础上添加了 name,trainable 等属性来支持计算图的构建。由于梯度运算会消耗大量的
计算资源,而且会自动更新相关参数,对于不需要的优化的张量,如神经网络的输入𝒀,
不需要通过 tf.Variable 封装;相反,对于需要计算梯度并优化的张量

a = tf.constant([-1,0,1,2])   #创建张量来计算梯度
aa = tf.Variable(a)
aa
<tf.Variable 'Variable:0' shape=(4,) dtype=int32, numpy=array([-1,  0,  1,  2])>
aa.name,aa.trainable
('Variable:0', True)

其中张量的 name 和 trainable 属性是 Variable 特有的属性,name 属性用于命名计算图中的
变量,这套命名体系是 TensorFlow 内部维护的,一般不需要用户关注 name 属性;trainable
属性表征当前张量是否需要被优化,创建 Variable 对象时是默认启用优化标志,可以设置
trainable=False 来设置张量不需要优化

直接创建Variable

a = tf.Variable([[1,2],[3,4]])
a
<tf.Variable 'Variable:0' shape=(2, 2) dtype=int32, numpy=
array([[1, 2],
       [3, 4]])>

待优化张量可视为普通张量的特殊类型,普通张量其实也可以通过 GradientTape.watch()方
法临时加入跟踪梯度信息的列表,从而支持自动求导功能

创建张量

Numpy Array 数组和 Python List 列表是 Python 程序中间非常重要的数据载体容器,很
多数据都是通过 Python 语言将数据加载至 Array 或者 List 容器,再转换到 Tensor 类型,通
过 TensorFlow 运算处理后导出到 Array 或者 List 容器,方便其他模块调用。

通过 tf.convert_to_tensor 函数可以创建新 Tensor,并将保存在 Python List 对象或者
Numpy Array 对象中的数据导入到新 Tensor 中,例如:

tf.convert_to_tensor([1,2.])  #从列表创建张量
<tf.Tensor: id=21, shape=(2,), dtype=float32, numpy=array([1., 2.], dtype=float32)>
tf.convert_to_tensor(np.array([[1,2],[3,4]])) #从数组中创建 张量
<tf.Tensor: id=22, shape=(2, 2), dtype=int32, numpy=
array([[1, 2],
       [3, 4]])>

实际上,tf.constant()和 tf.convert_to_tensor()都能够自动的把 Numpy 数组或者 Python
列表数据类型转化为 Tensor 类型,这两个 API 命名来自 TensorFlow 1.x 的命名习惯,在
TensorFlow 2 中函数的名字并不是很贴切,使用其一即可

创建全0 或者全1张量

𝒛 = 𝑿𝒚+ 𝒃,将权值矩阵𝑿初始化为全 1 矩阵,偏置 b 初始化为全 0 向量,此时线性变化
层输出𝒛 = 𝒚,因此是一种比较好的层初始化状态。通过 tf.zeros()和 tf.ones()即可创建任意
形状,且内容全 0 或全 1 的张量。例如,创建为 0 和为 1 的标量:

tf.zeros([]),tf.ones([]) # 创建标量
(<tf.Tensor: id=23, shape=(), dtype=float32, numpy=0.0>,
 <tf.Tensor: id=24, shape=(), dtype=float32, numpy=1.0>)

创建全0或者全1的向量

tf.zeros([1]),tf.ones([1])
(<tf.Tensor: id=27, shape=(1,), dtype=float32, numpy=array([0.], dtype=float32)>,
 <tf.Tensor: id=30, shape=(1,), dtype=float32, numpy=array([1.], dtype=float32)>)
tf.zeros([2,2])
<tf.Tensor: id=33, shape=(2, 2), dtype=float32, numpy=
array([[0., 0.],
       [0., 0.]], dtype=float32)>

通过 tf.zeros_like, tf.ones_like 可以方便地新建与某个张量 shape 一致,且内容为全 0 或1

a = tf.ones([3,2])
a
<tf.Tensor: id=36, shape=(3, 2), dtype=float32, numpy=
array([[1., 1.],
       [1., 1.],
       [1., 1.]], dtype=float32)>
b = tf.ones_like(a)
b
<tf.Tensor: id=39, shape=(3, 2), dtype=float32, numpy=
array([[1., 1.],
       [1., 1.],
       [1., 1.]], dtype=float32)>

小贴士: tf.*_like 是一系列的便捷函数,可以通过 tf.zeros(a.shape)等方式实现。

创建自定义数值张量

除了初始化为全 0,或全 1 的张量之外,有时也需要全部初始化为某个自定义数值的
张量,比如将张量的数值全部初始化为−1等。

通过 tf.fill(shape, value)可以创建全为自定义数值 value 的张量,形状由 shape 参数指定。例如,创建元素为−1的标量:

c = tf.fill([],-1)
c
<tf.Tensor: id=42, shape=(), dtype=int32, numpy=-1>
tf.fill([2,2],99)#创建数值均为99的向量
<tf.Tensor: id=45, shape=(2, 2), dtype=int32, numpy=
array([[99, 99],
       [99, 99]])>

创建已知分布的张量

正态分布(Normal Distribution,或 Gaussian Distribution)和均匀分布(Uniform
Distribution)是最常见的分布之一,创建采样自这 2 种分布的张量非常有用,比如在卷积神
经网络中,卷积核张量𝑿初始化为正态分布有利于网络的训练;在对抗生成网络中,隐藏
变量𝒜一般采样自均匀分布。

通过 tf.random.normal(shape, mean=0.0, stddev=1.0)可以创建形状为 shape,均值为mean,标准差为 stddev 的正态分布𝒩(mean,stddev 2 )。例如,创建均值为 0,标准差为 1的正态分布:

tf.random.normal([2,2])  #创建标准正态分布
<tf.Tensor: id=51, shape=(2, 2), dtype=float32, numpy=
array([[ 0.11410699, -0.19630972],
       [ 1.3169473 ,  0.52485913]], dtype=float32)>
tf.random.normal([2,2],mean=1,stddev=2)  #创建一个均值为1 标准差为2的正态分布 张量
<tf.Tensor: id=57, shape=(2, 2), dtype=float32, numpy=
array([[ 1.7531555, -1.417537 ],
       [-0.890833 , -0.3266158]], dtype=float32)>

通过 tf.random.uniform(shape, minval=0, maxval=None, dtype=tf.float32)可以创建采样自
[minval,maxval)区间的均匀分布的张量。例如创建采样自区间[0,1),shape 为[2,2]的矩

    tf.random.uniform([2,2]) #创建0,1的均匀分布
<tf.Tensor: id=64, shape=(2, 2), dtype=float32, numpy=
array([[0.7129042 , 0.01164734],
       [0.28311265, 0.5645001 ]], dtype=float32)>
tf.random.uniform([2,2],maxval=10)   #创建【0,10)的均匀分布
<tf.Tensor: id=71, shape=(2, 2), dtype=float32, numpy=
array([[8.9780855, 5.2519846],
       [1.245681 , 1.2312984]], dtype=float32)>
tf.random.uniform([2,2],minval=5,maxval=10)
<tf.Tensor: id=78, shape=(2, 2), dtype=float32, numpy=
array([[8.810197 , 8.602655 ],
       [8.298012 , 6.9268036]], dtype=float32)>

如果需要均匀采样整形类型的数据,必须指定采样区间的最大值 maxval 参数,同时指
定数据类型为 tf.int*型:

tf.random.uniform([2,2],maxval=10,dtype=tf.int32)
<tf.Tensor: id=82, shape=(2, 2), dtype=int32, numpy=
array([[3, 1],
       [4, 7]])>

创建序列

在循环计算或者对张量进行索引时,经常需要创建一段连续的整型序列,可以通过
tf.range()函数实现。tf.range(limit, delta=1)可以创建[0,limit)之间,步长为 delta 的整型序

tf.range(10)
<tf.Tensor: id=86, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])>
tf.range(10,delta=2)  #创建0-10补偿为2的序列
<tf.Tensor: id=90, shape=(5,), dtype=int32, numpy=array([0, 2, 4, 6, 8])>
tf.range(1,10,delta=2)
<tf.Tensor: id=94, shape=(5,), dtype=int32, numpy=array([1, 3, 5, 7, 9])>

张量的典型应用

在介绍完张量的相关属性和创建方式后,下面将介绍每种维度数下张量的典型应用,
让读者在看到每种张量时,能够直观地联想到它主要的物理意义和用途,对后续张量的维
度变换等一系列抽象操作的学习打下基础。

标量

在 TensorFlow 中,标量最容易理解,它就是一个简单的数字,维度数为 0,shape 为
[]。标量的一些典型用途是误差值的表示、各种测量指标的表示,比如准确度(Accuracy,
简称 acc),精度(Precision)和召回率(Recall)等。

以均方差误差函数为例,经过 tf.keras.losses.mse(或 tf.keras.losses.MSE,两者相同功能)返回每个样本上的误差值,最后取误差的均值作为当前 Batch 的误差,它是一个标量:
out = tf.random.uniform([4,10])  #随机化网路输出
y = tf.constant([2,3,2,0])  #随机构造真实样本标签
y
<tf.Tensor: id=102, shape=(4,), dtype=int32, numpy=array([2, 3, 2, 0])>
y = tf.one_hot(y,depth=10) #one_hot编码
y
<tf.Tensor: id=106, shape=(4, 10), dtype=float32, numpy=
array([[0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],
       [0., 0., 0., 1., 0., 0., 0., 0., 0., 0.],
       [0., 0., 1., 0., 0., 0., 0., 0., 0., 0.],
       [1., 0., 0., 0., 0., 0., 0., 0., 0., 0.]], dtype=float32)>
loss = tf.keras.losses.mse(y,out)#计算每个输出值
loss
<tf.Tensor: id=109, shape=(4,), dtype=float32, numpy=array([0.19174023, 0.22819547, 0.46747884, 0.1701625 ], dtype=float32)>
out
<tf.Tensor: id=101, shape=(4, 10), dtype=float32, numpy=
array([[0.08711052, 0.55813754, 0.42524672, 0.05114841, 0.5157341 ,
        0.928483  , 0.02206361, 0.31170142, 0.19607961, 0.03442168],
       [0.35133588, 0.2679038 , 0.1766653 , 0.67773247, 0.4002055 ,
        0.38266802, 0.4300121 , 0.69154036, 0.24248421, 0.9608028 ],
       [0.93188715, 0.29051602, 0.6420455 , 0.95685744, 0.9914843 ,
        0.38391495, 0.7828622 , 0.30895233, 0.00389087, 0.9162401 ],
       [0.527434  , 0.7365136 , 0.11024797, 0.61184895, 0.3517245 ,
        0.5346036 , 0.15987802, 0.15788758, 0.2023567 , 0.21998096]],
      dtype=float32)>
loss = tf.reduce_mean(loss)
loss
<tf.Tensor: id=111, shape=(), dtype=float32, numpy=0.26439428>

向量

向量是一种非常常见的数据载体,如在全连接层和卷积神经网络层中,偏置张量𝒃就
使用向量来表示。如图 4.2 所示,每个全连接层的输出节点都添加了一个偏置值,把所有
输出节点的偏置表示成向量形式:𝒃 = [𝑐 1 ,𝑐 2 ] T 。

#z= Wx 就是输入向量
z = tf.random.normal([4,2])
z
<tf.Tensor: id=117, shape=(4, 2), dtype=float32, numpy=
array([[-0.04479945, -0.9851824 ],
       [ 0.68836296, -0.57058215],
       [-1.6266129 ,  0.03281204],
       [ 0.7666135 ,  0.9352049 ]], dtype=float32)>
b = tf.ones([2]) #偏执向量
b
<tf.Tensor: id=120, shape=(2,), dtype=float32, numpy=array([1., 1.], dtype=float32)>
z = z+b
z
<tf.Tensor: id=121, shape=(4, 2), dtype=float32, numpy=
array([[ 0.95520055,  0.0148176 ],
       [ 1.688363  ,  0.42941785],
       [-0.6266129 ,  1.032812  ],
       [ 1.7666135 ,  1.935205  ]], dtype=float32)>
通过高层接口类 Dense()方式创建的网络层,张量𝑿和𝒃存储在类的内部,由类自动创建并管理。可以通过全连接层的 bias 成员变量查看偏置变量𝒃,例如创建输入节点数为 4,输出节点数为 3 的线性层网络,那么它的偏置向量 b 的长度应为 3,
fc = tf.keras.layers.Dense(3) # 创建一层 Wx+b,输出节点为 3
fc
<tensorflow.python.keras.layers.core.Dense at 0x7835588eb8>
fc.build(input_shape=(2,4))  #通过build创建W,b的张量输入节点为4
fc
<tensorflow.python.keras.layers.core.Dense at 0x7835588eb8>
fc.bias
<tf.Variable 'bias:0' shape=(3,) dtype=float32, numpy=array([0., 0., 0.], dtype=float32)>

可以看到,类的偏置成员 bias 为长度为 3 的向量,初始化为全 0,这也是偏置𝒃的默认初始

矩阵

x = tf.random.normal([2,4])

输入两个向量4个特征值

w = tf.ones([4,3])
b = tf.zeros([3])
o = x@w+b
o
<tf.Tensor: id=158, shape=(2, 3), dtype=float32, numpy=
array([[ 2.9301972 ,  2.9301972 ,  2.9301972 ],
       [-0.54779565, -0.54779565, -0.54779565]], dtype=float32)>

其中𝒀和𝑿张量均是矩阵,上述代码实现了一个线性变换的网络层,激活函数为空。一般
地,𝜎(𝒀@𝑿+ 𝒃)网络层称为全连接层,在 TensorFlow 中可以通过 Dense 类直接实现,特
别地,当激活函数𝜎为空时,全连接层也称为线性层。我们通过 Dense 类创建输入

fc = tf.keras.layers.Dense(3) #定义全连接层输出节点为3
fc.build(input_shape=(2,4))  #定义全连接层的输入节点为4
fc.kernel  #查看权值矩阵
<tf.Variable 'kernel:0' shape=(4, 3) dtype=float32, numpy=
array([[ 0.8466822 , -0.31858903, -0.88609904],
       [ 0.4471711 , -0.8811533 , -0.7778356 ],
       [-0.03076637, -0.8946478 ,  0.45518613],
       [ 0.08795929, -0.3508975 ,  0.5263113 ]], dtype=float32)>

三维张量

三维的张量一个典型应用是表示序列信号,它的格式是
𝒀 = [𝑐,sequence len,feature len]
其中𝑐表示序列信号的数量,sequence len 表示序列信号在时间维度上的采样点数或步数,

考虑自然语言处理(Natural Language Processing,简称 NLP)中句子的表示,如评价句
子的是否为正面情绪的情感分类任务网络,如图 4.3 所示。为了能够方便字符串被神经网
络处理,一般将单词通过嵌入层(Embedding Layer)编码为固定长度的向量,比如“a”编码
为某个长度 3 的向量,那么 2 个等长(单词数量为 5)的句子序列可以表示为 shape 为[2,5,3]
的 3 维张量,其中 2 表示句子个数,5 表示单词数量,3 表示单词向量的长度。我们通过
IMDB 数据集来演示如何表示句子,

(x_train,y_train),(x_test,y_test)=tf.keras.datasets.imdb.load_data(num_words=10000)  #加载IMDB数据集
x_train
array([list([1, 14, 22, 16, 43, 530, 973, 1622, 1385, 65, 458, 4468, 66, 3941, 4, 173, 36, 256, 5, 25, 100, 43, 838, 112, 50, 670, 2, 9, 35, 480, 284, 5, 150, 4, 172, 112, 167, 2, 336, 385, 39, 4, 172, 4536, 1111, 17, 546, 38, 13, 447, 4, 192, 50, 16, 6, 147, 2025, 19, 14, 22, 4, 1920, 4613, 469, 4, 22, 71, 87, 12, 16, 43, 530, 38, 76, 15, 13, 1247, 4, 22, 17, 515, 17, 12, 16, 626, 18, 2, 5, 62, 386, 12, 8, 316, 8, 106, 5, 4, 2223, 5244, 16, 480, 66, 3785, 33, 4, 130, 12, 16, 38, 619, 5, 25, 124, 51, 36, 135, 48, 25, 1415, 33, 6, 22, 12, 215, 28, 77, 52, 5, 14, 407, 16, 82, 2, 8, 4, 107, 117, 5952, 15, 256, 4, 2, 7, 3766, 5, 723, 36, 71, 43, 530, 476, 26, 400, 317, 46, 7, 4, 2, 1029, 13, 104, 88, 4, 381, 15, 297, 98, 32, 2071, 56, 26, 141, 6, 194, 7486, 18, 4, 226, 22, 21, 134, 476, 26, 480, 5, 144, 30, 5535, 18, 51, 36, 28, 224, 92, 25, 104, 4, 226, 65, 16, 38, 1334, 88, 12, 16, 283, 5, 16, 4472, 113, 103, 32, 15, 16, 5345, 19, 178, 32]),
       list([1, 194, 1153, 194, 8255, 78, 228, 5, 6, 1463, 4369, 5012, 134, 26, 4, 715, 8, 118, 1634, 14, 394, 20, 13, 119, 954, 189, 102, 5, 207, 110, 3103, 21, 14, 69, 188, 8, 30, 23, 7, 4, 249, 126, 93, 4, 114, 9, 2300, 1523, 5, 647, 4, 116, 9, 35, 8163, 4, 229, 9, 340, 1322, 4, 118, 9, 4, 130, 4901, 19, 4, 1002, 5, 89, 29, 952, 46, 37, 4, 455, 9, 45, 43, 38, 1543, 1905, 398, 4, 1649, 26, 6853, 5, 163, 11, 3215, 2, 4, 1153, 9, 194, 775, 7, 8255, 2, 349, 2637, 148, 605, 2, 8003, 15, 123, 125, 68, 2, 6853, 15, 349, 165, 4362, 98, 5, 4, 228, 9, 43, 2, 1157, 15, 299, 120, 5, 120, 174, 11, 220, 175, 136, 50, 9, 4373, 228, 8255, 5, 2, 656, 245, 2350, 5, 4, 9837, 131, 152, 491, 18, 2, 32, 7464, 1212, 14, 9, 6, 371, 78, 22, 625, 64, 1382, 9, 8, 168, 145, 23, 4, 1690, 15, 16, 4, 1355, 5, 28, 6, 52, 154, 462, 33, 89, 78, 285, 16, 145, 95]),
       list([1, 14, 47, 8, 30, 31, 7, 4, 249, 108, 7, 4, 5974, 54, 61, 369, 13, 71, 149, 14, 22, 112, 4, 2401, 311, 12, 16, 3711, 33, 75, 43, 1829, 296, 4, 86, 320, 35, 534, 19, 263, 4821, 1301, 4, 1873, 33, 89, 78, 12, 66, 16, 4, 360, 7, 4, 58, 316, 334, 11, 4, 1716, 43, 645, 662, 8, 257, 85, 1200, 42, 1228, 2578, 83, 68, 3912, 15, 36, 165, 1539, 278, 36, 69, 2, 780, 8, 106, 14, 6905, 1338, 18, 6, 22, 12, 215, 28, 610, 40, 6, 87, 326, 23, 2300, 21, 23, 22, 12, 272, 40, 57, 31, 11, 4, 22, 47, 6, 2307, 51, 9, 170, 23, 595, 116, 595, 1352, 13, 191, 79, 638, 89, 2, 14, 9, 8, 106, 607, 624, 35, 534, 6, 227, 7, 129, 113]),
       ...,
       list([1, 11, 6, 230, 245, 6401, 9, 6, 1225, 446, 2, 45, 2174, 84, 8322, 4007, 21, 4, 912, 84, 2, 325, 725, 134, 2, 1715, 84, 5, 36, 28, 57, 1099, 21, 8, 140, 8, 703, 5, 2, 84, 56, 18, 1644, 14, 9, 31, 7, 4, 9406, 1209, 2295, 2, 1008, 18, 6, 20, 207, 110, 563, 12, 8, 2901, 2, 8, 97, 6, 20, 53, 4767, 74, 4, 460, 364, 1273, 29, 270, 11, 960, 108, 45, 40, 29, 2961, 395, 11, 6, 4065, 500, 7, 2, 89, 364, 70, 29, 140, 4, 64, 4780, 11, 4, 2678, 26, 178, 4, 529, 443, 2, 5, 27, 710, 117, 2, 8123, 165, 47, 84, 37, 131, 818, 14, 595, 10, 10, 61, 1242, 1209, 10, 10, 288, 2260, 1702, 34, 2901, 2, 4, 65, 496, 4, 231, 7, 790, 5, 6, 320, 234, 2766, 234, 1119, 1574, 7, 496, 4, 139, 929, 2901, 2, 7750, 5, 4241, 18, 4, 8497, 2, 250, 11, 1818, 7561, 4, 4217, 5408, 747, 1115, 372, 1890, 1006, 541, 9303, 7, 4, 59, 2, 4, 3586, 2]),
       list([1, 1446, 7079, 69, 72, 3305, 13, 610, 930, 8, 12, 582, 23, 5, 16, 484, 685, 54, 349, 11, 4120, 2959, 45, 58, 1466, 13, 197, 12, 16, 43, 23, 2, 5, 62, 30, 145, 402, 11, 4131, 51, 575, 32, 61, 369, 71, 66, 770, 12, 1054, 75, 100, 2198, 8, 4, 105, 37, 69, 147, 712, 75, 3543, 44, 257, 390, 5, 69, 263, 514, 105, 50, 286, 1814, 23, 4, 123, 13, 161, 40, 5, 421, 4, 116, 16, 897, 13, 2, 40, 319, 5872, 112, 6700, 11, 4803, 121, 25, 70, 3468, 4, 719, 3798, 13, 18, 31, 62, 40, 8, 7200, 4, 2, 7, 14, 123, 5, 942, 25, 8, 721, 12, 145, 5, 202, 12, 160, 580, 202, 12, 6, 52, 58, 2, 92, 401, 728, 12, 39, 14, 251, 8, 15, 251, 5, 2, 12, 38, 84, 80, 124, 12, 9, 23]),
       list([1, 17, 6, 194, 337, 7, 4, 204, 22, 45, 254, 8, 106, 14, 123, 4, 2, 270, 2, 5, 2, 2, 732, 2098, 101, 405, 39, 14, 1034, 4, 1310, 9, 115, 50, 305, 12, 47, 4, 168, 5, 235, 7, 38, 111, 699, 102, 7, 4, 4039, 9245, 9, 24, 6, 78, 1099, 17, 2345, 2, 21, 27, 9685, 6139, 5, 2, 1603, 92, 1183, 4, 1310, 7, 4, 204, 42, 97, 90, 35, 221, 109, 29, 127, 27, 118, 8, 97, 12, 157, 21, 6789, 2, 9, 6, 66, 78, 1099, 4, 631, 1191, 5, 2642, 272, 191, 1070, 6, 7585, 8, 2197, 2, 2, 544, 5, 383, 1271, 848, 1468, 2, 497, 2, 8, 1597, 8778, 2, 21, 60, 27, 239, 9, 43, 8368, 209, 405, 10, 10, 12, 764, 40, 4, 248, 20, 12, 16, 5, 174, 1791, 72, 7, 51, 6, 1739, 22, 4, 204, 131, 9])],
      dtype=object)
x_train = tf.keras.preprocessing.sequence.pad_sequences(x_train,maxlen=80)  #将句子阶段为80个单词
x_train.shape
(25000, 80)
x_train
array([[  15,  256,    4, ...,   19,  178,   32],
       [ 125,   68,    2, ...,   16,  145,   95],
       [ 645,  662,    8, ...,    7,  129,  113],
       ...,
       [ 529,  443,    2, ...,    4, 3586,    2],
       [ 286, 1814,   23, ...,   12,    9,   23],
       [  97,   90,   35, ...,  204,  131,    9]])

可以看到 x_train 张量的 shape 为[25000,80],其中 25000 表示句子个数,80 表示每个句子
共 80 个单词,每个单词使用数字编码方式表示。我们通过 layers.Embedding 层将数字编码
的单词转换为长度为 100 个词向量

Embeding = tf.keras.layers.Embedding(10000,100)  #转换为100维度
out = Embeding(x_train)
out.shape
TensorShape([25000, 80, 100])

可以看到,经过 Embedding 层编码后,句子张量的 shape 变为[25000,80,100],其中 100 表
示每个单词编码为长度是 100 的向量。

四维张量

这里只讨论三、四维张量,大于四维的张量一般应用的比较少,如在元学习(Meta
Learning)中会采用五维的张量表示方法,理解方法与三、四维张量类似,不再赘述。
四维张量在卷积神经网络中应用非常广泛,它用于保存特征图(Feature maps)数据,格
式一般定义为
[𝑐,ℎ, ,𝑑]

#创建32x32的图片的输入
x = tf.random.normal([4,32,32,3])
#创建卷积神经网络
layer = tf.keras.layers.Conv2D(16,kernel_size=3)
out = layer(x) #前向计算
out.shape  #输出大小
TensorShape([4, 30, 30, 16])

其中卷积核张量也是 4 维张量,可以通过 kernel 成员变量访问:

layer.kernel.shape
TensorShape([3, 3, 3, 16])

索引与切片

通过索引与切片操作可以提取张量的部分数据,它们的使用频率非常高。

索引

式。考虑输入𝒀为 4 张32 × 32大小的彩色图片(为了方便演示,大部分张量都使用随机
分布模拟产生,后文同),shape 为[4,32,32,3],首先创建张量

x = tf.random.normal([4,32,32,3])
x[0] #第一个索引号为0
<tf.Tensor: id=239, shape=(32, 32, 3), dtype=float32, numpy=
array([[[ 0.15887685,  1.0638021 ,  0.24262176],
        [ 0.5162131 ,  0.3146141 ,  1.30261   ],
        [ 0.32958803, -0.07544474, -0.798697  ],
        ...,
        [ 0.5430354 ,  0.06708364,  0.18718515],
        [ 0.15137158,  0.82485485, -0.07353581],
        [ 0.06174438, -0.28900453, -0.17766367]],

       [[-0.8288241 , -0.43492955,  0.12549026],
        [-0.53608227, -1.3839718 , -0.38280237],
        [-1.0408336 , -0.7138202 , -1.2163761 ],
        ...,
        [-0.42730084, -0.5418567 , -1.4307729 ],
        [-0.5901603 , -0.20020208, -1.1485943 ],
        [ 0.6754496 ,  1.1658471 ,  1.254681  ]],

       [[ 0.8199817 , -0.13179821, -1.0026433 ],
        [ 0.20038375,  2.0983407 , -1.6608505 ],
        [-1.2044364 , -0.18790387,  0.99040943],
        ...,
        [-0.3662675 , -2.1195405 , -0.4394893 ],
        [-0.80587035, -0.34881762, -0.7255154 ],
        [ 0.05374175, -0.36799756,  1.710843  ]],

       ...,

       [[ 2.349938  , -1.0341504 ,  0.69676656],
        [ 1.1879286 , -0.01511606,  1.159321  ],
        [ 2.0477667 ,  0.13849233,  1.3389939 ],
        ...,
        [-0.56973726, -0.705439  ,  1.019128  ],
        [ 0.6739323 , -1.3462726 , -0.54695696],
        [ 0.7926641 , -0.03149584, -0.36437288]],

       [[ 0.03053406,  0.18028776, -0.01941211],
        [-0.08558384,  0.63365006,  0.9841086 ],
        [ 1.0553024 ,  1.1736095 ,  0.31384602],
        ...,
        [ 1.5592244 , -0.42086074,  0.6394359 ],
        [ 1.1481735 ,  0.4548075 ,  0.94918376],
        [ 1.1662008 ,  0.7955788 ,  0.81457335]],

       [[-1.4271562 ,  0.28228137, -0.28983933],
        [-0.23206161,  0.53333867,  0.8283403 ],
        [-0.56755936,  0.53504246, -0.3638036 ],
        ...,
        [-0.09928504,  0.9733174 ,  0.830424  ],
        [-0.8019797 , -0.41261557, -1.7591631 ],
        [-0.02050705, -0.18762097,  2.5975325 ]]], dtype=float32)>
x[0][1] #取第一张图片的第二行
<tf.Tensor: id=247, shape=(32, 3), dtype=float32, numpy=
array([[-0.8288241 , -0.43492955,  0.12549026],
       [-0.53608227, -1.3839718 , -0.38280237],
       [-1.0408336 , -0.7138202 , -1.2163761 ],
       [ 0.23205307,  0.31211147, -0.84631824],
       [ 0.12927896,  0.04636699, -0.7936061 ],
       [ 1.5422859 ,  0.25741068, -0.07476163],
       [ 1.4029118 ,  0.31732893,  0.4828657 ],
       [ 0.89409274, -0.54846984, -0.7361864 ],
       [-0.14061902, -0.17393057, -0.5558882 ],
       [-0.27859995,  0.857022  ,  1.4367795 ],
       [-0.38748017, -0.04675261, -1.254084  ],
       [ 0.08614758, -0.87526405,  1.1958708 ],
       [ 1.4687527 , -0.00503634,  2.0082822 ],
       [ 0.9783337 , -0.35418537, -0.34260166],
       [-0.60871214, -0.36148658,  0.23423466],
       [ 0.8484768 ,  1.3149838 , -0.6583954 ],
       [ 1.8823738 ,  1.0567163 ,  1.5663629 ],
       [ 1.4113243 ,  0.49838263,  1.3852206 ],
       [ 0.74519825,  1.7728142 , -0.3431991 ],
       [ 2.4443567 , -1.0441684 ,  1.0484878 ],
       [ 0.2322934 ,  0.06976946, -0.28763255],
       [-0.2368255 ,  0.6252483 , -0.5599113 ],
       [ 1.4904399 ,  0.38021138,  1.4142625 ],
       [ 0.02857532,  0.36857373, -0.9020619 ],
       [ 0.4648273 , -0.5392223 ,  2.3752363 ],
       [-0.83271945,  1.5225884 , -0.40676245],
       [ 0.6770792 , -0.4404919 , -2.7374938 ],
       [ 0.34658864, -0.38342553,  1.2044828 ],
       [-0.32053944,  0.06078859, -0.8694738 ],
       [-0.42730084, -0.5418567 , -1.4307729 ],
       [-0.5901603 , -0.20020208, -1.1485943 ],
       [ 0.6754496 ,  1.1658471 ,  1.254681  ]], dtype=float32)>
x[0][1][2] #取第一张图片地二行第三列
<tf.Tensor: id=259, shape=(3,), dtype=float32, numpy=array([-1.0408336, -0.7138202, -1.2163761], dtype=float32)>

取第 3 张图片,第 2 行,第 1 列的像素,B 通道(第 2 个通道)颜色强度值,实现如
下:

x[0][1][2][2]
<tf.Tensor: id=275, shape=(), dtype=float32, numpy=-1.2163761>

当张量的维度数较高时,使用[𝑗][𝑘]…[𝑙]的方式书写不方便,可以采用[𝑗,𝑘,…,𝑙]的方式索引,它们是等价的

#取第 2 张图片,第 10 行,第 3 列的数据,实现如下
x[1,9,2]
<tf.Tensor: id=279, shape=(3,), dtype=float32, numpy=array([-0.20183265, -0.8208802 , -0.75587183], dtype=float32)>
x[0,1,2,2]
<tf.Tensor: id=283, shape=(), dtype=float32, numpy=-1.2163761>

切片

通过start🔚step切片方式可以方便地提取一段数据,其中 start 为开始读取位置的索
引,end 为结束读取位置的索引(不包含 end 位),step 为采样步长

#以 shape 为[4,32,32,3]的图片张量为例,我们解释如果切片获得不同位置的数据。例如读取第 2,3 张图片,实现如
x[1:3]
<tf.Tensor: id=287, shape=(2, 32, 32, 3), dtype=float32, numpy=
array([[[[-8.0690390e-01,  7.4619069e-03, -1.2501005e+00],
         [ 2.3920364e+00, -1.1271025e+00,  2.7366141e-02],
         [ 3.5499024e-01, -1.1693124e+00,  2.0713241e+00],
         ...,
         [ 1.1476884e+00,  1.8161348e+00,  2.3483117e+00],
         [-8.0073857e-01,  5.9193856e-01, -1.3970522e+00],
         [-1.6983844e+00,  1.3193607e-01,  8.1776273e-01]],

        [[-1.7185298e+00,  2.9431930e+00,  1.4190791e+00],
         [ 1.1683645e+00,  1.2225899e+00,  9.2498493e-01],
         [ 1.6828652e-01, -4.8972711e-01,  1.4887863e+00],
         ...,
         [-3.1197420e-01,  8.1653464e-01,  8.8568050e-01],
         [-4.8885244e-01, -1.9139925e-01,  5.6082678e-01],
         [ 7.1493495e-01, -2.7520308e-01,  1.4879217e+00]],

        [[ 8.5356426e-01,  5.7057571e-01, -8.7584412e-01],
         [ 1.4214903e+00,  2.9921460e-01, -2.2303348e+00],
         [-4.9077800e-01, -3.7876591e-01, -7.1144062e-01],
         ...,
         [ 1.2371563e-01,  1.5320536e+00, -9.4040227e-01],
         [-4.2562485e-01,  1.5302038e+00, -4.3112147e-01],
         [ 1.9879790e-02,  1.2160345e-01, -1.8441306e-01]],

        ...,

        [[-3.2652920e-01, -5.6869745e-01, -4.5405510e-01],
         [-1.3042223e+00, -2.4090929e+00, -7.2417903e-01],
         [-1.3885345e+00,  5.2970779e-01,  1.5601419e+00],
         ...,
         [ 5.6834185e-01,  1.3871250e+00,  1.9315037e+00],
         [-1.0512487e+00, -1.7243301e+00, -4.0546840e-01],
         [-6.6690624e-01, -8.8478714e-01, -1.5375769e+00]],

        [[ 1.8121783e+00,  4.5676398e-01,  8.1859183e-01],
         [-5.9176052e-01,  1.0563352e+00, -1.2553442e+00],
         [ 3.2731283e-01,  7.3690069e-01,  5.3006865e-02],
         ...,
         [-4.5931679e-01, -7.6018726e-03,  2.3428485e+00],
         [ 9.6573007e-01, -5.5759960e-01, -1.0437362e+00],
         [ 9.6895421e-01,  1.4537536e+00,  1.5870560e+00]],

        [[ 1.7425136e-01, -4.8647013e-01, -2.0756158e-01],
         [-1.5267868e+00,  1.2506720e+00,  1.6387273e+00],
         [ 5.6755042e-01,  1.9707994e-01, -2.1612637e-01],
         ...,
         [-1.0404299e+00, -2.3283431e-01,  3.6714733e-01],
         [-2.6692331e-01, -9.3290025e-01, -8.0070978e-01],
         [-1.0560746e+00, -5.6154871e-01, -4.5454651e-01]]],


       [[[ 5.4996252e-01, -7.4528590e-02, -1.7633947e+00],
         [ 1.4020704e+00, -3.1120751e-02,  1.9022448e-01],
         [-4.7699863e-01, -1.7304105e+00,  1.1736192e-01],
         ...,
         [ 1.8769185e-01,  8.5350543e-01,  4.7741164e-02],
         [-1.4541264e-01, -9.9898869e-01,  7.3640913e-02],
         [ 5.3035522e-01, -8.3856702e-01,  1.2727956e+00]],

        [[ 7.5868833e-01,  4.5974392e-01,  2.8253782e-01],
         [ 1.7432643e-02, -1.5800915e+00, -1.3651814e+00],
         [ 4.4899422e-01,  1.2600594e+00,  8.3453131e-01],
         ...,
         [-5.9219682e-01, -3.0797318e-01, -3.5230178e-01],
         [-3.5032326e-01, -9.3916798e-01, -1.0232027e+00],
         [ 6.6426206e-01, -7.5940812e-01, -1.5368392e-01]],

        [[-1.8304984e-01,  3.0128349e-02,  2.6022443e-01],
         [-3.2168972e-01, -3.7524790e-02,  1.7678227e+00],
         [-9.2479807e-01, -2.5103426e+00,  3.8181350e+00],
         ...,
         [-1.1613452e+00, -5.0601840e-01,  6.7459375e-02],
         [ 6.7571604e-01,  6.2570554e-01,  7.1416867e-01],
         [-7.0441771e-01, -2.3878381e-01,  1.7144620e-01]],

        ...,

        [[ 1.2203118e+00, -3.1659630e-01, -2.2068007e+00],
         [ 6.7880291e-01, -5.5849835e-02,  3.6364025e-01],
         [ 1.1501849e+00, -4.5225367e-01,  2.8305292e+00],
         ...,
         [ 9.3994401e-02, -1.7224660e+00,  9.1594011e-01],
         [-6.3822865e-02, -3.7926456e-01,  5.0507355e-02],
         [-8.5975856e-01, -1.3075988e+00, -1.2428442e+00]],

        [[-7.5654340e-01,  1.5932541e+00,  1.0407473e+00],
         [ 7.3774874e-01,  2.5088508e+00, -2.1571748e+00],
         [ 1.0017579e+00, -1.2218058e+00,  6.2611049e-01],
         ...,
         [-7.3354578e-01, -1.4394432e-01,  8.9386606e-01],
         [ 8.7287289e-01, -6.8871593e-01,  6.4629629e-02],
         [-9.2307025e-01, -6.5930080e-01,  2.3725896e-01]],

        [[-1.9912511e-01, -7.0831543e-01,  1.2712439e+00],
         [-2.8995559e-01, -4.2042613e-01, -7.0997375e-01],
         [ 3.0826831e-01,  6.8689781e-01, -6.2766171e-01],
         ...,
         [-6.4379460e-01,  1.3598800e-03, -1.5731211e-01],
         [ 1.8480361e-01,  9.9664742e-01, -2.1612684e-01],
         [-1.1284094e-01,  5.5496919e-01, -1.0352247e+00]]]],
      dtype=float32)>

选择性地省略,全部省略时即为::,表示从最开始读取到最末尾,步长为 1,即不跳过任何
元素。如 x[0,::]表示读取第 1 张图片的所有行,其中::表示在行维度上读取所有行,它等价
于 x[0]的写法

x[0,::]
<tf.Tensor: id=291, shape=(32, 32, 3), dtype=float32, numpy=
array([[[ 0.15887685,  1.0638021 ,  0.24262176],
        [ 0.5162131 ,  0.3146141 ,  1.30261   ],
        [ 0.32958803, -0.07544474, -0.798697  ],
        ...,
        [ 0.5430354 ,  0.06708364,  0.18718515],
        [ 0.15137158,  0.82485485, -0.07353581],
        [ 0.06174438, -0.28900453, -0.17766367]],

       [[-0.8288241 , -0.43492955,  0.12549026],
        [-0.53608227, -1.3839718 , -0.38280237],
        [-1.0408336 , -0.7138202 , -1.2163761 ],
        ...,
        [-0.42730084, -0.5418567 , -1.4307729 ],
        [-0.5901603 , -0.20020208, -1.1485943 ],
        [ 0.6754496 ,  1.1658471 ,  1.254681  ]],

       [[ 0.8199817 , -0.13179821, -1.0026433 ],
        [ 0.20038375,  2.0983407 , -1.6608505 ],
        [-1.2044364 , -0.18790387,  0.99040943],
        ...,
        [-0.3662675 , -2.1195405 , -0.4394893 ],
        [-0.80587035, -0.34881762, -0.7255154 ],
        [ 0.05374175, -0.36799756,  1.710843  ]],

       ...,

       [[ 2.349938  , -1.0341504 ,  0.69676656],
        [ 1.1879286 , -0.01511606,  1.159321  ],
        [ 2.0477667 ,  0.13849233,  1.3389939 ],
        ...,
        [-0.56973726, -0.705439  ,  1.019128  ],
        [ 0.6739323 , -1.3462726 , -0.54695696],
        [ 0.7926641 , -0.03149584, -0.36437288]],

       [[ 0.03053406,  0.18028776, -0.01941211],
        [-0.08558384,  0.63365006,  0.9841086 ],
        [ 1.0553024 ,  1.1736095 ,  0.31384602],
        ...,
        [ 1.5592244 , -0.42086074,  0.6394359 ],
        [ 1.1481735 ,  0.4548075 ,  0.94918376],
        [ 1.1662008 ,  0.7955788 ,  0.81457335]],

       [[-1.4271562 ,  0.28228137, -0.28983933],
        [-0.23206161,  0.53333867,  0.8283403 ],
        [-0.56755936,  0.53504246, -0.3638036 ],
        ...,
        [-0.09928504,  0.9733174 ,  0.830424  ],
        [-0.8019797 , -0.41261557, -1.7591631 ],
        [-0.02050705, -0.18762097,  2.5975325 ]]], dtype=float32)>
#为了更加简洁,::可以简写为单个冒号:,例如:
x[:,0:28:2,0:28:2,:]
<tf.Tensor: id=295, shape=(4, 14, 14, 3), dtype=float32, numpy=
array([[[[ 0.15887685,  1.0638021 ,  0.24262176],
         [ 0.32958803, -0.07544474, -0.798697  ],
         [ 0.5564957 , -1.3757765 ,  0.6403469 ],
         ...,
         [ 0.21362567,  0.90268373, -0.9598748 ],
         [-1.5960621 ,  0.04354888,  1.5154463 ],
         [-0.99656004,  0.65905744, -2.1828988 ]],

        [[ 0.8199817 , -0.13179821, -1.0026433 ],
         [-1.2044364 , -0.18790387,  0.99040943],
         [-1.0860363 ,  0.7975513 ,  0.65343034],
         ...,
         [ 0.65947324, -0.31727815, -1.1517917 ],
         [ 0.37630206,  1.7621716 ,  0.12167298],
         [-0.27297562, -0.2967891 ,  0.26511705]],

        [[ 0.8985541 , -0.2362703 , -1.767671  ],
         [ 1.9127101 , -0.3876528 , -0.58510786],
         [ 0.6726209 , -1.1121641 , -1.532782  ],
         ...,
         [-1.1396053 , -0.79089534, -0.71180403],
         [-1.617079  , -1.1581329 ,  0.9971896 ],
         [-0.8432604 , -1.0547616 , -0.32770067]],

        ...,

        [[-0.11135279,  0.6382316 , -0.09314802],
         [-0.7863504 , -0.60625976, -0.22311987],
         [ 0.9794706 ,  2.0968647 , -1.9315486 ],
         ...,
         [ 0.01740818, -0.626683  ,  0.6311278 ],
         [ 2.2090154 ,  2.1570942 , -0.13342246],
         [ 0.37910503, -0.6176556 , -1.4818878 ]],

        [[-1.4722553 ,  1.3793535 ,  0.04639049],
         [-1.1332452 , -0.21064432,  0.10829575],
         [-0.64298946, -0.23143098,  0.94136727],
         ...,
         [-0.49181092, -0.9117892 , -1.5097189 ],
         [ 0.3968739 , -0.5277835 , -0.09638919],
         [-0.5999648 ,  0.11105305, -1.3502094 ]],

        [[-0.30277148,  0.24492094,  0.7838737 ],
         [-0.96122885,  0.49086136,  1.0475286 ],
         [-0.20484306,  1.34876   ,  0.9071389 ],
         ...,
         [ 0.29213133, -0.27826986,  0.600105  ],
         [-0.82881486, -0.9858647 ,  1.0007055 ],
         [-1.068334  , -1.0641605 , -0.1473667 ]]],


       [[[-0.8069039 ,  0.00746191, -1.2501005 ],
         [ 0.35499024, -1.1693124 ,  2.071324  ],
         [ 0.3120519 ,  0.45302156, -0.07594787],
         ...,
         [ 0.286556  , -1.8745283 ,  0.57682574],
         [ 0.12936789,  0.33494765,  1.6998888 ],
         [-0.7598867 ,  0.9585704 , -0.56586885]],

        [[ 0.85356426,  0.5705757 , -0.8758441 ],
         [-0.490778  , -0.3787659 , -0.7114406 ],
         [ 1.3272129 , -0.6036801 , -0.9600941 ],
         ...,
         [-0.67802536,  0.6861715 ,  1.4260198 ],
         [-0.31445473, -0.5349083 , -0.57425916],
         [ 0.39839053, -0.39068264,  1.7121905 ]],

        [[ 0.23872656, -0.15054561, -0.0458181 ],
         [ 0.14625135, -0.4049384 , -0.14186974],
         [ 0.05129207,  1.0203384 ,  0.13865176],
         ...,
         [-0.90658844, -0.4950846 ,  0.17458636],
         [-1.2691858 , -1.3299806 ,  0.3671368 ],
         [-0.56088233,  0.6397849 ,  1.5009962 ]],

        ...,

        [[-0.03738111, -0.7738488 , -0.26276183],
         [ 2.3279767 , -0.67886794,  1.559117  ],
         [-0.5135724 , -0.5762953 ,  0.15755104],
         ...,
         [ 0.30737424, -1.2174454 , -0.6815703 ],
         [ 0.67446476, -1.1067839 ,  0.5712222 ],
         [ 1.3038222 , -0.60405934, -0.53372014]],

        [[ 0.20993517,  0.78506285,  0.8735096 ],
         [-0.8074077 , -0.40923628, -0.70715094],
         [-1.1693084 , -0.3493604 , -0.6675655 ],
         ...,
         [-0.2307581 ,  0.2838166 , -0.82147557],
         [-1.3473532 ,  0.6485388 ,  0.0704335 ],
         [-0.41859862,  0.39183897, -0.8804382 ]],

        [[-0.34710523,  0.49272516, -1.2784463 ],
         [ 0.5125653 , -0.86366785, -1.3109303 ],
         [-0.1936582 , -0.22554746,  0.00571271],
         ...,
         [ 0.2314209 ,  0.18028627,  0.03390534],
         [ 0.5337963 , -1.604414  ,  0.84392726],
         [-0.87709856,  0.14362933, -0.71602184]]],


       [[[ 0.5499625 , -0.07452859, -1.7633947 ],
         [-0.47699863, -1.7304105 ,  0.11736192],
         [ 0.82264894, -0.2708066 ,  1.5604311 ],
         ...,
         [-0.4140058 , -1.1708884 ,  0.79029524],
         [-0.02007888,  0.48785588,  0.77818274],
         [-0.6029282 ,  0.14837039, -0.29615375]],

        [[-0.18304984,  0.03012835,  0.26022443],
         [-0.9247981 , -2.5103426 ,  3.818135  ],
         [ 0.44077075, -0.7850783 ,  0.4808445 ],
         ...,
         [ 1.3806164 , -0.7654911 ,  0.75263834],
         [-0.26342234, -0.70678735, -0.23413028],
         [ 0.37442288, -0.95262015, -0.07785683]],

        [[ 1.1646081 , -2.2333117 , -0.90204155],
         [-0.63897634, -1.0743266 , -0.37296316],
         [-0.33931142, -0.35908872,  2.0006905 ],
         ...,
         [-0.681646  ,  0.48462653, -0.73086166],
         [-0.5486749 , -1.3432366 ,  1.3641402 ],
         [ 1.8827593 ,  1.123424  ,  0.08247144]],

        ...,

        [[-1.172418  , -0.54893184, -0.1982384 ],
         [ 0.11114006, -1.4053111 ,  1.1397101 ],
         [ 0.03683241, -0.06735677, -0.39457574],
         ...,
         [-0.42947474,  0.19043003,  0.4715683 ],
         [ 1.3612753 , -0.8206614 , -0.0915319 ],
         [-0.87690365,  0.3337372 ,  0.08546929]],

        [[-1.3603598 , -0.03191262,  0.6463607 ],
         [ 1.3535389 , -0.37833852,  0.52540874],
         [ 2.0996022 , -0.41821897, -0.8002851 ],
         ...,
         [ 0.4086669 , -0.13528611, -0.9753031 ],
         [-0.01599073, -1.0717136 ,  1.4852026 ],
         [-0.86911243,  0.48376665,  0.9228629 ]],

        [[-0.14626777, -0.5788868 ,  2.212906  ],
         [-0.26023954,  0.35533088, -2.0547736 ],
         [ 0.86007774,  1.8946397 , -1.0837168 ],
         ...,
         [-1.1737899 ,  0.27013153,  0.16612251],
         [ 0.449273  , -0.5145797 ,  0.86907375],
         [-0.5543534 , -0.1062188 ,  0.4306154 ]]],


       [[[ 0.49462888, -0.6546244 , -1.7220471 ],
         [ 1.7542757 , -1.5229647 ,  0.02799832],
         [ 0.34576586, -0.68108815, -0.9376163 ],
         ...,
         [ 0.5911611 , -0.9711958 ,  0.28559965],
         [ 1.8167363 ,  0.09494361,  0.78475004],
         [-1.1180531 ,  0.7017198 , -1.3113427 ]],

        [[ 0.59054863, -0.36850604, -0.7498821 ],
         [ 0.47888842,  0.9133491 , -1.8785608 ],
         [ 0.6818354 ,  0.46285114, -0.15768681],
         ...,
         [ 0.05651945,  0.29372808,  1.103691  ],
         [ 0.2757879 ,  0.21998271,  0.3068704 ],
         [ 1.1049832 , -1.0403694 , -0.6311421 ]],

        [[-0.8536621 , -0.49601755,  0.8962184 ],
         [ 1.2591157 ,  0.85147136, -1.7603171 ],
         [ 2.4509947 , -0.22679244, -1.8956126 ],
         ...,
         [ 2.1961102 ,  0.38456282,  3.1728263 ],
         [ 1.7703277 , -0.64611787,  1.3456304 ],
         [ 0.33494255,  1.4633365 , -0.3434589 ]],

        ...,

        [[ 0.10294417, -0.2253732 , -0.53565925],
         [ 0.9502148 ,  0.55071384, -1.416617  ],
         [-0.8826971 , -1.6315796 , -0.49298233],
         ...,
         [-0.17867398, -1.0845338 , -0.5998716 ],
         [-0.9191785 ,  0.00466048,  0.37791595],
         [ 0.06293341, -0.6218224 ,  0.00583568]],

        [[-0.08573319,  0.60246944, -0.90955913],
         [ 0.6438663 , -0.13751327,  0.7928952 ],
         [ 0.11398036, -0.87621975,  0.59329295],
         ...,
         [-1.0285736 ,  0.7909276 , -0.9698808 ],
         [-0.6169408 , -0.14468005, -0.85658306],
         [ 1.5153418 , -0.73712003,  0.8751484 ]],

        [[-1.0867566 ,  0.43356776,  1.3535215 ],
         [-0.6166509 , -0.98992044, -0.69931316],
         [-0.9269953 , -0.9682326 , -1.228167  ],
         ...,
         [ 0.9433455 , -1.2464502 , -1.3953815 ],
         [ 1.1135478 , -0.01275251, -0.45150715],
         [ 0.65627646, -1.054595  , -0.40598097]]]], dtype=float32)>
x = tf.range(9)
x[8:0:-1]
<tf.Tensor: id=303, shape=(8,), dtype=int32, numpy=array([8, 7, 6, 5, 4, 3, 2, 1])>
x[::-1] # 逆序全部元素
<tf.Tensor: id=307, shape=(9,), dtype=int32, numpy=array([8, 7, 6, 5, 4, 3, 2, 1, 0])>
x[::-2] # 逆序间隔采样
<tf.Tensor: id=311, shape=(5,), dtype=int32, numpy=array([8, 6, 4, 2, 0])>
 x = tf.random.normal([4,32,32,3])
x[0,::-2,::-2] # 行、列逆序间隔采样
<tf.Tensor: id=321, shape=(16, 16, 3), dtype=float32, numpy=
array([[[-5.12858629e-01, -1.32675040e+00, -1.33697355e+00],
        [ 8.31341088e-01, -9.16649461e-01,  9.32324529e-01],
        [ 6.71549797e-01,  1.66459358e+00,  1.38182676e+00],
        [-8.60066473e-01,  1.14311111e+00,  6.23336136e-01],
        [-1.68301058e+00, -9.67067301e-01, -4.69439477e-01],
        [-4.30352598e-01, -5.66130817e-01,  4.92661819e-02],
        [-2.81552583e-01,  8.43149304e-01, -8.60508144e-01],
        [-6.12139583e-01, -2.08047971e-01,  4.72213104e-02],
        [ 4.00718898e-02, -3.40393066e-01,  1.15577018e+00],
        [-7.73570955e-01, -8.05181861e-01, -1.37376678e+00],
        [ 1.55765843e+00,  1.08528352e+00, -4.50809181e-01],
        [-1.32817268e-01,  1.53884733e+00,  1.35703236e-01],
        [ 2.12556815e+00, -5.24271309e-01, -5.43570876e-01],
        [ 8.20920765e-01,  7.34462082e-01,  1.79477334e-01],
        [ 3.98115873e-01, -1.40981698e+00,  5.36015511e-01],
        [ 6.95917308e-01, -5.66721261e-01,  1.61542201e+00]],

       [[ 1.30037308e+00,  4.05710965e-01, -9.92437422e-01],
        [ 6.97564244e-01, -1.03721142e+00,  1.28508317e+00],
        [ 8.73169243e-01, -1.57073462e+00, -4.67298508e-01],
        [-8.92316878e-01, -2.95966923e-01,  9.10675406e-01],
        [-8.36626053e-01, -1.96540758e-01, -9.75416422e-01],
        [-1.05574680e-02, -9.30420756e-01,  4.71875817e-01],
        [ 5.16441226e-01, -1.97359145e-01, -8.47413421e-01],
        [-1.75481892e+00,  1.34345043e+00,  1.45030463e+00],
        [ 1.76945016e-01,  6.59373522e-01,  5.70764169e-02],
        [ 3.94355297e-01,  5.87972514e-02,  4.69360888e-01],
        [-4.00711417e-01,  3.11920333e+00,  4.14017886e-01],
        [-4.20988083e-01, -4.72355217e-01, -4.71134722e-01],
        [ 1.04904854e+00,  1.17252529e+00, -8.40087056e-01],
        [ 9.01208282e-01,  9.69512582e-01,  8.79506171e-01],
        [ 4.01288271e-01,  8.59772339e-02,  6.74215496e-01],
        [ 7.28564620e-01,  5.78985870e-01,  1.36003792e+00]],

       [[ 1.00393102e-01, -2.10702315e-01,  9.14948434e-03],
        [-3.64452243e-01, -2.04911038e-01, -9.47110951e-01],
        [-4.57592607e-01,  6.94192171e-01, -1.61121333e+00],
        [ 9.47298527e-01, -2.77903862e-02,  5.12000918e-01],
        [ 1.72435129e+00, -2.53748447e-01, -3.72150950e-02],
        [-1.18907213e+00,  2.68956777e-02,  5.02674937e-01],
        [-1.30086350e+00,  4.56655651e-01,  1.17382956e+00],
        [ 2.55916178e-01, -7.83826649e-01,  5.54520845e-01],
        [ 8.03767052e-03,  8.86991978e-01, -3.42536062e-01],
        [-1.82633507e+00,  2.15081960e-01,  3.56852775e-03],
        [ 4.66198802e-01,  1.42127466e+00,  8.26655984e-01],
        [-1.02861357e+00, -6.88878298e-01, -8.07294130e-01],
        [ 1.05638993e+00, -5.82328022e-01,  5.37092030e-01],
        [-1.26770282e+00, -7.51018286e-01,  1.17214811e+00],
        [ 7.21696794e-01, -6.93379223e-01,  6.51117504e-01],
        [ 1.36706114e+00,  6.41243160e-01, -1.44741154e+00]],

       [[ 7.93220878e-01,  2.70831048e-01,  3.44419360e-01],
        [-2.40225816e+00, -5.86368680e-01, -8.93629491e-01],
        [-1.08578479e+00, -1.23838007e-01, -7.95995831e-01],
        [ 6.84047282e-01,  2.76426785e-02,  5.91857195e-01],
        [-8.65850031e-01,  5.19776940e-01, -8.59846830e-01],
        [ 6.65789843e-01, -2.43236776e-02,  7.24116266e-01],
        [-1.08223259e+00, -1.05693889e+00,  5.50808132e-01],
        [ 6.23720139e-02, -6.00328408e-02, -1.85542893e+00],
        [-8.33110034e-01, -7.95776010e-01, -3.11213404e-01],
        [-2.24416590e+00, -1.43613309e-01,  4.67960490e-03],
        [-1.67391884e+00,  4.26855505e-01,  4.09049541e-01],
        [-8.14227819e-01, -1.64458287e+00,  3.98572147e-01],
        [ 2.74991870e-01,  6.26605213e-01,  1.13287187e+00],
        [ 1.98929999e-02, -1.57774401e+00, -2.32633069e-01],
        [-1.59658110e+00, -8.02877247e-01,  4.74463366e-02],
        [-2.75298953e-01,  1.48242986e+00, -1.29947111e-01]],

       [[-5.49136877e-01,  1.45316315e+00, -3.06085020e-01],
        [ 8.00148189e-01, -7.47697949e-01,  1.00875366e+00],
        [ 3.91878575e-01, -6.91525519e-01,  1.16107237e+00],
        [-6.79570735e-01, -8.19280565e-01,  3.47965956e-02],
        [-8.73226464e-01, -4.07751650e-01, -4.50293303e-01],
        [-7.90950358e-01,  3.61499786e-01,  1.39569175e+00],
        [ 1.55323851e+00,  1.09538209e+00, -1.51868057e+00],
        [-5.08049846e-01, -1.40525472e+00,  8.01847339e-01],
        [-3.70299041e-01,  7.56911516e-01, -1.14244986e+00],
        [ 7.23651946e-01,  1.53758657e+00, -2.49552965e-01],
        [ 7.42164403e-02, -3.86257350e-01,  1.88116157e+00],
        [ 1.41864979e+00, -2.91464269e-01,  3.41553450e-01],
        [ 4.65657227e-02,  3.46146300e-02,  9.63111401e-01],
        [ 3.17835093e-01, -8.39324474e-01,  1.25640607e+00],
        [ 2.71349400e-01,  1.91866308e-01,  1.81787741e+00],
        [ 9.24407661e-01, -4.20070827e-01,  1.17567563e+00]],

       [[ 2.36003146e-01, -3.49181652e-01,  1.36427090e-01],
        [-9.00793672e-02, -3.53315920e-01, -1.92875624e+00],
        [-1.78954041e+00,  1.84840366e-01,  3.49817157e-01],
        [ 8.17180574e-01, -6.08494997e-01,  2.57253289e-01],
        [ 3.06243092e-01,  4.15356636e-01, -1.24045447e-01],
        [ 1.41851163e+00,  8.93023014e-01, -1.98486447e-01],
        [ 5.98474443e-01, -9.95723367e-01, -4.18219805e-01],
        [-1.26175463e+00,  3.88196558e-02,  8.71327043e-01],
        [ 5.41638494e-01,  9.08082664e-01,  2.82411671e+00],
        [ 3.76620561e-01,  6.71772361e-01, -1.92329729e+00],
        [-1.06631410e+00, -1.56089199e+00,  1.28559530e+00],
        [-8.74469936e-01,  1.69978961e-01,  2.13683939e+00],
        [-3.80789906e-01,  2.04378557e+00,  3.65629457e-02],
        [-1.30368543e+00, -1.79424226e+00,  6.68191314e-01],
        [-5.23736060e-01,  2.14872345e-01,  2.98764650e-02],
        [ 1.22676218e+00,  8.44560489e-02,  9.79913116e-01]],

       [[ 2.05020666e-01,  6.29583538e-01,  2.07532898e-01],
        [-5.75563431e-01, -8.82000744e-01, -5.89373887e-01],
        [ 5.92088044e-01,  5.59074640e-01,  2.97833085e-01],
        [-9.24980760e-01,  7.00818673e-02,  8.58127892e-01],
        [-6.33320570e-01, -5.85945100e-02,  9.83602762e-01],
        [ 2.24870467e+00,  9.08229686e-03, -1.84816039e+00],
        [ 8.60568583e-01, -3.52092117e-01, -4.26065892e-01],
        [-2.01408237e-01,  6.86502934e-01,  1.89810288e+00],
        [ 4.66678053e-01, -8.84282812e-02,  1.86781704e+00],
        [-1.87390193e-01,  1.00713968e+00, -4.72037852e-01],
        [-5.78431368e-01,  9.62767839e-01, -1.11948502e+00],
        [-5.10407723e-02, -1.19529998e+00, -1.02296686e+00],
        [-5.13843000e-01,  7.36129522e-01,  1.90848017e+00],
        [ 5.10215402e-01, -6.50578514e-02,  5.98527342e-02],
        [ 1.55760312e+00,  1.29025948e+00,  2.32096529e+00],
        [ 1.82811832e+00, -8.50238085e-01, -1.64907038e+00]],

       [[ 3.27760458e-01,  9.01203156e-01,  4.17243719e-01],
        [-6.38067186e-01, -1.90164059e-01,  1.06154263e-01],
        [ 7.52179682e-01,  8.09487581e-01, -1.86576903e+00],
        [ 9.69614565e-01, -5.12210011e-01, -9.08970296e-01],
        [-2.55323100e+00,  1.08321774e+00,  9.57793295e-01],
        [-2.61134151e-02, -1.82595742e+00, -1.85256016e+00],
        [-4.81456161e-01,  7.38383234e-01, -3.39342880e+00],
        [ 1.95254326e-01,  3.68862569e-01,  2.47258091e+00],
        [-1.49648738e+00,  2.05746627e+00, -2.84938281e-03],
        [-1.59891248e-01,  8.89002204e-01,  4.31419648e-02],
        [ 1.07003999e+00, -8.31308544e-01, -1.60301065e+00],
        [ 9.95886803e-01,  6.92649722e-01, -1.03685729e-01],
        [ 1.48837328e-01, -2.00259313e-02,  7.74835944e-01],
        [ 1.64104730e-01,  2.59704173e-01,  1.05614209e+00],
        [ 4.54318851e-01, -2.94152051e-01,  1.04833090e+00],
        [ 1.18135512e+00,  8.37986827e-01,  7.61772573e-01]],

       [[-1.20815742e+00,  7.39185572e-01,  1.67155874e+00],
        [ 8.03316534e-01,  2.80814111e-01,  1.53534985e+00],
        [ 1.29337895e+00,  2.45484501e-01,  6.34474754e-01],
        [-8.73428464e-01,  1.06092429e+00, -5.30627429e-01],
        [-1.29879192e-01, -6.95716202e-01, -1.04920518e+00],
        [-1.80375373e+00,  1.23836732e+00, -1.65295839e+00],
        [ 7.41421998e-01,  9.79211554e-02, -1.08601868e+00],
        [ 1.00783803e-01,  1.22072406e-01, -6.06758237e-01],
        [-9.96779621e-01, -1.37761843e+00,  7.54637897e-01],
        [-4.81967598e-01, -1.27421820e+00,  2.47721225e-01],
        [-1.55993536e-01, -2.75669694e-01, -2.44097054e-01],
        [-1.51471400e+00, -1.21171594e+00, -6.86530769e-02],
        [ 8.64279985e-01, -6.85522318e-01,  1.57672751e+00],
        [-4.01571423e-01, -3.29137135e+00, -8.24092746e-01],
        [ 9.05608714e-01,  1.88336241e+00, -3.24371040e-01],
        [-1.12480211e+00,  5.84363818e-01, -1.70251295e-01]],

       [[ 3.37083757e-01, -7.79814780e-01,  1.17843688e+00],
        [ 2.57556343e+00, -3.08598399e-01,  5.71095526e-01],
        [-1.14723504e+00,  1.51846394e-01, -3.31831634e-01],
        [-5.05951166e-01,  5.00925601e-01,  1.30445793e-01],
        [-7.39884615e-01,  4.34239395e-02, -5.63407280e-02],
        [ 5.31273007e-01,  1.24979806e+00,  7.88931131e-01],
        [-2.47994542e-01, -3.82653713e-01,  7.10633278e-01],
        [ 1.65640652e+00,  3.82380486e-01,  1.28080201e+00],
        [-1.80479681e+00,  1.27170587e+00, -1.28015792e+00],
        [ 4.30591464e-01,  1.29270589e+00,  2.48598143e-01],
        [ 4.52580787e-02,  4.49906677e-01,  5.55632591e-01],
        [-8.84447813e-01,  6.50327086e-01, -8.97721350e-01],
        [-1.44865870e+00, -2.37699819e+00, -7.57218182e-01],
        [ 1.67246842e+00,  1.81695208e-01,  1.12071657e+00],
        [ 1.80570975e-01,  3.57616454e-01,  6.86495721e-01],
        [-4.02043939e-01,  8.16699088e-01,  9.23376679e-01]],

       [[ 3.99261057e-01,  1.08122385e+00,  1.44043756e+00],
        [ 1.20331287e+00,  1.01484406e+00, -3.90487432e-01],
        [ 1.32636809e+00, -5.59104323e-01, -1.02056718e+00],
        [ 2.80945778e+00,  1.37894064e-01, -1.53881931e+00],
        [-1.21396112e+00, -1.91403568e-01,  1.73667490e+00],
        [ 3.32253277e-01,  1.95369339e+00,  2.68321663e-01],
        [-1.72506452e-01,  5.79592407e-01,  1.52968019e-02],
        [-3.02448153e-01,  5.39278448e-01,  1.38653052e+00],
        [ 7.37487972e-01,  6.79241955e-01, -7.99145520e-01],
        [ 6.53870761e-01, -2.11292791e+00,  2.95682222e-01],
        [-1.11456025e+00,  9.22115684e-01, -1.04801702e+00],
        [-1.51963556e+00,  5.28072476e-01,  2.78489971e+00],
        [ 7.68236756e-01, -6.65140375e-02,  4.24144007e-02],
        [ 1.05252457e+00,  1.06121294e-01,  1.12995647e-01],
        [-1.13180864e+00,  2.94206679e-01, -1.98684096e-01],
        [ 1.23381329e+00,  1.05745524e-01,  1.36180475e-01]],

       [[-1.21883428e+00,  2.80695260e-02,  1.44445395e+00],
        [ 5.40190160e-01,  4.31556135e-01,  5.63819110e-01],
        [ 8.99560988e-01,  4.39921230e-01, -8.19515586e-02],
        [ 1.47566283e+00, -5.17122746e-01,  1.68671310e+00],
        [ 8.93421948e-01, -1.82934833e+00,  1.45627356e+00],
        [-1.43394932e-01, -6.35605212e-03,  1.93333011e-02],
        [ 7.79834092e-01, -9.28719163e-01, -2.25391197e+00],
        [ 1.44192314e+00, -4.00518835e-01,  4.07484859e-01],
        [ 4.31960285e-01, -7.41328299e-01, -1.06466281e+00],
        [-9.40677822e-01, -1.08484042e+00,  6.90169185e-02],
        [ 1.07374322e+00, -5.49093843e-01, -2.21619773e+00],
        [ 1.15938544e+00, -1.90980837e-01,  4.39361185e-01],
        [ 7.95709848e-01,  1.11644119e-01, -3.22367907e-01],
        [-5.63396752e-01, -2.19989347e+00, -1.20200741e+00],
        [ 5.35169125e-01,  6.19813442e-01,  1.01975262e+00],
        [ 7.69046307e-01, -4.97781038e-01, -2.67138869e-01]],

       [[ 8.18550467e-01, -3.23144376e-01,  1.85840917e+00],
        [-6.88673198e-01,  1.55853498e+00,  1.10943043e+00],
        [-2.60069788e-01, -6.23827219e-01, -1.76365626e+00],
        [-2.10912490e+00,  6.61493897e-01,  7.79901564e-01],
        [-1.63351655e+00,  4.79646295e-01, -4.69270974e-01],
        [ 6.40578330e-01, -1.14313424e+00,  5.48534729e-02],
        [ 2.32353425e+00, -1.08948505e+00, -8.77904356e-01],
        [-8.93575311e-01,  1.13692999e+00, -9.36922908e-01],
        [ 1.15496743e+00, -6.28629565e-01, -3.17590237e-01],
        [ 8.38947237e-01,  1.38581479e+00,  1.01983666e+00],
        [ 2.86714017e-01, -9.96401787e-01,  7.23477960e-01],
        [-6.32666707e-01, -2.35953641e+00, -1.67708710e-01],
        [-1.42971432e+00,  1.62384748e+00, -1.63778269e+00],
        [-3.14870417e-01,  3.46172482e-01,  4.28715050e-01],
        [-4.13033515e-01, -1.62611198e+00,  7.18329728e-01],
        [-7.69768208e-02,  6.02496006e-02,  1.94374669e+00]],

       [[ 1.07356524e+00, -3.56402367e-01,  6.15161538e-01],
        [ 6.01745486e-01,  2.01817608e+00, -9.49906781e-02],
        [ 3.25398624e-01, -6.61825299e-01,  5.80669343e-02],
        [-1.21191585e+00,  1.29405797e-01,  7.43842304e-01],
        [-4.04925913e-01,  6.19568467e-01,  7.03277290e-01],
        [-1.13940656e+00, -2.63668954e-01, -3.92803550e-01],
        [ 5.42838089e-02,  3.93184155e-01, -1.74112514e-01],
        [-8.40804279e-01, -1.76327896e+00,  2.27612779e-01],
        [-4.17193949e-01,  3.40831757e-01,  6.36482358e-01],
        [ 7.70674407e-01, -6.54981136e-01, -1.38170719e+00],
        [-3.77197474e-01,  2.10867429e+00,  2.25425184e-01],
        [-1.43056989e-01, -1.04891777e+00, -2.19663531e-02],
        [-5.30137904e-02,  2.18721652e+00, -1.35581923e+00],
        [-1.20569074e+00, -9.82609689e-02,  3.20044577e-01],
        [-1.13968039e+00,  3.58640999e-01,  8.02457750e-01],
        [ 5.93103409e-01, -4.91118252e-01, -8.14898372e-01]],

       [[-6.91259205e-01,  8.54395568e-01, -1.19995535e-03],
        [ 2.27573156e-01,  5.21366969e-02, -6.91265047e-01],
        [-7.97660887e-01,  5.71649909e-01,  1.01897120e+00],
        [-2.05846286e+00, -1.17296588e+00,  1.53154171e+00],
        [-3.03904116e-01, -8.37112010e-01, -2.57611182e-02],
        [ 4.46517140e-01, -2.41748601e-01, -5.75058162e-01],
        [-7.84437731e-02,  2.17079258e+00, -1.11774576e+00],
        [ 6.92619607e-02, -1.33171529e-01, -6.01373613e-01],
        [-4.10314441e-01,  6.29732162e-02,  2.66803741e-01],
        [ 8.39434683e-01,  3.58227223e-01,  1.02784407e+00],
        [ 2.19815302e+00,  8.35144147e-02,  6.57926440e-01],
        [ 1.71179569e+00, -6.62919343e-01, -1.11403000e+00],
        [ 2.15544319e+00,  2.47308642e-01, -3.71033102e-01],
        [-2.97141761e-01,  8.92091811e-01, -4.54532892e-01],
        [ 1.06450908e-01, -6.36038005e-01,  4.26029056e-01],
        [-1.12084985e-01, -1.67425299e+00,  3.05158496e-01]],

       [[-9.74530220e-01,  2.95405531e+00,  1.12516677e+00],
        [ 1.69857323e+00,  6.60117984e-01,  2.61585414e-01],
        [ 6.76234245e-01, -7.57119834e-01,  3.70493114e-01],
        [-2.68636793e-01,  2.91401386e-01,  1.60869968e+00],
        [ 3.50089967e-02, -2.02207708e+00,  2.89290063e-02],
        [ 2.92572886e-01,  4.98813659e-01, -9.94279832e-02],
        [-6.38059616e-01,  1.12768340e+00,  3.28210980e-01],
        [-4.84653026e-01,  1.06180221e-01, -7.32409120e-01],
        [ 3.27100307e-01, -5.84893763e-01, -2.58985132e-01],
        [ 2.63800412e-01,  1.64166687e-03,  1.80988431e+00],
        [-4.37855780e-01,  8.42429757e-01, -1.18915582e+00],
        [ 1.12210608e+00,  3.11320350e-02, -6.40212536e-01],
        [ 1.49150407e+00,  2.29620194e+00, -2.10182738e+00],
        [ 1.73180139e+00, -1.58428323e+00,  9.87730265e-01],
        [ 8.01699013e-02,  1.00351930e+00, -1.10668981e+00],
        [ 1.12284362e-01,  1.08892906e+00,  9.22663033e-01]]],
      dtype=float32)>
x[:,:,:,1] # 取 G 通道数据
<tf.Tensor: id=325, shape=(4, 32, 32), dtype=float32, numpy=
array([[[-0.5098771 ,  1.0855609 ,  2.2097447 , ...,  2.0824249 ,
          0.03865028,  0.17514655],
        [ 0.11415657,  1.088929  , -1.7078295 , ...,  0.660118  ,
         -0.3400708 ,  2.9540553 ],
        [-0.4019389 ,  0.05790788,  0.08566959, ...,  0.70855254,
         -0.24059962, -0.6154288 ],
        ...,
        [ 0.22458512,  0.57898587, -0.16411702, ..., -1.0372114 ,
          2.1850038 ,  0.40571097],
        [ 1.3330071 ,  0.771887  ,  0.6989723 , ..., -0.18121673,
         -1.762427  ,  1.0092487 ],
        [ 1.7831248 , -0.56672126, -1.4117293 , ..., -0.91664946,
          0.5346296 , -1.3267504 ]],

       [[ 0.7838936 ,  0.08441457, -0.10567595, ..., -0.62921995,
          0.3537821 , -1.02496   ],
        [-0.8774562 , -0.16845766, -0.25303665, ..., -0.7026532 ,
         -0.79756373,  1.6321977 ],
        [-2.2251005 , -0.5713922 ,  1.4825715 , ..., -0.46648985,
         -0.60295   , -0.80748904],
        ...,
        [ 0.04091079, -0.2438825 ,  0.33083466, ..., -0.21061322,
          0.01396879, -0.60872215],
        [-1.4021204 , -0.7055435 , -0.28904566, ...,  0.25936556,
         -0.13118915,  2.401897  ],
        [-0.2845105 , -0.05943837,  0.18080406, ...,  0.25488204,
          0.5879935 ,  0.07765544]],

       [[-1.2347841 , -1.0714035 ,  0.6181078 , ..., -1.1582766 ,
          2.611402  , -1.0139518 ],
        [ 0.80162483,  1.2351211 , -0.39131325, ..., -0.91507584,
          0.85332286,  0.49019217],
        [ 0.58013743,  0.24763763,  0.6982535 , ...,  0.49509358,
         -0.02501035, -0.5125354 ],
        ...,
        [ 0.21925269, -0.6466106 ,  0.42671174, ...,  1.5173709 ,
          0.71534723, -0.44796994],
        [-1.57441   ,  0.9493397 , -1.0193949 , ..., -0.99153405,
         -2.1873395 ,  0.06027158],
        [-0.19085257, -0.16040516,  0.27712902, ..., -0.00295419,
         -1.2427148 ,  0.8207905 ]],

       [[-1.2573439 , -1.0373671 , -0.7934127 , ...,  0.5131228 ,
          0.25776967, -1.0780028 ],
        [ 0.6763729 ,  2.2258997 , -1.2256166 , ..., -1.1156107 ,
         -1.4238712 , -0.44870865],
        [-0.7616951 ,  0.6207081 , -0.12389632, ..., -1.3940469 ,
          0.14629437,  1.9813092 ],
        ...,
        [-0.7648322 , -0.34840062, -1.1783606 , ...,  0.5241032 ,
          1.1209637 ,  0.48544505],
        [ 0.8830715 , -1.4620808 , -0.67027754, ...,  0.33656323,
         -0.9691739 ,  1.3170173 ],
        [ 1.4038192 ,  0.8117516 ,  0.89982706, ..., -0.529901  ,
          0.991487  ,  0.39819124]]], dtype=float32)>
x[0:2,...,1:] # 高宽维度全部采集
<tf.Tensor: id=329, shape=(2, 32, 32, 2), dtype=float32, numpy=
array([[[[-0.5098771 ,  1.5363066 ],
         [ 1.0855609 , -1.2425263 ],
         [ 2.2097447 , -0.3455262 ],
         ...,
         [ 2.0824249 ,  2.7878735 ],
         [ 0.03865028,  1.5230906 ],
         [ 0.17514655,  0.0449919 ]],

        [[ 0.11415657, -1.4338398 ],
         [ 1.088929  ,  0.92266303],
         [-1.7078295 ,  0.11560173],
         ...,
         [ 0.660118  ,  0.2615854 ],
         [-0.3400708 ,  0.18243966],
         [ 2.9540553 ,  1.1251668 ]],

        [[-0.4019389 ,  0.04989729],
         [ 0.05790788, -0.28769502],
         [ 0.08566959,  0.1837446 ],
         ...,
         [ 0.70855254, -0.33452046],
         [-0.24059962, -1.152884  ],
         [-0.6154288 ,  0.96971357]],

        ...,

        [[ 0.22458512, -0.08382434],
         [ 0.57898587,  1.3600379 ],
         [-0.16411702, -0.23663063],
         ...,
         [-1.0372114 ,  1.2850832 ],
         [ 2.1850038 ,  1.219216  ],
         [ 0.40571097, -0.9924374 ]],

        [[ 1.3330071 , -0.54214746],
         [ 0.771887  ,  0.5319002 ],
         [ 0.6989723 ,  0.39816257],
         ...,
         [-0.18121673, -0.05700704],
         [-1.762427  ,  0.00431028],
         [ 1.0092487 ,  0.36563486]],

        [[ 1.7831248 , -1.7124783 ],
         [-0.56672126,  1.615422  ],
         [-1.4117293 ,  0.22843245],
         ...,
         [-0.91664946,  0.9323245 ],
         [ 0.5346296 ,  0.2788624 ],
         [-1.3267504 , -1.3369735 ]]],


       [[[ 0.7838936 , -0.03984008],
         [ 0.08441457, -1.335493  ],
         [-0.10567595, -0.14938474],
         ...,
         [-0.62921995,  0.21898726],
         [ 0.3537821 ,  1.4961141 ],
         [-1.02496   ,  0.85588825]],

        [[-0.8774562 ,  0.7640021 ],
         [-0.16845766, -0.06334937],
         [-0.25303665,  0.21530381],
         ...,
         [-0.7026532 ,  0.70077586],
         [-0.79756373, -2.2731397 ],
         [ 1.6321977 , -0.31151387]],

        [[-2.2251005 , -0.6970247 ],
         [-0.5713922 ,  1.8029691 ],
         [ 1.4825715 , -0.5640481 ],
         ...,
         [-0.46648985,  0.7609164 ],
         [-0.60295   ,  0.5238442 ],
         [-0.80748904,  0.7934278 ]],

        ...,

        [[ 0.04091079, -0.09855065],
         [-0.2438825 , -0.7801842 ],
         [ 0.33083466,  0.39071873],
         ...,
         [-0.21061322,  3.6896062 ],
         [ 0.01396879,  1.0106264 ],
         [-0.60872215,  0.04586332]],

        [[-1.4021204 ,  0.3486169 ],
         [-0.7055435 , -0.02770707],
         [-0.28904566, -0.11832161],
         ...,
         [ 0.25936556,  0.88856804],
         [-0.13118915,  0.37146634],
         [ 2.401897  , -1.0959843 ]],

        [[-0.2845105 , -0.0567197 ],
         [-0.05943837, -0.5847497 ],
         [ 0.18080406,  0.13932958],
         ...,
         [ 0.25488204,  0.8308334 ],
         [ 0.5879935 , -0.01285188],
         [ 0.07765544, -2.3112228 ]]]], dtype=float32)>
x[2:,...] # 高、宽、通道维度全部采集,等价于 x[2:]
<tf.Tensor: id=333, shape=(2, 32, 32, 3), dtype=float32, numpy=
array([[[[ 0.7945949 , -1.2347841 ,  0.47249708],
         [-0.2615968 , -1.0714035 ,  1.1448572 ],
         [-1.0418502 ,  0.6181078 , -0.07101548],
         ...,
         [ 0.04804359, -1.1582766 ,  1.2429018 ],
         [-0.30174834,  2.611402  ,  0.6804233 ],
         [-1.240855  , -1.0139518 ,  1.4055531 ]],

        [[-1.707563  ,  0.80162483, -1.3517904 ],
         [-0.93573076,  1.2351211 ,  1.0933201 ],
         [-1.147424  , -0.39131325,  0.7713079 ],
         ...,
         [ 0.64066964, -0.91507584,  0.1737621 ],
         [ 1.1623266 ,  0.85332286, -0.30063167],
         [ 0.7187028 ,  0.49019217, -0.9206615 ]],

        [[ 0.15926458,  0.58013743,  0.60102236],
         [ 0.59153324,  0.24763763, -1.9495552 ],
         [-0.31749198,  0.6982535 , -0.8042736 ],
         ...,
         [ 0.8707879 ,  0.49509358, -0.6867304 ],
         [-1.4305444 , -0.02501035, -0.99929893],
         [ 0.17384645, -0.5125354 ,  0.10659274]],

        ...,

        [[ 0.1064951 ,  0.21925269,  1.9000882 ],
         [-0.20990048, -0.6466106 ,  0.73544794],
         [-1.4809626 ,  0.42671174,  0.16862532],
         ...,
         [-1.1037835 ,  1.5173709 ,  0.1528409 ],
         [-0.54681784,  0.71534723,  1.9614433 ],
         [-0.9203462 , -0.44796994,  1.4050108 ]],

        [[-0.32906428, -1.57441   , -1.3997592 ],
         [-0.6523299 ,  0.9493397 , -1.3803322 ],
         [-0.7689747 , -1.0193949 ,  0.33281428],
         ...,
         [ 1.0886828 , -0.99153405,  0.36371905],
         [-1.0587267 , -2.1873395 ,  0.03044855],
         [-0.9652771 ,  0.06027158,  1.032748  ]],

        [[-0.6559332 , -0.19085257,  0.05694088],
         [ 0.06698092, -0.16040516, -0.02334103],
         [-0.8136688 ,  0.27712902, -1.2000872 ],
         ...,
         [-0.08051116, -0.00295419, -0.69620883],
         [-0.18664676, -1.2427148 ,  0.7336129 ],
         [-2.2412434 ,  0.8207905 , -1.5172895 ]]],


       [[[ 1.2161547 , -1.2573439 , -1.2718428 ],
         [-1.6064076 , -1.0373671 ,  0.25578052],
         [ 0.09028834, -0.7934127 , -0.9797417 ],
         ...,
         [ 0.46159938,  0.5131228 , -0.30064222],
         [-1.5939561 ,  0.25776967, -0.9606728 ],
         [-1.082976  , -1.0780028 ,  0.10589355]],

        [[ 0.7987019 ,  0.6763729 ,  1.8023363 ],
         [-0.20554051,  2.2258997 , -2.2459066 ],
         [ 1.6492949 , -1.2256166 , -0.82575357],
         ...,
         [ 0.52679706, -1.1156107 ,  0.9085335 ],
         [ 0.59762496, -1.4238712 , -0.47958985],
         [-0.27010065, -0.44870865, -1.6766595 ]],

        [[ 0.637566  , -0.7616951 ,  0.7511949 ],
         [-1.5865582 ,  0.6207081 , -0.5361186 ],
         [ 0.10845064, -0.12389632, -0.1997273 ],
         ...,
         [-0.26762375, -1.3940469 , -1.500582  ],
         [ 0.15319414,  0.14629437,  0.2699665 ],
         [ 1.0115746 ,  1.9813092 ,  1.0171049 ]],

        ...,

        [[ 1.4482028 , -0.7648322 ,  0.06379025],
         [ 1.8510162 , -0.34840062, -0.10682938],
         [-0.09289207, -1.1783606 ,  0.42853847],
         ...,
         [ 0.22264096,  0.5241032 , -0.48502067],
         [-0.4213207 ,  1.1209637 , -1.0214124 ],
         [ 1.630685  ,  0.48544505,  1.5306816 ]],

        [[-0.00818674,  0.8830715 ,  0.4825708 ],
         [-0.7329668 , -1.4620808 ,  1.0514579 ],
         [ 0.38649738, -0.67027754, -0.7106546 ],
         ...,
         [ 0.9241392 ,  0.33656323, -0.9770989 ],
         [-1.3609507 , -0.9691739 ,  0.11406882],
         [ 0.34177753,  1.3170173 , -0.7805903 ]],

        [[-0.41540006,  1.4038192 , -0.57709444],
         [ 0.39822683,  0.8117516 ,  0.6483183 ],
         [ 1.1203729 ,  0.89982706, -0.3563947 ],
         ...,
         [-1.8458656 , -0.529901  , -0.12438922],
         [-0.70390856,  0.991487  ,  0.29451367],
         [-0.16656587,  0.39819124, -0.32156795]]]], dtype=float32)>
x[...,:2]
<tf.Tensor: id=337, shape=(4, 32, 32, 2), dtype=float32, numpy=
array([[[[-4.32419658e-01, -5.09877086e-01],
         [ 7.07078397e-01,  1.08556092e+00],
         [ 2.03751564e+00,  2.20974469e+00],
         ...,
         [-1.73798287e+00,  2.08242488e+00],
         [ 1.04848671e+00,  3.86502817e-02],
         [ 1.94842041e+00,  1.75146550e-01]],

        [[-1.22242415e+00,  1.14156567e-01],
         [ 1.12284362e-01,  1.08892906e+00],
         [-8.34530950e-01, -1.70782948e+00],
         ...,
         [ 1.69857323e+00,  6.60117984e-01],
         [-1.20697165e+00, -3.40070814e-01],
         [-9.74530220e-01,  2.95405531e+00]],

        [[ 1.53859055e+00, -4.01938885e-01],
         [-2.44982779e-01,  5.79078756e-02],
         [ 5.64583302e-01,  8.56695920e-02],
         ...,
         [-1.02372777e+00,  7.08552539e-01],
         [ 7.16332570e-02, -2.40599617e-01],
         [-6.86471701e-01, -6.15428805e-01]],

        ...,

        [[ 3.49730015e-01,  2.24585116e-01],
         [ 7.28564620e-01,  5.78985870e-01],
         [-9.29911435e-01, -1.64117023e-01],
         ...,
         [ 6.97564244e-01, -1.03721142e+00],
         [-9.32126760e-01,  2.18500376e+00],
         [ 1.30037308e+00,  4.05710965e-01]],

        [[ 4.41274285e-01,  1.33300710e+00],
         [-5.07285185e-02,  7.71887004e-01],
         [ 1.36698377e+00,  6.98972285e-01],
         ...,
         [-2.28845787e+00, -1.81216732e-01],
         [ 1.69670284e+00, -1.76242697e+00],
         [-3.43527651e+00,  1.00924873e+00]],

        [[-1.85264707e+00,  1.78312480e+00],
         [ 6.95917308e-01, -5.66721261e-01],
         [-1.15507853e+00, -1.41172934e+00],
         ...,
         [ 8.31341088e-01, -9.16649461e-01],
         [ 1.07535470e+00,  5.34629583e-01],
         [-5.12858629e-01, -1.32675040e+00]]],


       [[[ 5.15374005e-01,  7.83893585e-01],
         [-5.10668635e-01,  8.44145715e-02],
         [-7.38578796e-01, -1.05675951e-01],
         ...,
         [-1.68340653e-01, -6.29219949e-01],
         [ 8.27257056e-03,  3.53782088e-01],
         [-1.75835520e-01, -1.02496004e+00]],

        [[ 5.39848745e-01, -8.77456188e-01],
         [ 1.17818582e+00, -1.68457657e-01],
         [-8.92112613e-01, -2.53036648e-01],
         ...,
         [ 4.76836145e-01, -7.02653229e-01],
         [-1.21396482e+00, -7.97563732e-01],
         [ 1.68496937e-01,  1.63219774e+00]],

        [[ 6.14707656e-02, -2.22510052e+00],
         [ 6.60790026e-01, -5.71392179e-01],
         [ 4.46515530e-01,  1.48257148e+00],
         ...,
         [ 1.59548461e-01, -4.66489851e-01],
         [ 5.70508897e-01, -6.02949977e-01],
         [-9.61358771e-02, -8.07489038e-01]],

        ...,

        [[ 1.11200154e+00,  4.09107879e-02],
         [ 1.92398891e-01, -2.43882507e-01],
         [-6.16861522e-01,  3.30834657e-01],
         ...,
         [ 1.08092427e+00, -2.10613221e-01],
         [-1.08140218e+00,  1.39687918e-02],
         [ 1.24228811e+00, -6.08722150e-01]],

        [[ 7.05049694e-01, -1.40212035e+00],
         [-1.06533384e+00, -7.05543518e-01],
         [ 1.22273874e+00, -2.89045662e-01],
         ...,
         [ 5.75357139e-01,  2.59365559e-01],
         [-9.37040001e-02, -1.31189153e-01],
         [-6.05499923e-01,  2.40189695e+00]],

        [[-1.26868278e-01, -2.84510493e-01],
         [-8.98756832e-03, -5.94383702e-02],
         [ 1.31652176e+00,  1.80804059e-01],
         ...,
         [ 1.25307858e+00,  2.54882038e-01],
         [ 2.48023421e-01,  5.87993503e-01],
         [-3.11426550e-01,  7.76554421e-02]]],


       [[[ 7.94594884e-01, -1.23478413e+00],
         [-2.61596799e-01, -1.07140350e+00],
         [-1.04185021e+00,  6.18107796e-01],
         ...,
         [ 4.80435863e-02, -1.15827656e+00],
         [-3.01748335e-01,  2.61140203e+00],
         [-1.24085498e+00, -1.01395178e+00]],

        [[-1.70756304e+00,  8.01624835e-01],
         [-9.35730755e-01,  1.23512113e+00],
         [-1.14742398e+00, -3.91313255e-01],
         ...,
         [ 6.40669644e-01, -9.15075839e-01],
         [ 1.16232657e+00,  8.53322864e-01],
         [ 7.18702793e-01,  4.90192175e-01]],

        [[ 1.59264579e-01,  5.80137432e-01],
         [ 5.91533244e-01,  2.47637630e-01],
         [-3.17491978e-01,  6.98253512e-01],
         ...,
         [ 8.70787919e-01,  4.95093584e-01],
         [-1.43054438e+00, -2.50103455e-02],
         [ 1.73846453e-01, -5.12535393e-01]],

        ...,

        [[ 1.06495105e-01,  2.19252691e-01],
         [-2.09900483e-01, -6.46610618e-01],
         [-1.48096263e+00,  4.26711738e-01],
         ...,
         [-1.10378349e+00,  1.51737094e+00],
         [-5.46817839e-01,  7.15347230e-01],
         [-9.20346200e-01, -4.47969943e-01]],

        [[-3.29064280e-01, -1.57440996e+00],
         [-6.52329922e-01,  9.49339688e-01],
         [-7.68974721e-01, -1.01939487e+00],
         ...,
         [ 1.08868277e+00, -9.91534054e-01],
         [-1.05872667e+00, -2.18733954e+00],
         [-9.65277076e-01,  6.02715760e-02]],

        [[-6.55933201e-01, -1.90852568e-01],
         [ 6.69809207e-02, -1.60405159e-01],
         [-8.13668787e-01,  2.77129024e-01],
         ...,
         [-8.05111602e-02, -2.95418687e-03],
         [-1.86646760e-01, -1.24271476e+00],
         [-2.24124336e+00,  8.20790529e-01]]],


       [[[ 1.21615469e+00, -1.25734389e+00],
         [-1.60640764e+00, -1.03736711e+00],
         [ 9.02883410e-02, -7.93412685e-01],
         ...,
         [ 4.61599380e-01,  5.13122797e-01],
         [-1.59395611e+00,  2.57769674e-01],
         [-1.08297598e+00, -1.07800281e+00]],

        [[ 7.98701882e-01,  6.76372886e-01],
         [-2.05540508e-01,  2.22589970e+00],
         [ 1.64929485e+00, -1.22561657e+00],
         ...,
         [ 5.26797056e-01, -1.11561072e+00],
         [ 5.97624958e-01, -1.42387116e+00],
         [-2.70100653e-01, -4.48708653e-01]],

        [[ 6.37565970e-01, -7.61695087e-01],
         [-1.58655822e+00,  6.20708108e-01],
         [ 1.08450644e-01, -1.23896316e-01],
         ...,
         [-2.67623752e-01, -1.39404690e+00],
         [ 1.53194144e-01,  1.46294370e-01],
         [ 1.01157463e+00,  1.98130918e+00]],

        ...,

        [[ 1.44820285e+00, -7.64832199e-01],
         [ 1.85101616e+00, -3.48400623e-01],
         [-9.28920656e-02, -1.17836058e+00],
         ...,
         [ 2.22640961e-01,  5.24103224e-01],
         [-4.21320707e-01,  1.12096369e+00],
         [ 1.63068497e+00,  4.85445052e-01]],

        [[-8.18674359e-03,  8.83071482e-01],
         [-7.32966781e-01, -1.46208084e+00],
         [ 3.86497378e-01, -6.70277536e-01],
         ...,
         [ 9.24139202e-01,  3.36563230e-01],
         [-1.36095071e+00, -9.69173908e-01],
         [ 3.41777533e-01,  1.31701732e+00]],

        [[-4.15400058e-01,  1.40381920e+00],
         [ 3.98226827e-01,  8.11751604e-01],
         [ 1.12037289e+00,  8.99827063e-01],
         ...,
         [-1.84586561e+00, -5.29901028e-01],
         [-7.03908563e-01,  9.91487026e-01],
         [-1.66565865e-01,  3.98191243e-01]]]], dtype=float32)>
张量的索引与切片方式多种多样,尤其是切片操作,初学者容易犯迷糊。但本质上切片操作只有start🔚step这一种基本形式,通过这种基本形式有目的地省略掉默认参数,从而衍生出多种简写方法,这也是很好理解的。它衍生的简写形式熟练后一看就能推测出省略掉的信息,书写起来也更方便快捷。由于深度学习一般处理的维度数在四维以内,⋯

维度变换

在神经网络运算过程中,维度变换是最核心的张量操作,通过维度变换可以将数据任
意地切换形式,满足不同场合的运算需求。

算法的每个模块对于数据张量的格式有不同的逻辑要求,当现有的数据格式不满足算
法要求时,需要通过维度变换将数据调整为正确的格式。这就是维度变换的功能。
基本的维度变换操作函数包含了改变视图 reshape、插入新维度 expand_dims,删除维
度 squeeze、交换维度 transpose、复制数据 tile 等函数。

改变试图

通过 tf.range()模拟生成一个向量数据,并通过 tf.reshape 视图改变函数产生不同的
视图,例如:

x = tf.range(96)
x = tf.reshape(x,[2,4,4,3])
x
<tf.Tensor: id=345, shape=(2, 4, 4, 3), dtype=int32, numpy=
array([[[[ 0,  1,  2],
         [ 3,  4,  5],
         [ 6,  7,  8],
         [ 9, 10, 11]],

        [[12, 13, 14],
         [15, 16, 17],
         [18, 19, 20],
         [21, 22, 23]],

        [[24, 25, 26],
         [27, 28, 29],
         [30, 31, 32],
         [33, 34, 35]],

        [[36, 37, 38],
         [39, 40, 41],
         [42, 43, 44],
         [45, 46, 47]]],


       [[[48, 49, 50],
         [51, 52, 53],
         [54, 55, 56],
         [57, 58, 59]],

        [[60, 61, 62],
         [63, 64, 65],
         [66, 67, 68],
         [69, 70, 71]],

        [[72, 73, 74],
         [75, 76, 77],
         [78, 79, 80],
         [81, 82, 83]],

        [[84, 85, 86],
         [87, 88, 89],
         [90, 91, 92],
         [93, 94, 95]]]])>

例如,张量𝑩按着初始视图[𝑐,ℎ, ,𝑑]写入的内存布局,我们改变𝑩的理解方式,它可以
有如下多种合法的理解方式:
❑ [𝑐,ℎ ∙ ,𝑑] 张量理解为𝑐张图片,ℎ ∙ 个像素点,𝑑个通道
❑ [𝑐,ℎ, ∙ 𝑑] 张量理解为𝑐张图片,ℎ行,每行的特征长度为 ∙ 𝑑
❑ [𝑐,ℎ ∙ ∙ 𝑑] 张量理解为𝑐张图片,每张图片的特征长度为ℎ ∙ ∙ 𝑑
上述新视图的存储都不需要改变,因此是合法的。
从语法上来说,视图变换只需要满足新视图的元素总量与存储区域大小相等即可,即
新视图的元素数量等于
𝑐 ∙ ℎ ∙ ∙

小贴士:这里存储和恢复的顺序一定不能搞混。 在 TensorFlow 中,可以通过张量的 ndim 和 shape 成员属性获得张量的维度数和形

x.ndim
4
x.shape
TensorShape([2, 4, 4, 3])

通过 tf.reshape(x, new_shape),可以将张量的视图任意地合法改变,例如:

tf.reshape(x,[2,-1])
<tf.Tensor: id=348, shape=(2, 48), dtype=int32, numpy=
array([[ 0,  1,  2,  3,  4,  5,  6,  7,  8,  9, 10, 11, 12, 13, 14, 15,
        16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31,
        32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47],
       [48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63,
        64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79,
        80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95]])>

其中的参数−1表示当前轴上长度需要根据张量总元素不变的法则自动推导,从而方便用户
书写。比如,上面的−1可以推导为
(2 ∙ 4 ∙ 4 ∙ 3)/2= 48

再次改变数据的视图为[2,4,12],实现如下

tf.reshape(x,[2,4,12])
<tf.Tensor: id=350, shape=(2, 4, 12), dtype=int32, numpy=
array([[[ 0,  1,  2,  3,  4,  5,  6,  7,  8,  9, 10, 11],
        [12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23],
        [24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35],
        [36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47]],

       [[48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59],
        [60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71],
        [72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83],
        [84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95]]])>
tf.reshape(x,[2,-1,3])
<tf.Tensor: id=352, shape=(2, 16, 3), dtype=int32, numpy=
array([[[ 0,  1,  2],
        [ 3,  4,  5],
        [ 6,  7,  8],
        [ 9, 10, 11],
        [12, 13, 14],
        [15, 16, 17],
        [18, 19, 20],
        [21, 22, 23],
        [24, 25, 26],
        [27, 28, 29],
        [30, 31, 32],
        [33, 34, 35],
        [36, 37, 38],
        [39, 40, 41],
        [42, 43, 44],
        [45, 46, 47]],

       [[48, 49, 50],
        [51, 52, 53],
        [54, 55, 56],
        [57, 58, 59],
        [60, 61, 62],
        [63, 64, 65],
        [66, 67, 68],
        [69, 70, 71],
        [72, 73, 74],
        [75, 76, 77],
        [78, 79, 80],
        [81, 82, 83],
        [84, 85, 86],
        [87, 88, 89],
        [90, 91, 92],
        [93, 94, 95]]])>

通过上述的一系列连续变换视图操作时需要意识到,张量的存储顺序始终没有改变,数据
在内存中仍然是按着初始写入的顺序0,1,2,⋯,95保存的

增删维度

增加维度

增加一个长度为 1 的维度相当于给原有的数据添加一个新维度的概念,维度
长度为 1,故数据并不需要改变,仅仅是改变数据的理解方式,因此它其实可以理解为改
变视图的一种特殊方式。

考虑一个具体例子,一张28 × 28大小的灰度图片的数据保存为 shape 为[28,28]的张
量,在末尾给

x = tf.random.uniform([28,28],maxval=10,dtype=tf.int32)
x
<tf.Tensor: id=356, shape=(28, 28), dtype=int32, numpy=
array([[7, 6, 8, 0, 5, 7, 3, 7, 1, 0, 2, 9, 6, 2, 7, 6, 2, 4, 4, 2, 3, 9,
        1, 8, 3, 6, 0, 7],
       [8, 2, 4, 5, 3, 3, 2, 5, 4, 0, 7, 1, 3, 9, 0, 8, 8, 9, 8, 5, 1, 1,
        0, 6, 6, 5, 6, 6],
       [5, 9, 2, 0, 9, 0, 4, 0, 5, 2, 3, 8, 0, 2, 5, 5, 1, 3, 5, 9, 3, 9,
        3, 6, 8, 7, 4, 4],
       [6, 6, 0, 5, 0, 7, 2, 3, 8, 7, 4, 9, 3, 2, 4, 5, 9, 0, 2, 7, 5, 9,
        3, 1, 8, 7, 3, 5],
       [5, 8, 5, 0, 1, 0, 5, 4, 3, 1, 2, 0, 2, 9, 0, 1, 7, 4, 3, 7, 9, 1,
        9, 5, 4, 4, 6, 2],
       [4, 3, 4, 3, 4, 8, 8, 2, 6, 6, 7, 2, 6, 4, 7, 0, 8, 4, 0, 3, 5, 0,
        1, 6, 3, 3, 5, 9],
       [0, 8, 8, 4, 0, 6, 0, 8, 2, 3, 1, 3, 6, 8, 5, 5, 1, 9, 5, 8, 7, 5,
        3, 4, 5, 2, 1, 7],
       [0, 7, 3, 0, 1, 8, 9, 2, 8, 3, 3, 3, 7, 1, 4, 9, 7, 0, 7, 6, 6, 8,
        4, 0, 2, 6, 2, 2],
       [8, 7, 9, 1, 6, 8, 2, 5, 4, 9, 4, 8, 6, 6, 0, 3, 8, 3, 7, 5, 3, 3,
        1, 3, 4, 0, 8, 4],
       [7, 7, 2, 7, 3, 1, 0, 3, 5, 6, 5, 4, 1, 4, 1, 3, 7, 8, 6, 9, 9, 0,
        9, 9, 5, 3, 9, 2],
       [0, 5, 2, 1, 8, 0, 3, 0, 0, 4, 6, 8, 6, 8, 9, 1, 2, 7, 9, 0, 7, 0,
        5, 5, 5, 4, 4, 5],
       [5, 1, 8, 2, 8, 9, 6, 3, 7, 0, 3, 1, 3, 2, 4, 3, 0, 7, 5, 4, 0, 6,
        3, 2, 7, 1, 5, 7],
       [2, 6, 9, 7, 2, 9, 2, 0, 2, 9, 9, 6, 1, 9, 8, 0, 3, 9, 4, 5, 9, 0,
        3, 3, 8, 5, 0, 6],
       [4, 5, 2, 0, 7, 2, 8, 3, 6, 7, 3, 9, 6, 6, 0, 0, 5, 6, 3, 5, 2, 0,
        9, 7, 1, 0, 0, 8],
       [9, 0, 4, 4, 7, 1, 3, 2, 2, 7, 1, 3, 4, 7, 8, 4, 1, 3, 4, 4, 2, 3,
        2, 1, 1, 7, 7, 1],
       [0, 0, 3, 6, 9, 6, 9, 9, 4, 7, 7, 8, 7, 8, 4, 0, 8, 3, 5, 8, 3, 4,
        7, 6, 1, 8, 8, 5],
       [3, 1, 9, 6, 8, 7, 6, 8, 6, 0, 5, 7, 9, 5, 9, 0, 1, 8, 7, 9, 0, 3,
        8, 4, 7, 8, 4, 7],
       [0, 0, 4, 9, 3, 1, 5, 3, 0, 1, 6, 5, 8, 2, 1, 1, 8, 7, 9, 4, 2, 5,
        6, 0, 6, 4, 4, 3],
       [1, 4, 3, 2, 9, 9, 0, 0, 3, 3, 8, 4, 0, 3, 7, 4, 5, 1, 2, 0, 5, 2,
        2, 9, 8, 7, 5, 5],
       [4, 3, 1, 8, 8, 6, 0, 1, 9, 0, 5, 9, 8, 5, 8, 0, 2, 7, 8, 7, 1, 9,
        4, 9, 4, 7, 2, 6],
       [3, 4, 8, 8, 4, 6, 8, 8, 8, 2, 0, 2, 7, 6, 8, 8, 9, 5, 1, 8, 9, 2,
        7, 3, 8, 0, 3, 4],
       [3, 1, 9, 9, 3, 4, 0, 0, 5, 3, 6, 6, 9, 2, 0, 6, 4, 4, 4, 5, 7, 4,
        7, 3, 9, 1, 3, 5],
       [3, 3, 3, 9, 2, 5, 5, 0, 9, 4, 0, 9, 2, 6, 5, 1, 3, 8, 5, 4, 0, 6,
        4, 7, 5, 1, 6, 0],
       [4, 2, 6, 7, 2, 1, 4, 5, 9, 3, 0, 8, 0, 3, 6, 1, 8, 1, 9, 0, 6, 6,
        8, 2, 2, 2, 1, 6],
       [3, 2, 3, 1, 6, 2, 5, 5, 4, 3, 5, 8, 5, 4, 1, 5, 2, 2, 8, 3, 5, 3,
        2, 9, 9, 0, 0, 7],
       [4, 6, 9, 1, 4, 5, 7, 0, 5, 5, 6, 3, 8, 1, 1, 7, 1, 3, 1, 0, 2, 0,
        2, 3, 3, 3, 5, 5],
       [7, 1, 4, 3, 9, 9, 5, 0, 5, 2, 4, 3, 9, 4, 8, 9, 2, 0, 8, 9, 4, 5,
        3, 0, 5, 0, 2, 6],
       [8, 9, 6, 9, 1, 4, 7, 0, 9, 8, 7, 0, 6, 9, 4, 2, 4, 1, 1, 9, 5, 7,
        9, 9, 7, 5, 4, 7]])>

通过 tf.expand_dims(x, axis)可在指定的 axis 轴前可以插入一个新的维度:

x = tf.expand_dims(x,axis=2) ## axis=2 表示宽维度后面的一个维度
x = tf.expand_dims(x,axis=0)
x
<tf.Tensor: id=366, shape=(1, 1, 28, 28, 1, 1, 1), dtype=int32, numpy=
array([[[[[[[7]]],


          [[[6]]],


          [[[8]]],


          [[[0]]],


          [[[5]]],


          [[[7]]],


          [[[3]]],


          [[[7]]],


          [[[1]]],


          [[[0]]],


          [[[2]]],


          [[[9]]],


          [[[6]]],


          [[[2]]],


          [[[7]]],


          [[[6]]],


          [[[2]]],


          [[[4]]],


          [[[4]]],


          [[[2]]],


          [[[3]]],


          [[[9]]],


          [[[1]]],


          [[[8]]],


          [[[3]]],


          [[[6]]],


          [[[0]]],


          [[[7]]]],



         [[[[8]]],


          [[[2]]],


          [[[4]]],


          [[[5]]],


          [[[3]]],


          [[[3]]],


          [[[2]]],


          [[[5]]],


          [[[4]]],


          [[[0]]],


          [[[7]]],


          [[[1]]],


          [[[3]]],


          [[[9]]],


          [[[0]]],


          [[[8]]],


          [[[8]]],


          [[[9]]],


          [[[8]]],


          [[[5]]],


          [[[1]]],


          [[[1]]],


          [[[0]]],


          [[[6]]],


          [[[6]]],


          [[[5]]],


          [[[6]]],


          [[[6]]]],



         [[[[5]]],


          [[[9]]],


          [[[2]]],


          [[[0]]],


          [[[9]]],


          [[[0]]],


          [[[4]]],


          [[[0]]],


          [[[5]]],


          [[[2]]],


          [[[3]]],


          [[[8]]],


          [[[0]]],


          [[[2]]],


          [[[5]]],


          [[[5]]],


          [[[1]]],


          [[[3]]],


          [[[5]]],


          [[[9]]],


          [[[3]]],


          [[[9]]],


          [[[3]]],


          [[[6]]],


          [[[8]]],


          [[[7]]],


          [[[4]]],


          [[[4]]]],



         [[[[6]]],


          [[[6]]],


          [[[0]]],


          [[[5]]],


          [[[0]]],


          [[[7]]],


          [[[2]]],


          [[[3]]],


          [[[8]]],


          [[[7]]],


          [[[4]]],


          [[[9]]],


          [[[3]]],


          [[[2]]],


          [[[4]]],


          [[[5]]],


          [[[9]]],


          [[[0]]],


          [[[2]]],


          [[[7]]],


          [[[5]]],


          [[[9]]],


          [[[3]]],


          [[[1]]],


          [[[8]]],


          [[[7]]],


          [[[3]]],


          [[[5]]]],



         [[[[5]]],


          [[[8]]],


          [[[5]]],


          [[[0]]],


          [[[1]]],


          [[[0]]],


          [[[5]]],


          [[[4]]],


          [[[3]]],


          [[[1]]],


          [[[2]]],


          [[[0]]],


          [[[2]]],


          [[[9]]],


          [[[0]]],


          [[[1]]],


          [[[7]]],


          [[[4]]],


          [[[3]]],


          [[[7]]],


          [[[9]]],


          [[[1]]],


          [[[9]]],


          [[[5]]],


          [[[4]]],


          [[[4]]],


          [[[6]]],


          [[[2]]]],



         [[[[4]]],


          [[[3]]],


          [[[4]]],


          [[[3]]],


          [[[4]]],


          [[[8]]],


          [[[8]]],


          [[[2]]],


          [[[6]]],


          [[[6]]],


          [[[7]]],


          [[[2]]],


          [[[6]]],


          [[[4]]],


          [[[7]]],


          [[[0]]],


          [[[8]]],


          [[[4]]],


          [[[0]]],


          [[[3]]],


          [[[5]]],


          [[[0]]],


          [[[1]]],


          [[[6]]],


          [[[3]]],


          [[[3]]],


          [[[5]]],


          [[[9]]]],



         [[[[0]]],


          [[[8]]],


          [[[8]]],


          [[[4]]],


          [[[0]]],


          [[[6]]],


          [[[0]]],


          [[[8]]],


          [[[2]]],


          [[[3]]],


          [[[1]]],


          [[[3]]],


          [[[6]]],


          [[[8]]],


          [[[5]]],


          [[[5]]],


          [[[1]]],


          [[[9]]],


          [[[5]]],


          [[[8]]],


          [[[7]]],


          [[[5]]],


          [[[3]]],


          [[[4]]],


          [[[5]]],


          [[[2]]],


          [[[1]]],


          [[[7]]]],



         [[[[0]]],


          [[[7]]],


          [[[3]]],


          [[[0]]],


          [[[1]]],


          [[[8]]],


          [[[9]]],


          [[[2]]],


          [[[8]]],


          [[[3]]],


          [[[3]]],


          [[[3]]],


          [[[7]]],


          [[[1]]],


          [[[4]]],


          [[[9]]],


          [[[7]]],


          [[[0]]],


          [[[7]]],


          [[[6]]],


          [[[6]]],


          [[[8]]],


          [[[4]]],


          [[[0]]],


          [[[2]]],


          [[[6]]],


          [[[2]]],


          [[[2]]]],



         [[[[8]]],


          [[[7]]],


          [[[9]]],


          [[[1]]],


          [[[6]]],


          [[[8]]],


          [[[2]]],


          [[[5]]],


          [[[4]]],


          [[[9]]],


          [[[4]]],


          [[[8]]],


          [[[6]]],


          [[[6]]],


          [[[0]]],


          [[[3]]],


          [[[8]]],


          [[[3]]],


          [[[7]]],


          [[[5]]],


          [[[3]]],


          [[[3]]],


          [[[1]]],


          [[[3]]],


          [[[4]]],


          [[[0]]],


          [[[8]]],


          [[[4]]]],



         [[[[7]]],


          [[[7]]],


          [[[2]]],


          [[[7]]],


          [[[3]]],


          [[[1]]],


          [[[0]]],


          [[[3]]],


          [[[5]]],


          [[[6]]],


          [[[5]]],


          [[[4]]],


          [[[1]]],


          [[[4]]],


          [[[1]]],


          [[[3]]],


          [[[7]]],


          [[[8]]],


          [[[6]]],


          [[[9]]],


          [[[9]]],


          [[[0]]],


          [[[9]]],


          [[[9]]],


          [[[5]]],


          [[[3]]],


          [[[9]]],


          [[[2]]]],



         [[[[0]]],


          [[[5]]],


          [[[2]]],


          [[[1]]],


          [[[8]]],


          [[[0]]],


          [[[3]]],


          [[[0]]],


          [[[0]]],


          [[[4]]],


          [[[6]]],


          [[[8]]],


          [[[6]]],


          [[[8]]],


          [[[9]]],


          [[[1]]],


          [[[2]]],


          [[[7]]],


          [[[9]]],


          [[[0]]],


          [[[7]]],


          [[[0]]],


          [[[5]]],


          [[[5]]],


          [[[5]]],


          [[[4]]],


          [[[4]]],


          [[[5]]]],



         [[[[5]]],


          [[[1]]],


          [[[8]]],


          [[[2]]],


          [[[8]]],


          [[[9]]],


          [[[6]]],


          [[[3]]],


          [[[7]]],


          [[[0]]],


          [[[3]]],


          [[[1]]],


          [[[3]]],


          [[[2]]],


          [[[4]]],


          [[[3]]],


          [[[0]]],


          [[[7]]],


          [[[5]]],


          [[[4]]],


          [[[0]]],


          [[[6]]],


          [[[3]]],


          [[[2]]],


          [[[7]]],


          [[[1]]],


          [[[5]]],


          [[[7]]]],



         [[[[2]]],


          [[[6]]],


          [[[9]]],


          [[[7]]],


          [[[2]]],


          [[[9]]],


          [[[2]]],


          [[[0]]],


          [[[2]]],


          [[[9]]],


          [[[9]]],


          [[[6]]],


          [[[1]]],


          [[[9]]],


          [[[8]]],


          [[[0]]],


          [[[3]]],


          [[[9]]],


          [[[4]]],


          [[[5]]],


          [[[9]]],


          [[[0]]],


          [[[3]]],


          [[[3]]],


          [[[8]]],


          [[[5]]],


          [[[0]]],


          [[[6]]]],



         [[[[4]]],


          [[[5]]],


          [[[2]]],


          [[[0]]],


          [[[7]]],


          [[[2]]],


          [[[8]]],


          [[[3]]],


          [[[6]]],


          [[[7]]],


          [[[3]]],


          [[[9]]],


          [[[6]]],


          [[[6]]],


          [[[0]]],


          [[[0]]],


          [[[5]]],


          [[[6]]],


          [[[3]]],


          [[[5]]],


          [[[2]]],


          [[[0]]],


          [[[9]]],


          [[[7]]],


          [[[1]]],


          [[[0]]],


          [[[0]]],


          [[[8]]]],



         [[[[9]]],


          [[[0]]],


          [[[4]]],


          [[[4]]],


          [[[7]]],


          [[[1]]],


          [[[3]]],


          [[[2]]],


          [[[2]]],


          [[[7]]],


          [[[1]]],


          [[[3]]],


          [[[4]]],


          [[[7]]],


          [[[8]]],


          [[[4]]],


          [[[1]]],


          [[[3]]],


          [[[4]]],


          [[[4]]],


          [[[2]]],


          [[[3]]],


          [[[2]]],


          [[[1]]],


          [[[1]]],


          [[[7]]],


          [[[7]]],


          [[[1]]]],



         [[[[0]]],


          [[[0]]],


          [[[3]]],


          [[[6]]],


          [[[9]]],


          [[[6]]],


          [[[9]]],


          [[[9]]],


          [[[4]]],


          [[[7]]],


          [[[7]]],


          [[[8]]],


          [[[7]]],


          [[[8]]],


          [[[4]]],


          [[[0]]],


          [[[8]]],


          [[[3]]],


          [[[5]]],


          [[[8]]],


          [[[3]]],


          [[[4]]],


          [[[7]]],


          [[[6]]],


          [[[1]]],


          [[[8]]],


          [[[8]]],


          [[[5]]]],



         [[[[3]]],


          [[[1]]],


          [[[9]]],


          [[[6]]],


          [[[8]]],


          [[[7]]],


          [[[6]]],


          [[[8]]],


          [[[6]]],


          [[[0]]],


          [[[5]]],


          [[[7]]],


          [[[9]]],


          [[[5]]],


          [[[9]]],


          [[[0]]],


          [[[1]]],


          [[[8]]],


          [[[7]]],


          [[[9]]],


          [[[0]]],


          [[[3]]],


          [[[8]]],


          [[[4]]],


          [[[7]]],


          [[[8]]],


          [[[4]]],


          [[[7]]]],



         [[[[0]]],


          [[[0]]],


          [[[4]]],


          [[[9]]],


          [[[3]]],


          [[[1]]],


          [[[5]]],


          [[[3]]],


          [[[0]]],


          [[[1]]],


          [[[6]]],


          [[[5]]],


          [[[8]]],


          [[[2]]],


          [[[1]]],


          [[[1]]],


          [[[8]]],


          [[[7]]],


          [[[9]]],


          [[[4]]],


          [[[2]]],


          [[[5]]],


          [[[6]]],


          [[[0]]],


          [[[6]]],


          [[[4]]],


          [[[4]]],


          [[[3]]]],



         [[[[1]]],


          [[[4]]],


          [[[3]]],


          [[[2]]],


          [[[9]]],


          [[[9]]],


          [[[0]]],


          [[[0]]],


          [[[3]]],


          [[[3]]],


          [[[8]]],


          [[[4]]],


          [[[0]]],


          [[[3]]],


          [[[7]]],


          [[[4]]],


          [[[5]]],


          [[[1]]],


          [[[2]]],


          [[[0]]],


          [[[5]]],


          [[[2]]],


          [[[2]]],


          [[[9]]],


          [[[8]]],


          [[[7]]],


          [[[5]]],


          [[[5]]]],



         [[[[4]]],


          [[[3]]],


          [[[1]]],


          [[[8]]],


          [[[8]]],


          [[[6]]],


          [[[0]]],


          [[[1]]],


          [[[9]]],


          [[[0]]],


          [[[5]]],


          [[[9]]],


          [[[8]]],


          [[[5]]],


          [[[8]]],


          [[[0]]],


          [[[2]]],


          [[[7]]],


          [[[8]]],


          [[[7]]],


          [[[1]]],


          [[[9]]],


          [[[4]]],


          [[[9]]],


          [[[4]]],


          [[[7]]],


          [[[2]]],


          [[[6]]]],



         [[[[3]]],


          [[[4]]],


          [[[8]]],


          [[[8]]],


          [[[4]]],


          [[[6]]],


          [[[8]]],


          [[[8]]],


          [[[8]]],


          [[[2]]],


          [[[0]]],


          [[[2]]],


          [[[7]]],


          [[[6]]],


          [[[8]]],


          [[[8]]],


          [[[9]]],


          [[[5]]],


          [[[1]]],


          [[[8]]],


          [[[9]]],


          [[[2]]],


          [[[7]]],


          [[[3]]],


          [[[8]]],


          [[[0]]],


          [[[3]]],


          [[[4]]]],



         [[[[3]]],


          [[[1]]],


          [[[9]]],


          [[[9]]],


          [[[3]]],


          [[[4]]],


          [[[0]]],


          [[[0]]],


          [[[5]]],


          [[[3]]],


          [[[6]]],


          [[[6]]],


          [[[9]]],


          [[[2]]],


          [[[0]]],


          [[[6]]],


          [[[4]]],


          [[[4]]],


          [[[4]]],


          [[[5]]],


          [[[7]]],


          [[[4]]],


          [[[7]]],


          [[[3]]],


          [[[9]]],


          [[[1]]],


          [[[3]]],


          [[[5]]]],



         [[[[3]]],


          [[[3]]],


          [[[3]]],


          [[[9]]],


          [[[2]]],


          [[[5]]],


          [[[5]]],


          [[[0]]],


          [[[9]]],


          [[[4]]],


          [[[0]]],


          [[[9]]],


          [[[2]]],


          [[[6]]],


          [[[5]]],


          [[[1]]],


          [[[3]]],


          [[[8]]],


          [[[5]]],


          [[[4]]],


          [[[0]]],


          [[[6]]],


          [[[4]]],


          [[[7]]],


          [[[5]]],


          [[[1]]],


          [[[6]]],


          [[[0]]]],



         [[[[4]]],


          [[[2]]],


          [[[6]]],


          [[[7]]],


          [[[2]]],


          [[[1]]],


          [[[4]]],


          [[[5]]],


          [[[9]]],


          [[[3]]],


          [[[0]]],


          [[[8]]],


          [[[0]]],


          [[[3]]],


          [[[6]]],


          [[[1]]],


          [[[8]]],


          [[[1]]],


          [[[9]]],


          [[[0]]],


          [[[6]]],


          [[[6]]],


          [[[8]]],


          [[[2]]],


          [[[2]]],


          [[[2]]],


          [[[1]]],


          [[[6]]]],



         [[[[3]]],


          [[[2]]],


          [[[3]]],


          [[[1]]],


          [[[6]]],


          [[[2]]],


          [[[5]]],


          [[[5]]],


          [[[4]]],


          [[[3]]],


          [[[5]]],


          [[[8]]],


          [[[5]]],


          [[[4]]],


          [[[1]]],


          [[[5]]],


          [[[2]]],


          [[[2]]],


          [[[8]]],


          [[[3]]],


          [[[5]]],


          [[[3]]],


          [[[2]]],


          [[[9]]],


          [[[9]]],


          [[[0]]],


          [[[0]]],


          [[[7]]]],



         [[[[4]]],


          [[[6]]],


          [[[9]]],


          [[[1]]],


          [[[4]]],


          [[[5]]],


          [[[7]]],


          [[[0]]],


          [[[5]]],


          [[[5]]],


          [[[6]]],


          [[[3]]],


          [[[8]]],


          [[[1]]],


          [[[1]]],


          [[[7]]],


          [[[1]]],


          [[[3]]],


          [[[1]]],


          [[[0]]],


          [[[2]]],


          [[[0]]],


          [[[2]]],


          [[[3]]],


          [[[3]]],


          [[[3]]],


          [[[5]]],


          [[[5]]]],



         [[[[7]]],


          [[[1]]],


          [[[4]]],


          [[[3]]],


          [[[9]]],


          [[[9]]],


          [[[5]]],


          [[[0]]],


          [[[5]]],


          [[[2]]],


          [[[4]]],


          [[[3]]],


          [[[9]]],


          [[[4]]],


          [[[8]]],


          [[[9]]],


          [[[2]]],


          [[[0]]],


          [[[8]]],


          [[[9]]],


          [[[4]]],


          [[[5]]],


          [[[3]]],


          [[[0]]],


          [[[5]]],


          [[[0]]],


          [[[2]]],


          [[[6]]]],



         [[[[8]]],


          [[[9]]],


          [[[6]]],


          [[[9]]],


          [[[1]]],


          [[[4]]],


          [[[7]]],


          [[[0]]],


          [[[9]]],


          [[[8]]],


          [[[7]]],


          [[[0]]],


          [[[6]]],


          [[[9]]],


          [[[4]]],


          [[[2]]],


          [[[4]]],


          [[[1]]],


          [[[1]]],


          [[[9]]],


          [[[5]]],


          [[[7]]],


          [[[9]]],


          [[[9]]],


          [[[7]]],


          [[[5]]],


          [[[4]]],


          [[[7]]]]]]])>

删除维度

是增加维度的逆操作,与增加维度一样,删除维度只能删除长度为 1 的维
度,也不会改变张量的存储。继续考虑增加维度后 shape 为[1,28,28,1]的例子,如果希望将
图片数量维度删除,可以通过 tf.squeeze(x, axis)函数,axis 参数为待删除的维度的索引号,
例如,图片数量的维度轴 axis=0:

x = tf.squeeze(x,axis=0) #删除x的数量的维度
x
<tf.Tensor: id=367, shape=(1, 28, 28, 1, 1, 1), dtype=int32, numpy=
array([[[[[[7]]],


         [[[6]]],


         [[[8]]],


         [[[0]]],


         [[[5]]],


         [[[7]]],


         [[[3]]],


         [[[7]]],


         [[[1]]],


         [[[0]]],


         [[[2]]],


         [[[9]]],


         [[[6]]],


         [[[2]]],


         [[[7]]],


         [[[6]]],


         [[[2]]],


         [[[4]]],


         [[[4]]],


         [[[2]]],


         [[[3]]],


         [[[9]]],


         [[[1]]],


         [[[8]]],


         [[[3]]],


         [[[6]]],


         [[[0]]],


         [[[7]]]],



        [[[[8]]],


         [[[2]]],


         [[[4]]],


         [[[5]]],


         [[[3]]],


         [[[3]]],


         [[[2]]],


         [[[5]]],


         [[[4]]],


         [[[0]]],


         [[[7]]],


         [[[1]]],


         [[[3]]],


         [[[9]]],


         [[[0]]],


         [[[8]]],


         [[[8]]],


         [[[9]]],


         [[[8]]],


         [[[5]]],


         [[[1]]],


         [[[1]]],


         [[[0]]],


         [[[6]]],


         [[[6]]],


         [[[5]]],


         [[[6]]],


         [[[6]]]],



        [[[[5]]],


         [[[9]]],


         [[[2]]],


         [[[0]]],


         [[[9]]],


         [[[0]]],


         [[[4]]],


         [[[0]]],


         [[[5]]],


         [[[2]]],


         [[[3]]],


         [[[8]]],


         [[[0]]],


         [[[2]]],


         [[[5]]],


         [[[5]]],


         [[[1]]],


         [[[3]]],


         [[[5]]],


         [[[9]]],


         [[[3]]],


         [[[9]]],


         [[[3]]],


         [[[6]]],


         [[[8]]],


         [[[7]]],


         [[[4]]],


         [[[4]]]],



        [[[[6]]],


         [[[6]]],


         [[[0]]],


         [[[5]]],


         [[[0]]],


         [[[7]]],


         [[[2]]],


         [[[3]]],


         [[[8]]],


         [[[7]]],


         [[[4]]],


         [[[9]]],


         [[[3]]],


         [[[2]]],


         [[[4]]],


         [[[5]]],


         [[[9]]],


         [[[0]]],


         [[[2]]],


         [[[7]]],


         [[[5]]],


         [[[9]]],


         [[[3]]],


         [[[1]]],


         [[[8]]],


         [[[7]]],


         [[[3]]],


         [[[5]]]],



        [[[[5]]],


         [[[8]]],


         [[[5]]],


         [[[0]]],


         [[[1]]],


         [[[0]]],


         [[[5]]],


         [[[4]]],


         [[[3]]],


         [[[1]]],


         [[[2]]],


         [[[0]]],


         [[[2]]],


         [[[9]]],


         [[[0]]],


         [[[1]]],


         [[[7]]],


         [[[4]]],


         [[[3]]],


         [[[7]]],


         [[[9]]],


         [[[1]]],


         [[[9]]],


         [[[5]]],


         [[[4]]],


         [[[4]]],


         [[[6]]],


         [[[2]]]],



        [[[[4]]],


         [[[3]]],


         [[[4]]],


         [[[3]]],


         [[[4]]],


         [[[8]]],


         [[[8]]],


         [[[2]]],


         [[[6]]],


         [[[6]]],


         [[[7]]],


         [[[2]]],


         [[[6]]],


         [[[4]]],


         [[[7]]],


         [[[0]]],


         [[[8]]],


         [[[4]]],


         [[[0]]],


         [[[3]]],


         [[[5]]],


         [[[0]]],


         [[[1]]],


         [[[6]]],


         [[[3]]],


         [[[3]]],


         [[[5]]],


         [[[9]]]],



        [[[[0]]],


         [[[8]]],


         [[[8]]],


         [[[4]]],


         [[[0]]],


         [[[6]]],


         [[[0]]],


         [[[8]]],


         [[[2]]],


         [[[3]]],


         [[[1]]],


         [[[3]]],


         [[[6]]],


         [[[8]]],


         [[[5]]],


         [[[5]]],


         [[[1]]],


         [[[9]]],


         [[[5]]],


         [[[8]]],


         [[[7]]],


         [[[5]]],


         [[[3]]],


         [[[4]]],


         [[[5]]],


         [[[2]]],


         [[[1]]],


         [[[7]]]],



        [[[[0]]],


         [[[7]]],


         [[[3]]],


         [[[0]]],


         [[[1]]],


         [[[8]]],


         [[[9]]],


         [[[2]]],


         [[[8]]],


         [[[3]]],


         [[[3]]],


         [[[3]]],


         [[[7]]],


         [[[1]]],


         [[[4]]],


         [[[9]]],


         [[[7]]],


         [[[0]]],


         [[[7]]],


         [[[6]]],


         [[[6]]],


         [[[8]]],


         [[[4]]],


         [[[0]]],


         [[[2]]],


         [[[6]]],


         [[[2]]],


         [[[2]]]],



        [[[[8]]],


         [[[7]]],


         [[[9]]],


         [[[1]]],


         [[[6]]],


         [[[8]]],


         [[[2]]],


         [[[5]]],


         [[[4]]],


         [[[9]]],


         [[[4]]],


         [[[8]]],


         [[[6]]],


         [[[6]]],


         [[[0]]],


         [[[3]]],


         [[[8]]],


         [[[3]]],


         [[[7]]],


         [[[5]]],


         [[[3]]],


         [[[3]]],


         [[[1]]],


         [[[3]]],


         [[[4]]],


         [[[0]]],


         [[[8]]],


         [[[4]]]],



        [[[[7]]],


         [[[7]]],


         [[[2]]],


         [[[7]]],


         [[[3]]],


         [[[1]]],


         [[[0]]],


         [[[3]]],


         [[[5]]],


         [[[6]]],


         [[[5]]],


         [[[4]]],


         [[[1]]],


         [[[4]]],


         [[[1]]],


         [[[3]]],


         [[[7]]],


         [[[8]]],


         [[[6]]],


         [[[9]]],


         [[[9]]],


         [[[0]]],


         [[[9]]],


         [[[9]]],


         [[[5]]],


         [[[3]]],


         [[[9]]],


         [[[2]]]],



        [[[[0]]],


         [[[5]]],


         [[[2]]],


         [[[1]]],


         [[[8]]],


         [[[0]]],


         [[[3]]],


         [[[0]]],


         [[[0]]],


         [[[4]]],


         [[[6]]],


         [[[8]]],


         [[[6]]],


         [[[8]]],


         [[[9]]],


         [[[1]]],


         [[[2]]],


         [[[7]]],


         [[[9]]],


         [[[0]]],


         [[[7]]],


         [[[0]]],


         [[[5]]],


         [[[5]]],


         [[[5]]],


         [[[4]]],


         [[[4]]],


         [[[5]]]],



        [[[[5]]],


         [[[1]]],


         [[[8]]],


         [[[2]]],


         [[[8]]],


         [[[9]]],


         [[[6]]],


         [[[3]]],


         [[[7]]],


         [[[0]]],


         [[[3]]],


         [[[1]]],


         [[[3]]],


         [[[2]]],


         [[[4]]],


         [[[3]]],


         [[[0]]],


         [[[7]]],


         [[[5]]],


         [[[4]]],


         [[[0]]],


         [[[6]]],


         [[[3]]],


         [[[2]]],


         [[[7]]],


         [[[1]]],


         [[[5]]],


         [[[7]]]],



        [[[[2]]],


         [[[6]]],


         [[[9]]],


         [[[7]]],


         [[[2]]],


         [[[9]]],


         [[[2]]],


         [[[0]]],


         [[[2]]],


         [[[9]]],


         [[[9]]],


         [[[6]]],


         [[[1]]],


         [[[9]]],


         [[[8]]],


         [[[0]]],


         [[[3]]],


         [[[9]]],


         [[[4]]],


         [[[5]]],


         [[[9]]],


         [[[0]]],


         [[[3]]],


         [[[3]]],


         [[[8]]],


         [[[5]]],


         [[[0]]],


         [[[6]]]],



        [[[[4]]],


         [[[5]]],


         [[[2]]],


         [[[0]]],


         [[[7]]],


         [[[2]]],


         [[[8]]],


         [[[3]]],


         [[[6]]],


         [[[7]]],


         [[[3]]],


         [[[9]]],


         [[[6]]],


         [[[6]]],


         [[[0]]],


         [[[0]]],


         [[[5]]],


         [[[6]]],


         [[[3]]],


         [[[5]]],


         [[[2]]],


         [[[0]]],


         [[[9]]],


         [[[7]]],


         [[[1]]],


         [[[0]]],


         [[[0]]],


         [[[8]]]],



        [[[[9]]],


         [[[0]]],


         [[[4]]],


         [[[4]]],


         [[[7]]],


         [[[1]]],


         [[[3]]],


         [[[2]]],


         [[[2]]],


         [[[7]]],


         [[[1]]],


         [[[3]]],


         [[[4]]],


         [[[7]]],


         [[[8]]],


         [[[4]]],


         [[[1]]],


         [[[3]]],


         [[[4]]],


         [[[4]]],


         [[[2]]],


         [[[3]]],


         [[[2]]],


         [[[1]]],


         [[[1]]],


         [[[7]]],


         [[[7]]],


         [[[1]]]],



        [[[[0]]],


         [[[0]]],


         [[[3]]],


         [[[6]]],


         [[[9]]],


         [[[6]]],


         [[[9]]],


         [[[9]]],


         [[[4]]],


         [[[7]]],


         [[[7]]],


         [[[8]]],


         [[[7]]],


         [[[8]]],


         [[[4]]],


         [[[0]]],


         [[[8]]],


         [[[3]]],


         [[[5]]],


         [[[8]]],


         [[[3]]],


         [[[4]]],


         [[[7]]],


         [[[6]]],


         [[[1]]],


         [[[8]]],


         [[[8]]],


         [[[5]]]],



        [[[[3]]],


         [[[1]]],


         [[[9]]],


         [[[6]]],


         [[[8]]],


         [[[7]]],


         [[[6]]],


         [[[8]]],


         [[[6]]],


         [[[0]]],


         [[[5]]],


         [[[7]]],


         [[[9]]],


         [[[5]]],


         [[[9]]],


         [[[0]]],


         [[[1]]],


         [[[8]]],


         [[[7]]],


         [[[9]]],


         [[[0]]],


         [[[3]]],


         [[[8]]],


         [[[4]]],


         [[[7]]],


         [[[8]]],


         [[[4]]],


         [[[7]]]],



        [[[[0]]],


         [[[0]]],


         [[[4]]],


         [[[9]]],


         [[[3]]],


         [[[1]]],


         [[[5]]],


         [[[3]]],


         [[[0]]],


         [[[1]]],


         [[[6]]],


         [[[5]]],


         [[[8]]],


         [[[2]]],


         [[[1]]],


         [[[1]]],


         [[[8]]],


         [[[7]]],


         [[[9]]],


         [[[4]]],


         [[[2]]],


         [[[5]]],


         [[[6]]],


         [[[0]]],


         [[[6]]],


         [[[4]]],


         [[[4]]],


         [[[3]]]],



        [[[[1]]],


         [[[4]]],


         [[[3]]],


         [[[2]]],


         [[[9]]],


         [[[9]]],


         [[[0]]],


         [[[0]]],


         [[[3]]],


         [[[3]]],


         [[[8]]],


         [[[4]]],


         [[[0]]],


         [[[3]]],


         [[[7]]],


         [[[4]]],


         [[[5]]],


         [[[1]]],


         [[[2]]],


         [[[0]]],


         [[[5]]],


         [[[2]]],


         [[[2]]],


         [[[9]]],


         [[[8]]],


         [[[7]]],


         [[[5]]],


         [[[5]]]],



        [[[[4]]],


         [[[3]]],


         [[[1]]],


         [[[8]]],


         [[[8]]],


         [[[6]]],


         [[[0]]],


         [[[1]]],


         [[[9]]],


         [[[0]]],


         [[[5]]],


         [[[9]]],


         [[[8]]],


         [[[5]]],


         [[[8]]],


         [[[0]]],


         [[[2]]],


         [[[7]]],


         [[[8]]],


         [[[7]]],


         [[[1]]],


         [[[9]]],


         [[[4]]],


         [[[9]]],


         [[[4]]],


         [[[7]]],


         [[[2]]],


         [[[6]]]],



        [[[[3]]],


         [[[4]]],


         [[[8]]],


         [[[8]]],


         [[[4]]],


         [[[6]]],


         [[[8]]],


         [[[8]]],


         [[[8]]],


         [[[2]]],


         [[[0]]],


         [[[2]]],


         [[[7]]],


         [[[6]]],


         [[[8]]],


         [[[8]]],


         [[[9]]],


         [[[5]]],


         [[[1]]],


         [[[8]]],


         [[[9]]],


         [[[2]]],


         [[[7]]],


         [[[3]]],


         [[[8]]],


         [[[0]]],


         [[[3]]],


         [[[4]]]],



        [[[[3]]],


         [[[1]]],


         [[[9]]],


         [[[9]]],


         [[[3]]],


         [[[4]]],


         [[[0]]],


         [[[0]]],


         [[[5]]],


         [[[3]]],


         [[[6]]],


         [[[6]]],


         [[[9]]],


         [[[2]]],


         [[[0]]],


         [[[6]]],


         [[[4]]],


         [[[4]]],


         [[[4]]],


         [[[5]]],


         [[[7]]],


         [[[4]]],


         [[[7]]],


         [[[3]]],


         [[[9]]],


         [[[1]]],


         [[[3]]],


         [[[5]]]],



        [[[[3]]],


         [[[3]]],


         [[[3]]],


         [[[9]]],


         [[[2]]],


         [[[5]]],


         [[[5]]],


         [[[0]]],


         [[[9]]],


         [[[4]]],


         [[[0]]],


         [[[9]]],


         [[[2]]],


         [[[6]]],


         [[[5]]],


         [[[1]]],


         [[[3]]],


         [[[8]]],


         [[[5]]],


         [[[4]]],


         [[[0]]],


         [[[6]]],


         [[[4]]],


         [[[7]]],


         [[[5]]],


         [[[1]]],


         [[[6]]],


         [[[0]]]],



        [[[[4]]],


         [[[2]]],


         [[[6]]],


         [[[7]]],


         [[[2]]],


         [[[1]]],


         [[[4]]],


         [[[5]]],


         [[[9]]],


         [[[3]]],


         [[[0]]],


         [[[8]]],


         [[[0]]],


         [[[3]]],


         [[[6]]],


         [[[1]]],


         [[[8]]],


         [[[1]]],


         [[[9]]],


         [[[0]]],


         [[[6]]],


         [[[6]]],


         [[[8]]],


         [[[2]]],


         [[[2]]],


         [[[2]]],


         [[[1]]],


         [[[6]]]],



        [[[[3]]],


         [[[2]]],


         [[[3]]],


         [[[1]]],


         [[[6]]],


         [[[2]]],


         [[[5]]],


         [[[5]]],


         [[[4]]],


         [[[3]]],


         [[[5]]],


         [[[8]]],


         [[[5]]],


         [[[4]]],


         [[[1]]],


         [[[5]]],


         [[[2]]],


         [[[2]]],


         [[[8]]],


         [[[3]]],


         [[[5]]],


         [[[3]]],


         [[[2]]],


         [[[9]]],


         [[[9]]],


         [[[0]]],


         [[[0]]],


         [[[7]]]],



        [[[[4]]],


         [[[6]]],


         [[[9]]],


         [[[1]]],


         [[[4]]],


         [[[5]]],


         [[[7]]],


         [[[0]]],


         [[[5]]],


         [[[5]]],


         [[[6]]],


         [[[3]]],


         [[[8]]],


         [[[1]]],


         [[[1]]],


         [[[7]]],


         [[[1]]],


         [[[3]]],


         [[[1]]],


         [[[0]]],


         [[[2]]],


         [[[0]]],


         [[[2]]],


         [[[3]]],


         [[[3]]],


         [[[3]]],


         [[[5]]],


         [[[5]]]],



        [[[[7]]],


         [[[1]]],


         [[[4]]],


         [[[3]]],


         [[[9]]],


         [[[9]]],


         [[[5]]],


         [[[0]]],


         [[[5]]],


         [[[2]]],


         [[[4]]],


         [[[3]]],


         [[[9]]],


         [[[4]]],


         [[[8]]],


         [[[9]]],


         [[[2]]],


         [[[0]]],


         [[[8]]],


         [[[9]]],


         [[[4]]],


         [[[5]]],


         [[[3]]],


         [[[0]]],


         [[[5]]],


         [[[0]]],


         [[[2]]],


         [[[6]]]],



        [[[[8]]],


         [[[9]]],


         [[[6]]],


         [[[9]]],


         [[[1]]],


         [[[4]]],


         [[[7]]],


         [[[0]]],


         [[[9]]],


         [[[8]]],


         [[[7]]],


         [[[0]]],


         [[[6]]],


         [[[9]]],


         [[[4]]],


         [[[2]]],


         [[[4]]],


         [[[1]]],


         [[[1]]],


         [[[9]]],


         [[[5]]],


         [[[7]]],


         [[[9]]],


         [[[9]]],


         [[[7]]],


         [[[5]]],


         [[[4]]],


         [[[7]]]]]])>

继续删除通道数维度,由于已经删除了图片数量维度,此时的 x 的 shape 为[28,28,1],因

x = tf.squeeze(x, axis=3) # 删除图片通道数维度
x
<tf.Tensor: id=368, shape=(1, 28, 28, 1, 1), dtype=int32, numpy=
array([[[[[7]],

         [[6]],

         [[8]],

         [[0]],

         [[5]],

         [[7]],

         [[3]],

         [[7]],

         [[1]],

         [[0]],

         [[2]],

         [[9]],

         [[6]],

         [[2]],

         [[7]],

         [[6]],

         [[2]],

         [[4]],

         [[4]],

         [[2]],

         [[3]],

         [[9]],

         [[1]],

         [[8]],

         [[3]],

         [[6]],

         [[0]],

         [[7]]],


        [[[8]],

         [[2]],

         [[4]],

         [[5]],

         [[3]],

         [[3]],

         [[2]],

         [[5]],

         [[4]],

         [[0]],

         [[7]],

         [[1]],

         [[3]],

         [[9]],

         [[0]],

         [[8]],

         [[8]],

         [[9]],

         [[8]],

         [[5]],

         [[1]],

         [[1]],

         [[0]],

         [[6]],

         [[6]],

         [[5]],

         [[6]],

         [[6]]],


        [[[5]],

         [[9]],

         [[2]],

         [[0]],

         [[9]],

         [[0]],

         [[4]],

         [[0]],

         [[5]],

         [[2]],

         [[3]],

         [[8]],

         [[0]],

         [[2]],

         [[5]],

         [[5]],

         [[1]],

         [[3]],

         [[5]],

         [[9]],

         [[3]],

         [[9]],

         [[3]],

         [[6]],

         [[8]],

         [[7]],

         [[4]],

         [[4]]],


        [[[6]],

         [[6]],

         [[0]],

         [[5]],

         [[0]],

         [[7]],

         [[2]],

         [[3]],

         [[8]],

         [[7]],

         [[4]],

         [[9]],

         [[3]],

         [[2]],

         [[4]],

         [[5]],

         [[9]],

         [[0]],

         [[2]],

         [[7]],

         [[5]],

         [[9]],

         [[3]],

         [[1]],

         [[8]],

         [[7]],

         [[3]],

         [[5]]],


        [[[5]],

         [[8]],

         [[5]],

         [[0]],

         [[1]],

         [[0]],

         [[5]],

         [[4]],

         [[3]],

         [[1]],

         [[2]],

         [[0]],

         [[2]],

         [[9]],

         [[0]],

         [[1]],

         [[7]],

         [[4]],

         [[3]],

         [[7]],

         [[9]],

         [[1]],

         [[9]],

         [[5]],

         [[4]],

         [[4]],

         [[6]],

         [[2]]],


        [[[4]],

         [[3]],

         [[4]],

         [[3]],

         [[4]],

         [[8]],

         [[8]],

         [[2]],

         [[6]],

         [[6]],

         [[7]],

         [[2]],

         [[6]],

         [[4]],

         [[7]],

         [[0]],

         [[8]],

         [[4]],

         [[0]],

         [[3]],

         [[5]],

         [[0]],

         [[1]],

         [[6]],

         [[3]],

         [[3]],

         [[5]],

         [[9]]],


        [[[0]],

         [[8]],

         [[8]],

         [[4]],

         [[0]],

         [[6]],

         [[0]],

         [[8]],

         [[2]],

         [[3]],

         [[1]],

         [[3]],

         [[6]],

         [[8]],

         [[5]],

         [[5]],

         [[1]],

         [[9]],

         [[5]],

         [[8]],

         [[7]],

         [[5]],

         [[3]],

         [[4]],

         [[5]],

         [[2]],

         [[1]],

         [[7]]],


        [[[0]],

         [[7]],

         [[3]],

         [[0]],

         [[1]],

         [[8]],

         [[9]],

         [[2]],

         [[8]],

         [[3]],

         [[3]],

         [[3]],

         [[7]],

         [[1]],

         [[4]],

         [[9]],

         [[7]],

         [[0]],

         [[7]],

         [[6]],

         [[6]],

         [[8]],

         [[4]],

         [[0]],

         [[2]],

         [[6]],

         [[2]],

         [[2]]],


        [[[8]],

         [[7]],

         [[9]],

         [[1]],

         [[6]],

         [[8]],

         [[2]],

         [[5]],

         [[4]],

         [[9]],

         [[4]],

         [[8]],

         [[6]],

         [[6]],

         [[0]],

         [[3]],

         [[8]],

         [[3]],

         [[7]],

         [[5]],

         [[3]],

         [[3]],

         [[1]],

         [[3]],

         [[4]],

         [[0]],

         [[8]],

         [[4]]],


        [[[7]],

         [[7]],

         [[2]],

         [[7]],

         [[3]],

         [[1]],

         [[0]],

         [[3]],

         [[5]],

         [[6]],

         [[5]],

         [[4]],

         [[1]],

         [[4]],

         [[1]],

         [[3]],

         [[7]],

         [[8]],

         [[6]],

         [[9]],

         [[9]],

         [[0]],

         [[9]],

         [[9]],

         [[5]],

         [[3]],

         [[9]],

         [[2]]],


        [[[0]],

         [[5]],

         [[2]],

         [[1]],

         [[8]],

         [[0]],

         [[3]],

         [[0]],

         [[0]],

         [[4]],

         [[6]],

         [[8]],

         [[6]],

         [[8]],

         [[9]],

         [[1]],

         [[2]],

         [[7]],

         [[9]],

         [[0]],

         [[7]],

         [[0]],

         [[5]],

         [[5]],

         [[5]],

         [[4]],

         [[4]],

         [[5]]],


        [[[5]],

         [[1]],

         [[8]],

         [[2]],

         [[8]],

         [[9]],

         [[6]],

         [[3]],

         [[7]],

         [[0]],

         [[3]],

         [[1]],

         [[3]],

         [[2]],

         [[4]],

         [[3]],

         [[0]],

         [[7]],

         [[5]],

         [[4]],

         [[0]],

         [[6]],

         [[3]],

         [[2]],

         [[7]],

         [[1]],

         [[5]],

         [[7]]],


        [[[2]],

         [[6]],

         [[9]],

         [[7]],

         [[2]],

         [[9]],

         [[2]],

         [[0]],

         [[2]],

         [[9]],

         [[9]],

         [[6]],

         [[1]],

         [[9]],

         [[8]],

         [[0]],

         [[3]],

         [[9]],

         [[4]],

         [[5]],

         [[9]],

         [[0]],

         [[3]],

         [[3]],

         [[8]],

         [[5]],

         [[0]],

         [[6]]],


        [[[4]],

         [[5]],

         [[2]],

         [[0]],

         [[7]],

         [[2]],

         [[8]],

         [[3]],

         [[6]],

         [[7]],

         [[3]],

         [[9]],

         [[6]],

         [[6]],

         [[0]],

         [[0]],

         [[5]],

         [[6]],

         [[3]],

         [[5]],

         [[2]],

         [[0]],

         [[9]],

         [[7]],

         [[1]],

         [[0]],

         [[0]],

         [[8]]],


        [[[9]],

         [[0]],

         [[4]],

         [[4]],

         [[7]],

         [[1]],

         [[3]],

         [[2]],

         [[2]],

         [[7]],

         [[1]],

         [[3]],

         [[4]],

         [[7]],

         [[8]],

         [[4]],

         [[1]],

         [[3]],

         [[4]],

         [[4]],

         [[2]],

         [[3]],

         [[2]],

         [[1]],

         [[1]],

         [[7]],

         [[7]],

         [[1]]],


        [[[0]],

         [[0]],

         [[3]],

         [[6]],

         [[9]],

         [[6]],

         [[9]],

         [[9]],

         [[4]],

         [[7]],

         [[7]],

         [[8]],

         [[7]],

         [[8]],

         [[4]],

         [[0]],

         [[8]],

         [[3]],

         [[5]],

         [[8]],

         [[3]],

         [[4]],

         [[7]],

         [[6]],

         [[1]],

         [[8]],

         [[8]],

         [[5]]],


        [[[3]],

         [[1]],

         [[9]],

         [[6]],

         [[8]],

         [[7]],

         [[6]],

         [[8]],

         [[6]],

         [[0]],

         [[5]],

         [[7]],

         [[9]],

         [[5]],

         [[9]],

         [[0]],

         [[1]],

         [[8]],

         [[7]],

         [[9]],

         [[0]],

         [[3]],

         [[8]],

         [[4]],

         [[7]],

         [[8]],

         [[4]],

         [[7]]],


        [[[0]],

         [[0]],

         [[4]],

         [[9]],

         [[3]],

         [[1]],

         [[5]],

         [[3]],

         [[0]],

         [[1]],

         [[6]],

         [[5]],

         [[8]],

         [[2]],

         [[1]],

         [[1]],

         [[8]],

         [[7]],

         [[9]],

         [[4]],

         [[2]],

         [[5]],

         [[6]],

         [[0]],

         [[6]],

         [[4]],

         [[4]],

         [[3]]],


        [[[1]],

         [[4]],

         [[3]],

         [[2]],

         [[9]],

         [[9]],

         [[0]],

         [[0]],

         [[3]],

         [[3]],

         [[8]],

         [[4]],

         [[0]],

         [[3]],

         [[7]],

         [[4]],

         [[5]],

         [[1]],

         [[2]],

         [[0]],

         [[5]],

         [[2]],

         [[2]],

         [[9]],

         [[8]],

         [[7]],

         [[5]],

         [[5]]],


        [[[4]],

         [[3]],

         [[1]],

         [[8]],

         [[8]],

         [[6]],

         [[0]],

         [[1]],

         [[9]],

         [[0]],

         [[5]],

         [[9]],

         [[8]],

         [[5]],

         [[8]],

         [[0]],

         [[2]],

         [[7]],

         [[8]],

         [[7]],

         [[1]],

         [[9]],

         [[4]],

         [[9]],

         [[4]],

         [[7]],

         [[2]],

         [[6]]],


        [[[3]],

         [[4]],

         [[8]],

         [[8]],

         [[4]],

         [[6]],

         [[8]],

         [[8]],

         [[8]],

         [[2]],

         [[0]],

         [[2]],

         [[7]],

         [[6]],

         [[8]],

         [[8]],

         [[9]],

         [[5]],

         [[1]],

         [[8]],

         [[9]],

         [[2]],

         [[7]],

         [[3]],

         [[8]],

         [[0]],

         [[3]],

         [[4]]],


        [[[3]],

         [[1]],

         [[9]],

         [[9]],

         [[3]],

         [[4]],

         [[0]],

         [[0]],

         [[5]],

         [[3]],

         [[6]],

         [[6]],

         [[9]],

         [[2]],

         [[0]],

         [[6]],

         [[4]],

         [[4]],

         [[4]],

         [[5]],

         [[7]],

         [[4]],

         [[7]],

         [[3]],

         [[9]],

         [[1]],

         [[3]],

         [[5]]],


        [[[3]],

         [[3]],

         [[3]],

         [[9]],

         [[2]],

         [[5]],

         [[5]],

         [[0]],

         [[9]],

         [[4]],

         [[0]],

         [[9]],

         [[2]],

         [[6]],

         [[5]],

         [[1]],

         [[3]],

         [[8]],

         [[5]],

         [[4]],

         [[0]],

         [[6]],

         [[4]],

         [[7]],

         [[5]],

         [[1]],

         [[6]],

         [[0]]],


        [[[4]],

         [[2]],

         [[6]],

         [[7]],

         [[2]],

         [[1]],

         [[4]],

         [[5]],

         [[9]],

         [[3]],

         [[0]],

         [[8]],

         [[0]],

         [[3]],

         [[6]],

         [[1]],

         [[8]],

         [[1]],

         [[9]],

         [[0]],

         [[6]],

         [[6]],

         [[8]],

         [[2]],

         [[2]],

         [[2]],

         [[1]],

         [[6]]],


        [[[3]],

         [[2]],

         [[3]],

         [[1]],

         [[6]],

         [[2]],

         [[5]],

         [[5]],

         [[4]],

         [[3]],

         [[5]],

         [[8]],

         [[5]],

         [[4]],

         [[1]],

         [[5]],

         [[2]],

         [[2]],

         [[8]],

         [[3]],

         [[5]],

         [[3]],

         [[2]],

         [[9]],

         [[9]],

         [[0]],

         [[0]],

         [[7]]],


        [[[4]],

         [[6]],

         [[9]],

         [[1]],

         [[4]],

         [[5]],

         [[7]],

         [[0]],

         [[5]],

         [[5]],

         [[6]],

         [[3]],

         [[8]],

         [[1]],

         [[1]],

         [[7]],

         [[1]],

         [[3]],

         [[1]],

         [[0]],

         [[2]],

         [[0]],

         [[2]],

         [[3]],

         [[3]],

         [[3]],

         [[5]],

         [[5]]],


        [[[7]],

         [[1]],

         [[4]],

         [[3]],

         [[9]],

         [[9]],

         [[5]],

         [[0]],

         [[5]],

         [[2]],

         [[4]],

         [[3]],

         [[9]],

         [[4]],

         [[8]],

         [[9]],

         [[2]],

         [[0]],

         [[8]],

         [[9]],

         [[4]],

         [[5]],

         [[3]],

         [[0]],

         [[5]],

         [[0]],

         [[2]],

         [[6]]],


        [[[8]],

         [[9]],

         [[6]],

         [[9]],

         [[1]],

         [[4]],

         [[7]],

         [[0]],

         [[9]],

         [[8]],

         [[7]],

         [[0]],

         [[6]],

         [[9]],

         [[4]],

         [[2]],

         [[4]],

         [[1]],

         [[1]],

         [[9]],

         [[5]],

         [[7]],

         [[9]],

         [[9]],

         [[7]],

         [[5]],

         [[4]],

         [[7]]]]])>
x = tf.squeeze(x)
x
<tf.Tensor: id=369, shape=(28, 28), dtype=int32, numpy=
array([[7, 6, 8, 0, 5, 7, 3, 7, 1, 0, 2, 9, 6, 2, 7, 6, 2, 4, 4, 2, 3, 9,
        1, 8, 3, 6, 0, 7],
       [8, 2, 4, 5, 3, 3, 2, 5, 4, 0, 7, 1, 3, 9, 0, 8, 8, 9, 8, 5, 1, 1,
        0, 6, 6, 5, 6, 6],
       [5, 9, 2, 0, 9, 0, 4, 0, 5, 2, 3, 8, 0, 2, 5, 5, 1, 3, 5, 9, 3, 9,
        3, 6, 8, 7, 4, 4],
       [6, 6, 0, 5, 0, 7, 2, 3, 8, 7, 4, 9, 3, 2, 4, 5, 9, 0, 2, 7, 5, 9,
        3, 1, 8, 7, 3, 5],
       [5, 8, 5, 0, 1, 0, 5, 4, 3, 1, 2, 0, 2, 9, 0, 1, 7, 4, 3, 7, 9, 1,
        9, 5, 4, 4, 6, 2],
       [4, 3, 4, 3, 4, 8, 8, 2, 6, 6, 7, 2, 6, 4, 7, 0, 8, 4, 0, 3, 5, 0,
        1, 6, 3, 3, 5, 9],
       [0, 8, 8, 4, 0, 6, 0, 8, 2, 3, 1, 3, 6, 8, 5, 5, 1, 9, 5, 8, 7, 5,
        3, 4, 5, 2, 1, 7],
       [0, 7, 3, 0, 1, 8, 9, 2, 8, 3, 3, 3, 7, 1, 4, 9, 7, 0, 7, 6, 6, 8,
        4, 0, 2, 6, 2, 2],
       [8, 7, 9, 1, 6, 8, 2, 5, 4, 9, 4, 8, 6, 6, 0, 3, 8, 3, 7, 5, 3, 3,
        1, 3, 4, 0, 8, 4],
       [7, 7, 2, 7, 3, 1, 0, 3, 5, 6, 5, 4, 1, 4, 1, 3, 7, 8, 6, 9, 9, 0,
        9, 9, 5, 3, 9, 2],
       [0, 5, 2, 1, 8, 0, 3, 0, 0, 4, 6, 8, 6, 8, 9, 1, 2, 7, 9, 0, 7, 0,
        5, 5, 5, 4, 4, 5],
       [5, 1, 8, 2, 8, 9, 6, 3, 7, 0, 3, 1, 3, 2, 4, 3, 0, 7, 5, 4, 0, 6,
        3, 2, 7, 1, 5, 7],
       [2, 6, 9, 7, 2, 9, 2, 0, 2, 9, 9, 6, 1, 9, 8, 0, 3, 9, 4, 5, 9, 0,
        3, 3, 8, 5, 0, 6],
       [4, 5, 2, 0, 7, 2, 8, 3, 6, 7, 3, 9, 6, 6, 0, 0, 5, 6, 3, 5, 2, 0,
        9, 7, 1, 0, 0, 8],
       [9, 0, 4, 4, 7, 1, 3, 2, 2, 7, 1, 3, 4, 7, 8, 4, 1, 3, 4, 4, 2, 3,
        2, 1, 1, 7, 7, 1],
       [0, 0, 3, 6, 9, 6, 9, 9, 4, 7, 7, 8, 7, 8, 4, 0, 8, 3, 5, 8, 3, 4,
        7, 6, 1, 8, 8, 5],
       [3, 1, 9, 6, 8, 7, 6, 8, 6, 0, 5, 7, 9, 5, 9, 0, 1, 8, 7, 9, 0, 3,
        8, 4, 7, 8, 4, 7],
       [0, 0, 4, 9, 3, 1, 5, 3, 0, 1, 6, 5, 8, 2, 1, 1, 8, 7, 9, 4, 2, 5,
        6, 0, 6, 4, 4, 3],
       [1, 4, 3, 2, 9, 9, 0, 0, 3, 3, 8, 4, 0, 3, 7, 4, 5, 1, 2, 0, 5, 2,
        2, 9, 8, 7, 5, 5],
       [4, 3, 1, 8, 8, 6, 0, 1, 9, 0, 5, 9, 8, 5, 8, 0, 2, 7, 8, 7, 1, 9,
        4, 9, 4, 7, 2, 6],
       [3, 4, 8, 8, 4, 6, 8, 8, 8, 2, 0, 2, 7, 6, 8, 8, 9, 5, 1, 8, 9, 2,
        7, 3, 8, 0, 3, 4],
       [3, 1, 9, 9, 3, 4, 0, 0, 5, 3, 6, 6, 9, 2, 0, 6, 4, 4, 4, 5, 7, 4,
        7, 3, 9, 1, 3, 5],
       [3, 3, 3, 9, 2, 5, 5, 0, 9, 4, 0, 9, 2, 6, 5, 1, 3, 8, 5, 4, 0, 6,
        4, 7, 5, 1, 6, 0],
       [4, 2, 6, 7, 2, 1, 4, 5, 9, 3, 0, 8, 0, 3, 6, 1, 8, 1, 9, 0, 6, 6,
        8, 2, 2, 2, 1, 6],
       [3, 2, 3, 1, 6, 2, 5, 5, 4, 3, 5, 8, 5, 4, 1, 5, 2, 2, 8, 3, 5, 3,
        2, 9, 9, 0, 0, 7],
       [4, 6, 9, 1, 4, 5, 7, 0, 5, 5, 6, 3, 8, 1, 1, 7, 1, 3, 1, 0, 2, 0,
        2, 3, 3, 3, 5, 5],
       [7, 1, 4, 3, 9, 9, 5, 0, 5, 2, 4, 3, 9, 4, 8, 9, 2, 0, 8, 9, 4, 5,
        3, 0, 5, 0, 2, 6],
       [8, 9, 6, 9, 1, 4, 7, 0, 9, 8, 7, 0, 6, 9, 4, 2, 4, 1, 1, 9, 5, 7,
        9, 9, 7, 5, 4, 7]])>

交换维度

改变视图、增删维度都不会影响张量的存储。在实现算法逻辑时,在保持维度顺序不
变的条件下,仅仅改变张量的理解方式是不够的,有时需要直接调整的存储顺序,即交换
维度(Transpose)。通过交换维度操作,改变了张量的存储顺序,同时也改变了张量的视
图。

交换维度操作是非常常见的,比如在 TensorFlow 中,图片张量的默认存储格式是通道
后行格式:[𝑐,ℎ, ,𝑑],但是部分库的图片格式是通道先行格式:[𝑐,𝑑,ℎ, ],因此需要完成
[𝑐,ℎ, ,𝑑]到[𝑐,𝑑,ℎ, ]维度交换运算,此时若简单的使用改变视图函数 reshape,则新视图
的存储方式需要改变,因此使用改变视图函数是不合法的。我们以[𝑐,ℎ, ,𝑑]转换到
[𝑐,𝑑,ℎ, ]为例,介绍如何使用 tf.transpose(x, perm)函数完成维度交换操作,其中参数 perm
表示新的顺序 List。考虑图片张量 shape 为[2,32,32,3],“ 图片数量、行、列、通道
数 ”的维度索引分别为 0、1、2、3,如果需要交换为[𝑐,𝑑,ℎ, ]格式,则新维度的排序为
“ 图片数量、通道数、行、列 ”,对应的索引号为[0,3,1,2],因此参数 perm 需设

x = tf.random.normal([2,32,32,3])
x
<tf.Tensor: id=383, shape=(2, 32, 32, 3), dtype=float32, numpy=
array([[[[-1.81754386e+00, -4.70156968e-01, -2.52298117e-01],
         [ 5.58642030e-01,  5.97760499e-01,  1.09302580e+00],
         [ 5.14004827e-01, -1.05498999e-01,  7.38092601e-01],
         ...,
         [ 2.23055840e-01,  1.06471765e+00, -1.38331532e+00],
         [ 1.68199444e+00,  2.46848673e-01,  1.47695792e+00],
         [ 1.15786624e+00, -3.11333984e-01,  4.99992788e-01]],

        [[-6.04505539e-01, -2.47344688e-01,  1.51075566e+00],
         [ 5.44666886e-01,  2.55123287e-01,  5.57059646e-01],
         [-4.57320154e-01,  2.82056046e+00,  6.76047444e-01],
         ...,
         [-5.46135724e-01,  3.81573886e-01,  1.20632327e+00],
         [ 5.61702669e-01,  5.53223789e-01,  3.50575507e-01],
         [ 3.78108084e-01, -6.26158476e-01,  3.21322158e-02]],

        [[-5.26854277e-01,  1.04663074e+00, -1.05008446e-01],
         [-1.09959412e+00,  4.42302287e-01,  1.57041311e+00],
         [ 9.37579453e-01,  1.01151049e+00, -3.10841888e-01],
         ...,
         [ 1.11713850e+00, -1.02515352e+00,  3.16543788e-01],
         [ 8.82856309e-01,  2.22450882e-01, -1.10408616e+00],
         [-5.53659916e-01,  1.26014495e+00,  7.38231480e-01]],

        ...,

        [[ 1.64292350e-01, -5.60496569e-01,  1.57700419e-01],
         [ 5.80713630e-01,  6.30690455e-01,  1.61487031e+00],
         [-2.67562330e-01,  1.97027281e-01,  1.27779508e+00],
         ...,
         [ 4.33750421e-01, -1.61507213e+00,  5.58398664e-01],
         [-7.99253464e-01, -3.76639485e-01,  2.79633366e-02],
         [ 6.45348668e-01, -1.00306714e+00,  8.63744736e-01]],

        [[-5.10530829e-01, -1.02812600e+00, -1.03168976e+00],
         [ 6.63297594e-01, -2.50450993e+00,  1.18919218e+00],
         [-1.44717622e+00,  9.30240005e-02, -1.11981225e+00],
         ...,
         [ 6.39562786e-01,  1.01350045e+00,  8.57544661e-01],
         [-3.98536682e-01,  1.01951730e+00,  7.84884095e-01],
         [ 5.83199382e-01,  2.51417905e-01,  2.73984820e-01]],

        [[ 4.36883122e-01, -1.02460220e-01, -1.13776886e+00],
         [-2.38410875e-01, -6.04024470e-01,  8.58290851e-01],
         [-5.16876161e-01, -8.74156654e-01, -8.32713604e-01],
         ...,
         [ 7.42940605e-01,  1.15333736e+00, -8.03016067e-01],
         [-4.22317475e-01, -2.18287300e-04, -7.64122486e-01],
         [-3.43144268e-01, -2.01973930e-01, -9.90525365e-01]]],


       [[[-1.73000246e-01, -3.88606936e-01, -1.64711010e+00],
         [ 1.16506386e+00, -6.42754436e-01, -2.51510048e+00],
         [ 4.84685868e-01,  1.28097153e+00,  7.27444112e-01],
         ...,
         [-1.20562863e+00,  9.53909993e-01,  7.50858545e-01],
         [ 1.34659231e+00,  9.07264888e-01, -2.05528617e+00],
         [-1.02416945e+00, -1.18705058e+00, -1.20303905e+00]],

        [[ 6.25148937e-02,  2.47662440e-01,  5.13902128e-01],
         [ 9.16073844e-02,  2.31153518e-01, -1.22589672e+00],
         [-1.11717021e+00, -1.92667812e-01,  5.61413467e-01],
         ...,
         [ 9.32722867e-01,  2.79274255e-01, -2.86753029e-01],
         [ 1.64209723e+00, -3.34836036e-01,  1.45347047e+00],
         [ 3.32810372e-01,  4.41387504e-01,  1.92122543e+00]],

        [[ 2.61857867e-01, -1.99202657e+00, -2.98113734e-01],
         [-1.05231297e+00, -1.81242037e+00,  1.55490994e+00],
         [ 1.34826970e+00,  7.61362016e-01,  3.82197738e-01],
         ...,
         [-1.45399928e+00, -1.18456459e+00,  3.48090023e-01],
         [ 5.90160191e-01,  8.67367923e-01, -5.17747998e-01],
         [ 1.61343849e+00,  4.96253192e-01,  3.91920358e-01]],

        ...,

        [[-3.68142515e-01,  2.38423538e+00, -1.51942110e+00],
         [-5.20390034e-01, -1.53226471e+00, -4.22226578e-01],
         [ 1.05757916e+00, -1.68196952e+00,  2.34283400e+00],
         ...,
         [ 7.02120125e-01, -1.36901224e+00, -3.83962899e-01],
         [ 2.52281487e-01,  2.91662157e-01,  1.46380651e+00],
         [ 1.10750830e+00,  2.01109082e-01,  1.64109066e-01]],

        [[-1.00996113e+00,  6.95149958e-01, -1.34099811e-01],
         [-1.18660457e-01,  1.21997319e-01,  4.23581839e-01],
         [ 5.88935852e-01,  4.08603966e-01,  1.89512342e-01],
         ...,
         [ 1.28577542e+00,  1.21233428e+00,  8.59268904e-01],
         [ 9.48506355e-01,  1.17113721e+00,  1.11448359e+00],
         [ 4.18897495e-02, -4.77942944e-01,  6.76567376e-01]],

        [[ 1.77094090e+00,  9.86636400e-01, -1.26546252e+00],
         [ 3.53283381e+00, -4.29592669e-01, -7.84093320e-01],
         [-6.90953672e-01,  2.13375494e-01, -3.11317444e-01],
         ...,
         [-7.92546421e-02, -1.30087025e-02,  2.74553239e-01],
         [-2.79268712e-01, -7.70633876e-01,  2.95856386e-01],
         [ 2.09342480e+00,  1.87956655e+00, -4.51156348e-01]]]],
      dtype=float32)>
tf.transpose(x,perm=[0,3,1,2])
<tf.Tensor: id=377, shape=(2, 3, 32, 32), dtype=float32, numpy=
array([[[[ 4.56660986e-01, -8.64537597e-01,  1.09439838e+00, ...,
           6.54324234e-01, -5.18730402e-01,  7.74429977e-01],
         [ 1.85835406e-01, -1.96778744e-01,  3.69712651e-01, ...,
           2.68401414e-01, -7.61413574e-01, -1.87827909e+00],
         [-6.27159715e-01,  2.90783793e-01, -1.28109038e+00, ...,
          -1.34567618e-01,  7.45671570e-01, -6.64750695e-01],
         ...,
         [-1.33715415e+00, -9.63671088e-01, -4.47890311e-01, ...,
           6.65110350e-01,  3.06763709e-01, -5.51078320e-01],
         [-1.12137091e+00, -6.59661219e-02, -1.94841897e+00, ...,
           5.01195192e-01, -5.91338456e-01,  6.06271863e-01],
         [-2.21048307e+00, -1.54931438e+00, -4.26999331e-01, ...,
          -8.76425087e-01,  3.61407638e-01, -2.69644391e-02]],

        [[-3.88249129e-01,  7.53394842e-01, -4.80105989e-02, ...,
           1.22189499e-01, -6.48842990e-01,  1.02127576e+00],
         [ 1.61169112e+00,  2.48362637e+00,  4.34053928e-01, ...,
          -1.14192891e+00,  2.65340626e-01,  8.88472319e-01],
         [-3.25526309e+00,  4.46957558e-01,  8.75013173e-02, ...,
           2.28511393e-01,  9.11052704e-01, -8.04829299e-01],
         ...,
         [ 9.18760180e-01,  4.97242436e-02,  9.78760242e-01, ...,
          -2.62028527e+00, -1.04077613e+00,  2.69607186e-01],
         [-1.29277635e+00,  1.03052807e+00,  2.52305031e-01, ...,
          -1.05126703e+00,  6.31268203e-01,  8.29544604e-01],
         [ 5.38806081e-01, -6.87905371e-01, -8.22809339e-02, ...,
           1.05049908e-01, -3.62366557e-01, -5.30386686e-01]],

        [[-2.34627679e-01, -6.14984870e-01, -2.04992437e+00, ...,
           1.31570506e+00,  1.53367364e+00,  1.83163786e+00],
         [-8.65115285e-01, -2.59310240e-03, -2.87160248e-01, ...,
           1.97374392e+00,  4.35472608e-01, -7.63692021e-01],
         [ 1.62894979e-01,  2.11612344e-01,  8.03831935e-01, ...,
          -5.56330860e-01,  1.36233971e-01, -2.16648674e+00],
         ...,
         [ 9.62221444e-01,  2.37274155e-01,  4.87596840e-01, ...,
           4.04263288e-01, -9.13134813e-01, -2.87664533e-01],
         [-1.75514960e+00,  4.69974965e-01,  2.56670594e+00, ...,
           1.56467900e-01,  3.41644973e-01,  2.85851270e-01],
         [-5.34506321e-01, -2.38703087e-01, -1.03533435e+00, ...,
          -2.10629031e-01,  2.07872510e-01, -1.03136683e+00]]],


       [[[ 6.01794302e-01,  5.41662455e-01,  4.33888108e-01, ...,
           1.06451726e+00, -1.32466450e-01, -3.43189329e-01],
         [ 2.39515448e+00,  1.51892292e+00, -1.67090249e+00, ...,
           9.33021903e-01,  1.07544208e+00, -6.90970123e-01],
         [ 6.66577756e-01,  8.49103570e-01, -4.49224770e-01, ...,
          -1.12364173e+00, -6.96548700e-01,  1.71072185e-01],
         ...,
         [ 3.79819274e-01, -5.38920999e-01, -7.91214943e-01, ...,
           3.41945916e-01, -8.32802951e-01,  6.33069873e-01],
         [ 1.78350341e+00, -2.30789930e-02,  7.07671762e-01, ...,
          -6.43216312e-01,  1.52743995e+00, -2.08357964e-02],
         [ 6.20469570e-01, -2.96334147e-01, -6.57846749e-01, ...,
          -3.41852218e-01, -2.77065635e+00,  2.11229250e-01]],

        [[ 9.77860212e-01,  1.42131269e+00,  6.57469988e-01, ...,
           1.40364707e+00,  1.74125564e+00, -1.53570080e+00],
         [-1.72540736e+00,  7.98906207e-01,  5.14676332e-01, ...,
           1.07795811e+00, -7.29510784e-01, -1.11854470e+00],
         [ 1.79711199e+00, -5.68214536e-01, -1.15522668e-01, ...,
          -4.81177121e-01, -9.31737870e-02,  1.76382154e-01],
         ...,
         [-1.01227117e+00,  2.22866654e+00, -1.01094282e+00, ...,
          -1.74077600e-01, -1.97279602e-01, -1.01106381e+00],
         [-6.36570603e-02,  1.09268570e+00,  1.15045929e+00, ...,
           9.36924875e-01, -7.51687229e-01, -1.89653844e-01],
         [-2.94862211e-01, -2.76293427e-01,  9.84368384e-01, ...,
          -3.41624618e-02, -3.51431549e-01,  5.85692883e-01]],

        [[-6.83572292e-02, -2.56744474e-01, -4.18020695e-01, ...,
           1.14546752e+00, -5.56915104e-01,  5.97728431e-01],
         [-5.82312405e-01,  1.94600761e-01, -8.18775833e-01, ...,
          -1.27806878e+00, -1.29437804e+00,  1.07816584e-01],
         [ 8.26703191e-01,  3.75836492e-02, -1.38987508e-02, ...,
           6.74908578e-01, -7.69450545e-01,  7.65210748e-01],
         ...,
         [ 8.97288740e-01, -4.02161598e-01, -1.04141319e+00, ...,
          -3.86901379e-01,  1.38070989e+00,  4.38632905e-01],
         [ 1.99656832e+00,  1.92818735e-02, -1.75299495e-01, ...,
          -3.22317219e+00,  1.19741607e+00, -3.82416584e-02],
         [-7.36781061e-01, -2.50357270e+00,  5.11278450e-01, ...,
          -9.69237447e-01, -3.86975676e-01, -4.40756977e-01]]]],
      dtype=float32)>

需要注意的是,通过 tf.transpose 完成维度交换后,张量的存储顺序已经改变,视图也
随之改变,后续的所有操作必须基于新的存续顺序和视图进行。相对于改变视图操作,维
度交换操作的计算代价更高。

复制数据

当通过增加维度操作插入新维度后,可能希望在新的维度上面复制若干份数据,满足
测试版1205
第 4 章 TensorFlow 基础 28
后续算法的格式要求。考虑𝒁 = 𝒀@𝑿+ 𝒃的例子,偏置𝒃插入样本数的新维度后,需要在

度上复制 Batch Size 份数据,将 shape 变为与𝒀@𝑿一致后,才能完成张量相加运算。
可以通过 tf.tile(x, multiples)函数完成数据在指定维度上的复制操作,multiples 分别指
定了每个维度上面的复制倍数,对应位置为 1 表明不复制,为 2 表明新长度为原来长度的
2 倍,即数据复制一份,以此类推。

 b = tf.constant([1,2]) #创建B向量
b
<tf.Tensor: id=385, shape=(2,), dtype=int32, numpy=array([1, 2])>
b = tf.expand_dims(b,axis=0) #编程矩阵
b
<tf.Tensor: id=387, shape=(1, 2), dtype=int32, numpy=array([[1, 2]])>
b = tf.tile(b,multiples=[2,1])
b
<tf.Tensor: id=389, shape=(2, 2), dtype=int32, numpy=
array([[1, 2],
       [1, 2]])>

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值