TensorFlow2-维度变换(7)

维度变换

简介

  • shape, ndim
  • reshape
  • expand_dims/squeeze
  • transpose
  • broadcast_to

View

  • 图片信息 [b,h,w]
  • [b, 28, 28]
  • →[b, 28*28]
  • →[b, 2, 14*28]
  • →[b, 28, 28, 1] [b,h,w,c]
import tensorflow as tf
import numpy as np

Reshape

a=tf.random.normal([4,28,28,3])
a.shape, a.ndim
(TensorShape([4, 28, 28, 3]), 4)
# 重置维度
tf.reshape(a, [4,28*28,3]).shape
TensorShape([4, 784, 3])
tf.reshape(a, [4,784,3]).shape
TensorShape([4, 784, 3])
# -1 自动推断
tf.reshape(a,[4,-1,3]).shape
TensorShape([4, 784, 3])
tf.reshape(a,[4,784*3]).shape
TensorShape([4, 2352])
tf.reshape(a,[4,-1]).shape
TensorShape([4, 2352])

Reshape is flexib 重塑是灵活的

a=tf.random.normal([4,28,28,3])
a.shape
TensorShape([4, 28, 28, 3])
tf.reshape(tf.reshape(a,[4,-1]),[4,28,28,3]).shape
TensorShape([4, 28, 28, 3])
tf.reshape(tf.reshape(a,[4,-1]),[4,14,56,3]).shape
TensorShape([4, 14, 56, 3])
tf.reshape(tf.reshape(a,[4,-1]),[4,1,784,3]).shape
TensorShape([4, 1, 784, 3])

Reshape could lead to potential bugs!

  • 重塑可能会导致潜在的错误
  • images: [4, 28, 28, 3]
    • [b, h, w, 3]
  • reshape to: [4, 784, 3]
    • [b, pixel, 3]
  • [4, 784, 3] ℎ𝑒𝑖𝑔ℎ𝑡: 28, 𝑤𝑖𝑑𝑡ℎ: 28 [4, 28, 28, 3]
  • [4, 784, 3] ℎ𝑒𝑖𝑔ℎ𝑡: 14, 𝑤𝑖𝑑𝑡ℎ: 56 [4, 14, 56, 3]
  • [4, 784, 3] 𝑤𝑖𝑑𝑡ℎ: 28, ℎ𝑒𝑖𝑔ℎ𝑡: 28 [4, 28, 28, 3]

tf.transpo 矩阵转置

  • [h,w] -> [w,h]
a=tf.random.normal([4,3,2,1])
a.shape
TensorShape([4, 3, 2, 1])
tf.transpose(a).shape
TensorShape([1, 2, 3, 4])
# 按维度索引转置
tf.transpose(a, perm=[0,1,3,2]).shape
TensorShape([4, 3, 1, 2])

[b,h,w,c]→[b, 3, h, w]

a=tf.random.normal([4,28,28,3])
a.shape
TensorShape([4, 28, 28, 3])
tf.transpose(a,[0,2,1,3]).shape
TensorShape([4, 28, 28, 3])
tf.transpose(a,[0,3,2,1]).shape
TensorShape([4, 3, 28, 28])
tf.transpose(a,[0,3,1,2]).shape
TensorShape([4, 3, 28, 28])

Squeeze VS Expand_dims

挤压与扩张

Expand dim

  • a: [classes, students, classes]
    • [4, 35, 8]
  • add school dim
  • [1, 4, 35, 8] + [1, 4, 35, 8]
    • [2, 4, 35, 8]
a=tf.random.normal([4,35,8])
a.shape
TensorShape([4, 35, 8])
# 在第0维添加一个轴
tf.expand_dims(a,axis=0).shape
TensorShape([1, 4, 35, 8])
# 在第2维添加一个轴
tf.expand_dims(a,axis=2).shape
TensorShape([4, 35, 1, 8])
# 在第3维添加一个轴
tf.expand_dims(a,axis=3).shape
TensorShape([4, 35, 8, 1])
tf.expand_dims(a,axis=-1).shape
TensorShape([4, 35, 8, 1])
tf.expand_dims(a,axis=-4).shape
TensorShape([1, 4, 35, 8])

Squeeze dim

  • Only squeeze for shape=1 dim (仅挤压shape=1 dim)
  • [4, 35, 8, 1]
  • [1, 4, 35, 8]
  • [1, 4, 35, 1]
tf.squeeze(tf.zeros([1,2,1,1,3])).shape
TensorShape([2, 3])
a=tf.zeros([1,2,1,3])
a.shape
TensorShape([1, 2, 1, 3])
tf.squeeze(a, axis=0).shape
TensorShape([2, 1, 3])
tf.squeeze(a, axis=2).shape
TensorShape([1, 2, 3])
tf.squeeze(a, axis=-2).shape
TensorShape([1, 2, 3])
tf.squeeze(a, axis=-4).shape
TensorShape([2, 1, 3])
# 仅挤压shape=1 dim
# tf.squeeze(a, axis=1).shape
以下是使用TensorFlow 2实现Embedded Gaussian中的两个嵌入权重变换的代码示例: ```python import tensorflow as tf class EmbeddingWeightTransform(tf.keras.layers.Layer): def __init__(self, d_model, num_heads, name="embedding_weight_transform"): super().__init__(name=name) self.d_model = d_model self.num_heads = num_heads # define the layers for the embedding weight transform self.dense1 = tf.keras.layers.Dense(d_model, name="dense1") self.reshape = tf.keras.layers.Reshape((num_heads, d_model // num_heads), name="reshape") self.transpose = tf.keras.layers.Permute((2, 1), name="transpose") def call(self, inputs): # apply the embedding weight transform x = self.dense1(inputs) x = self.reshape(x) x = self.transpose(x) return x ``` 其中,`EmbeddingWeightTransform`是一个自定义的Keras层,它接收一个形状为`(batch_size, seq_len, d_model)`的输入张量,并返回一个形状为`(batch_size, num_heads, d_model // num_heads, seq_len)`的输出张量。这个层内部实现了Embedded Gaussian中的两个嵌入权重变换: 1. 将输入张量通过一个全连接层`(batch_size, seq_len, d_model) -> (batch_size, seq_len, d_model)`,得到一个形状为`(batch_size, seq_len, d_model)`的张量。 2. 将全连接层的输出张量通过一个`Reshape`层`(batch_size, seq_len, d_model) -> (batch_size, num_heads, d_model // num_heads, seq_len)`,得到一个形状为`(batch_size, num_heads, d_model // num_heads, seq_len)`的张量。 3. 将`Reshape`层的输出张量通过一个`Permute`层`(batch_size, num_heads, d_model // num_heads, seq_len) -> (batch_size, d_model // num_heads, num_heads, seq_len)`,得到一个形状为`(batch_size, d_model // num_heads, num_heads, seq_len)`的张量。 下面是另一个嵌入权重变换的代码示例: ```python class EmbeddingWeightTransform(tf.keras.layers.Layer): def __init__(self, d_model, num_heads, name="embedding_weight_transform"): super().__init__(name=name) self.d_model = d_model self.num_heads = num_heads # define the layers for the embedding weight transform self.dense1 = tf.keras.layers.Dense(d_model, name="dense1") self.reshape1 = tf.keras.layers.Reshape((num_heads, -1), name="reshape1") self.permute = tf.keras.layers.Permute((2, 1), name="permute") self.dense2 = tf.keras.layers.Dense(d_model, name="dense2") self.reshape2 = tf.keras.layers.Reshape((-1, num_heads, d_model // num_heads), name="reshape2") def call(self, inputs): # apply the embedding weight transform x = self.dense1(inputs) x = self.reshape1(x) x = self.permute(x) x = self.dense2(x) x = self.reshape2(x) return x ``` 与前面的实现相比,这个实现在`Reshape`层中使用了占位符`-1`,以便根据其他维度自动推断出该维度大小。此外,这个实现还添加了一个额外的全连接层和一个`Reshape`层。这两个层的作用是将输入张量从形状`(batch_size, seq_len, d_model)`转换为形状`(batch_size, num_heads, d_model // num_heads, seq_len)`。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值