0. 应用实例
现在有一个问题在你面前:请使用Tensorflow构造一个 5 × 5 5 \times 5 5×5 的网格坐标。
思考中。。。。。。
解决方法:
import tensorflow as tf
x = tf.tile(tf.range(5, dtype=tf.int32)[tf.newaxis, :], [5, 1])
y = tf.tile(tf.range(5, dtype=tf.int32)[:, tf.newaxis], [1, 5])
xy_grid = tf.concat([x[:, :, tf.newaxis], y[:, :, tf.newaxis]], axis=-1)
with tf.Session() as sess:
print(sess.run(xy_grid))
结果展示(图像坐标形式的网格,即 (0, 0)在左上角):
[[[0 0] [1 0] [2 0] [3 0] [4 0]]
[[0 1] [1 1] [2 1] [3 1] [4 1]]
[[0 2] [1 2] [2 2] [3 2] [4 2]]
[[0 3] [1 3] [2 3] [3 3] [4 3]]
[[0 4] [1 4] [2 4] [3 4] [4 4]]]
如果你对该方法存在疑惑,那么你可以继续往下看了。
1. tf.range()
函数定义:
tf.range(start, limit=None, delta=1, dtype=None, name="range")
函数作用:该函数相当于Python内置函数range()
,用来生成一个1维的序列张量,序列数的范围为 [ start, limit ),增量为delta 。
实例:
start = 2
limit = 10
delta = 3
tf.range(start, limit, delta) # [2, 5, 8]
start = 3
limit = 1
delta = -0.5
tf.range(start, limit, delta) # [3, 2.5, 2, 1.5]
limit = 5
tf.range(limit) # [0, 1, 2, 3, 4]
2. tf.newaxis
作用:tf.newaxis
是用来做张量维数扩展的,张量维数扩展也可以使用tf.expand_dims()
,个人更喜欢用tf.newaxis
,原因是tf.newaxis
不需要像tf.expand_dims()
那样指定axis。
实例:
从下面的代码可以看出,tf.newaxis
可以方便且直观在你想要的一个或多个维度进行维度拓展,在编写代码时,是以结果为导向的,我们可以很清楚地知道扩展后的张量的shape。
import tensorflow as tf
x = tf.range(5, dtype=tf.int32)
new_x1 = x[:, tf.newaxis]
new_x2 = x[tf.newaxis, :]
new_x3 = new_x2[:, tf.newaxis, :, tf.newaxis]
print(x.get_shape()) # (5,)
print(new_x1.get_shape()) # (5, 1)
print(new_x2.get_shape()) # (1, 5)
print(new_x3.get_shape()) # (1, 1, 5, 1)
with tf.Session() as sess:
print(sess.run(x)) # [0 1 2 3 4]
print(sess.run(new_x1)) # [[0] [1] [2] [3] [4]]
print(sess.run(new_x2)) # [[0 1 2 3 4]]
3. tf.tile()
函数定义:
tf.tile(input, multiples, name=None)
函数作用:tile
的英文翻译为“瓷砖”、“瓦片”,意思是我们可以像铺瓷砖一样对输入张量input
进行复制,最终铺出我们想要的新张量,你可以类比 Numpy 的 Broadcast 机制。
注意:1维的multiples
的维度(即,元素个数),必须与输入张量input
的维数相同!
实例:
import tensorflow as tf
x = tf.range(3, dtype=tf.int32)
new_x1 = x[tf.newaxis, :]
# 以new_x1为瓷砖,竖向地铺2次瓷砖,最终行数增加2倍
new_x2 = tf.tile(new_x1, [2, 1])
# 以new_x2为瓷砖,竖向地铺3次瓷砖,最终行数增加3倍,
# 同时横向地铺2次瓷砖,最终列数增加2倍,
new_x3 = tf.tile(new_x2, [3, 2])
with tf.Session() as sess:
print(sess.run(x))
print(sess.run(new_x1))
print(sess.run(new_x2))
print(sess.run(new_x3))
输出结果:
[0 1 2]
[[0 1 2]]
[[0 1 2]
[0 1 2]]
[[0 1 2 0 1 2]
[0 1 2 0 1 2]
[0 1 2 0 1 2]
[0 1 2 0 1 2]
[0 1 2 0 1 2]
[0 1 2 0 1 2]]
4. tf.concat()
函数定义:
tf.concat(values, axis, name="concat")
直接给出tf.concat()
的官方说明文档:
“”"Concatenates tensors along one dimension.
Concatenates the list of tensors values
along dimension axis
. If
values[i].shape = [D0, D1, ... Daxis(i), ...Dn]
, the concatenated
result has shape
[D0, D1, ... Raxis, ...Dn]
where
Raxis = sum(Daxis(i))
That is, the data from the input tensors is joined along the axis
dimension.
The number of dimensions of the input tensors must match, and all dimensions
except axis
must be equal.
For example:
t1 = [[1, 2, 3], [4, 5, 6]]
t2 = [[7, 8, 9], [10, 11, 12]]
tf.concat([t1, t2], 0) # [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12]]
tf.concat([t1, t2], 1) # [[1, 2, 3, 7, 8, 9], [4, 5, 6, 10, 11, 12]]
# tensor t3 with shape [2, 3]
# tensor t4 with shape [2, 3]
tf.shape(tf.concat([t3, t4], 0)) # [4, 3]
tf.shape(tf.concat([t3, t4], 1)) # [2, 6]
As in Python, the axis
could also be negative numbers. Negative axis
are interpreted as counting from the end of the rank, i.e.,
axis + rank(values)
-th dimension.
For example:
t1 = [[[1, 2], [2, 3]], [[4, 4], [5, 3]]]
t2 = [[[7, 4], [8, 4]], [[2, 10], [15, 11]]]
tf.concat([t1, t2], -1)
would produce:
[[[ 1, 2, 7, 4],
[ 2, 3, 8, 4]],
[[ 4, 4, 2, 10],
[ 5, 3, 15, 11]]]
Note: If you are concatenating along a new axis consider using stack.
E.g.
tf.concat([tf.expand_dims(t, axis) for t in tensors], axis)
can be rewritten as
tf.stack(tensors, axis=axis)
Args:
values: A list of Tensor
objects or a single Tensor
.
axis: 0-D int32
Tensor
. Dimension along which to concatenate. Must be
in the range [-rank(values), rank(values))
. As in Python, indexing
for axis is 0-based. Positive axis in the rage of
[0, rank(values))
refers to axis
-th dimension. And negative axis
refers to axis + rank(values)
-th dimension.
name: A name for the operation (optional).
Returns:
A Tensor
resulting from concatenation of the input tensors.
“”"
结束语:现在你回过头去看文章一开始提出的应用实例,你会豁然开朗。