1209
return photo_filenames, sorted(class_names)
这步排序有什么意义呢?
1204
batch_size和batch_num有什么区别吗?把batch_size设置成5,是不是意味着每次从总数中取出5个
1203
.set_shape和tf.reshape()有区别吗?
optimizer = tf.train.RMSPropOptimizer(0.001, 0.9)为啥两个变量?
1201
pycharm中显示某个变量,比如我显示的是mnist数据集(55000,28,28,1)这里面应该是由1和0组成,但是变量都成省略号了,有办法都显示出来么?
1130
lines = filter(None, lines)
Inception V2
min_depth=16,
depth_multiplier=1.0
“`
min_depth: Minimum depth value (number of channels) for all convolution ops.
Enforced when depth_multiplier < 1, and not an active constraint when
depth_multiplier >= 1.
depth_multiplier: Float multiplier for the depth (number of channels)
for all convolution ops. The value must be greater than zero. Typical
usage will be to set this value in (0, 1) to reduce the number of
parameters or computation cost of the model.
为何引入的depth
注解中说的最小的深度值是通道的数量?
对比下V1:
- 加入了BN,毕竟原文通篇在讲这东西,但是为什么程序中没看见过batch_norm,就在结束的时候的avg_scope里面看到了一句定义
with slim.arg_scope([slim.batch_norm, slim.dropout],is_training=is_training): - 另一个改变是把分支的5X5都变成了2个3X3,第一个3X3后面,设置了服从(0.09)的高斯分布weights_initializer=trunc_normal(0.09),如果是先池化,后卷积的话,0.09变成了0.1,有什么根据吗