文章来源
突出贡献
In this paper, we propose an architecture that distills this insight into a simple connectivity pattern: to ensure maximum information flow between layers in the network, we connect all layers (with matching feature-map sizes) directly with each other. To preserve the feed-forward nature, each layer obtains additional inputs from all preceding layers and passes on its own feature-maps to all subsequent
layers. Crucially, in contrast to ResNets, we never combine features
through summation before they are passed into a layer; instead, we combine features by concatenating them.
模型
对其中的121层的模型进行显示,如下图所示。为了显示得更多,我对其中第二个DenseBlock变为只剩头和尾的部分,第三层的也是如同处理。