卷积神经网络前向、反向传播公式推导

最近闲来无事,手推了一下CNN的反向传播。但是发现网上的blog大多只是简单的从常用的一些参数设置角度对每一种层进行了推导(以及互相抄袭互相转载),但是事实上CVer们为了发论文才不会只使用最基本的层的参数设置,所以这里除了把一些大家都讲烂了的基本情况抄袭一遍之外,增加一些比较有趣的问题。

注意,本文没有指责其他人的推导是错误的,只是补充了一些在实际应用中会遇到的情况

虽然本文很长公式很多,其实都是把手写的很多过程搬了上来为了能够更容易理解而已

这里按照AlexNet结构的CNN模型,先讲解前向传播再反推反向传播。另外,一般研究能读到这里的都是来找反向传播的,所以前向我就简单些,反正已经烂大街了。Pytorch官网给出的AlexNet结构如下:

Conv2d(3, 64, kernel_size=11, stride=4, padding=2),
ReLU(inplace=True),
MaxPool2d(kernel_size=3, stride=2),
Conv2d(64, 192, kernel_size=5, padding=2),
ReLU(inplace=True),
MaxPool2d(kernel_size=3, stride=2),
Conv2d(192, 384, kernel_size=3, padding=1),
ReLU(inplace=True),
Conv2d(384, 256, kernel_size=3, padding=1),
ReLU(inplace=True),
Conv2d(256, 256, kernel_size=3, padding=1),
ReLU(inplace=True),
MaxPool2d(kernel_size=3, stride=2),
Dropout(),
Linear(256 * 6 * 6, 4096),
ReLU(inplace=True),
Dropout()
Linear(4096, 4096),
ReLU(inplace=True),
Linear(4096, num_classes),

其构成主要是卷积层、非线性激活函数层、池化层、全连接层,当然后面新的网络还增加了BN层,pytorch官方给出的还去掉了Local Response Normalized层。总之这个模型很简单,至于复杂的东西后面再慢慢加上。

前向传播

对于这些层的物理意义、数学本质,请移步其他专门讲这个的blog,由于我的关注点在于数学公式推导上面,所以就不深入讲解了

1、卷积层
卷积层顾名思义,执行卷积操作,计算公式如下:
y n o u t , i , j = ∑ n i n ∑ k x ∑ k y w n o u t , n i n , k x , k y × x n i n , i + k x , j + k y y_{n_{out},i,j}=\sum_{n_{in}} \sum_{k_x} \sum_{k_y}w_{n_{out},n_{in},k_x,k_y}\times x_{n_{in},i+k_x,j+k_y} ynout,i,j=ninkxkywnout,nin,kx,ky×xnin,i+kx,j+ky
定义输入 x x x为输入特征图, w w w为卷积核权重,输出 y y y为输出特征图, n o u t n_{out} nout为输出通道的序号, n i n n_{in} nin为输入通道的序号, k x , k y k_x,k_y kxky是卷积核权重位置, i + k x , j + k y i+k_x,j+k_y i+kxj+ky是当前参与卷积的输入特征图的像素位置, i , j i,j i,j是输出特征图的像素位置。

一个卷积层的输入是一个三维的特征图,换句话说,是由许多个二维的特征图组成,每个二维特征图称为一个输入通道。一个卷积层的权重由多个三维卷积核组成,卷积核个数称为输出通道个数,输出特征图由多个二维特征图组成,个数等于输出通道个数。

这里偷了张经典老图,具体最早起源哪里我也说不清楚了。卷积核在输入特征图上面进行滑动,每个输出对应的是多个输入与卷积核权重的乘加,在卷积核滑动过程中,卷积核权重并不会发生变化,也就是说这部分权重在整个卷积层内共享,因此卷积层对存储空间的需求相对较小。而由于输入特征图上面的每一个像素都需要大量的乘加运算,因此卷积层的计算量较大。所以很多论文里面称之为计算密集型层。

需要注意的是,此处的滑动仅进行二维滑动,在输入通道维度是一一对应的并不会滑动。
在这里插入图片描述
另外值得注意的是,这个图中表示了步长为1的卷积,这里的步长是指卷积核在特征图上面进行滑动的时候一次滑动的像素个数。而实际使用当中,比如本文示例模型,在第一个卷积层使用了步长为4的卷积,第二个卷积层使用了步长为2的卷积。

除此之外,还有空洞卷积(或者叫扩张卷积,英文原文是dilated convolution),这里有一个变量“dilation(我也不知道该怎么翻译更合适,有些翻译成空洞,有些翻译成扩张数,我就直接使用pytorch中conv2d类的输入变量名来代替了)”是指,二维卷积核中相邻两个权重坐标的距离,比如正常的卷积,其dilation是1,第一个权重坐标为 ( 0 , 0 ) (0,0) (0,0),第二个权重坐标为 ( 0 , 1 ) (0,1) (0,1)。而实际上会出现dilation大于1的情况,那么第二个权重坐标为 ( 0 , d i l a t i o n ) (0,dilation) (0,dilation),以此类推,横纵的距离均为dilation。下面给出尺寸为 3 × 3 3\times3 3×3,dilation为1和2的卷积核
[ 1 , 1 , 1 1 , 1 , 1 1 , 1 , 1 ] \left[ \begin{array}{lr} 1,1,1\\ 1,1,1\\ 1,1,1\\ \end{array}\right] 1,1,11,1,11,1,1 [ 1 , 0 , 1 , 0 , 1 0 , 0 , 0 , 0 , 0 1 , 0 , 1 , 0 , 1 0 , 0 , 0 , 0 , 0 1 , 0 , 1 , 0 , 1 ] \left[ \begin{array}{lr} 1,0,1,0,1\\ 0,0,0,0,0\\ 1,0,1,0,1\\ 0,0,0,0,0\\ 1,0,1,0,1\\ \end{array}\right] 1,0,1,0,10,0,0,0,01,0,1,0,10,0,0,0,01,0,1,0,1

另外还有一个很常用的参数是padding,也就是为了保证图像或者是特征图的边缘信息能够被更好的利用,在图像外边缘补零。padding的数量就是补零的个数。比如padding=1,那么图像会:
[ 1 , 1 , 1 1 , 1 , 1 1 , 1 , 1 ] ⇒ [ 0 , 0 , 0 , 0 , 0 0 , 1 , 1 , 1 , 0 0 , 1 , 1 , 1 , 0 0 , 1 , 1 , 1 , 0 0 , 0 , 0 , 0 , 0 ] \left[ \begin{array}{lr} 1,1,1\\ 1,1,1\\ 1,1,1\\ \end{array}\right]\Rightarrow\left[ \begin{array}{lr} 0,0,0,0,0\\ 0,1,1,1,0\\ 0,1,1,1,0\\ 0,1,1,1,0\\ 0,0,0,0,0\\ \end{array}\right] 1,1,11,1,11,1,10,0,0,0,00,1,1,1,00,1,1,1,00,1,1,1,00,0,0,0,0

这里要划重点,因为后面在计算梯度的时候,步长,padding和dilation可是很重要的一个变量

2、非线性激活函数
非线性激活函数有很多种,比如经典的 s i g m o i d sigmoid sigmoid
y = 1 1 + e x y=\frac{1}{1+e^x} y=1+ex1
再比如现在很常用的 R e L U ReLU ReLU
y = { x , x > 0 0 , x ⩽ 0 y=\left\{ \begin{array}{lr} x, & x>0 \\ 0, & x\leqslant0\\ \end{array} \right. y={x,0,x>0x0
非线性激活函数大体上来说能够提高模型的非线性程度,从而提升模型的表达能力。当然每个激活函数的提出都有其物理意义和数学意义,不能一言以蔽之,可以去找到最早提出的论文进行学习。

3、池化层

池化层,对英文的pooling layer进行了直译,操作是将池化窗口内的数据以一定的规则选出或者计算得到一个数据,从而得到降维的目的。常见的池化层有均值池化:
y m , n = 1 k x × k y × ∑ i = 0 , j = 0 k x , k y x m + i , m + j y_{m,n}=\frac{1}{k_x\times k_y}\times \sum_{i=0,j=0}^{k_x,k_y}{x_{m+i,m+j}} ym,n=kx×ky1×i=0,j=0kx,kyxm+i,m+j
最大值池化:
y m , n = m a x { x m + i , m + j } , i ∈ [ 0 , k x ] , j ∈ [ 0 , k y ] y_{m,n}=max\{ x_{m+i,m+j}\},i\in [0,k_x],j\in [0,k_y] ym,n=max{xm+i,m+j},i[0,kx],j[0,ky]
用于求和或者求最大值的数据来自于同一个池化窗口,这个操作类似于卷积,也是一个窗口在输入特征图上面进行滑动,只不过进行的操作是不一样的。另外值得注意的是,我们一般池化操作是二维操作,不同的通道分别进行。

注意,这里也有一个步长的问题,和卷积层是一样的,后面也要考。 不过这里没有dilation了,我查了一下似乎没有空洞池化23333,如果有人有兴趣做一下实验,看看效果好不好,记得反馈给我哦。

4、全连接层

全连接层最容易理解,也被称作线性层,事实上他确实就是线性公式:
y = w ∗ x + b y=w*x+b y=wx+b

4.1 dropout层

这一层是一个防止过拟合的trick,就是随机的只更新一部分参数。如果单纯的进行前向传播其实不需要管他,如果涉及到训练的话,那么事实上就是每次在训练的时候训练的是一个子网络,这并不会妨碍我们的计算公式,只不过是计算其中的一部分,更改一下尺寸就好,所以不再赘述。细节可以移步其他地方,比如Hinton大佬的原文。

5、损失函数

损失函数,或者可以说是代价函数,用于定量的度量当前模型和期望模型之间的差距。通过对这个差距和梯度下降(后来是随机梯度下降SGD,再后来又有Adam等其他的优化方法)等方法的使用,来优化模型。常见的损失函数有绝对误差:
L = 1 m ∑ i = 0 N − 1 ∣ y i − y ^ i ∣ L=\frac1m\sum_{i=0}^{N-1}|y_i-\hat y_i| L=m1i=0N1yiy^i均方误差:
L = 1 2 m ∑ i = 0 N − 1 ( y i − y ^ i ) 2 L=\frac1{2m}\sum_{i=0}^{N-1}(y_i-\hat y_i)^2 L=2m1i=0N1(yiy^i)2交叉熵:
L = 1 m ∑ i = 0 N − 1 − y × l o g y ^ L=\frac1m\sum_{i=0}^{N-1}-y\times log\hat y L=m1i=0N1y×logy^等。

这里着重讲解一下交叉熵的公式(参考这里),因为这个损失函数的使用是非常广泛的,另外在后面讲解反向传播的时候也会以此为例,以及反向传播的计算过程也能体现出交叉熵损失函数的优势。
首先介绍几个概念

  1. 概率:一件事情发生的可能性。对于事件 x x x,其概率为 p ( x ) p(x) p(x)
  2. 信息量:用于衡量一个事件的不确定性,其概率越大,不确定性越小,则携带的信息量越小。对于事件 x x x,其信息量为: I ( x ) = − l o g ( p ( x ) ) I(x)=-log(p(x)) I(x)=log(p(x))
  3. 信息熵:整个系统中信息量的期望。信息量衡量的是单个事件,而熵衡量的是整个系统。对于系统 X = [ x 0 , x 1 . . x N − 1 ] X=[x_0,x_1..x_{N-1}] X=[x0,x1..xN1],其熵为 H ( p ) = − ∑ i = 0 N − 1 p ( x i ) l o g ( p ( x i ) ) H(p)=-\sum_{i=0}^{N-1} p(x_i)log(p(x_i)) H(p)=i=0N1p(xi)log(p(xi))熵是来自于热力学的定义,表征的是系统的混乱程度,在信息论里面则类比一下,表示的是系统的不确定性,不确定性越大,熵值越大。
  4. KL散度:也称KL距离,表示一个事件两种不同概率分布模型之间的距离。对于事件 x x x,两种概率分布模型分别为 p ( x ) , q ( x ) p(x),q(x) p(x),q(x),二者的KL散度为: D K L ( p ∣ ∣ q ) = ∑ i = 0 N − 1 p ( x i ) l o g ( p ( x i ) q ( x i ) ) = ∑ i = 0 N − 1 p ( x i ) l o g ( p ( x i ) ) − p ( x i ) l o g ( q ( x i ) ) = − H ( p ) + ( ∑ i = 0 N − 1 − p ( x i ) l o g ( q ( x i ) ) ) \begin{aligned} D_{KL}(p||q) & =\sum_{i=0}^{N-1} p(x_i)log\left( \frac{p(x_i)}{q(x_i)} \right)\\ & =\sum_{i=0}^{N-1} p(x_i)log(p(x_i))-p(x_i)log(q(x_i))\\ & =-H(p)+\left(\sum_{i=0}^{N-1} -p(x_i)log(q(x_i))\right)\end{aligned} DKL(pq)=i=0N1p(xi)log(q(xi)p(xi))=i=0N1p(xi)log(p(xi))p(xi)log(q(xi))=H(p)+(i=0N1p(xi)log(q(xi)))也就是说,KL散度就是信息熵减去一个东西
  5. 交叉熵:交叉熵就是前面说的那个东西, H ( p , q ) = − ∑ i = 0 N − 1 p ( x i ) l o g ( p ( x i ) H(p,q)=-\sum_{i=0}^{N-1} p(x_i)log(p(x_i) H(p,q)=i=0N1p(xi)log(p(xi)

实际应用的时候,我们评价当前模型和期望模型之间的关系,需要使用KL散度来进行度量,也就是当前模型和期望模型之间的KL散度,因此我们可以使用这个量作为损失函数进行优化,尽可能的减小KL散度。

假设 p p p为期望模型的分布, q q q为当前模型的分布,那么就是最小化 D K L ( p ∣ ∣ q ) = − H ( p ) + H ( p , q ) D_{KL}(p||q)=-H(p)+H(p,q) DKL(pq)=H(p)+H(p,q)。观察一下,事实上由于期望模型是不再变化的,所以 − H ( p ) -H(p) H(p)是一个定值,因此我们要最小化的就是 H ( p , q ) H(p,q) H(p,q),也就是交叉熵,所以在当前机器学习应用当中,我们使用交叉熵来作为损失函数。

但是呢,由于交叉熵的输入是概率值,因此要在全连接层的输出后面增加softmax层,将全连接层的输出转化为概率(类似于一个归一化):
y i = e x i ∑ j = 0 N − 1 e x j y_i=\frac{e^{x_i}}{\sum_{j=0}^{N-1}e^{x_j}} yi=j=0N1exjexi

至于通过通过最大似然的推导得到交叉熵的物理意义,还请移步其他blog(或许前面我提到的参考就可以),这里就不再多说啦,偏离了我们的正题。

反向传播

下面重头戏来了,反向传播计算梯度。既然是反向传播,那就反过来讲。

1、损失函数

接着前面的交叉熵来说,由于KL散度中的信息熵部分不发生变化,所以可以不再考虑,另一方面常数的梯度为0,不再发生变化,这也是可以忽略这一部分的原因之一。

现在假设我们得到了损失值 L L L。目标模型的输出为 Y = [ y 0 , y 1 , . . . y N − 1 ] Y=[y_0,y_1,...y_{N-1}] Y=[y0,y1,...yN1],当前模型输出(也就是softmax的输出)为 Y ^ = [ y ^ 0 , y ^ 1 , . . . , y ^ N − 1 ] \hat Y=[\hat y_0,\hat y_1,...,\hat y_{N-1}] Y^=[y^0,y^1,...,y^N1],softmax的输入(在前面放出来的模型里面也就是模型最终全连接层的输出)为 x = [ x 0 , x 1 , . . . , x N − 1 ] x=[x_0,x_1,...,x_{N-1}] x=[x0,x1,...,xN1]

首先根据前向传播经过softmax, Y ^ \hat Y Y^ P P P存在以下关系:
y ^ i = e x i ∑ j = 0 N − 1 e x j \hat y_i=\frac{e^{x_i}}{\sum_{j=0}^{N-1}e^{x_j}} y^i=j=0N1exjexi
Y , Y ^ Y,\hat Y Y,Y^ L L L存在以下关系:
L = − ∑ i = 0 N − 1 y i l o g ( y ^ i ) L=-\sum_{i=0}^{N-1} y_ilog(\hat y_i) L=i=0N1yilog(y^i)
前面曾经说过, Y Y Y是固定不变的,求的是 L L L关于 Y ^ \hat Y Y^的梯度:
∂ L ∂ y ^ i = − y i y ^ i \frac {\partial L}{\partial \hat y_i}=-\frac{y_i}{\hat y_i} y^iL=y^iyi
进一步的,我们去求 L L L关于 P P P的导数。
∂ L ∂ x i = ∑ j = 0 N − 1 ∂ L ∂ y ^ j ∂ y ^ j ∂ x ^ i \frac {\partial L}{\partial x_i}=\sum _{j=0}^{N-1}\frac {\partial L}{\partial \hat y_j}\frac{\partial \hat y_j}{\partial \hat x_i} xiL=j=0N1y^jLx^iy^j
其中
∂ y ^ j ∂ x i = ∂ e x i ∑ j = 0 N − 1 e x j ∂ x i = ∂ e x i ∑ j = 0 N − 1 e x j ∂ e x i ∂ e x i ∂ x i = { y ^ i ( 1 − y ^ i ) , i = j − y ^ i y ^ j , i ≠ j \begin{aligned} \frac{\partial \hat y_j}{\partial x_i} &=\frac{\partial \frac{e^{x_i}}{\sum_{j=0}^{N-1}e^{x_j}} }{\partial {x_i}} \\&=\frac{\partial \frac{e^{x_i}}{\sum_{j=0}^{N-1}e^{x_j}} }{\partial e^{x_i}}\frac {\partial e^{x_i}}{\partial x_i} \\&=\left\{ \begin{array}{lr} \hat y_i\left(1-\hat y_i\right), & i=j \\ -\hat y_i \hat y_j, & i\neq j\\ \end{array} \right. \end{aligned} xiy^j=xij=0N1exjexi=exij=0N1exjexixiexi={y^i(1y^i),y^iy^j,i=ji=j
也就是说
∂ L ∂ x i = ∑ j = 0 , i ≠ j N − 1 ( − y j y ^ j ) × ( − y ^ i y ^ j ) + ( − y i y ^ i ) × y ^ i ( 1 − y ^ i ) = ∑ i = 0 , i ≠ j N − 1 ( y j y ^ i ) + y i y ^ i − y i = y ^ i − y i \begin{aligned} \frac {\partial L}{\partial x_i}&=\sum _{j=0,i\neq j}^{N-1}\left(-\frac{y_j}{\hat y_j} \right)\times \left(-\hat y_i\hat y_j\right)+\left(-\frac{y_i}{\hat y_i}\right)\times\hat y_i\left(1-\hat y_i\right)\\&=\sum_{i=0,i\neq j}^{N-1}\left(y_j\hat y_i\right)+y_i\hat y_i-y_i\\&=\hat y_i-y_i \end{aligned} xiL=j=0,i=jN1(y^jyj)×(y^iy^j)+(y^iyi)×y^i(1y^i)=i=0,i=jN1(yjy^i)+yiy^iyi=y^iyi
非常简洁的结果。前面说过softmax+交叉熵好啊,不仅仅其含义,算起梯度来也是简单的不行。

注意,这里我们用到了所谓的链式法则,也就是一层一层逐级往前递推。 首先是计算softmax输出的梯度,在计算全连接层输出的梯度,每次只需要计算当前层的梯度与前一层传递过来的倒数进行合并。我们将前面的公式重新按照这种方式写下:
∂ L ∂ x = ∂ L ∂ y ^ ∂ y ^ ∂ x \frac {\partial L}{\partial x}=\frac {\partial L}{\partial \hat y}\frac{\partial \hat y}{\partial x} xL=y^Lxy^
2、全连接层

前面我们得到的是模型损失关于全连接层输出的偏导数,根据链式法则,我们只需要得到当前层的梯度,再与前面传递过来的梯度进行合并即可。

假设传递过来的梯度为 δ l + 1 \delta_{l+1} δl+1,我们看一下前向传播的公式:
y = w x + b y=wx+b y=wx+b
所以输出对于输入的梯度就是
∂ y j ∂ x i = w i , j \frac {\partial y_j}{\partial x_i}=w_{i,j} xiyj=wi,j
按照前面的说法,我们将传递过来的梯度与当前层计算梯度进行合并
∂ L ∂ x i = ∑ j = 0 N − 1 ∂ L ∂ y j ∂ y j ∂ x i \frac {\partial L}{\partial x_i}=\sum_{j=0}^{N-1}\frac {\partial L}{\partial y_j}\frac{\partial y_j}{\partial x_i} xiL=j=0N1yjLxiyj
根据权重和输入输出的尺寸计算关系,得到矩阵形式的计算公式(或者可以将原始公式转化为 w T y = x w^Ty=x wTy=x,由于计算梯度,常数 b b b忽略):
∂ L ∂ x = w T δ l + 1 \frac {\partial L}{\partial x}=w^T\delta_{l+1} xL=wTδl+1
这样我们就活得了通过全连接层进行传递后得到的梯度。

同理,对于公式中的另外两个变量有如下梯度:
∂ L ∂ w = δ l + 1 x T \frac {\partial L}{\partial w}=\delta_{l+1}x^T wL=δl+1xT ∂ L ∂ b = δ l + 1 \frac{\partial L}{\partial b}=\delta_{l+1} bL=δl+1
3、非线性激活函数
以模型里面用的ReLU为例:
y = { x , x > 0 0 , x ⩽ 0 y=\left\{ \begin{array}{lr} x, & x>0 \\ 0, & x\leqslant0\\ \end{array} \right. y={x,0,x>0x0
该层梯度很容易计算:
∂ y i ∂ x i = { 1 , x i > 0 0 , x i ⩽ 0 \frac{\partial y_i}{\partial x_i}=\left\{ \begin{array}{lr} 1, & x_i>0 \\ 0, & x_i\leqslant0\\ \end{array} \right. xiyi={1,0,xi>0xi0
所以很明显我们可以看出来,输出梯度与输入梯度是一一对应的。因此这里引入一个新的计算符号 ⊙ \odot ,这个符号表示Hadamard积,是指同一尺寸向量或者矩阵内元素一一对应相乘。

因此对于所有的激活函数,根据链式法则,仍然假设前面传入的梯度为 δ l + 1 \delta_{l+1} δl+1,假设非线性激活函数的梯度为 σ ′ \sigma' σ那么:
∂ L ∂ x = δ l + 1 ⊙ σ ′ \frac{\partial L}{\partial x}=\delta_{l+1}\odot \sigma' xL=δl+1σ
其他blog的结论一样,那确实毕竟我们推导使用了一样的激活函数。但是问题来了,是不是所有激活函数都是一对一输出的?按理说我没见过其他的,softmax虽然也叫激活函数,但是一般他都不会用在中间,一般都是和交叉熵一起用在最后,所以多数时候还是可以放心使用的。只不过,如果真的某一天出现了比较魔幻的激活函数。。。。总之慢慢往回推雅克比肯定是没问题的

3.1、dropout层

前面说过,跳过。

4、池化层

本文示例模型再往前推是最大值池化层,以此为例,池化尺寸为 3 × 3 3\times 3 3×3,步长为2。这会发生一些很有趣的事情,因此我们先来使用一个比较常见的为例,也就是池化尺寸 2 × 2 2\times 2 2×2,步长为2。

对于一个池化窗口内的数据而言:
y 0 = m a x [ x 0 , 0 , x 0 , 1 x 1 , 0 , x 1 , 1 ] y_0=max\left[ \begin{array}{lr} x_{0,0},x_{0,1}\\ x_{1,0},x_{1,1}\\ \end{array}\right] y0=max[x0,0,x0,1x1,0,x1,1]
假设最大值是 x 0 , 1 x_{0,1} x0,1,那么对于求导而言
∂ y 0 ∂ x = [ 0 , 1 0 , 0 ] \frac{\partial y_0}{\partial x}=\left[ \begin{array}{lr} 0,1\\ 0,0\\ \end{array}\right] xy0=[0,10,0]
也就是说,传递上来的梯度,会传递到前向传播时最大值所对应的位置上。而且由此我们可以看出,像非线性激活函数一样,被选出来的最大值于后面的梯度一一对应,其他是0。所以一样的道理使用 ⊙ \odot 进行计算。或者我们可以认为,对传递上来的梯度进行上采样,然后再与当前层梯度一一对应相乘。

同理如果是均值池化,对于一个池化窗口内的数据而言:
y 0 = 1 4 ( x 0 , 0 + x 0 , 1 + x 1 , 0 + x 1 , 1 ) y_0=\frac14\left(x_{0,0}+x_{0,1}+x_{1,0}+x_{1,1}\right) y0=41(x0,0+x0,1+x1,0+x1,1)
那么梯度就是
∂ y 0 ∂ x = [ 1 4 , 1 4 1 4 , 1 4 ] \frac{\partial y_0}{\partial x}=\left[ \begin{array}{lr} \frac14,\frac14\\ \frac14,\frac14\\ \end{array}\right] xy0=[41,4141,41]
和最大值池化一样仍然可以考虑为对传递而来的梯度进行上采样再与当前层梯度一一对应相乘。

像前面非线性激活函数一样,我们令当前池化层梯度为 σ ′ \sigma' σ,那么计算公式为:
∂ L ∂ x = u p s a m p l e ( δ l + 1 ) ⊙ σ ′ \frac{\partial L}{\partial x}=upsample\left(\delta_{l+1}\right)\odot \sigma' xL=upsample(δl+1)σ
到这里我们就得到了网上搜到的大多数blog的公式了。这没什么问题。但是如果我们进一步考虑到本文示例模型中的池化层,尺寸为 3 × 3 3\times 3 3×3步长为2的最大值池化。

对于一个池化窗口内的数据而言:
y 0 = m a x [ x 0 , 0 , x 0 , 1 , x 0 , 2 x 1 , 0 , x 1 , 1 , x 1 , 2 x 2 , 0 , x 2 , 1 , x 2 , 2 ] y_0=max\left[ \begin{array}{lr} x_{0,0},x_{0,1},x_{0,2}\\ x_{1,0},x_{1,1},x_{1,2}\\ x_{2,0},x_{2,1},x_{2,2}\\ \end{array}\right] y0=maxx0,0,x0,1,x0,2x1,0,x1,1,x1,2x2,0,x2,1,x2,2
假设最大值是 x 0 , 2 x_{0,2} x0,2,那么对于求导而言
∂ y 0 ∂ x = [ 0 , 0 , 1 0 , 0 , 0 0 , 0 , 0 ] \frac{\partial y_0}{\partial x}=\left[ \begin{array}{lr} 0,0,1\\ 0,0,0\\ 0,0,0\\ \end{array}\right] xy0=0,0,10,0,00,0,0
也就是说,传递上来的梯度,会传递到前向传播时最大值所对应的位置上。到这里还没有什么问题。

池化窗口需要滑动,滑动两个像素,那么:
y 1 = m a x [ x 0 , 2 , x 0 , 3 , x 0 , 4 x 1 , 2 , x 1 , 3 , x 1 , 4 x 2 , 2 , x 2 , 3 , x 2 , 4 ] y_1=max\left[ \begin{array}{lr} x_{0,2},x_{0,3},x_{0,4}\\ x_{1,2},x_{1,3},x_{1,4}\\ x_{2,2},x_{2,3},x_{2,4}\\ \end{array}\right] y1=maxx0,2,x0,3,x0,4x1,2,x1,3,x1,4x2,2,x2,3,x2,4
假设最大值还是 x 0 , 2 x_{0,2} x0,2,那么对于求导而言
∂ y 1 ∂ x = [ 1 , 0 , 0 0 , 0 , 0 0 , 0 , 0 ] \frac{\partial y_1}{\partial x}=\left[ \begin{array}{lr} 1,0,0\\ 0,0,0\\ 0,0,0\\ \end{array}\right] xy1=1,0,00,0,00,0,0
有没有发现一个很有趣的问题?从 y 0 , y 1 y_0,y_1 y0,y1两个路径去计算梯度,都会把梯度传到 x 0 , 2 x_{0,2} x0,2这里。所以
∂ L ∂ x 0 , 2 = ∂ L ∂ y 0 ∂ y 0 ∂ x 0 , 2 + ∂ L ∂ y 1 ∂ y 1 ∂ x 0 , 2 \frac{\partial L}{\partial x_{0,2}}=\frac{\partial L}{\partial y_0}\frac{\partial y_0}{\partial x_{0,2}}+\frac{\partial L}{\partial y_1}\frac{\partial y_1}{\partial x_{0,2}} x0,2L=y0Lx0,2y0+y1Lx0,2y1
这显然不能够使用上采样才一一对应相乘的公式了。我也没想出来一个很好的可以直接统一使用的很好的公式,如果有做框架的大佬看到这里希望能私聊指点我一下框架里面是怎么实现的。而我自己的话,大概可能也许就只能用大量的循环挨个去判断了(我后面会上传一些我手写的python代码,就是用的循环,可是太慢了)。

5、卷积层

最后就是我们的重头戏卷积层了。前面说过卷积层的设置参数:输入通道数量、输出通道数量、卷积核尺寸、步长、padding、dilation。

5.1 现在还是先以最简单的,输入通道数量为1,输出通道数量为1,尺寸 3 × 3 3\times 3 3×3,步长为1,padding=0,dilation=1进行推导。假设传递回来的梯度为 δ \delta δ

我们首先算几个输出:
y 0 , 0 = [ x 0 , 0 , x 0 , 1 , x 0 , 2 x 1 , 0 , x 1 , 1 , x 1 , 2 x 2 , 0 , x 2 , 1 , x 2 , 2 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 0 , 0 w 0 , 0 + x 0 , 1 w 0 , 1 + x 0 , 2 w 0 , 2 + x 1 , 0 w 1 , 0 + x 1 , 1 w 1 , 1 + x 1 , 2 w 1 , 2 + x 2 , 0 w 2 , 0 + x 2 , 1 w 2 , 1 + x 2 , 2 w 2 , 2 \begin{aligned} y_{0,0}&=\left[ \begin{array}{lr} x_{0,0},x_{0,1},x_{0,2}\\ x_{1,0},x_{1,1},x_{1,2}\\ x_{2,0},x_{2,1},x_{2,2}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{0,0}w_{0,0}+x_{0,1}w_{0,1}+x_{0,2}w_{0,2}+x_{1,0}w_{1,0}+x_{1,1}w_{1,1}+x_{1,2}w_{1,2}+x_{2,0}w_{2,0}+x_{2,1}w_{2,1}+x_{2,2}w_{2,2} \end{aligned} y0,0=x0,0,x0,1,x0,2x1,0,x1,1,x1,2x2,0,x2,1,x2,2w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x0,0w0,0+x0,1w0,1+x0,2w0,2+x1,0w1,0+x1,1w1,1+x1,2w1,2+x2,0w2,0+x2,1w2,1+x2,2w2,2 y 0 , 1 = [ x 0 , 1 , x 0 , 2 , x 0 , 3 x 1 , 1 , x 1 , 2 , x 1 , 3 x 2 , 1 , x 2 , 2 , x 2 , 3 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 0 , 1 w 0 , 0 + x 0 , 2 w 0 , 1 + x 0 , 3 w 0 , 2 + x 1 , 1 w 1 , 0 + x 1 , 2 w 1 , 1 + x 1 , 3 w 1 , 2 + x 2 , 1 w 2 , 0 + x 2 , 2 w 2 , 1 + x 2 , 3 w 2 , 2 \begin{aligned} y_{0,1}&=\left[ \begin{array}{lr} x_{0,1},x_{0,2},x_{0,3}\\ x_{1,1},x_{1,2},x_{1,3}\\ x_{2,1},x_{2,2},x_{2,3}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{0,1}w_{0,0}+x_{0,2}w_{0,1}+x_{0,3}w_{0,2}+x_{1,1}w_{1,0}+x_{1,2}w_{1,1}+x_{1,3}w_{1,2}+x_{2,1}w_{2,0}+x_{2,2}w_{2,1}+x_{2,3}w_{2,2} \end{aligned} y0,1=x0,1,x0,2,x0,3x1,1,x1,2,x1,3x2,1,x2,2,x2,3w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x0,1w0,0+x0,2w0,1+x0,3w0,2+x1,1w1,0+x1,2w1,1+x1,3w1,2+x2,1w2,0+x2,2w2,1+x2,3w2,2 y 1 , 0 = [ x 1 , 0 , x 1 , 1 , x 1 , 2 x 2 , 0 , x 2 , 1 , x 2 , 2 x 3 , 0 , x 3 , 1 , x 3 , 2 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 1 , 0 w 0 , 0 + x 1 , 1 w 0 , 1 + x 1 , 2 w 0 , 2 + x 2 , 0 w 1 , 0 + x 2 , 1 w 1 , 1 + x 2 , 2 w 1 , 2 + x 3 , 0 w 2 , 0 + x 3 , 1 w 2 , 1 + x 3 , 2 w 2 , 2 \begin{aligned} y_{1,0}&=\left[ \begin{array}{lr} x_{1,0},x_{1,1},x_{1,2}\\ x_{2,0},x_{2,1},x_{2,2}\\ x_{3,0},x_{3,1},x_{3,2}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{1,0}w_{0,0}+x_{1,1}w_{0,1}+x_{1,2}w_{0,2}+x_{2,0}w_{1,0}+x_{2,1}w_{1,1}+x_{2,2}w_{1,2}+x_{3,0}w_{2,0}+x_{3,1}w_{2,1}+x_{3,2}w_{2,2} \end{aligned} y1,0=x1,0,x1,1,x1,2x2,0,x2,1,x2,2x3,0,x3,1,x3,2w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x1,0w0,0+x1,1w0,1+x1,2w0,2+x2,0w1,0+x2,1w1,1+x2,2w1,2+x3,0w2,0+x3,1w2,1+x3,2w2,2 y 1 , 1 = [ x 1 , 1 , x 1 , 2 , x 1 , 3 x 2 , 1 , x 2 , 2 , x 2 , 3 x 3 , 1 , x 3 , 2 , x 3 , 3 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 1 , 1 w 0 , 0 + x 1 , 2 w 0 , 1 + x 1 , 3 w 0 , 2 + x 2 , 1 w 1 , 0 + x 2 , 2 w 1 , 1 + x 2 , 3 w 1 , 2 + x 3 , 1 w 2 , 0 + x 3 , 2 w 2 , 1 + x 3 , 3 w 2 , 2 \begin{aligned} y_{1,1}&=\left[ \begin{array}{lr} x_{1,1},x_{1,2},x_{1,3}\\ x_{2,1},x_{2,2},x_{2,3}\\ x_{3,1},x_{3,2},x_{3,3}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{1,1}w_{0,0}+x_{1,2}w_{0,1}+x_{1,3}w_{0,2}+x_{2,1}w_{1,0}+x_{2,2}w_{1,1}+x_{2,3}w_{1,2}+x_{3,1}w_{2,0}+x_{3,2}w_{2,1}+x_{3,3}w_{2,2} \end{aligned} y1,1=x1,1,x1,2,x1,3x2,1,x2,2,x2,3x3,1,x3,2,x3,3w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x1,1w0,0+x1,2w0,1+x1,3w0,2+x2,1w1,0+x2,2w1,1+x2,3w1,2+x3,1w2,0+x3,2w2,1+x3,3w2,2
我们先看对于 x 0 , 0 x_{0,0} x0,0的梯度,由于只有 y 0 , 0 y_{0,0} y0,0用到了他,所以:
∂ L ∂ x 0 , 0 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 0 , 0 = δ 0 , 0 w 0 , 0 \frac{\partial L}{\partial x_{0,0}}=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{0,0}}=\delta_{0,0}w_{0,0} x0,0L=y0,0Lx0,0y0,0=δ0,0w0,0
再看 x 1 , 1 x_{1,1} x1,1,四个输出都用到了他,所以
∂ L ∂ x 1 , 1 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 1 , 1 + ∂ L ∂ y 0 , 1 ∂ y 0 , 1 ∂ x 1 , 1 + ∂ L ∂ y 1 , 0 ∂ y 1 , 0 ∂ x 1 , 1 + ∂ L ∂ y 1 , 1 ∂ y 1 , 1 ∂ x 1 , 1 = δ 0 , 0 w 1 , 1 + δ 0 , 1 w 1 , 0 + δ 1 , 0 w 0 , 1 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{0,1}}\frac{\partial y_{0,1}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{1,0}}\frac{\partial y_{1,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{1,1}}\frac{\partial y_{1,1}}{\partial x_{1,1}}\\ &=\delta_{0,0}w_{1,1}+\delta_{0,1}w_{1,0}+\delta_{1,0}w_{0,1}+\delta_{1,1}w_{0,0} \end{aligned} x1,1L=y0,0Lx1,1y0,0+y0,1Lx1,1y0,1+y1,0Lx1,1y1,0+y1,1Lx1,1y1,1=δ0,0w1,1+δ0,1w1,0+δ1,0w0,1+δ1,1w0,0
其他的就不再这里推导,都是一样的,那么这个公式看起来已经很有趣了,和前面前向计算卷积的公式差不多,那么我们试一下。
∂ L ∂ x 0 , 0 = [ 0 , 0 , 0 0 , 0 , 0 0 , 0 , δ 0 , 0 ] ⊗ [ w 2 , 2 , w 2 , 1 , w 2 , 0 w 1 , 2 , w 1 , 1 , w 1 , 0 w 0 , 0 , w 0 , 1 , w 0 , 0 ] = δ 0 , 0 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{0,0}}&=\left[ \begin{array}{lr} 0,0,0\\ 0,0,0\\ 0,0,\delta_{0,0}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{2,2},w_{2,1},w_{2,0}\\ w_{1,2},w_{1,1},w_{1,0}\\ w_{0,0},w_{0,1},w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{0,0} \end{aligned} x0,0L=0,0,00,0,00,0,δ0,0w2,2,w2,1,w2,0w1,2,w1,1,w1,0w0,0,w0,1,w0,0=δ0,0w0,0 ∂ L ∂ x 1 , 1 = [ 0 , 0 , 0 0 , δ 0 , 0 , δ 0 , 1 0 , δ 1 , 0 , δ 1 , 1 ] ⊗ [ w 2 , 2 , w 2 , 1 , w 2 , 0 w 1 , 2 , w 1 , 1 , w 1 , 0 w 0 , 0 , w 0 , 1 , w 0 , 0 ] = δ 0 , 0 w 1 , 1 + δ 0 , 1 w 1 , 0 + δ 1 , 0 w 0 , 1 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\left[ \begin{array}{lr} 0,0,0\\ 0,\delta_{0,0},\delta_{0,1}\\ 0,\delta_{1,0},\delta_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{2,2},w_{2,1},w_{2,0}\\ w_{1,2},w_{1,1},w_{1,0}\\ w_{0,0},w_{0,1},w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{1,1}+\delta_{0,1}w_{1,0}+\delta_{1,0}w_{0,1}+\delta_{1,1}w_{0,0} \end{aligned} x1,1L=0,0,00,δ0,0,δ0,10,δ1,0,δ1,1w2,2,w2,1,w2,0w1,2,w1,1,w1,0w0,0,w0,1,w0,0=δ0,0w1,1+δ0,1w1,0+δ1,0w0,1+δ1,1w0,0
所以实际上,这就是二维卷积啊,只不过把 w w w进行了180度旋转,对 δ \delta δ矩阵进行padding=2。这就得到了大多数blog里面会给出的一个公式
∂ L ∂ x = δ ⊗ w r o t 180 \frac{\partial L}{\partial x}=\delta\otimes w_{rot180} xL=δwrot180
不过要注意这里需要padding。

5.2 那么现在改变一下卷积核的大小,如果改成 2 × 2 2\times 2 2×2,公式应该是不会出现什么问题,主要来看看padding的大小

还是按照前面说的进行推导。我们首先算几个输出:
y 0 , 0 = [ x 0 , 0 , x 0 , 1 x 1 , 0 , x 1 , 1 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 0 , 0 w 0 , 0 + x 0 , 1 w 0 , 1 + x 1 , 0 w 1 , 0 + x 1 , 1 w 1 , 1 \begin{aligned} y_{0,0}&=\left[ \begin{array}{lr} x_{0,0},x_{0,1}\\ x_{1,0},x_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{0,0}w_{0,0}+x_{0,1}w_{0,1}+x_{1,0}w_{1,0}+x_{1,1}w_{1,1} \end{aligned} y0,0=[x0,0,x0,1x1,0,x1,1][w0,0,w0,1w1,0,w1,1]=x0,0w0,0+x0,1w0,1+x1,0w1,0+x1,1w1,1 y 0 , 1 = [ x 0 , 1 , x 0 , 2 x 1 , 1 , x 1 , 2 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 0 , 1 w 0 , 0 + x 0 , 2 w 0 , 1 + x 1 , 1 w 1 , 0 + x 1 , 2 w 1 , 1 \begin{aligned} y_{0,1}&=\left[ \begin{array}{lr} x_{0,1},x_{0,2}\\ x_{1,1},x_{1,2}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{0,1}w_{0,0}+x_{0,2}w_{0,1}+x_{1,1}w_{1,0}+x_{1,2}w_{1,1} \end{aligned} y0,1=[x0,1,x0,2x1,1,x1,2][w0,0,w0,1w1,0,w1,1]=x0,1w0,0+x0,2w0,1+x1,1w1,0+x1,2w1,1 y 1 , 0 = [ x 1 , 0 , x 1 , 1 x 2 , 0 , x 2 , 1 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 1 , 0 w 0 , 0 + x 1 , 1 w 0 , 1 + x 2 , 0 w 1 , 0 + x 2 , 1 w 1 , 1 \begin{aligned} y_{1,0}&=\left[ \begin{array}{lr} x_{1,0},x_{1,1}\\ x_{2,0},x_{2,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{1,0}w_{0,0}+x_{1,1}w_{0,1}+x_{2,0}w_{1,0}+x_{2,1}w_{1,1} \end{aligned} y1,0=[x1,0,x1,1x2,0,x2,1][w0,0,w0,1w1,0,w1,1]=x1,0w0,0+x1,1w0,1+x2,0w1,0+x2,1w1,1 y 1 , 1 = [ x 1 , 1 , x 1 , 2 x 2 , 1 , x 2 , 2 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 1 , 1 w 0 , 0 + x 1 , 2 w 0 , 1 + x 2 , 1 w 1 , 0 + x 2 , 2 w 1 , 1 \begin{aligned} y_{1,1}&=\left[ \begin{array}{lr} x_{1,1},x_{1,2}\\ x_{2,1},x_{2,2}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{1,1}w_{0,0}+x_{1,2}w_{0,1}+x_{2,1}w_{1,0}+x_{2,2}w_{1,1} \end{aligned} y1,1=[x1,1,x1,2x2,1,x2,2][w0,0,w0,1w1,0,w1,1]=x1,1w0,0+x1,2w0,1+x2,1w1,0+x2,2w1,1
我们先看对于 x 0 , 0 x_{0,0} x0,0的梯度,由于只有 y 0 , 0 y_{0,0} y0,0用到了他,所以:
∂ L ∂ x 0 , 0 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 0 , 0 = δ 0 , 0 w 0 , 0 \frac{\partial L}{\partial x_{0,0}}=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{0,0}}=\delta_{0,0}w_{0,0} x0,0L=y0,0Lx0,0y0,0=δ0,0w0,0
再看 x 1 , 1 x_{1,1} x1,1,四个输出都用到了他,所以
∂ L ∂ x 1 , 1 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 1 , 1 + ∂ L ∂ y 0 , 1 ∂ y 0 , 1 ∂ x 1 , 1 + ∂ L ∂ y 1 , 0 ∂ y 1 , 0 ∂ x 1 , 1 + ∂ L ∂ y 1 , 1 ∂ y 1 , 1 ∂ x 1 , 1 = δ 0 , 0 w 1 , 1 + δ 0 , 1 w 1 , 0 + δ 1 , 0 w 0 , 1 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{0,1}}\frac{\partial y_{0,1}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{1,0}}\frac{\partial y_{1,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{1,1}}\frac{\partial y_{1,1}}{\partial x_{1,1}}\\ &=\delta_{0,0}w_{1,1}+\delta_{0,1}w_{1,0}+\delta_{1,0}w_{0,1}+\delta_{1,1}w_{0,0} \end{aligned} x1,1L=y0,0Lx1,1y0,0+y0,1Lx1,1y0,1+y1,0Lx1,1y1,0+y1,1Lx1,1y1,1=δ0,0w1,1+δ0,1w1,0+δ1,0w0,1+δ1,1w0,0
其他的就不再这里推导,都是一样的,那么这个公式看起来已经很有趣了,和前面前向计算卷积的公式差不多,那么我们试一下。
∂ L ∂ x 0 , 0 = [ 0 , 0 0 , δ 0 , 0 ] ⊗ [ w 1 , 1 , w 1 , 0 w 0 , 1 , w 0 , 0 ] = δ 0 , 0 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{0,0}}&=\left[ \begin{array}{lr} 0,0\\ 0,\delta_{0,0}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{1,1},w_{1,0}\\ w_{0,1},w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{0,0} \end{aligned} x0,0L=[0,00,δ0,0][w1,1,w1,0w0,1,w0,0]=δ0,0w0,0 ∂ L ∂ x 1 , 1 = [ δ 0 , 0 , δ 0 , 1 δ 1 , 0 , δ 1 , 1 ] ⊗ [ w 1 , 1 , w 1 , 0 w 0 , 1 , w 0 , 0 ] = δ 0 , 0 w 1 , 1 + δ 0 , 1 w 1 , 0 + δ 1 , 0 w 0 , 1 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\left[ \begin{array}{lr} \delta_{0,0},\delta_{0,1}\\ \delta_{1,0},\delta_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{1,1},w_{1,0}\\ w_{0,1},w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{1,1}+\delta_{0,1}w_{1,0}+\delta_{1,0}w_{0,1}+\delta_{1,1}w_{0,0} \end{aligned} x1,1L=[δ0,0,δ0,1δ1,0,δ1,1][w1,1,w1,0w0,1,w0,0]=δ0,0w1,1+δ0,1w1,0+δ1,0w0,1+δ1,1w0,0
所以这次的padding为1。那么结论来了,我们这里的卷积核的尺寸每增减1,padding就会增减1。所以我们姑且认为
p a d d i n g _ b a c k w a r d = k e r n e l _ s i z e − C padding\_backward=kernel\_size-C padding_backward=kernel_sizeC
目前来看 C = 1 C=1 C=1

5.3 我们再来输入特征图的padding试试,这次对卷积核大小为 2 × 2 2\times 2 2×2,用padding为1

还是按照前面说的进行推导。我们首先算几个输出:
y 0 , 0 = [ 0 , 0 0 , x 0 , 0 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 0 , 0 w 1 , 1 \begin{aligned} y_{0,0}&=\left[ \begin{array}{lr} 0,0\\ 0,x_{0,0}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{0,0}w_{1,1} \end{aligned} y0,0=[0,00,x0,0][w0,0,w0,1w1,0,w1,1]=x0,0w1,1 y 0 , 1 = [ 0 , 0 x 0 , 0 , x 0 , 1 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 0 , 0 w 1 , 0 + x 0 , 1 w 1 , 1 \begin{aligned} y_{0,1}&=\left[ \begin{array}{lr} 0,0\\ x_{0,0},x_{0,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{0,0}w_{1,0}+x_{0,1}w_{1,1} \end{aligned} y0,1=[0,0x0,0,x0,1][w0,0,w0,1w1,0,w1,1]=x0,0w1,0+x0,1w1,1 y 1 , 0 = [ 0 , x 0 , 0 0 , x 1 , 0 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 0 , 0 w 0 , 1 + x 1 , 0 w 1 , 1 \begin{aligned} y_{1,0}&=\left[ \begin{array}{lr} 0,x_{0,0}\\ 0,x_{1,0}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{0,0}w_{0,1}+x_{1,0}w_{1,1} \end{aligned} y1,0=[0,x0,00,x1,0][w0,0,w0,1w1,0,w1,1]=x0,0w0,1+x1,0w1,1 y 1 , 1 = [ x 0 , 0 , x 0 , 1 x 1 , 0 , x 1 , 1 ] ⊗ [ w 0 , 0 , w 0 , 1 w 1 , 0 , w 1 , 1 ] = x 0 , 0 w 0 , 0 + x 0 , 1 w 0 , 1 + x 1 , 0 w 1 , 0 + x 1 , 1 w 1 , 1 \begin{aligned} y_{1,1}&=\left[ \begin{array}{lr} x_{0,0},x_{0,1}\\ x_{1,0},x_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1}\\ w_{1,0},w_{1,1}\\ \end{array}\right]\\&=x_{0,0}w_{0,0}+x_{0,1}w_{0,1}+x_{1,0}w_{1,0}+x_{1,1}w_{1,1} \end{aligned} y1,1=[x0,0,x0,1x1,0,x1,1][w0,0,w0,1w1,0,w1,1]=x0,0w0,0+x0,1w0,1+x1,0w1,0+x1,1w1,1
我们先看 x 0 , 0 x_{0,0} x0,0,四个输出都用到了他,所以
∂ L ∂ x 0 , 0 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 0 , 0 + ∂ L ∂ y 0 , 1 ∂ y 0 , 1 ∂ x 0 , 0 + ∂ L ∂ y 1 , 0 ∂ y 1 , 0 ∂ x 0 , 0 + ∂ L ∂ y 1 , 1 ∂ y 1 , 1 ∂ x 0 , 0 = δ 0 , 0 w 1 , 1 + δ 0 , 1 w 1 , 0 + δ 1 , 0 w 0 , 1 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{0,0}}&=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{0,0}}+\frac{\partial L}{\partial y_{0,1}}\frac{\partial y_{0,1}}{\partial x_{0,0}}+\frac{\partial L}{\partial y_{1,0}}\frac{\partial y_{1,0}}{\partial x_{0,0}}+\frac{\partial L}{\partial y_{1,1}}\frac{\partial y_{1,1}}{\partial x_{0,0}}\\ &=\delta_{0,0}w_{1,1}+\delta_{0,1}w_{1,0}+\delta_{1,0}w_{0,1}+\delta_{1,1}w_{0,0} \end{aligned} x0,0L=y0,0Lx0,0y0,0+y0,1Lx0,0y0,1+y1,0Lx0,0y1,0+y1,1Lx0,0y1,1=δ0,0w1,1+δ0,1w1,0+δ1,0w0,1+δ1,1w0,0
事实上继续往下推的话,每一个都是被四个全都用到。继续按照卷积去计算的话:
∂ L ∂ x 0 , 0 = [ δ 0 , 0 , δ 0 , 1 δ 1 , 0 , δ 1 , 1 ] ⊗ [ w 1 , 1 , w 1 , 0 w 0 , 1 , w 0 , 0 ] = δ 0 , 0 w 1 , 1 + δ 0 , 1 w 1 , 0 + δ 1 , 0 w 0 , 1 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{0,0}}&=\left[ \begin{array}{lr} \delta_{0,0},\delta_{0,1}\\ \delta_{1,0},\delta_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{1,1},w_{1,0}\\ w_{0,1},w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{1,1}+\delta_{0,1}w_{1,0}+\delta_{1,0}w_{0,1}+\delta_{1,1}w_{0,0} \end{aligned} x0,0L=[δ0,0,δ0,1δ1,0,δ1,1][w1,1,w1,0w0,1,w0,0]=δ0,0w1,1+δ0,1w1,0+δ1,0w0,1+δ1,1w0,0
所以这次的padding为0。所以结论就是,我们前向的padding每增减1,反向的padding就会减增1。所以我们姑且认为
p a d d i n g _ b a c k w a r d = k e r n e l _ s i z e − p a d d i n g _ f o r w a r d − C padding\_backward=kernel\_size-padding\_forward-C padding_backward=kernel_sizepadding_forwardC
目前来看 C = 1 C=1 C=1

5.4 现在来看看步长的影响,卷积核尺寸使用 3 × 3 3\times3 3×3,padding=1,步长为2(一般来说步长应该是小于卷积核尺寸的),dilation默认为1这个不做更改

我们首先算几个输出:
y 0 , 0 = [ 0 , 0 , 0 0 , x 0 , 0 , x 0 , 1 0 , x 1 , 0 , x 1 , 1 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 0 , 0 w 1 , 1 + x 0 , 1 w 1 , 2 + x 1 , 0 w 2 , 1 + x 1 , 1 w 2 , 2 \begin{aligned} y_{0,0}&=\left[ \begin{array}{lr} 0,0,0\\ 0,x_{0,0},x_{0,1}\\ 0,x_{1,0},x_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{0,0}w_{1,1}+x_{0,1}w_{1,2}+x_{1,0}w_{2,1}+x_{1,1}w_{2,2} \end{aligned} y0,0=0,0,00,x0,0,x0,10,x1,0,x1,1w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x0,0w1,1+x0,1w1,2+x1,0w2,1+x1,1w2,2 y 0 , 1 = [ 0 , 0 , 0 x 0 , 1 , x 0 , 2 , x 0 , 3 x 1 , 1 , x 1 , 2 , x 1 , 3 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 0 , 1 w 1 , 0 + x 0 , 2 w 1 , 1 + x 0 , 3 w 1 , 2 + x 1 , 1 w 2 , 0 + x 1 , 2 w 2 , 1 + x 1 , 3 w 2 , 2 \begin{aligned} y_{0,1}&=\left[ \begin{array}{lr} 0,0,0\\ x_{0,1},x_{0,2},x_{0,3}\\ x_{1,1},x_{1,2},x_{1,3}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{0,1}w_{1,0}+x_{0,2}w_{1,1}+x_{0,3}w_{1,2}+x_{1,1}w_{2,0}+x_{1,2}w_{2,1}+x_{1,3}w_{2,2} \end{aligned} y0,1=0,0,0x0,1,x0,2,x0,3x1,1,x1,2,x1,3w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x0,1w1,0+x0,2w1,1+x0,3w1,2+x1,1w2,0+x1,2w2,1+x1,3w2,2 y 1 , 0 = [ 0 , x 1 , 0 , x 1 , 1 0 , x 2 , 0 , x 2 , 1 0 , x 3 , 0 , x 3 , 1 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 1 , 0 w 0 , 1 + x 1 , 1 w 0 , 2 + x 2 , 0 w 1 , 1 + x 2 , 1 w 1 , 2 + x 3 , 0 w 2 , 1 + x 3 , 1 w 2 , 2 \begin{aligned} y_{1,0}&=\left[ \begin{array}{lr} 0,x_{1,0},x_{1,1}\\ 0,x_{2,0},x_{2,1}\\ 0,x_{3,0},x_{3,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{1,0}w_{0,1}+x_{1,1}w_{0,2}+x_{2,0}w_{1,1}+x_{2,1}w_{1,2}+x_{3,0}w_{2,1}+x_{3,1}w_{2,2} \end{aligned} y1,0=0,x1,0,x1,10,x2,0,x2,10,x3,0,x3,1w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x1,0w0,1+x1,1w0,2+x2,0w1,1+x2,1w1,2+x3,0w2,1+x3,1w2,2 y 1 , 1 = [ x 1 , 1 , x 1 , 2 , x 1 , 3 x 2 , 1 , x 2 , 2 , x 2 , 3 x 3 , 1 , x 3 , 2 , x 3 , 3 ] ⊗ [ w 0 , 0 , w 0 , 1 , w 0 , 2 w 1 , 0 , w 1 , 1 , w 1 , 2 w 2 , 0 , w 2 , 1 , w 2 , 2 ] = x 1 , 1 w 0 , 0 + x 1 , 2 w 0 , 1 + x 1 , 3 w 0 , 2 + x 2 , 1 w 1 , 0 + x 2 , 2 w 1 , 1 + x 2 , 3 w 1 , 2 + x 3 , 1 w 2 , 0 + x 3 , 2 w 2 , 1 + x 3 , 3 w 2 , 2 \begin{aligned} y_{1,1}&=\left[ \begin{array}{lr} x_{1,1},x_{1,2},x_{1,3}\\ x_{2,1},x_{2,2},x_{2,3}\\ x_{3,1},x_{3,2},x_{3,3}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},w_{0,1},w_{0,2}\\ w_{1,0},w_{1,1},w_{1,2}\\ w_{2,0},w_{2,1},w_{2,2}\\ \end{array}\right]\\&=x_{1,1}w_{0,0}+x_{1,2}w_{0,1}+x_{1,3}w_{0,2}+x_{2,1}w_{1,0}+x_{2,2}w_{1,1}+x_{2,3}w_{1,2}+x_{3,1}w_{2,0}+x_{3,2}w_{2,1}+x_{3,3}w_{2,2} \end{aligned} y1,1=x1,1,x1,2,x1,3x2,1,x2,2,x2,3x3,1,x3,2,x3,3w0,0,w0,1,w0,2w1,0,w1,1,w1,2w2,0,w2,1,w2,2=x1,1w0,0+x1,2w0,1+x1,3w0,2+x2,1w1,0+x2,2w1,1+x2,3w1,2+x3,1w2,0+x3,2w2,1+x3,3w2,2
我们先看对于 x 0 , 0 x_{0,0} x0,0的梯度:
∂ L ∂ x 0 , 0 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 0 , 0 = δ 0 , 0 w 1 , 1 \frac{\partial L}{\partial x_{0,0}}=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{0,0}}=\delta_{0,0}w_{1,1} x0,0L=y0,0Lx0,0y0,0=δ0,0w1,1
再看 x 1 , 1 x_{1,1} x1,1,四个输出都用到了他,所以
∂ L ∂ x 1 , 1 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 1 , 1 + ∂ L ∂ y 0 , 1 ∂ y 0 , 1 ∂ x 1 , 1 + ∂ L ∂ y 1 , 0 ∂ y 1 , 0 ∂ x 1 , 1 + ∂ L ∂ y 1 , 1 ∂ y 1 , 1 ∂ x 1 , 1 = δ 0 , 0 w 2 , 2 + δ 0 , 1 w 2 , 0 + δ 1 , 0 w 0 , 2 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{0,1}}\frac{\partial y_{0,1}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{1,0}}\frac{\partial y_{1,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{1,1}}\frac{\partial y_{1,1}}{\partial x_{1,1}}\\ &=\delta_{0,0}w_{2,2}+\delta_{0,1}w_{2,0}+\delta_{1,0}w_{0,2}+\delta_{1,1}w_{0,0} \end{aligned} x1,1L=y0,0Lx1,1y0,0+y0,1Lx1,1y0,1+y1,0Lx1,1y1,0+y1,1Lx1,1y1,1=δ0,0w2,2+δ0,1w2,0+δ1,0w0,2+δ1,1w0,0
继续按照卷积去计算的话:
∂ L ∂ x 0 , 0 = [ 0 ,    0 ,    0 0 , δ 0 , 0 , 0 0 ,    0 ,    0 ] ⊗ [ w 2 , 2 , w 2 , 1 , w 2 , 0 w 1 , 2 , w 1 , 1 , w 1 , 0 w 0 , 2 , w 0 , 1 , w 0 , 0 ] = δ 0 , 0 w 1 , 1 \begin{aligned} \frac{\partial L}{\partial x_{0,0}}&=\left[ \begin{array}{lr} 0,\ \ 0,\ \ 0\\ 0,\delta_{0,0},0\\ 0,\ \ 0,\ \ 0\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{2,2},w_{2,1},w_{2,0}\\ w_{1,2},w_{1,1},w_{1,0}\\ w_{0,2},w_{0,1},w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{1,1} \end{aligned} x0,0L=0,  0,  00,δ0,0,00,  0,  0w2,2,w2,1,w2,0w1,2,w1,1,w1,0w0,2,w0,1,w0,0=δ0,0w1,1 ∂ L ∂ x 1 , 1 = [ δ 0 , 0 , 0 , δ 0 , 1    0 ,    0 ,   0 δ 1 , 0 , 0 , δ 1 , 1 ] ⊗ [ w 2 , 2 , w 2 , 1 , w 2 , 0 w 1 , 2 , w 1 , 1 , w 1 , 0 w 0 , 2 , w 0 , 1 , w 0 , 0 ] = δ 0 , 0 w 2 , 2 + δ 0 , 1 w 2 , 0 + δ 1 , 0 w 0 , 2 + δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\left[ \begin{array}{lr} \delta_{0,0},0,\delta_{0,1}\\ \ \ 0,\ \ 0,\ 0\\ \delta_{1,0},0,\delta_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{2,2},w_{2,1},w_{2,0}\\ w_{1,2},w_{1,1},w_{1,0}\\ w_{0,2},w_{0,1},w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{2,2}+\delta_{0,1}w_{2,0}+\delta_{1,0}w_{0,2}+\delta_{1,1}w_{0,0} \end{aligned} x1,1L=δ0,0,0,δ0,1  0,  0, 0δ1,0,0,δ1,1w2,2,w2,1,w2,0w1,2,w1,1,w1,0w0,2,w0,1,w0,0=δ0,0w2,2+δ0,1w2,0+δ1,0w0,2+δ1,1w0,0
是不是觉得这个格式奇怪但是很眼熟?前面的 δ \delta δ矩阵做了我们在前向传播的时候说的空洞操作,中间插0,dilation=2对吧?正好这个和我们设置的步长一样,而且现在这个卷积的步长为1,和我们设置的前向传播的dilation一样。另外padding和前面的结论一致,这里也是padding为1。
所以我们得到了新的结论:
d i l a t i o n _ b a c k w a r d _ f e a t u r e m a p = s t r i d e _ f o r w a r d dilation\_backward\_featuremap=stride\_forward dilation_backward_featuremap=stride_forward
注意这个dilation_backward作用对象是谁。

5.5 我们改变dilation而不是改变步长试试,卷积核尺寸 3 × 3 3\times3 3×3,步长为1,dilation改为2,padding还是1

计算输出,这次需要多算几个:
y 0 , 0 = [ 0 ,    0 ,      0 ,      0 ,      0 0 , x 0 , 0 , x 0 , 1 , x 0 , 2 , x 0 , 3 0 , x 1 , 0 , x 1 , 1 , x 1 , 2 , x 1 , 3 0 , x 2 , 0 , x 2 , 1 , x 2 , 2 , x 2 , 3 0 , x 3 , 0 , x 3 , 1 , x 3 , 2 , x 3 , 3 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 1 , 1 w 1 , 1 + x 1 , 3 w 1 , 2 + x 3 , 1 w 2 , 1 + x 3 , 3 w 2 , 2 \begin{aligned} y_{0,0}&=\left[ \begin{array}{lr} 0,\ \ 0,\ \ \ \ 0,\ \ \ \ 0,\ \ \ \ 0\\ 0,x_{0,0},x_{0,1},x_{0,2},x_{0,3}\\ 0,x_{1,0},x_{1,1},x_{1,2},x_{1,3}\\ 0,x_{2,0},x_{2,1},x_{2,2},x_{2,3}\\ 0,x_{3,0},x_{3,1},x_{3,2},x_{3,3}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{1,1}w_{1,1}+x_{1,3}w_{1,2}+x_{3,1}w_{2,1}+x_{3,3}w_{2,2} \end{aligned} y0,0=0,  0,    0,    0,    00,x0,0,x0,1,x0,2,x0,30,x1,0,x1,1,x1,2,x1,30,x2,0,x2,1,x2,2,x2,30,x3,0,x3,1,x3,2,x3,3w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x1,1w1,1+x1,3w1,2+x3,1w2,1+x3,3w2,2 y 0 , 1 = [    0 ,      0 ,      0 ,      0 ,      0 x 0 , 0 , x 0 , 1 , x 0 , 2 , x 0 , 3 , x 0 , 4 x 1 , 0 , x 1 , 1 , x 1 , 2 , x 1 , 3 , x 1 , 4 x 2 , 0 , x 2 , 1 , x 2 , 2 , x 2 , 3 , x 2 , 4 x 3 , 0 , x 3 , 1 , x 3 , 2 , x 3 , 3 , x 3 , 4 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 1 , 0 w 1 , 0 + x 1 , 2 w 1 , 1 + x 1 , 4 w 1 , 2 + x 3 , 0 w 2 , 0 + x 3 , 2 w 2 , 1 + x 3 , 4 w 2 , 2 \begin{aligned} y_{0,1}&=\left[ \begin{array}{lr} \ \ 0,\ \ \ \ 0,\ \ \ \ 0,\ \ \ \ 0,\ \ \ \ 0\\ x_{0,0},x_{0,1},x_{0,2},x_{0,3},x_{0,4}\\ x_{1,0},x_{1,1},x_{1,2},x_{1,3},x_{1,4}\\ x_{2,0},x_{2,1},x_{2,2},x_{2,3},x_{2,4}\\ x_{3,0},x_{3,1},x_{3,2},x_{3,3},x_{3,4}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{1,0}w_{1,0}+x_{1,2}w_{1,1}+x_{1,4}w_{1,2}+x_{3,0}w_{2,0}+x_{3,2}w_{2,1}+x_{3,4}w_{2,2} \end{aligned} y0,1=  0,    0,    0,    0,    0x0,0,x0,1,x0,2,x0,3,x0,4x1,0,x1,1,x1,2,x1,3,x1,4x2,0,x2,1,x2,2,x2,3,x2,4x3,0,x3,1,x3,2,x3,3,x3,4w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x1,0w1,0+x1,2w1,1+x1,4w1,2+x3,0w2,0+x3,2w2,1+x3,4w2,2 y 0 , 2 = [    0 ,      0 ,      0 ,      0 ,      0 x 0 , 1 , x 0 , 2 , x 0 , 3 , x 0 , 4 , x 0 , 5 x 1 , 1 , x 1 , 2 , x 1 , 3 , x 1 , 4 , x 1 , 5 x 2 , 1 , x 2 , 2 , x 2 , 3 , x 2 , 4 , x 2 , 5 x 3 , 1 , x 3 , 2 , x 3 , 3 , x 3 , 4 , x 3 , 5 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 1 , 1 w 1 , 0 + x 1 , 3 w 1 , 1 + x 1 , 5 w 1 , 2 + x 3 , 1 w 2 , 0 + x 3 , 3 w 2 , 1 + x 3 , 5 w 2 , 2 \begin{aligned} y_{0,2}&=\left[ \begin{array}{lr} \ \ 0,\ \ \ \ 0,\ \ \ \ 0,\ \ \ \ 0,\ \ \ \ 0\\ x_{0,1},x_{0,2},x_{0,3},x_{0,4},x_{0,5}\\ x_{1,1},x_{1,2},x_{1,3},x_{1,4},x_{1,5}\\ x_{2,1},x_{2,2},x_{2,3},x_{2,4},x_{2,5}\\ x_{3,1},x_{3,2},x_{3,3},x_{3,4},x_{3,5}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{1,1}w_{1,0}+x_{1,3}w_{1,1}+x_{1,5}w_{1,2}+x_{3,1}w_{2,0}+x_{3,3}w_{2,1}+x_{3,5}w_{2,2} \end{aligned} y0,2=  0,    0,    0,    0,    0x0,1,x0,2,x0,3,x0,4,x0,5x1,1,x1,2,x1,3,x1,4,x1,5x2,1,x2,2,x2,3,x2,4,x2,5x3,1,x3,2,x3,3,x3,4,x3,5w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x1,1w1,0+x1,3w1,1+x1,5w1,2+x3,1w2,0+x3,3w2,1+x3,5w2,2 y 1 , 0 = [ 0 , x 0 , 0 , x 0 , 1 , x 0 , 2 , x 0 , 3 0 , x 1 , 0 , x 1 , 1 , x 1 , 2 , x 1 , 3 0 , x 2 , 0 , x 2 , 1 , x 2 , 2 , x 2 , 3 0 , x 3 , 0 , x 3 , 1 , x 3 , 2 , x 3 , 3 0 , x 4 , 0 , x 4 , 1 , x 4 , 2 , x 4 , 3 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 0 , 1 w 0 , 1 + x 0 , 3 w 0 , 2 + x 2 , 1 w 1 , 1 + x 2 , 3 w 1 , 2 + x 4 , 1 w 2 , 1 + x 4 , 3 w 2 , 2 \begin{aligned} y_{1,0}&=\left[ \begin{array}{lr} 0,x_{0,0},x_{0,1},x_{0,2},x_{0,3}\\ 0,x_{1,0},x_{1,1},x_{1,2},x_{1,3}\\ 0,x_{2,0},x_{2,1},x_{2,2},x_{2,3}\\ 0,x_{3,0},x_{3,1},x_{3,2},x_{3,3}\\ 0,x_{4,0},x_{4,1},x_{4,2},x_{4,3}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{0,1}w_{0,1}+x_{0,3}w_{0,2}+x_{2,1}w_{1,1}+x_{2,3}w_{1,2}+x_{4,1}w_{2,1}+x_{4,3}w_{2,2} \end{aligned} y1,0=0,x0,0,x0,1,x0,2,x0,30,x1,0,x1,1,x1,2,x1,30,x2,0,x2,1,x2,2,x2,30,x3,0,x3,1,x3,2,x3,30,x4,0,x4,1,x4,2,x4,3w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x0,1w0,1+x0,3w0,2+x2,1w1,1+x2,3w1,2+x4,1w2,1+x4,3w2,2 y 1 , 1 = [ x 0 , 0 , x 0 , 1 , x 0 , 2 , x 0 , 3 , x 0 , 4 x 1 , 0 , x 1 , 1 , x 1 , 2 , x 1 , 3 , x 1 , 4 x 2 , 0 , x 2 , 1 , x 2 , 2 , x 2 , 3 , x 2 , 4 x 3 , 0 , x 3 , 1 , x 3 , 2 , x 3 , 3 , x 3 , 4 x 4 , 0 , x 4 , 1 , x 4 , 2 , x 4 , 3 , x 4 , 4 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 0 , 0 w 0 , 0 + x 0 , 2 w 0 , 1 + x 0 , 4 w 0 , 2 + x 2 , 0 w 1 , 0 + x 2 , 2 w 1 , 1 + x 2 , 4 w 1 , 2 + x 4 , 0 w 2 , 0 + x 4 , 2 w 2 , 1 + x 4 , 4 w 2 , 2 \begin{aligned} y_{1,1}&=\left[ \begin{array}{lr} x_{0,0},x_{0,1},x_{0,2},x_{0,3},x_{0,4}\\ x_{1,0},x_{1,1},x_{1,2},x_{1,3},x_{1,4}\\ x_{2,0},x_{2,1},x_{2,2},x_{2,3},x_{2,4}\\ x_{3,0},x_{3,1},x_{3,2},x_{3,3},x_{3,4}\\ x_{4,0},x_{4,1},x_{4,2},x_{4,3},x_{4,4}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{0,0}w_{0,0}+x_{0,2}w_{0,1}+x_{0,4}w_{0,2}+x_{2,0}w_{1,0}+x_{2,2}w_{1,1}+x_{2,4}w_{1,2}+x_{4,0}w_{2,0}+x_{4,2}w_{2,1}+x_{4,4}w_{2,2} \end{aligned} y1,1=x0,0,x0,1,x0,2,x0,3,x0,4x1,0,x1,1,x1,2,x1,3,x1,4x2,0,x2,1,x2,2,x2,3,x2,4x3,0,x3,1,x3,2,x3,3,x3,4x4,0,x4,1,x4,2,x4,3,x4,4w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x0,0w0,0+x0,2w0,1+x0,4w0,2+x2,0w1,0+x2,2w1,1+x2,4w1,2+x4,0w2,0+x4,2w2,1+x4,4w2,2 y 1 , 2 = [ x 0 , 1 , x 0 , 2 , x 0 , 3 , x 0 , 4 , x 0 , 5 x 1 , 1 , x 1 , 2 , x 1 , 3 , x 1 , 4 , x 1 , 5 x 2 , 1 , x 2 , 2 , x 2 , 3 , x 2 , 4 , x 2 , 5 x 3 , 1 , x 3 , 2 , x 3 , 3 , x 3 , 4 , x 3 , 5 x 4 , 1 , x 4 , 2 , x 4 , 3 , x 4 , 4 , x 4 , 5 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 0 , 1 w 0 , 0 + x 0 , 3 w 0 , 1 + x 0 , 5 w 0 , 2 + x 2 , 1 w 1 , 0 + x 2 , 3 w 1 , 1 + x 2 , 5 w 1 , 2 + x 4 , 1 w 2 , 0 + x 4 , 3 w 2 , 1 + x 4 , 5 w 2 , 2 \begin{aligned} y_{1,2}&=\left[ \begin{array}{lr} x_{0,1},x_{0,2},x_{0,3},x_{0,4},x_{0,5}\\ x_{1,1},x_{1,2},x_{1,3},x_{1,4},x_{1,5}\\ x_{2,1},x_{2,2},x_{2,3},x_{2,4},x_{2,5}\\ x_{3,1},x_{3,2},x_{3,3},x_{3,4},x_{3,5}\\ x_{4,1},x_{4,2},x_{4,3},x_{4,4},x_{4,5}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{0,1}w_{0,0}+x_{0,3}w_{0,1}+x_{0,5}w_{0,2}+x_{2,1}w_{1,0}+x_{2,3}w_{1,1}+x_{2,5}w_{1,2}+x_{4,1}w_{2,0}+x_{4,3}w_{2,1}+x_{4,5}w_{2,2} \end{aligned} y1,2=x0,1,x0,2,x0,3,x0,4,x0,5x1,1,x1,2,x1,3,x1,4,x1,5x2,1,x2,2,x2,3,x2,4,x2,5x3,1,x3,2,x3,3,x3,4,x3,5x4,1,x4,2,x4,3,x4,4,x4,5w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x0,1w0,0+x0,3w0,1+x0,5w0,2+x2,1w1,0+x2,3w1,1+x2,5w1,2+x4,1w2,0+x4,3w2,1+x4,5w2,2 y 2 , 0 = [ 0 , x 1 , 0 , x 1 , 1 , x 1 , 2 , x 1 , 3 0 , x 2 , 0 , x 2 , 1 , x 2 , 2 , x 2 , 3 0 , x 3 , 0 , x 3 , 1 , x 3 , 2 , x 3 , 3 0 , x 4 , 0 , x 4 , 1 , x 4 , 2 , x 4 , 3 0 , x 5 , 0 , x 5 , 1 , x 5 , 2 , x 5 , 3 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 1 , 1 w 0 , 1 + x 1 , 3 w 0 , 2 + x 3 , 1 w 1 , 1 + x 3 , 3 w 1 , 2 + x 5 , 1 w 2 , 1 + x 5 , 3 w 2 , 2 \begin{aligned} y_{2,0}&=\left[ \begin{array}{lr} 0,x_{1,0},x_{1,1},x_{1,2},x_{1,3}\\ 0,x_{2,0},x_{2,1},x_{2,2},x_{2,3}\\ 0,x_{3,0},x_{3,1},x_{3,2},x_{3,3}\\ 0,x_{4,0},x_{4,1},x_{4,2},x_{4,3}\\ 0,x_{5,0},x_{5,1},x_{5,2},x_{5,3}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{1,1}w_{0,1}+x_{1,3}w_{0,2}+x_{3,1}w_{1,1}+x_{3,3}w_{1,2}+x_{5,1}w_{2,1}+x_{5,3}w_{2,2} \end{aligned} y2,0=0,x1,0,x1,1,x1,2,x1,30,x2,0,x2,1,x2,2,x2,30,x3,0,x3,1,x3,2,x3,30,x4,0,x4,1,x4,2,x4,30,x5,0,x5,1,x5,2,x5,3w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x1,1w0,1+x1,3w0,2+x3,1w1,1+x3,3w1,2+x5,1w2,1+x5,3w2,2 y 2 , 1 = [ x 1 , 0 , x 1 , 1 , x 1 , 2 , x 1 , 3 , x 1 , 4 x 2 , 0 , x 2 , 1 , x 2 , 2 , x 2 , 3 , x 2 , 4 x 3 , 0 , x 3 , 1 , x 3 , 2 , x 3 , 3 , x 3 , 4 x 4 , 0 , x 4 , 1 , x 4 , 2 , x 4 , 3 , x 4 , 4 x 5 , 0 , x 5 , 1 , x 5 , 2 , x 5 , 3 , x 5 , 4 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 1 , 0 w 0 , 0 + x 1 , 2 w 0 , 1 + x 1 , 4 w 0 , 2 + x 3 , 0 w 1 , 0 + x 3 , 2 w 1 , 1 + x 3 , 4 w 1 , 2 + x 5 , 0 w 2 , 0 + x 5 , 2 w 2 , 1 + x 5 , 4 w 2 , 2 \begin{aligned} y_{2,1}&=\left[ \begin{array}{lr} x_{1,0},x_{1,1},x_{1,2},x_{1,3},x_{1,4}\\ x_{2,0},x_{2,1},x_{2,2},x_{2,3},x_{2,4}\\ x_{3,0},x_{3,1},x_{3,2},x_{3,3},x_{3,4}\\ x_{4,0},x_{4,1},x_{4,2},x_{4,3},x_{4,4}\\ x_{5,0},x_{5,1},x_{5,2},x_{5,3},x_{5,4}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{1,0}w_{0,0}+x_{1,2}w_{0,1}+x_{1,4}w_{0,2}+x_{3,0}w_{1,0}+x_{3,2}w_{1,1}+x_{3,4}w_{1,2}+x_{5,0}w_{2,0}+x_{5,2}w_{2,1}+x_{5,4}w_{2,2} \end{aligned} y2,1=x1,0,x1,1,x1,2,x1,3,x1,4x2,0,x2,1,x2,2,x2,3,x2,4x3,0,x3,1,x3,2,x3,3,x3,4x4,0,x4,1,x4,2,x4,3,x4,4x5,0,x5,1,x5,2,x5,3,x5,4w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x1,0w0,0+x1,2w0,1+x1,4w0,2+x3,0w1,0+x3,2w1,1+x3,4w1,2+x5,0w2,0+x5,2w2,1+x5,4w2,2 y 2 , 2 = [ x 1 , 1 , x 1 , 2 , x 1 , 3 , x 1 , 4 , x 1 , 5 x 2 , 1 , x 2 , 2 , x 2 , 3 , x 2 , 4 , x 2 , 5 x 3 , 1 , x 3 , 2 , x 3 , 3 , x 3 , 4 , x 3 , 5 x 4 , 1 , x 4 , 2 , x 4 , 3 , x 4 , 4 , x 4 , 5 x 5 , 1 , x 5 , 2 , x 5 , 3 , x 5 , 4 , x 5 , 5 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2     0 ,    0 ,     0 ,   0 ,     0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2     0 ,    0 ,     0 ,   0 ,     0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 ] = x 1 , 1 w 0 , 0 + x 1 , 3 w 0 , 1 + x 1 , 5 w 0 , 2 + x 3 , 1 w 1 , 0 + x 3 , 3 w 1 , 1 + x 3 , 5 w 1 , 2 + x 5 , 1 w 2 , 0 + x 5 , 3 w 2 , 1 + x 5 , 5 w 2 , 2 \begin{aligned} y_{2,2}&=\left[ \begin{array}{lr} x_{1,1},x_{1,2},x_{1,3},x_{1,4},x_{1,5}\\ x_{2,1},x_{2,2},x_{2,3},x_{2,4},x_{2,5}\\ x_{3,1},x_{3,2},x_{3,3},x_{3,4},x_{3,5}\\ x_{4,1},x_{4,2},x_{4,3},x_{4,4},x_{4,5}\\ x_{5,1},x_{5,2},x_{5,3},x_{5,4},x_{5,5}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{0,0},0,w_{0,1},0,w_{0,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,0},0,w_{1,1},0,w_{1,2}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{2,0},0,w_{2,1},0,w_{2,2}\\ \end{array}\right]\\&=x_{1,1}w_{0,0}+x_{1,3}w_{0,1}+x_{1,5}w_{0,2}+x_{3,1}w_{1,0}+x_{3,3}w_{1,1}+x_{3,5}w_{1,2}+x_{5,1}w_{2,0}+x_{5,3}w_{2,1}+x_{5,5}w_{2,2} \end{aligned} y2,2=x1,1,x1,2,x1,3,x1,4,x1,5x2,1,x2,2,x2,3,x2,4,x2,5x3,1,x3,2,x3,3,x3,4,x3,5x4,1,x4,2,x4,3,x4,4,x4,5x5,1,x5,2,x5,3,x5,4,x5,5w0,0,0,w0,1,0,w0,2   0,  0,   0, 0,   0w1,0,0,w1,1,0,w1,2   0,  0,   0, 0,   0w2,0,0,w2,1,0,w2,2=x1,1w0,0+x1,3w0,1+x1,5w0,2+x3,1w1,0+x3,3w1,1+x3,5w1,2+x5,1w2,0+x5,3w2,1+x5,5w2,2
我们先看对于 x 0 , 0 x_{0,0} x0,0的梯度:
∂ L ∂ x 0 , 0 = ∂ L ∂ y 1 , 1 ∂ y 1 , 1 ∂ x 0 , 0 = δ 1 , 1 w 0 , 0 \frac{\partial L}{\partial x_{0,0}}=\frac{\partial L}{\partial y_{1,1}}\frac{\partial y_{1,1}}{\partial x_{0,0}}=\delta_{1,1}w_{0,0} x0,0L=y1,1Lx0,0y1,1=δ1,1w0,0
再看 x 1 , 1 x_{1,1} x1,1
∂ L ∂ x 1 , 1 = ∂ L ∂ y 0 , 0 ∂ y 0 , 0 ∂ x 1 , 1 + ∂ L ∂ y 0 , 2 ∂ y 0 , 2 ∂ x 1 , 1 + ∂ L ∂ y 2 , 0 ∂ y 2 , 0 ∂ x 1 , 1 + ∂ L ∂ y 2 , 2 ∂ y 2 , 2 ∂ x 1 , 1 = δ 0 , 0 w 1 , 1 + δ 0 , 2 w 1 , 0 + δ 2 , 0 w 0 , 1 + δ 2 , 2 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\frac{\partial L}{\partial y_{0,0}}\frac{\partial y_{0,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{0,2}}\frac{\partial y_{0,2}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{2,0}}\frac{\partial y_{2,0}}{\partial x_{1,1}}+\frac{\partial L}{\partial y_{2,2}}\frac{\partial y_{2,2}}{\partial x_{1,1}}\\&=\delta_{0,0}w_{1,1}+\delta_{0,2}w_{1,0}+\delta_{2,0}w_{0,1}+\delta_{2,2}w_{0,0} \end{aligned} x1,1L=y0,0Lx1,1y0,0+y0,2Lx1,1y0,2+y2,0Lx1,1y2,0+y2,2Lx1,1y2,2=δ0,0w1,1+δ0,2w1,0+δ2,0w0,1+δ2,2w0,0
继续按照卷积去计算的话:
∂ L ∂ x 0 , 0 = [ 0 , 0 , 0 ,    0 ,    0 0 , 0 , 0 ,    0 ,    0 0 , 0 , 0 ,    0 ,    0 0 , 0 , 0 , δ 0 , 0 , δ 0 , 1 0 , 0 , 0 , δ 1 , 0 , δ 1 , 1 ] ⊗ [ w 2 , 2 , 0 , w 2 , 1 , 0 , w 2 , 0     0 ,    0 ,     0 ,   0 ,     0 w 1 , 2 , 0 , w 1 , 1 , 0 , w 1 , 0     0 ,    0 ,     0 ,   0 ,     0 w 0 , 2 , 0 , w 0 , 1 , 0 , w 0 , 0 ] = δ 1 , 1 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{0,0}}&=\left[ \begin{array}{lr} 0,0,0,\ \ 0,\ \ 0\\ 0,0,0,\ \ 0,\ \ 0\\ 0,0,0,\ \ 0,\ \ 0\\ 0,0,0,\delta_{0,0},\delta_{0,1}\\ 0,0,0,\delta_{1,0},\delta_{1,1}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{2,2},0,w_{2,1},0,w_{2,0}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,2},0,w_{1,1},0,w_{1,0}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{0,2},0,w_{0,1},0,w_{0,0}\\ \end{array}\right]\\&=\delta_{1,1}w_{0,0} \end{aligned} x0,0L=0,0,0,  0,  00,0,0,  0,  00,0,0,  0,  00,0,0,δ0,0,δ0,10,0,0,δ1,0,δ1,1w2,2,0,w2,1,0,w2,0   0,  0,   0, 0,   0w1,2,0,w1,1,0,w1,0   0,  0,   0, 0,   0w0,2,0,w0,1,0,w0,0=δ1,1w0,0 ∂ L ∂ x 1 , 1 = [ 0 , 0 ,    0 ,    0 ,    0 0 , 0 ,    0 ,    0 ,    0 0 , 0 , δ 0 , 0 , δ 0 , 1 , δ 0 , 2 0 , 0 , δ 1 , 0 , δ 1 , 1 , δ 1 , 2 0 , 0 , δ 2 , 0 , δ 2 , 1 , δ 2 , 2 ] ⊗ [ w 2 , 2 , 0 , w 2 , 1 , 0 , w 2 , 0     0 ,    0 ,     0 ,   0 ,     0 w 1 , 2 , 0 , w 1 , 1 , 0 , w 1 , 0     0 ,    0 ,     0 ,   0 ,     0 w 0 , 2 , 0 , w 0 , 1 , 0 , w 0 , 0 ] = δ 0 , 0 w 1 , 1 + δ 0 , 2 w 1 , 0 + δ 2 , 0 w 0 , 1 + δ 2 , 2 w 0 , 0 \begin{aligned} \frac{\partial L}{\partial x_{1,1}}&=\left[ \begin{array}{lr} 0,0,\ \ 0,\ \ 0,\ \ 0\\ 0,0,\ \ 0,\ \ 0,\ \ 0\\ 0,0,\delta_{0,0},\delta_{0,1},\delta_{0,2}\\ 0,0,\delta_{1,0},\delta_{1,1},\delta_{1,2}\\ 0,0,\delta_{2,0},\delta_{2,1},\delta_{2,2}\\ \end{array}\right]\otimes \left[ \begin{array}{lr} w_{2,2},0,w_{2,1},0,w_{2,0}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{1,2},0,w_{1,1},0,w_{1,0}\\ \ \ \ 0,\ \ 0,\ \ \ 0,\ 0,\ \ \ 0\\ w_{0,2},0,w_{0,1},0,w_{0,0}\\ \end{array}\right]\\&=\delta_{0,0}w_{1,1}+\delta_{0,2}w_{1,0}+\delta_{2,0}w_{0,1}+\delta_{2,2}w_{0,0} \end{aligned} x1,1L=0,0,  0,  0,  00,0,  0,  0,  00,0,δ0,0,δ0,1,δ0,20,0,δ1,0,δ1,1,δ1,20,0,δ2,0,δ2,1,δ2,2w2,2,0,w2,1,0,w2,0   0,  0,   0, 0,   0w1,2,0,w1,1,0,w1,0   0,  0,   0, 0,   0w0,2,0,w0,1,0,w0,0=δ0,0w1,1+δ0,2w1,0+δ2,0w0,1+δ2,2w0,0
豁,说实话今天我也是第一次推dilation不是1 的情况,原来这东西是影响padding的。
那么新的结论来了
p a d d i n g _ b a c k w a r d = k e r n e l _ s i z e − p a d d i n g _ f o r w a r d + ( d i l a t i o n − 1 ) × 2 − C padding\_backward=kernel\_size-padding\_forward+\left(dilation-1\right)\times2-C padding_backward=kernel_sizepadding_forward+(dilation1)×2C

小结但不是这一部分的结束
p a d d i n g _ b a c k w a r d = k e r n e l _ s i z e − p a d d i n g _ f o r w a r d + ( d i l a t i o n _ f o r w a r d _ k e r n e l − 1 ) × 2 − C d i l a t i o n _ b a c k w a r d _ f e a t u r e m a p = s t r i d e _ f o r w a r d d i l a t i o n _ b a c k w a r d _ k e r n e l = d i l a t i o n _ f o r w a r d _ k e r n e l padding\_backward=kernel\_size-padding\_forward+\left(dilation\_forward\_kernel-1\right)\times2-C\\ dilation\_backward\_featuremap=stride\_forward\\ dilation\_backward\_kernel=dilation\_forward\_kernel padding_backward=kernel_sizepadding_forward+(dilation_forward_kernel1)×2Cdilation_backward_featuremap=stride_forwarddilation_backward_kernel=dilation_forward_kernel

5.6 输入通道个数

我们现在回到最初的起点,卷积核尺寸为 3 × 3 3\times3 3×3,padding=0,步长为1,dilation也默认为1。这次设置输入通道为2,输出通道仍为1。

定义一下下标,下标两个数字的表示二维的横纵坐标,三个数字的,第一个数字表示输入通道,后面两个表示二维的横纵坐标。

我们只算一个输出:
y 0 , 0 = [ [ x 0 , 0 , 0 , x 0 , 0 , 1 , x 0 , 0 , 2 x 0 , 1 , 0 , x 0 , 1 , 1 , x 0 , 1 , 2 x 0 , 2 , 0 , x 0 , 2 , 1 , x 0 , 2 , 2 ] [ x 1 , 0 , 0 , x 1 , 0 , 1 , x 1 , 0 , 2 x 1 , 1 , 0 , x 1 , 1 , 1 , x 1 , 1 , 2 x 1 , 2 , 0 , x 1 , 2 , 1 , x 1 , 2 , 2 ] ] ⊗      [ [ w 0 , 0 , 0 , w 0 , 0 , 1 , w 0 , 0 , 2 w 0 , 1 , 0 , w 0 , 1 , 1 , w 0 , 1 , 2 w 0 , 2 , 0 , w 0 , 2 , 1 , w 0 , 2 , 2 ] [ w 1 , 0 , 0 , w 1 , 0 , 1 , w 1 , 0 , 2 w 1 , 1 , 0 , w 1 , 1 , 1 , w 1 , 1 , 2 w 1 , 2 , 0 , w 1 , 2 , 1 , w 1 , 2 , 2 ] ] = x 0 , 0 , 0 w 0 , 0 , 0 + x 0 , 0 , 1 w 0 , 0 , 1 + x 0 , 0 , 2 w 0 , 0 , 2 + x 0 , 1 , 0 w 0 , 1 , 0 + x 0 , 1 , 1 w 0 , 1 , 1 + x 0 , 1 , 2 w 0 , 1 , 2 + x 0 , 2 , 0 w 0 , 2 , 0 + x 0 , 2 , 1 w 0 , 2 , 1 + x 0 , 2 , 2 w 0 , 2 , 2 +        x 1 , 0 , 0 w 1 , 0 , 0 + x 1 , 0 , 1 w 1 , 0 , 1 + x 1 , 0 , 2 w 1 , 0 , 2 + x 1 , 1 , 0 w 1 , 1 , 0 + x 1 , 1 , 1 w 1 , 1 , 1 + x 1 , 1 , 2 w 1 , 1 , 2 + x 1 , 2 , 0 w 1 , 2 , 0 + x 1 , 2 , 1 w 1 , 2 , 1 + x 1 , 2 , 2 w 1 , 2 , 2 \begin{aligned} y_{0,0}&=\left[\left[ \begin{array}{lr} x_{0,0,0},x_{0,0,1},x_{0,0,2}\\ x_{0,1,0},x_{0,1,1},x_{0,1,2}\\ x_{0,2,0},x_{0,2,1},x_{0,2,2}\\ \end{array}\right]\left[ \begin{array}{lr} x_{1,0,0},x_{1,0,1},x_{1,0,2}\\ x_{1,1,0},x_{1,1,1},x_{1,1,2}\\ x_{1,2,0},x_{1,2,1},x_{1,2,2}\\ \end{array}\right]\right]\otimes\\&\ \ \ \ \left[ \left[ \begin{array}{lr} w_{0,0,0},w_{0,0,1},w_{0,0,2}\\ w_{0,1,0},w_{0,1,1},w_{0,1,2}\\ w_{0,2,0},w_{0,2,1},w_{0,2,2}\\ \end{array}\right]\left[ \begin{array}{lr} w_{1,0,0},w_{1,0,1},w_{1,0,2}\\ w_{1,1,0},w_{1,1,1},w_{1,1,2}\\ w_{1,2,0},w_{1,2,1},w_{1,2,2}\\ \end{array}\right]\right]\\&=x_{0,0,0}w_{0,0,0}+x_{0,0,1}w_{0,0,1}+x_{0,0,2}w_{0,0,2}+x_{0,1,0}w_{0,1,0}+x_{0,1,1}w_{0,1,1}+x_{0,1,2}w_{0,1,2}+x_{0,2,0}w_{0,2,0}+x_{0,2,1}w_{0,2,1}+x_{0,2,2}w_{0,2,2}+\\&\ \ \ \ \ \ x_{1,0,0}w_{1,0,0}+x_{1,0,1}w_{1,0,1}+x_{1,0,2}w_{1,0,2}+x_{1,1,0}w_{1,1,0}+x_{1,1,1}w_{1,1,1}+x_{1,1,2}w_{1,1,2}+x_{1,2,0}w_{1,2,0}+x_{1,2,1}w_{1,2,1}+x_{1,2,2}w_{1,2,2} \end{aligned} y0,0=x0,0,0,x0,0,1,x0,0,2x0,1,0,x0,1,1,x0,1,2x0,2,0,x0,2,1,x0,2,2x1,0,0,x1,0,1,x1,0,2x1,1,0,x1,1,1,x1,1,2x1,2,0,x1,2,1,x1,2,2    w0,0,0,w0,0,1,w0,0,2w0,1,0,w0,1,1,w0,1,2w0,2,0,w0,2,1,w0,2,2w1,0,0,w1,0,1,w1,0,2w1,1,0,w1,1,1,w1,1,2w1,2,0,w1,2,1,w1,2,2=x0,0,0w0,0,0+x0,0,1w0,0,1+x0,0,2w0,0,2+x0,1,0w0,1,0+x0,1,1w0,1,1+x0,1,2w0,1,2+x0,2,0w0,2,0+x0,2,1w0,2,1+x0,2,2w0,2,2+      x1,0,0w1,0,0+x1,0,1w1,0,1+x1,0,2w1,0,2+x1,1,0w1,1,0+x1,1,1w1,1,1+x1,1,2w1,1,2+x1,2,0w1,2,0+x1,2,1w1,2,1+x1,2,2w1,2,2
根据这个计算式,我们可以很明显的看出来,不同的通道在求梯度的时候是互相不干扰的,列举几个看看:
∂ y ∂ x 0 , 0 , 0 = w 0 , 0 , 0 \frac{\partial y}{\partial x_{0,0,0}}=w_{0,0,0} x0,0,0y=w0,0,0 ∂ y ∂ x 1 , 0 , 0 = w 1 , 0 , 0 \frac{\partial y}{\partial x_{1,0,0}}=w_{1,0,0} x1,0,0y=w1,0,0
因此新的梯度应该是(这里的下标表示输入通道):
∂ L ∂ x 0 = δ ⊗ w 0 \frac{\partial L}{\partial x_0}=\delta \otimes w_0 x0L=δw0 ∂ L ∂ x 1 = δ ⊗ w 1 \frac{\partial L}{\partial x_1}=\delta \otimes w_1 x1L=δw1
5.6 输出通道个数

卷积核尺寸为 3 × 3 3\times3 3×3,padding=0,步长为1,dilation也默认为1。这次设置输入通道为1,输出通道为2。

定义一下下标,下标两个数字的表示二维的横纵坐标,三个数字的,前面两个表示二维的横纵坐标,第三个数字表示输出通道。

我们只算两个输出:
y 0 , 0 , 0 = [ x 0 , 0 , x 0 , 1 , x 0 , 2 x 1 , 0 , x 1 , 1 , x 1 , 2 x 2 , 0 , x 2 , 1 , x 2 , 2 ] ⊗ [ w 0 , 0 , 0 , w 0 , 1 , 0 , w 0 , 2 , 0 w 1 , 0 , 0 , w 1 , 1 , 0 , w 1 , 2 , 0 w 2 , 0 , 0 , w 2 , 1 , 0 , w 2 , 2 , 0 ] = x 0 , 0 w 0 , 0 , 0 + x 0 , 1 w 0 , 1 , 0 + x 0 , 2 w 0 , 2 , 0 + x 1 , 0 w 1 , 0 , 0 + x 1 , 1 w 1 , 1 , 0 + x 1 , 2 w 1 , 2 , 0 + x 2 , 0 w 2 , 0 , 0 + x 2 , 1 w 2 , 1 , 0 + x 2 , 2 w 2 , 2 , 1 \begin{aligned} y_{0,0,0}&=\left[ \begin{array}{lr} x_{0,0},x_{0,1},x_{0,2}\\ x_{1,0},x_{1,1},x_{1,2}\\ x_{2,0},x_{2,1},x_{2,2}\\ \end{array}\right]\otimes\left[ \begin{array}{lr} w_{0,0,0},w_{0,1,0},w_{0,2,0}\\ w_{1,0,0},w_{1,1,0},w_{1,2,0}\\ w_{2,0,0},w_{2,1,0},w_{2,2,0}\\ \end{array}\right]\\&=x_{0,0}w_{0,0,0}+x_{0,1}w_{0,1,0}+x_{0,2}w_{0,2,0}+x_{1,0}w_{1,0,0}+x_{1,1}w_{1,1,0}+x_{1,2}w_{1,2,0}+x_{2,0}w_{2,0,0}+x_{2,1}w_{2,1,0}+x_{2,2}w_{2,2,1} \end{aligned} y0,0,0=x0,0,x0,1,x0,2x1,0,x1,1,x1,2x2,0,x2,1,x2,2w0,0,0,w0,1,0,w0,2,0w1,0,0,w1,1,0,w1,2,0w2,0,0,w2,1,0,w2,2,0=x0,0w0,0,0+x0,1w0,1,0+x0,2w0,2,0+x1,0w1,0,0+x1,1w1,1,0+x1,2w1,2,0+x2,0w2,0,0+x2,1w2,1,0+x2,2w2,2,1 y 0 , 0 , 1 = [ x 0 , 0 , x 0 , 1 , x 0 , 2 x 1 , 0 , x 1 , 1 , x 1 , 2 x 2 , 0 , x 2 , 1 , x 2 , 2 ] ⊗ [ w 0 , 0 , 1 , w 0 , 1 , 1 , w 0 , 2 , 1 w 1 , 0 , 1 , w 1 , 1 , 1 , w 1 , 2 , 1 w 2 , 0 , 1 , w 2 , 1 , 1 , w 2 , 2 , 1 ] = x 0 , 0 w 0 , 0 , 1 + x 0 , 1 w 0 , 1 , 1 + x 0 , 2 w 0 , 2 , 1 + x 1 , 0 w 1 , 0 , 1 + x 1 , 1 w 1 , 1 , 1 + x 1 , 2 w 1 , 2 , 1 + x 2 , 0 w 2 , 0 , 1 + x 2 , 1 w 2 , 1 , 1 + x 2 , 2 w 2 , 2 , 1 \begin{aligned} y_{0,0,1}&=\left[ \begin{array}{lr} x_{0,0},x_{0,1},x_{0,2}\\ x_{1,0},x_{1,1},x_{1,2}\\ x_{2,0},x_{2,1},x_{2,2}\\ \end{array}\right]\otimes\left[ \begin{array}{lr} w_{0,0,1},w_{0,1,1},w_{0,2,1}\\ w_{1,0,1},w_{1,1,1},w_{1,2,1}\\ w_{2,0,1},w_{2,1,1},w_{2,2,1}\\ \end{array}\right]\\&=x_{0,0}w_{0,0,1}+x_{0,1}w_{0,1,1}+x_{0,2}w_{0,2,1}+x_{1,0}w_{1,0,1}+x_{1,1}w_{1,1,1}+x_{1,2}w_{1,2,1}+x_{2,0}w_{2,0,1}+x_{2,1}w_{2,1,1}+x_{2,2}w_{2,2,1} \end{aligned} y0,0,1=x0,0,x0,1,x0,2x1,0,x1,1,x1,2x2,0,x2,1,x2,2w0,0,1,w0,1,1,w0,2,1w1,0,1,w1,1,1,w1,2,1w2,0,1,w2,1,1,w2,2,1=x0,0w0,0,1+x0,1w0,1,1+x0,2w0,2,1+x1,0w1,0,1+x1,1w1,1,1+x1,2w1,2,1+x2,0w2,0,1+x2,1w2,1,1+x2,2w2,2,1
根据前面的说法, x 0 , 0 x_{0,0} x0,0只在这两个计算式中用到了,因此他的梯度应该是:
∂ L ∂ x 0 , 0 = ∂ L ∂ y 0 , 0 , 0 ∂ y 0 , 0 , 0 ∂ x 0 , 0 + ∂ L ∂ y 0 , 0 , 1 ∂ y 0 , 0 , 1 ∂ x 0 , 0 = δ 0 , 0 , 0 w 0 , 0 , 0 + δ 0 , 0 , 1 w 0 , 0 , 1 \begin{aligned} \frac{\partial L}{\partial x_{0,0}}&=\frac{\partial L}{\partial y_{0,0,0}}\frac{\partial y_{0,0,0}}{\partial x_{0,0}}+\frac{\partial L}{\partial y_{0,0,1}}\frac{\partial y_{0,0,1}}{\partial x_{0,0}}\\ &=\delta_{0,0,0}w_{0,0,0}+\delta_{0,0,1}w_{0,0,1} \end{aligned} x0,0L=y0,0,0Lx0,0y0,0,0+y0,0,1Lx0,0y0,0,1=δ0,0,0w0,0,0+δ0,0,1w0,0,1
所以说,两个输出通道都参与到了这个的计算当中。也就是说这就变成了每个通道单独卷积再加和。

有没有觉得这个怎么看起来这么熟悉。。这不就是前向传播中的那种三维卷积,各个二维通道分别卷积再加和。

那么结论来了
∂ L ∂ x = δ ⊗ w r o t 180 \frac{\partial L}{\partial x}=\delta\otimes w_{rot180} xL=δwrot180
这个式子和前面二维的一样,只不过现在这些矩阵都变成了三维矩阵。

到这里我们整个反向求梯度的链式求导已经over了,只剩下对卷积层权重和偏置的求导了,这里和前面的推导是一样的方法,所以就留下作为个人练习好了~~

  • 12
    点赞
  • 45
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值