Take the deep NN of figure-1 as the example:
Following figure-2 shows the building blocks of this deep NN while generalized the figure for NN with L layers:
- Forward and backward propagation for single example x:
Forward propagation:
Input , output , cached :
The initial input is vector x
Backward propagation:
Input , output :
The initial input is
A side notes here is how to calculate from previous layer :
- Forward and backup propagation for m examples X:
Forward propagation:
Input , output , cached
The initial input is X like
Backward propagation:
Input , output
The initial input is:
<end>