The neural networks where the output from one layer is used as input to the next layer. Such networks are called feedforward neural networks. This means there are no loops in the network - information is always fed forward, never fed back. If we did have loops, we'd end up with situations where the input to the σ σ function depended on the output. That'd be hard to make sense of, and so we don't allow such loops.
However, there are other models of artificial neural networks in which feedback loops are possible. These models are calledrecurrent neural networks.
也就是说有feedback loops的是循环神经网络(recurrent neural networks),没有的则是前向神经网络( feedforward neural networks)