吴恩达讲的英语很好听 --2020.02.10
Neural Networks
Origins: algorithms that try to mimic the brain.
Neural Netwaoks model has three layer: input layer,hiden layer(maybe more than one) and output layer.
发现了一个讲的不错的网址,主要讲模型Neural Netwaoks model :
https://www.jianshu.com/p/c133a5220841
The algorithms exist in hiden layers and results are transfered from one hiden layer to another, until getting the final result.
The weight is simple but vital in this model,deciding the result x1 XNOR x2 could get. After combining the uper models,there is a three layer model x,a(1),a(2) correspondingly.
Here is a specific example,and here we have only one training example: (x,y).
Start to compute cost function J(θ)
We need to compute the proper θ(the weight) to minimise the cost function.
We give forward propagation(前向传播方法),list all of the neural:
Then,to get the proper θ, we use a algorithm called backpropagation algorithm(反向传播算法).
δ is what the ativation(激活码) minus the training set,which is the "error".The smaller δ is,the better of this model.
Backpropagation algorithm could help us to get beter δ.We start to compute the δ terms for the layer(4),and then layer(3)
There is no δ(1) because the first layer corresponds to the input layer that's just x.
Finally,after computing δ(4),δ(3) and δ(2),and some surperingly complicated calculation,we can get the partial derivative of cost function J(θ):
So,here are all the steps:
Suppose we have m training samples:
For i=1 to m:
Set a(1) = x(i)
compute a(l)
compute δ(l)
compute partial derivative of cost function J(θ)
In the end,we can use gradient descent(梯度下降) or in one of the advanced optimization algorithm.