吴恩达机器学习之神经网络


吴恩达讲的英语很好听                                                                                                        --2020.02.10


Neural Networks

Origins: algorithms that try to mimic the brain.

Neural Netwaoks model has three layer: input layer,hiden layer(maybe more than one) and output layer.

发现了一个讲的不错的网址,主要讲模型Neural Netwaoks model :

https://www.jianshu.com/p/c133a5220841

The algorithms exist in hiden layers and  results are transfered from one hiden layer to another, until getting the final result.


The weight is simple but vital in this model,deciding the result x1 XNOR x2 could get. After combining the uper models,there is a three layer model x,a(1),a(2) correspondingly.

Here is a specific example,and here we have only one training example: (x,y).

Start to compute cost function J(θ)

We need to compute the proper θ(the weight) to minimise the cost function.

We give forward propagation(前向传播方法),list all of the neural:

Then,to get the proper θ, we use a algorithm called backpropagation algorithm(反向传播算法).

δ is what the ativation(激活码) minus the training set,which is the "error".The smaller δ is,the better of this model.

Backpropagation algorithm could help us to get beter δ.We start to compute the δ terms for the layer(4),and then layer(3)

There is no δ(1) because the first layer corresponds to the input layer that's just x.

Finally,after computing δ(4),δ(3) and δ(2),and some surperingly complicated calculation,we can get the partial derivative of cost function J(θ):

So,here are all the steps:

Suppose we have m training samples:

For i=1 to m:

    Set a(1) = x(i)

   compute a(l)

   compute δ(l)

compute partial derivative of cost function J(θ)

In the end,we can use gradient descent(梯度下降) or in one of the advanced optimization algorithm.

 

 

 

 

 

 

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值