Basics of Neural Network Programming - Gradient descent on m examples

This is the notes when I study the Coursera class Neural Networks & Deep Learning by Andrew Ng, section Gradient descent on m training examples. Share it with you and hope it helps


Last class, you saw how to compute derivatives and implement gradient descent with respect to just one training example for logistic regression. Now, we'll do it for m training examples.

Recap the cost function of logistic regression:

J(w,b)=\frac{1}{m}\sum _{i=1}^{m}\pounds (a^{(i)},y^{(i)})

And,

a^{(i)}=\hat{y}^{(i)}=\sigma (z^{(i)})=\sigma (w^{T}x^{(i)}+b)

According to calculus equations:

\frac{\partial }{\partial w_{1}}J(w,b)=\frac{1}{m}\sum _{i=1}^{m}\frac{\partial }{\partial w_{1}}\pounds (a^{(i)},y^{(i)})

\frac{\partial }{\partial w_{2}}J(w,b)=\frac{1}{m}\sum _{i=1}^{m}\frac{\partial }{\partial w_{2}}\pounds (a^{(i)},y^{(i)})

\frac{\partial }{\partial b}J(w,b)=\frac{1}{m}\sum _{i=1}^{m}\frac{\partial }{\partial b}\pounds (a^{(i)},y^{(i)})

So, above can be shown as below in figure-1 in programming:

Figure-1

Note that everything on figure-1 implements just one step of gradient descent. So you have to repeat everything on figure-1 multiple times in order to take multiple steps of gradient descent.

It turns out there are 2 weakness in these calculations: you need to write 2 for loops. The 1st is over the m training examples; the 2nd is over all the features. Having explicit for loops in the code makes your algorithm run less efficiency. In deep learning era, we would move to bigger and bigger data sets, so being able to implement your algorithms without using explicit for loops is really important and help you to scale to much bigger data sets. The technique vectorization allows you to get rid of these explicit for loops in the code.

<end>

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值