This is the notes when study the course <Neural Networks & Deep Learning> by Andrew Ng. Just share it to all and hope it helps
Let's try to get an intuitive understanding of calculus and derivatives.
Figure-1 is the plot of function . Let's look at a few points:
- If , then .
Let's bump up a little bit, so now , then
So, the slope or derivative of at the point is of the little green triangle which is 3. - If , then .
Let's bump up a little bit, so now , then
So, the slope or derivative of at the point is also 3.
So the slope of at point can be defined as:
It means that if I nudge to the right a little bit, I expect to go up by as much as the value I just nudged of .
Note that, in above examples, we nudged the variable by 0.001. For formal mathematical definition of derivatives, it looks like this:
Also noted as:
<end>