Basics of Neural Network Programming - Computation Graph

这篇博客通过一个简单的例子解释了计算图在神经网络中的作用,特别是在前向传播和反向传播过程中的应用。计算图将神经网络的计算过程分为三个步骤,并以图形方式展示,便于理解。在前向传播中,从左到右计算输出值;而在反向传播中,通过右向左的路径计算梯度和导数。计算图对于优化如成本函数(J)等特定输出变量非常有用。
摘要由CSDN通过智能技术生成

This is the notes when studying the class Neural Networks & Deep Learning by Andrew Ng, section 2.7 computation graph. Share it with you and hope it helps.


The computations of a neural network are organized in terms of a forward propagation step in which we compute the output of the neural network followed by a backward propagation step which we use to compute gradients or derivatives. The computation graph explained why it's organized this way.

Let's use a simpler example than logistic regression to explain how computation graph works.

Suppose:

J(a,b,c)=3(a+bc)

Computing this function actually has 3 distinct steps:

\\u=bc \\ v=a+u \\ J=3v

We can take these 3 steps and draw them in a computation graph as below:

In addition, we've shown a concrete example in the computation graph by setting the variables a,b,c to specific values.

The computation graph comes in handy when there is some special output variable, such as J in this case, you want to optimize. In the case of logistic regression, J is the cost function we try to optimize. In this little example, through a left-to-right pass, we can compute the value of J. In the next couple of classes, we'll see that in order to compute derivatives, it'll be a right-to-left path going in the opposite direction as the red arrows.

<end>

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值