[深度学习] 第一课 Week 4 Quiz - Key concepts on Deep Neural Networks

这篇博客探讨了深度学习中深度神经网络的关键概念,包括正向传播和反向传播中的缓存作用,超参数的定义,以及神经网络层数对特征计算的影响。解释了向量化运算在避免显式层间循环中的作用,并讨论了初始化模型参数的方法。此外,还阐述了激活函数在前向传播和反向传播中的重要性,以及在网络结构中权重矩阵的维度规则。
摘要由CSDN通过智能技术生成
  1. What is the "cache" used for in our implementation of forward propagation and backward propagation?

    •  It is used to cache the intermediate values of the cost function during training.
    •  We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.
    •  It is used to keep track of the hyperparameters that we are searching over, to speed up computation.
    •  We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.

    the "cache" records values from the forward propagation units and sends it to the backward propagation units because it is needed to compute the chain rule derivatives.

  2. Among the following, which ones are "hyperparameters"? (Check all that apply.) 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值