What is the "cache" used for in our implementation of forward propagation and backward propagation?
- It is used to cache the intermediate values of the cost function during training.
- We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.
- It is used to keep track of the hyperparameters that we are searching over, to speed up computation.
- We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activations.
the "cache" records values from the forward propagation units and sends it to the backward propagation units because it is needed to compute the chain rule derivatives.
Among the following, which ones are "hyperparameters"? (Check all that apply.)
[深度学习] 第一课 Week 4 Quiz - Key concepts on Deep Neural Networks
最新推荐文章于 2024-05-15 17:07:32 发布
这篇博客探讨了深度学习中深度神经网络的关键概念,包括正向传播和反向传播中的缓存作用,超参数的定义,以及神经网络层数对特征计算的影响。解释了向量化运算在避免显式层间循环中的作用,并讨论了初始化模型参数的方法。此外,还阐述了激活函数在前向传播和反向传播中的重要性,以及在网络结构中权重矩阵的维度规则。
摘要由CSDN通过智能技术生成