【吴恩达深度学习】01_week4_quiz Key concepts on Deep Neural Networks

(1)What is the “cache” used for in our implementation of forward propagation and backward propagation?
[A]It is used to keep track of the hyperparameters that we are searching over, to speed up computation.
[B]We use it to pass variables computed during forward propagation to the corresponding backward propagation step. It contains useful values for backward propagation to compute derivatives.
[C]We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward propagation to compute activation.
[D]It is used to cache the intermediate values of the cost function during training.

答案:B

(2)Among the following, which ones are “hyperparameters”?(Check all that apply.)
[A]learning rate α \alpha α
[B]number of iterations
[C]size of the hidden layers n [ l ] n^{[l]} n[l]
[D]bias vectors b [ l ] b^{[l]} b[l]
[E]number of layers L in the neural network
[F]weight matrices W [ l ] W^{[l]} W[l]
[G]activation values a [ l ] a^{[l]} a[l]

答案:A,B,C,E
解析:模型参数通常是有数据来驱动调整,超参数则不需要数据来驱动,而是在训练前或者训练中人为的进行调整的参数。

(3)Which of the following statements is true?
[A]The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.
[B]The earlier layer of a neural network are typically computing more complex features of the input than the deeper layers.

答案:A

(4)Vectorization allows you to compute forward propagation in an L-layer neural network without an explicit for-loop(or any other explicit iterative loop) over the layers I = 1 , 2 , . . . L I=1,2,...L I=1,2,...L.True/False?
答案:False

(5)Assume we store the values for n [ l ] n^{[l]} n[l] in an array called layers, as follows: layer_dims=[nx,4,3,2,1]. So layer 1 has four hidden units, layer 2 has 3 hidden units and so on. Which of the following for-loops will allow you to initialize the parameters for the model?
[A]在这里插入图片描述
[B]在这里插入图片描述
[C]在这里插入图片描述
[D]在这里插入图片描述
答案:D

(6)Consider the following neural network.
在这里插入图片描述
How many layers does this network have?
[A]The number of layer L is 4. The number of hidden layers is 3.
[B]The number of layer L is 3. The number of hidden layers is 3.
[C]The number of layer L is 4. The number of hidden layers is 4.
[D]The number of layer L is 5. The number of hidden layers is 4.

答案:A
解析:输出层不算隐藏层

(7)During forward propagation, in the forward function for a layer l you need to know that is the activation function in a layer(Sigmoid, tanh,ReLu,etc.). During back propagation, the corresponding backward function also needs to know what is the activation function for layer l, since the gradient depends on it. True/False?
答案:True
解析:不同的激活函数导数不同

(8)There are certain functions with the following properties:
To compute the function using a shallow network circuit, you will need a large network(where we measure size by the number of logic gates in the network), but to compute it using a deep network circuit, you need only an exponentially smaller network. True/False?

答案:True
解析:见视频:Why deep representations

(9)Consider the following 2 hidden layer neural network:
在这里插入图片描述
Which of the following statements are True?(Check all that apply)
[A] W [ 1 ] W^{[1]} W[1] will have shape (4,4)
[B] b [ 1 ] b^{[1]} b[1] will have shape (4,1)
[C] W [ 1 ] W^{[1]} W[1] will have shape (3,4)
[D] b [ 1 ] b^{[1]} b[1] will have shape (3,1)
[E] W [ 2 ] W^{[2]} W[2] will have shape (3,4)
[F] b [ 2 ] b^{[2]} b[2] will have shape (1,1)
[G] W [ 2 ] W^{[2]} W[2] will have shape (3,1)
[H] b [ 2 ] b^{[2]} b[2] will have shape (3,1)
[I] W [ 3 ] W^{[3]} W[3] will have shape (3,1)
[J] b [ 3 ] b^{[3]} b[3] will have shape (1,1)
[K] W [ 3 ] W^{[3]} W[3] will have shape (1,3)
[L] b [ 3 ] b^{[3]} b[3] will have shape (3,1)

答案:A,B,E,H,J,K

(10)Whereas the previous question used a specific network, in the general case what is the dimension of W [ l ] W^{[l]} W[l], the weight matrix associated with layer I ?
[A] W [ l ] W^{[l]} W[l] has shape ( n [ l − 1 ] , n [ l ] ) (n^{[l-1]},n^{[l]}) (n[l1],n[l])
[B] W [ l ] W^{[l]} W[l] has shape ( n [ l + 1 ] , n [ l ] ) (n^{[l+1]},n^{[l]}) (n[l+1],n[l])
[C] W [ l ] W^{[l]} W[l] has shape ( n [ l ] , n [ l − 1 ] ) (n^{[l]},n^{[l-1]}) (n[l],n[l1])
[D] W [ l ] W^{[l]} W[l] has shape ( n [ l ] , n [ l + 1 ] ) (n^{[l]},n^{[l+1]}) (n[l],n[l+1])
答案:C
在这里插入图片描述

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值