本文章内容:
Coursera吴恩达深度学习课程,第一课神经网络和深度学习Neural Networks and Deep Learning,
第四周:深层神经网络(Deep Neural Networks)
编程作业: Building your deep neural network: Step by Step
错题本
我的;
parameters['W' + str(l)] = np.random.randn(L[l], L[l-1]) * 0.01
parameters['b' + str(l)] = np.zeros((L[l]),1)
正确:
parameters['W' + str(l)] = np.random.randn(layer_dims[l], layer_dims[l-1])*0.01
parameters['b' + str(l)] = np.zeros( (layer_dims[l], 1) )
反思:
取数组的元素,不是循环长度的。
我的:
# Implement [LINEAR -> RELU]*(L-1). Add "cache" to the "caches" list.
for l in range(1, L):
A_prev = A
### START CODE HERE ### (≈ 2 lines of code)
A, cache = linear_activation_forward(A_prev, W, b, activation = "relu")
None
### END CODE HERE ###
# Implement LINEAR -> SIGMOID. Add "cache" to the "caches" list.
### START CODE HERE ### (≈ 2 lines of code)
AL, cache = linear_activation_forward(A_prev, W, b, activation = "sigmoid")
None
正确:
# Implement [LINEAR -> RELU]*(L-1). Add "cache" to the "caches" list.
for l in