![](https://img-blog.csdnimg.cn/2019091813595558.png?x-oss-process=image/resize,m_fixed,h_224,w_224)
吴恩达 Deep Learning
文章平均质量分 91
吴恩达 Deep Learning test and project
Shelton_Peng
这个作者很懒,什么都没留下…
展开
-
Andrew Ng Deep Learning 第四周 选择题
1.What is the “cache” used for in our implementation of forward propagation and backward propagationA.We use it to pass variables computed during backward propagation to the corresponding forward propagation step. It contains useful values for forward pro原创 2022-01-16 14:26:50 · 934 阅读 · 0 评论 -
Andrew Ng Deep Learning 第三周 编程练习 Planar_data_classification_with_one_hidden_layer
目录Exercise 1Exercise 2 layer_sizesExercise 3 initialize_parametersExercise 4 forward_propagationExercise 5 compute_costExercise 6 backward_propagationExercise 7 update_parametersExercise 8 nn_modelExercise 9 predictExercise 1How many training exam.原创 2022-01-13 10:35:40 · 713 阅读 · 0 评论 -
Andrew Ng Deep Learning 第三周 选择题
1.Which of the following are true? (Check all that apply.)A.XXX is a matrix in which each row is one training example.B.a4[2]a^{[2]}_4a4[2] is the activation output by the 4th4^{th}4th neuron of the 2nd2^{nd}2nd layerC.a[2](12)a^{[2](12)}a[2](12) denot原创 2022-01-12 10:56:12 · 642 阅读 · 0 评论 -
Andrew Ng Deep Learning 第三周 双层神经网络反向传播 公式推导
在学这部分内容时,并不能理解dz[1]=W[2]Tdz[2]∗g[1]′(z[1])dz^{[1]} =W^{[2]T}dz^{[2]}*g^{[1]'}(z^{[1]})dz[1]=W[2]Tdz[2]∗g[1]′(z[1])是怎么推导的其实就是简单的利用链式法则还有一点就是当时没有注意x和ax和ax和a推导如下:dz[1]=dLdz[1]=dLdz[2]⋅dz[2]dz[1]=dLdz[2]⋅dz[2]da[1]⋅da[1]dz[1]dz^{[1]}=\frac{dL}{dz^{[1]}}=..原创 2022-01-03 21:49:40 · 517 阅读 · 0 评论 -
Andrew Ng Deep Learning 第二周 编程练习 Logistic regression with neural networks
目录Exercise 1Exercise 2Exercise 3 sigmoidExercise 4 initialize_with_zerosExercise 5 propagateExercise 6 optimizeExercise 7 predictExercise 8 modelExercise 1Find the values for:m_train (number of training examples)m_test (number of test examples).原创 2022-01-01 14:26:12 · 988 阅读 · 0 评论 -
Andrew Ng Deep Learning 第二周 10道选择题
1.What does a neuron compute?A.A neuron computes a function g that scales the input x linearly (Wx + b)B.A neuron computes an activation function followed by a linear function (z = Wx + b)C.A neuron computes a linear function (z = Wx + b) followed by an原创 2021-12-31 11:25:06 · 397 阅读 · 0 评论 -
Andrew Ng Deep Learning 第二周 Numpy练习
目录Exercise 1Exercise 2 basic_sigmoidExercise 3 sigmoidExercise 4 sigmoid_derivativeExercise 5 image2vectorExercise 6 normalize_rowsExercise 7 softmaxExercise 8 L1Exercise 9 L2Exercise 1Set test to “Hello World” in the cell below to print “He.原创 2021-12-31 10:28:09 · 344 阅读 · 0 评论 -
Andrew Ng Deep Learning 第一周 10道选择题
1.What does the analogy “AI is the new electricity” refer to ?A.Similar to electricity starting about 100 years ago, AI is transforming multiple industries.B.Through the “smart grid”, AI is delivering a new wave of electricity.C.AI is powering personal原创 2021-12-28 10:20:21 · 658 阅读 · 0 评论