Test-第三周测试题
1.Which of the falloning are true?(Check all that apply)
result:null
2.The tanh activation is not always better than sigmoid activation function for hidden units because the mean of its output is closer to zero,and so it centers the datas, making leaming complex for the next layer. True/Fatse?
A.True
B.False
3.Which of the following is a correct vectorized implementation of forward propapation for layer 2 ?
result:B
4. You are bulding a birary clasfeeforrecoeniringcucumbersmvswaterrelansym0.Which one of these activation functions would you recommend using for the output laper?
A.tanh
B.sigmod
C. Relu
D.Leaky Relu
result:null
s.Cosider the folllowing code:
A= np.random.randn(4,3)
B=np.sum(A,axis=1,keepdims=True)
What will be B.shape? (If you're not sure,feel free to run this in pychon to find out)
A.(1,3)
B.(3,)
C.(4,1)
D.(4,)
result:C
6.Suppose you have built a neural network with one hidden layer and tanh as activation function for the hidden layers. Which of the following is a best option to initialize the weighes?
A. initialize all wights to a single number chosen randomly.
B. lnitiakize all weights to 0.
C. initialize the weights to small randon numbers.
D. lnitialize the veights to large randon numbers.
result:null
7.Using linear activation functions in the hidden layers of a multiayer neural network is equivalent to using a singe layer. True /False
A. true
B. Fase
result:null
8:Which of the following is true about the ReLU activation functions?
A. They cause several problems in practice becaase they have no derivative at 0. That is why leaky RrLU was invented.
B. They are the go to option when you don't know what activation function to choese for hidden layers .
C. They are only used in the case of rgression problems, such as predicting house prices.
D.They are increasingly being replaced by the tanh in most cases.
result:null
9.Consider the following 1 hidden layer neural network: 1分
Which of the following statements are True?(Check al that apply)
result:null
10. Censider the following 1 hidden layer neural network:
result:null