【吴恩达深度学习】01_week2_quiz Neural Netwoek Basics

(1)What does a neuron compute?
[A]A neuron computes an activation function followed by a linear function(z=Wx+b)
[B]A neuron computes the mean of all features before applying the output to an activation function.
[C]A neuron computes a linear function(z=Wx+b) followed by an activation function.
[D]A neuron computes a function g that scales the input x linearly(Wx+b)

答案:C
解析:如图所示
在这里插入图片描述

(2)Which of these is the “Logistic Loss”?
[A] L ( i ) ( y ^ ( i ) , y ( i ) ) = ∣ y ( i ) − y ^ ( i ) ∣ 2 L^{(i)} (\hat{y}^{(i)},y^{(i)})=|y^{(i)}-\hat{y}^{(i)}|^2 L(i)(y^(i),y(i))=y(i)y^(i)2
[B] L ( i ) ( y ^ ( i ) , y ( i ) ) = m a x ( 0 , y ( i ) − y ^ ( i ) ) L^{(i)} (\hat{y}^{(i)},y^{(i)})=max(0,y^{(i)}-\hat{y}^{(i)}) L(i)(y^(i),y(i))=max(0,y(i)y^(i))
[C] L ( i ) ( y ^ ( i ) , y ( i ) ) = ∣ y ( i ) − y ^ ( i ) ∣ L^{(i)} (\hat{y}^{(i)},y^{(i)})=|y^{(i)}-\hat{y}^{(i)}| L(i)(y^(i),y(i))=y(i)y^(i)
[D] L ( i ) ( y ^ ( i ) , y ( i ) ) = − y ( i ) log ⁡ ( y ^ ( i ) ) + ( 1 − y ( i ) ) log ⁡ ( 1 − y ^ ( i ) ) L^{(i)} (\hat{y}^{(i)},y^{(i)})=-y^{(i)}\log{(\hat{y}^{(i)}})+(1-y^{(i)})\log{(1-\hat{y}^{(i)}}) L(i)(y^(i),y(i))=y(i)log(y^(i))+(1y(i))log(1y^(i))

答案:D

(3)Suppose img is a (32,32,3) array, representing a 32x32 image with 3 color channels red, green and blue. How do you reshape this into a column vector?
[A] x = img.reshape((32*32*3,1))
[B] x = img.reshape((1,32*32*3))
[C] x = img.reshape((3,32*32))
[C] x = img.reshape((32*32,3))

答案:A
解析:注意,题目要求列向量(column vector)

(4)Consider the two following random arrays “a” and “b”:

a = np.random.randn(2,3) # a.shape = (2,3)
b = np.random.randn(2,1) # a.shape = (2,1)
c = a+b

What will be the shape of “c”?
[A] c.shape = (2,1)
[B] c.shape = (3,2)
[C] c.shape = (2,3)
[D] The computation cannot happen because the size don’t match. It’s going to be “Error” !

答案:C
解析:Broadcasting

(5)Consider the two following random arrays “a” and “b”:

a = np.random.randn(4,3) # a.shape = (4,3)
b = np.random.randn(3,2) # a.shape = (3,2)
c = a*b

What will be the shape of “c”?
[A] c.shape = (3,3)
[B] c.shape = (4,2)
[C] c.shape = (4,3)
[D] The computation cannot happen because the size don’t match. It’s going to be “Error” !

答案:D
解析:对于ndarray,*运算符为元素乘法,matmul为矩阵乘法。

(6)Suppose you have n x n_x nx input features per example. Recall that X = [ x ( 1 ) , x ( 2 ) . . . x ( m ) ] X=[x^{(1)},x^{(2)}...x^{(m)}] X=[x(1),x(2)...x(m)].What is the dimension of X?
[A] ( 1 , m ) (1,m) (1,m)
[B] ( m , 1 ) (m,1) (m,1)
[C] ( x x , m ) (x_x,m) (xx,m)
[D] ( m , n x ) (m,n_x) (m,nx)

答案:C

(7)Recall that “np.dot(a,b)” performs a matrix multiplication on a and b, whereas “a*b” performs an element-wise multiplication.
Consider the two following random arrays “a” and “b” .

a = np.random.randn(12288,150) # a.shape = (12288,150)
b = np.random.randn(150,45) # a.shape = (150,45)
c = np.dot(a,b)

What will be the shape of “c”?
[A] c.shape = (150,150)
[B] c.shape = (12288,150)
[C] c.shape = (12288,45)
[D] The computation cannot happen because the size don’t match. It’s going to be “Error” !

答案:C

(8)Consider the following code snippet:

# a.shape = (3,4)
# b.shape = (4,1)

for i in range(3):
    for j in range(4):
        c[i][j]=a[i][j]+b[j]

How do you vectorize this?
[A] c = a.T + b.T
[B] c = a.T + b
[C] c = a + b.T
[D] c = a + b

答案:C

(9)Consider the following code:

a = np.random.randn(3,3) 
b = np.random.randn(3,1) 
c = a*b

What will be c?(If you’re not sure, feel free to run this in python to find out).
[A]This will invoke broadcasting, so b is copied three times to become (3,3), and * is an element-wise product so c.shape will be (3,3)
[B]This will invoke broadcasting, so b is copied three times to become (3,3), and * invokes a matrix multiplication operation of two 3x3 matrices so c.shape will be (3,3)
[C]This will multiply a 3x3 matrix a with a 3x1 vector, thus resulting in a 3x1 vector. That is, c.shape = (3,1)
[D]It will lead to an error since you cannot use “*” to operate on these two matrices. You need to instead use np.dot(a,b)

答案:A

(10)Consider the following computation graph.
在这里插入图片描述
What is the output J?
[A] J = ( c − 1 ) ∗ ( b + a ) J=(c-1)*(b+a) J=(c1)(b+a)
[B] J = ( a − 1 ) ∗ ( b + c ) J=(a-1)*(b+c) J=(a1)(b+c)
[C] J = a ∗ b + b ∗ c + a ∗ c J=a*b+b*c+a*c J=ab+bc+ac
[D] J = ( b − 1 ) ∗ ( c + a ) J=(b-1)*(c+a) J=(b1)(c+a)

答案:B
解析:
J = u + v − w = a ∗ b + a ∗ c − ( b + c ) = a ∗ ( b + c ) − ( b + c ) = ( a − 1 ) ∗ ( b + c ) \begin{aligned} J & =u+v-w \\ &= a*b+a*c-(b+c) \\ &= a*(b+c)-(b+c) \\ &= (a-1)*(b+c) \end{aligned} J=u+vw=ab+ac(b+c)=a(b+c)(b+c)=(a1)(b+c)

  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值