吴恩达-深度学习入门-第二周课后测验题

前情须知

1、本文参考CSDN博主何宽老师的文章,仅用于个人学习使用,将答案部分单独摘出放在最后,方便进行自我检测。
参考文章链接: https://blog.csdn.net/u013733326/article/details/79865858
2、第二周分为测验题和编程题两部分

一、中文题目

1、 神经元节点计算什么?

【 】神经元节点先计算激活函数,再计算线性函数(z = Wx + b)

【 】神经元节点先计算线性函数(z = Wx + b),再计算激活。

【 】神经元节点计算函数g,函数g计算(Wx + b)。

【 】在 将输出应用于激活函数之前,神经元节点计算所有特征的平均值

2、 下面哪一个是Logistic损失?

未找到图,略.
请注意:我们使用交叉熵损失函数。

3、 假设img是一个(32,32,3)数组,具有3个颜色通道:红色、绿色和蓝色的32x32像素的图像。 如何将其重新转换为列向量?

4、 看一下下面的这两个随机数组“a”和“b”:

a = np.random.randn(2, 3) # a.shape = (2, 3)
b = np.random.randn(2, 1) # b.shape = (2, 1)
c = a + b

请问数组c的维度是多少?

5、 看一下下面的这两个随机数组“a”和“b”:

a = np.random.randn(4, 3) # a.shape = (4, 3)
b = np.random.randn(3, 2) # b.shape = (3, 2)
c = a * b

请问数组c的维度是多少?

6、 假设你的每一个实例有n_x个输入特征,想一下在X=[x^(1), x(2)…x(m)]中,X的维度是多少?

7、 回想一下,np.dot(a,b)在a和b上执行矩阵乘法,而`a * b’执行元素方式的乘法。
看一下下面的这两个随机数组“a”和“b”:

a = np.random.randn(12288, 150) # a.shape = (12288, 150)
b = np.random.randn(150, 45) # b.shape = (150, 45)
c = np.dot(a, b)

请问c的维度是多少?

8、 看一下下面的这个代码片段:

# a.shape = (3,4)
# b.shape = (4,1)

for i in range(3):
  for j in range(4):
    c[i][j] = a[i][j] + b[j]

请问要怎么把它们向量化?

9、 看一下下面的代码:

a = np.random.randn(3, 3)
b = np.random.randn(3, 1)
c = a * b

请问c的维度会是多少?

10、 看一下下面的计算图,写出最终的式子:

注:无图,直接看答案.

J = u + v - w
  = a * b + a * c - (b + c)
  = a * (b + c) - (b + c)
  = (a - 1) * (b + c)

二、英文题目

1、 What does a neuron compute?

[ ] A neuron computes an activation function followed by a linear function (z = Wx + b)

[ ] A neuron computes a linear function (z = Wx + b) followed by an activation function

[ ] A neuron computes a function g that scales the input x linearly (Wx + b)

[ ] A neuron computes the mean of all features before applying the output to an activation function

2、 Which of these is the “Logistic Loss”?
此题未找到图,略。
Note: We are using a cross-entropy loss function.

3、 Suppose img is a (32,32,3) array, representing a 32x32 image with 3 color channels red, green and blue. How do you reshape this into a column vector?

4、 Consider the two following random arrays “a” and “b”:

a = np.random.randn(2, 3) # a.shape = (2, 3)
b = np.random.randn(2, 1) # b.shape = (2, 1)
c = a + b

What will be the shape of “c”?

5、 Consider the two following random arrays “a” and “b”:

a = np.random.randn(4, 3) # a.shape = (4, 3)
b = np.random.randn(3, 2) # b.shape = (3, 2)
c = a * b

What will be the shape of “c”?

6、 Suppose you have n_x input features per example. Recall that X=[x^(1), x(2)…x(m)]. What is the dimension of X?

7、 Recall that np.dot(a,b)performs a matrix multiplication on a and b, whereas a*b performs an element-wise multiplication.

Consider the two following random arrays “a” and “b”:

a = np.random.randn(12288, 150) # a.shape = (12288, 150)
b = np.random.randn(150, 45) # b.shape = (150, 45)
c = np.dot(a, b)

What will be the shape of “c”?

8、 Consider the following code snippet:

# a.shape = (3,4)
# b.shape = (4,1)

for i in range(3):
  for j in range(4):
    c[i][j] = a[i][j] + b[j]

How do you vectorize this?

9、 Consider the following code:

a = np.random.randn(3, 3)
b = np.random.randn(3, 1)
c = a * b

What will be c?

10、 Consider the following computation graph.

Note:The calculation chart is not found, the answer is as follows:

J = u + v - w
  = a * b + a * c - (b + c)
  = a * (b + c) - (b + c)
  = (a - 1) * (b + c)

三、答案

1、
[x] A neuron computes a linear function (z = Wx + b) followed by an activation function

Note: The output of a neuron is a = g(Wx + b) where g is the activation function (sigmoid, tanh, ReLU, …).

【★】神经元节点先计算线性函数(z = Wx + b),再计算激活。
请注意:神经元的输出是a = g(Wx + b),其中g是激活函数(sigmoid,tanh,ReLU,…)。

2、
Figure not found, omitted.
Note: We are using a cross-entropy loss function.

3、

x = img.reshape((32 * 32 * 3, 1))

4、
b (column vector) is copied 3 times so that it can be summed to each column of a. Therefore,c.shape = (2, 3).

答: B(列向量)复制3次,以便它可以和A的每一列相加,所以:c.shape = (2, 3)

5、
“*” operator indicates element-wise multiplication. Element-wise multiplication requires same dimension between two matrices. It’s going to be an error.

答:运算符 “*” 说明了按元素乘法来相乘,但是元素乘法需要两个矩阵之间的维数相同,所以这将报错,无法计算。

6、
答:(n_x, m)

Note: A stupid way to validate this is use the formula Z^(l) = W(l)A(l) whenl=1, then we have
请注意:一个比较笨的方法是当l=1的时候,那么计算一下,所以我们就有:

A(1)A(1) = X
X.shape = (n_x, m)
Z(1)Z(1).shape = (n(1)n(1), m)
W(1)W(1).shape = (n(1)n(1), n_x)

7、
c.shape = (12288, 45), this is a simple matrix multiplication example.

答: c.shape = (12288, 45), 这是一个简单的矩阵乘法例子。

8、
答:c = a + b.T

9、
答:这将会使用广播机制,b会被复制三次,就会变成(3,3),再使用元素乘法。所以: c.shape = (3, 3).

10、

J = u + v - w
  = a * b + a * c - (b + c)
  = a * (b + c) - (b + c)
  = (a - 1) * (b + c)

答: (a - 1) * (b + c).

  • 3
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值