吴恩达第四周答案 Neural Networks: Representation

Which of the following statements are true? Check all that apply.

Any logical function over binary-valued (0 or 1) inputs  x1  and  x2  can be (approximately) represented using some neural network.

A two layer (one input layer, one output layer; no hidden layer) neural network can represent the XOR function.

The activation values of the hidden units in a neural network, with the sigmoid activation function applied at every layer, are always in the range (0, 1).

Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let  a(3)1=(hΘ(x))1  be the activation of the first output unit, and similarly  a(3)2=(hΘ(x))2  and  a(3)3=(hΘ(x))3 . Then for any input  x , it must be the case that  a(3)1+a(3)2+a(3)3=1 .

1
point
2。

Consider the following neural network which takes two binary-valued inputs  x1,x2{0,1}  and outputs  hΘ(x) . Which of the following logical functions does it (approximately) compute?

NAND (meaning "NOT AND")

AND

OR

XOR (exclusive OR)

1
point
3。

Consider the neural network given below. Which of the following equations correctly computes the activation  a(3)1 ? Note:  g(z)  is the sigmoid activation function.

a(3)1=g(Θ(2)1,0a(2)0+Θ(2)1,1a(2)1+Θ(2)1,2a(2)2)

a(3)1=g(Θ(2)1,0a(1)0+Θ(2)1,1a(1)1+Θ(2)1,2a(1)2)

a(3)1=g(Θ(1)1,0a(2)0+Θ(1)1,1a(2)1+Θ(1)1,2a(2)2)

a(3)1=g(Θ(2)2,0a(2)0+Θ(2)2,1a(2)1+Θ(2)2,2a(2)2)

1
point
4。

You have the following neural network:

You'd like to compute the activations of the hidden layer  a(2)R3 . One way to do so is the following Octave code:

You want to have a vectorized implementation of this (i.e., one that does not use for loops). Which of the following implementations correctly compute  a(2) ? Check all that apply.

a2 = sigmoid (Theta1 * x);

a2 = sigmoid (x * Theta1);

a2 = sigmoid (Theta2 * x);

z = sigmoid(x); a2 = Theta1 * z;

1
point
5。

You are using the neural network pictured below and have learned the parameters  Θ(1)=[110.51.21.92.7]  (used to compute  a(2) ) and  Θ(2)=[10.21.7]  (used to compute  a(3) } as a function of  a(2) ). Suppose you swap the parameters for the first hidden layer between its two units so  Θ(1)=[111.20.52.71.9]  and also swap the output layer so  Θ(2)=[11.70.2] . How will this change the value of the output  hΘ(x) ?

It will stay the same.

It will increase.

It will decrease

Insufficient information to tell: it may increase or decrease.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值