Machine Learning 错题记录 week 4 Neural Networks: Representation

Which of the following statements are true? Check all that apply.

Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let  be the activation of the first output unit, and similarly  and . Then for any input , it must be the case that .

Any logical function over binary-valued (0 or 1) inputs  and  can be (approximately) represented using some neural network.

The activation values of the hidden units in a neural network, with the sigmoid activation function applied at every layer, are always in the range (0, 1).

A two layer (one input layer, one output layer; no hidden layer) neural network can represent the XOR function.


第一次我选的BD。。错了  应该是BC 


第 2 个问题
1
point

2。第 2 个问题

Consider the following neural network which takes two binary-valued inputs  and outputs . Which of the following logical functions does it (approximately) compute

OR

AND

NAND (meaning "NOT AND")

XOR (exclusive OR)


看不到theta的值= =随便蒙了一个B。。。选错了


第 3 个问题
1
point

3。第 3 个问题

Consider the neural network given below. Which of the following equations correctly computes the activation ? Note:  is the sigmoid activation function.

The activation  is not present in this network.


答案是A

第 4 个问题
1
point

4。第 4 个问题

You have the following neural network:

You'd like to compute the activations of the hidden layer . One way to do so is the following Octave code:

You want to have a vectorized implementation of this (i.e., one that does not use for loops). Which of the following implementations correctly compute ? Check all that apply.

z = Theta1 * x; a2 = sigmoid (z);

a2 = sigmoid (x * Theta1);

a2 = sigmoid (Theta2 * x);

z = sigmoid(x); a2 = sigmoid (Theta1 * z);


答案是A


第 5 个问题
1
point

5。第 5 个问题

You are using the neural network pictured below and have learned the parameters  (used to compute ) and  (used to compute } as a function of ). Suppose you swap the parameters for the first hidden layer between its two units so  and also swap the output layer so . How will this change the value of the output ?


It will stay the same.

It will increase.

It will decrease

Insufficient information to tell: it may increase or decrease.


答案是A



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值