Machine Learning week 4 quiz: Neural Networks: Representation

原创 2015年11月18日 20:51:02

Neural Networks: Representation

5 试题

1. 

Which of the following statements are true? Check all that apply.

A two layer (one input layer, one output layer; no hidden layer) neural network can represent the XOR function.

The activation values of the hidden units in a neural network, with the sigmoid activation function applied at every layer, are always in the range (0, 1).

Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let a(3)1=(hΘ(x))1 be the activation of the first output unit, and similarly a(3)2=(hΘ(x))2 and a(3)3=(hΘ(x))3. Then for any input x, it must be the case that a(3)1+a(3)2+a(3)3=1.

Any logical function over binary-valued (0 or 1) inputs x1 and x2 can be (approximately) represented using some neural network.

2. 

Consider the following neural network which takes two binary-valued inputs x1,x2{0,1} and outputs hΘ(x). Which of the following logical functions does it (approximately) compute?

OR

AND

NAND (meaning "NOT AND")

XOR (exclusive OR)

3. 

Consider the neural network given below. Which of the following equations correctly computes the activation a(3)1? Note: g(z) is the sigmoid activation function.

a(3)1=g(Θ(2)1,0a(2)0+Θ(2)1,1a(2)1+Θ(2)1,2a(2)2)

a(3)1=g(Θ(2)1,0a(1)0+Θ(2)1,1a(1)1+Θ(2)1,2a(1)2)

a(3)1=g(Θ(1)1,0a(2)0+Θ(1)1,1a(2)1+Θ(1)1,2a(2)2)

a(3)1=g(Θ(2)2,0a(2)0+Θ(2)2,1a(2)1+Θ(2)2,2a(2)2)

4. 

You have the following neural network:

You'd like to compute the activations of the hidden layer a(2)R3. One way to do so is the following Octave code:

You want to have a vectorized implementation of this (i.e., one that does not use for loops). Which of the following implementations correctly compute a(2)? Check all that apply.

z = Theta1 * x; a2 = sigmoid (z);

a2 = sigmoid (x * Theta1);

a2 = sigmoid (Theta2 * x);

z = sigmoid(x); a2 = sigmoid (Theta1 * z);

5. 

You are using the neural network pictured below and have learned the parameters Θ(1)=[111.55.13.72.3] (used to compute a(2)) and Θ(2)=[10.60.8] (used to compute a(3)} as a function of a(2)). Suppose you swap the parameters for the first hidden layer between its two units so Θ(1)=[115.11.52.33.7] and also swap the output layer so Θ(2)=[10.80.6]. How will this change the value of the output hΘ(x)?

It will stay the same.

It will increase.

It will decrease

Insufficient information to tell: it may increase or decrease.

版权声明:本文为博主原创文章,未经博主允许不得转载。 举报

相关文章推荐

Coursera Machine Learning Week 4 - Neural Networks

Coursera Machine Learning Week 4 - Neural Networks

Machine Learning - Neural Networks Examples and Intuitions

This articles contains topics about Neural Networks examples and intuitions.

我是如何成为一名python大咖的?

人生苦短,都说必须python,那么我分享下我是如何从小白成为Python资深开发者的吧。2014年我大学刚毕业..

Python 3.5 Socket TypeError: a bytes-like object is required, not 'str' 错误提示

目前正在学习python基本语法以及计算机网络课,所以正好结合学习python网络编程,看的是《python核心编程》第三版,发现示例2-1代码返回错误… 发现这里python3.5和Python2...

Machine Learning week 7 quiz: Support Vector Machines

Support Vector Machines 5 试题 1.  Suppose you have trained an SVM cl...

解决 win10 pycurl安装出错 Command "python setup.py egg_info" failed with error code 10

今天在win10下python3.6.0下 利用pip 安装pyspider时出现以下错误:Command "python setup.py egg_info" failed with error c...

五种JSP页面跳转方法详解

五种JSP页面跳转方法详解   1. RequestDispatcher.forward()  是在服务器端起作用,当使用forward()时,Servlet engine传递HTTP请求从当前的Se...

Machine Learning机器学习公开课汇总

机器学习目前比较热,网上也散落着很多相关的公开课和学习资源,这里基于课程图谱的机器学习公开课标签做一个汇总整理,便于大家参考对比。 1、Coursera上斯坦福大学Andrew Ng教授的“机器学习...
返回顶部
收藏助手
不良信息举报
您举报文章:深度学习:神经网络中的前向传播和反向传播算法推导
举报原因:
原因补充:

(最多只允许输入30个字)