Machine Learning week 4 quiz: Neural Networks: Representation

原创 2015年11月18日 20:51:02

Neural Networks: Representation

5 试题


Which of the following statements are true? Check all that apply.

A two layer (one input layer, one output layer; no hidden layer) neural network can represent the XOR function.

The activation values of the hidden units in a neural network, with the sigmoid activation function applied at every layer, are always in the range (0, 1).

Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let a(3)1=(hΘ(x))1 be the activation of the first output unit, and similarly a(3)2=(hΘ(x))2 and a(3)3=(hΘ(x))3. Then for any input x, it must be the case that a(3)1+a(3)2+a(3)3=1.

Any logical function over binary-valued (0 or 1) inputs x1 and x2 can be (approximately) represented using some neural network.


Consider the following neural network which takes two binary-valued inputs x1,x2{0,1} and outputs hΘ(x). Which of the following logical functions does it (approximately) compute?



NAND (meaning "NOT AND")

XOR (exclusive OR)


Consider the neural network given below. Which of the following equations correctly computes the activation a(3)1? Note: g(z) is the sigmoid activation function.






You have the following neural network:

You'd like to compute the activations of the hidden layer a(2)R3. One way to do so is the following Octave code:

You want to have a vectorized implementation of this (i.e., one that does not use for loops). Which of the following implementations correctly compute a(2)? Check all that apply.

z = Theta1 * x; a2 = sigmoid (z);

a2 = sigmoid (x * Theta1);

a2 = sigmoid (Theta2 * x);

z = sigmoid(x); a2 = sigmoid (Theta1 * z);


You are using the neural network pictured below and have learned the parameters Θ(1)=[] (used to compute a(2)) and Θ(2)=[10.60.8] (used to compute a(3)} as a function of a(2)). Suppose you swap the parameters for the first hidden layer between its two units so Θ(1)=[] and also swap the output layer so Θ(2)=[10.80.6]. How will this change the value of the output hΘ(x)?

It will stay the same.

It will increase.

It will decrease

Insufficient information to tell: it may increase or decrease.



Machine Learning - Neural Networks Examples and Intuitions

This articles contains topics about Neural Networks examples and intuitions.
  • iracer
  • iracer
  • 2016年03月13日 09:38
  • 1174

Coursera Machine Learning Week 4 - Neural Networks

Coursera Machine Learning Week 4 - Neural Networks

第三周编程作业-Planar data classification with one hidden layer

Planar data classification with one hidden layer Welcome to your week 3 programming assignment. It's...

neural network and deep learning (2)

CHAPTER 1 Using neural nets to recognize handwritten digits Neural Networks and Deep Lea...

DeepLearningToolBox学习——NN(neural network)

经典的DeepLearningToolBox,将里面的模型和Andrew Ng的UFLDL tutorial 对应学习,收获不小。 下载地址:DeepLearningToolBox 神经网络模...

Coursera Machine Learning 第四周 quiz Neural Networks: Representation

Which of the following statements are true? Check all that apply. 答案CD Suppose you have...

(4)Neural Networks: Representation

Machine Learning系列,转载自 以下内容源自coursera上...

Machine Learning Week 4

Neural Networks Nonlinear Hypothesis Neural Networks Neuron Model Logistic Unit Notations used in ne...
  • lyking3
  • lyking3
  • 2015年11月02日 07:27
  • 780

Machine Learning week 4 quiz: programming assignment-Multi-class Classification and Neural Networks

一、ex3.m %% Machine Learning Online Class - Exercise 3 | Part 1: One-vs-all % Instructions % ----...

Coursera Machine Learning Week4 学习笔记

七、神经网络:表述(Neural Networks:Representation)7.1 模型表示 (Model Representation) 1、每一个神经元都可以被认为是一个处理单元/神经核(...
您举报文章:Machine Learning week 4 quiz: Neural Networks: Representation