Coursera Machine Learning Week 4 - Neural Networks

原创 2016年08月31日 13:59:48

Neural Networks: Representation

1. Which of the following statements are true? Check all that apply.

  • (OK) In a neural network with many layers, we think of each successive layer as being able to use the earlier layers as features, so as to be able to compute increasingly complex functions.
  • (OK) If a neural network is overfitting the data, one solution would be to increase the regularization parameter λ.
  • If a neural network is overfitting the data, one solution would be to decrease the regularization parameter λ.
  • Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let a(3)1=(hΘ(x))1 be the activation of the first output unit, and similarly a(3)2=(hΘ(x))2 and a(3)3=(hΘ(x))3. Then for any input x, it must be the case that a(3)1+a(3)2+a(3)3=1.

2. Consider the following neural network which takes two binary-valued inputs x1,x2∈{0,1} and outputs hΘ(x). Which of the following logical functions does it (approximately) compute?

+1     -20
x1      30     ()     hΘ(x)
x2      30
  • (OK) OR
  • AND
  • NAND (meaning “NOT AND”)
  • XOR (exclusive OR)

3. Consider the neural network given below. Which of the following equations correctly computes the activation a(3)1? Note: g(z) is the sigmoid activation function.

       +1     +1     +1
       x1     ()     ()     ()
       x2     ()     ()
layer  1      2      3      4
  • (OK) a(3)1=g(Θ(2)1,0a(2)0+Θ(2)1,1a(2)1+Θ(2)1,2a(2)2)
  • a(3)1=g(Θ(1)1,0a(1)0+Θ(1)1,1a(1)1+Θ(1)1,2a(1)2)
  • a(3)1=g(Θ(1)1,0a(2)0+Θ(1)1,1a(2)1+Θ(1)1,2a(2)2)
  • The activation a(3)1 is not present in this network.

4. You have the following neural network:

       +1     ()
       x1     ()     ()     hΘ(x)
       x2     ()
layer  1      2      3

You’d like to compute the activations of the hidden layer a(2)∈ℝ3. One way to do so is the following Octave code:

% Theta1 is Theta with superscript "(1)" from lecture
% ie, the matrix of parameters for the mapping from layer 1 (input) to layer 2
% Theta1 has size 3x3
% Assume 'sigmoid' is a built-in function to compute 1 / (1 + exp(-z))

a2 = zeros (3, 1);
for i = 1:3
  for j = 1:3
    a2(i) = a2(i) + x(j) * Theta1(i, j);
  end
  a2(i) = sigmoid (a2(i));
end

You want to have a vectorized implementation of this (i.e., one that does not use for loops). Which of the following implementations correctly compute a(2)? Check all that apply.

  • (OK) a2 = sigmoid (Theta1 * x);
  • a2 = sigmoid (x * Theta1);
  • a2 = sigmoid (Theta2 * x);
  • z = sigmoid(x); a2 = Theta1 * z;

5. You are using the neural network pictured below and have learned the parameters Θ(1)=[1111.72.43.2] (used to compute a(2)) and Θ(2)=[10.3−1.2] (used to compute a(3)} as a function of a(2)). Suppose you swap the parameters for the first hidden layer between its two units so Θ(1)=[111.713.22.4] and also swap the output layer so Θ(2)=[1−1.20.3]. How will this change the value of the output hΘ(x)?

       +1     ()
       x1     ()     ()     hΘ(x)
       x2     ()
layer  1      2      3
  • (OK) It will stay the same.
  • It will increase.
  • It will decrease
  • Insufficient information to tell: it may increase or decrease.

Programming Assignment: Multi-class Classification and Neural Networks

lrCostFunction.m

T = theta;
T(1) = 0;
S = sigmoid(X * theta);
J = ( (-y' * log(S)) - ((1 - y') * log(1-S)) ) / m + lambda / (2 * m) * sum(T .^ 2);
grad = (S - y)' * X / m + lambda / m * T';

oneVsAll.m

initial_theta = zeros(n + 1, 1);
options = optimset('GradObj', 'on', 'MaxIter', 50);
for c = 1:num_labels
  [theta] = ...
    fmincg(@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
           initial_theta, options);
  all_theta(c,:) = theta';
end;

predictOneVsAll.m

[max_value, p] = max(sigmoid(X * all_theta'), [], 2);

predict.m

a1 = [ones(m, 1) X];
a2 = [ones(m, 1) sigmoid(a1 * Theta1')];
[max_value, p] = max(sigmoid(a2 * Theta2'), [], 2);

-eof-

相关文章推荐

Coursera Machine Learning 第四周 quiz Neural Networks: Representation

Which of the following statements are true? Check all that apply. 答案CD Suppose you have...

Machine Learning week 4 quiz: Neural Networks: Representation

Neural Networks: Representation 5 试题 1.  Which of the following sta...

Machine Learning week 4 quiz: Neural Networks: Representation

Neural Networks: Representation 5 试题 1.  Which of the following sta...

Coursera Machine Learning 第四周 quiz Neural Networks: Representation

Which of the following statements are true? Check all that apply. 答案CD Suppose you have...

#“Machine Learning”(Andrew Ng)#Week 4_1:Neural Networks(神经网络)

Neural Networks(神经网络) Neural networks is a model inspired by how the brain works. It is widely us...

Machine Learning week 4 quiz: programming assignment-Multi-class Classification and Neural Networks

一、ex3.m %% Machine Learning Online Class - Exercise 3 | Part 1: One-vs-all % Instructions % ----...

Coursera Machine Learning 第五周 quiz Programming Exercise 4: Neural Networks Learning

nnCostFunction.m function [J grad] = nnCostFunction(nn_params, ... ...

Coursera Machine Learning 第五周 quiz Neural Networks: Learning

1.  You are training a three layer neural network and would like to use backpropagation to compu...

Machine Learning week 5 quiz: Neural Networks: Learning

Neural Networks: Learning 5 试题 1.  You are training a three layer neural network and ...

神经网络的机器学习(Neural Networks for Machine Learning)(4)

神经网路的机器学习(Nerual Networks for Machine Learning)第四讲
内容举报
返回顶部
收藏助手
不良信息举报
您举报文章:Coursera Machine Learning Week 4 - Neural Networks
举报原因:
原因补充:

(最多只允许输入30个字)