Neural Networks

1.Problem:

In the previous part of this exercise, you implemented multi-class logistic regression to recognize handwritten digits. However, logistic regression cannot form more complex hypotheses as it is only a linear classifier.

In this part of the exercise, you will implement a neural network to recognize handwritten digits using the same training set as before. The neural network will be able to represent complex models that form non-linear hypotheses. For this week, you will be using parameters from a neural network that we have already trained. Your goal isto implement the feedforward propagation algorithm to use our weights for prediction. In next weeks exercise, you will write the backpropagation algorithm for learning the neural network parameters.

 

2.Model representation:

Our neural network is shown in Figure 2. It has 3 layersan input layer, a

hidden layer and an output layer. Recall that our inputs are pixel values of digit images. Since the images are of size 20×20, this gives us 400 input layer units (excluding the extra bias unit which always outputs +1). As before,the training data will be loaded into the variables X and y.



3.Feedforward Propagation and Prediction

You should implement the feedforward computation that computes hθ(x(i)) for every example i and returns the associated predictions. Similar to the one-vs-all classification strategy, the prediction from the neural network will be the label that has the largest output (hθ(x))k.

对应MATLAB代码如下:

<span style="font-size:14px;">function p = predict(Theta1, Theta2, X)
%PREDICT Predict the label of an input given a trained neural network
%   p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the
%   trained weights of a neural network (Theta1, Theta2)
 
% Useful values
m = size(X, 1);
num_labels = size(Theta2, 1);
 
% You need to return the following variables correctly
p = zeros(size(X, 1), 1);
 
% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
%               your learned neural network. You should set p to a
%               vector containing labels between 1 to num_labels.
%
% Hint: The max function might come in useful. In particular, the max
%       function can also return the index of the max element, for more
%       information see 'help max'. If your examples are in rows, then, you
%       can use max(A, [], 2) to obtain the max for each row.
%
X=[ones(size(X,1),1) X];
out1=sigmoid(X*Theta1');%X:5000 400   theta1:25 401  theta2:10 26
out1=[ones(size(X,1),1) out1];%out:5000*25
out2=sigmoid(out1*Theta2');
[value,p]=max(out2,[],2);
 
end</span>

You should see that the accuracy is about 97.5%. After that, an interactive sequence will launch displaying images from the training set one at a time, while the console prints out the predicted label for the displayed image. To stop the image sequence,press Ctrl-C.

对应MATLAB代码如下:

<span style="font-size:14px;">rp = randperm(m);
 
for i = 1:m
    % Display 
    fprintf('\nDisplaying Example Image\n');
    displayData(X(rp(i), :));
 
    pred = predict(Theta1, Theta2, X(rp(i),:));
    fprintf('\nNeural Network Prediction: %d (digit %d)\n', pred, mod(pred, 10));
    
    % Pause
    fprintf('Program paused. Press enter to continue.\n');
    pause;
end
</span>

运行结果:



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值