Ufldl Exercise:Softmax Regression Softmax回归练习

今天学习了Ufldl上的Softmax回归,最后按照步骤完成了Exercise,教程很好,学到了很多,谢谢Ng…这里贴上代码作为记录。

softmaxCost.m

function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels)

% numClasses - the number of classes 
% inputSize - the size N of the input vector
% lambda - weight decay parameter
% data - the N x M input matrix, where each column data(:, i) corresponds to
%        a single test set
% labels - an M x 1 matrix containing the labels corresponding for the input data
%

% Unroll the parameters from theta
theta = reshape(theta, numClasses, inputSize); %10*784

numCases = size(data, 2);   %60000

groundTruth = full(sparse(labels, 1:numCases, 1));  %10*60000
cost = 0;

thetagrad = zeros(numClasses, inputSize);

%% ---------- YOUR CODE HERE --------------------------------------
%  Instructions: Compute the cost and gradient for softmax regression.
%                You need to compute thetagrad and cost.
%                The groundTruth matrix might come in handy.

thetaTx = theta * data;   % 10*60000
%减去最大值 防止exp值过大溢出
thetaTx = bsxfun(@minus, thetaTx, max(thetaTx, [], 1));   % 10*60000
eThetaTx = exp(thetaTx);   % 10*60000
hypothesis = eThetaTx./(repmat(sum(eThetaTx),size(eThetaTx,1),1)); % 10*60000

cost = -sum(sum(log(hypothesis).*groundTruth))/numCases;
cost = cost + sum(sum(theta.^2))*lambda/2;

thetagrad = -(groundTruth - hypothesis)*data'./numCases + lambda.*theta;  %10*784

% ------------------------------------------------------------------
% Unroll the gradient matrices into a vector for minFunc
grad = [thetagrad(:)];
end

softmaxPredict.m

function [pred] = softmaxPredict(softmaxModel, data)

% softmaxModel - model trained using softmaxTrain
% data - the N x M input matrix, where each column data(:, i) corresponds to
%        a single test set
%
% Your code should produce the prediction matrix 
% pred, where pred(i) is argmax_c P(y(c) | x(i)).

% Unroll the parameters from theta
theta = softmaxModel.optTheta;  % this provides a numClasses x inputSize matrix
pred = zeros(1, size(data, 2));  %1*10000

%% ---------- YOUR CODE HERE --------------------------------------
%  Instructions: Compute pred using theta assuming that the labels start 
%                from 1.
thetaTx = theta * data;   % 10*10000
%减去最大值 防止exp值过大溢出
thetaTx = bsxfun(@minus, thetaTx, max(thetaTx, [], 1));   % 10*10000
eThetaTx = exp(thetaTx);   % 10*10000
hypothesis = eThetaTx./(repmat(sum(eThetaTx),size(eThetaTx,1),1)); % 10*10000

[max_a,pred] = max(hypothesis);

% ---------------------------------------------------------------------

end

softmaxExercise.m不需要怎么改动,运行结果:
Iteration FunEvals Step Length Function Val Opt Cond
1 4 1.54361e+00 1.29057e+00 4.37649e+01
2 5 1.00000e+00 7.82370e-01 3.04250e+01
3 6 1.00000e+00 6.48177e-01 1.60402e+01
4 7 1.00000e+00 5.88466e-01 1.03033e+01
5 8 1.00000e+00 5.23323e-01 6.15521e+00
6 9 1.00000e+00 4.87197e-01 7.82056e+00
7 10 1.00000e+00 4.60377e-01 6.56151e+00
8 11 1.00000e+00 4.41660e-01 4.49433e+00
9 12 1.00000e+00 4.18627e-01 3.15190e+00
10 13 1.00000e+00 4.01340e-01 5.03764e+00
11 14 1.00000e+00 3.85184e-01 3.41922e+00
12 15 1.00000e+00 3.72779e-01 3.23279e+00
13 16 1.00000e+00 3.57478e-01 3.37257e+00
14 17 1.00000e+00 3.49141e-01 3.58792e+00
15 18 1.00000e+00 3.39864e-01 1.79471e+00
16 19 1.00000e+00 3.33988e-01 1.85238e+00
17 20 1.00000e+00 3.28561e-01 2.03870e+00
18 21 1.00000e+00 3.19317e-01 1.85553e+00
19 23 4.27212e-01 3.15478e-01 2.42549e+00
20 24 1.00000e+00 3.11168e-01 1.24567e+00
21 25 1.00000e+00 3.08696e-01 9.21270e-01
22 26 1.00000e+00 3.06569e-01 1.17821e+00
23 27 1.00000e+00 3.03571e-01 1.36010e+00
24 28 1.00000e+00 2.99396e-01 1.16500e+00
25 29 1.00000e+00 2.94996e-01 9.01524e-01
26 30 1.00000e+00 2.92395e-01 1.16334e+00
27 31 1.00000e+00 2.90392e-01 7.86387e-01
28 32 1.00000e+00 2.89011e-01 6.62383e-01
29 33 1.00000e+00 2.87114e-01 6.92430e-01
30 34 1.00000e+00 2.85611e-01 7.98752e-01
31 35 1.00000e+00 2.84138e-01 1.07541e+00
32 36 1.00000e+00 2.82942e-01 6.47151e-01
33 37 1.00000e+00 2.81866e-01 5.51706e-01
34 38 1.00000e+00 2.81274e-01 6.50994e-01
35 39 1.00000e+00 2.80108e-01 6.96425e-01
36 40 1.00000e+00 2.78613e-01 5.93658e-01
37 41 1.00000e+00 2.76816e-01 7.37413e-01
38 42 1.00000e+00 2.75813e-01 3.69235e-01
39 43 1.00000e+00 2.75426e-01 3.89808e-01
40 44 1.00000e+00 2.74699e-01 3.31635e-01
41 45 1.00000e+00 2.73897e-01 2.99459e-01
42 46 1.00000e+00 2.73466e-01 5.34627e-01
43 47 1.00000e+00 2.72708e-01 2.62014e-01
44 48 1.00000e+00 2.72330e-01 2.22565e-01
45 49 1.00000e+00 2.71927e-01 2.39448e-01
46 50 1.00000e+00 2.71244e-01 2.26701e-01
47 52 4.83755e-01 2.70887e-01 3.68758e-01
48 53 1.00000e+00 2.70384e-01 2.03495e-01
49 54 1.00000e+00 2.70114e-01 1.60001e-01
50 55 1.00000e+00 2.69825e-01 1.62002e-01
51 56 1.00000e+00 2.69500e-01 2.13095e-01
52 57 1.00000e+00 2.69217e-01 1.68744e-01
53 58 1.00000e+00 2.69026e-01 1.27266e-01
54 59 1.00000e+00 2.68775e-01 1.23489e-01
55 60 1.00000e+00 2.68621e-01 1.80024e-01
56 61 1.00000e+00 2.68476e-01 1.35393e-01
57 62 1.00000e+00 2.68308e-01 9.73147e-02
58 63 1.00000e+00 2.68170e-01 1.06733e-01
59 64 1.00000e+00 2.68051e-01 1.23121e-01
60 65 1.00000e+00 2.67956e-01 8.41721e-02
61 66 1.00000e+00 2.67881e-01 7.32114e-02
62 67 1.00000e+00 2.67817e-01 8.40687e-02
63 68 1.00000e+00 2.67738e-01 8.64128e-02
64 69 1.00000e+00 2.67662e-01 7.15700e-02
65 70 1.00000e+00 2.67604e-01 6.43530e-02
66 71 1.00000e+00 2.67563e-01 5.83816e-02
67 72 1.00000e+00 2.67523e-01 5.98268e-02
68 73 1.00000e+00 2.67483e-01 5.59753e-02
69 74 1.00000e+00 2.67454e-01 4.53813e-02
70 75 1.00000e+00 2.67423e-01 4.31462e-02
71 76 1.00000e+00 2.67393e-01 6.30492e-02
72 77 1.00000e+00 2.67368e-01 3.73010e-02
73 78 1.00000e+00 2.67352e-01 3.31077e-02
74 79 1.00000e+00 2.67334e-01 3.31655e-02
75 80 1.00000e+00 2.67319e-01 5.36011e-02
76 81 1.00000e+00 2.67306e-01 2.79618e-02
77 82 1.00000e+00 2.67297e-01 2.69696e-02
78 83 1.00000e+00 2.67289e-01 2.70824e-02
79 84 1.00000e+00 2.67275e-01 2.44458e-02
80 85 1.00000e+00 2.67267e-01 4.22630e-02
81 86 1.00000e+00 2.67260e-01 1.93842e-02
82 87 1.00000e+00 2.67257e-01 2.10086e-02
83 88 1.00000e+00 2.67252e-01 2.07542e-02
84 89 1.00000e+00 2.67246e-01 2.66203e-02
85 90 1.00000e+00 2.67243e-01 2.74736e-02
86 91 1.00000e+00 2.67240e-01 1.23968e-02
87 92 1.00000e+00 2.67238e-01 1.09012e-02
88 93 1.00000e+00 2.67236e-01 1.18840e-02
89 94 1.00000e+00 2.67232e-01 1.19009e-02
90 95 1.00000e+00 2.67231e-01 3.64096e-02
91 96 1.00000e+00 2.67228e-01 1.29305e-02
92 97 1.00000e+00 2.67227e-01 7.89632e-03
93 98 1.00000e+00 2.67226e-01 7.74716e-03
94 99 1.00000e+00 2.67225e-01 7.47274e-03
95 100 1.00000e+00 2.67223e-01 1.05767e-02
96 101 1.00000e+00 2.67223e-01 1.43786e-02
97 102 1.00000e+00 2.67222e-01 4.92490e-03
98 103 1.00000e+00 2.67221e-01 4.34420e-03
99 104 1.00000e+00 2.67221e-01 4.96777e-03
100 105 1.00000e+00 2.67220e-01 5.05543e-03
Exceeded Maximum Number of Iterations
Accuracy: 92.640%
准确率与标准答案一模一样啊,哈哈哈,不错~

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值