机器学习--(第四周)一对多分类和神经网络预测


一对多分类



系统函数

随机抽取样本 

randperm(n)

返回包含1:n随机排列的行向量

fmincg

使用非线性共轭梯度算法最小化函数,用法举例如下,

 [theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)),initial_theta, options);


自定义函数

代价函数

lrCostFunction() 

function [J, grad] = lrCostFunction(theta, X, y, lambda)

m = length(y);

J = 0;

grad = zeros(size(theta));

J = 1./m*(-y'*log(sigmoid(X*theta))-(1-y')*log(1-sigmoid(X*theta)));
J = J + lambda/(2*m)*(sum(theta.^2)-theta(1).^2);
grad = 1./m*X'*(sigmoid(X*theta)-y);
grad = grad + lambda/m*theta;

grad(1) = grad(1) - lambda/m*theta(1);

grad = grad(:);
end


定义梯度下降函数

function [all_theta] = oneVsAll(X, y, num_labels, lambda)

m = size(X, 1);

n = size(X, 2);

all_theta = zeros(num_labels, n + 1);

initial_theta = zeros(n+1,1);
options = optimset('GradObj', 'on', 'MaxIter', 50);
for c = 1:num_labels
    [theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)),initial_theta, options);
    all_theta(c,:) = theta';
end

end


大的主要的流程是1、2、5、6

其中流程3是为了检验编写的函数是否能够达到目的,需要已有的数据结果进行对比


1、设置参数

input_layer_size  = 400;  % 20x20 Input Images of Digits
num_labels = 10;          % 10 labels, from 1 to 10

2、导入数据

load('ex3data1.mat'); % training data stored in arrays X, y

m = size(X, 1);

3、随机选组前100个例子进行展示

rand_indices = randperm(m);
sel = X(rand_indices(1:100), :);

displayData(sel);

4、测试梯度下降函数(测试梯度下降函数)

% Test case for lrCostFunction
fprintf('\nTesting lrCostFunction() with regularization');

theta_t = [-2; -1; 1; 2];
X_t = [ones(5,1) reshape(1:15,5,3)/10];
y_t = ([1;0;1;0;1] >= 0.5);
lambda_t = 3;
[J grad] = lrCostFunction(theta_t, X_t, y_t, lambda_t);

5、计算参数grad

lambda = 0.1;
[all_theta] = oneVsAll(X, y, num_labels, lambda);


6、预测值与计算值对比,得出分类算法的准确率

pred = predictOneVsAll(all_theta, X);

fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);


神经网络进行预测


系统函数

[c, p] = max(a3, [], 2);

[x, ix] = max ([1, 3, 5, 2, 5])

    ⇒  x = 5
        ix = 3 


定义函数

预测函数

function p = predict(Theta1, Theta2, X)

m = size(X, 1);

num_labels = size(Theta2, 1);

p = zeros(size(X, 1), 1);

X = [ones(m,1) X];

a2 = sigmoid(X*Theta1');

a2 = [ones(size(a2,1),1) a2];

a3 = (sigmoid(a2*Theta2'));

[c, p] = max(a3, [], 2);


1、设定参数

input_layer_size  = 400;  % 20x20 Input Images of Digits
hidden_layer_size = 25;   % 25 hidden units
num_labels = 10;          % 10 labels, from 1 to 10   

                          % (note that we have mapped "0" to label 10)

2、导入数据,显示部分随机的结果

fprintf('Loading and Visualizing Data ...\n')
load('ex3data1.mat');
m = size(X, 1);
% Randomly select 100 data points to display
sel = randperm(size(X, 1));
sel = sel(1:100);
displayData(X(sel, :));

fprintf('Program paused. Press enter to continue.\n');

3、导入神经网络隐藏层参数Theta1和逻辑判断参数Theta2

% Load the weights into variables Theta1 and Theta2
load('ex3weights.mat');

4、预测

pred = predict(Theta1, Theta2, X);

fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);

5、随机展示其中一个手写图像,输出预测结果

rp = randperm(m);

for i = 1:m
    % Display 
    fprintf('\nDisplaying Example Image\n');
    displayData(X(rp(i), :));
    pred = predict(Theta1, Theta2, X(rp(i),:));
    fprintf('\nNeural Network Prediction: %d (digit %d)\n', pred, mod(pred, 10));  
    % Pause with quit option
    s = input('Paused - press enter to continue, q to exit:','s');
    if s == 'q'
      break
    end
end



  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值