吴恩达的机器学习编程作业13:dataset3Params 选择最优参数

function [C, sigma] = dataset3Params(X, y, Xval, yval)
%DATASET3PARAMS returns your choice of C and sigma for Part 3 of the exercise
%where you select the optimal (C, sigma) learning parameters to use for SVM
%with RBF kernel
%   [C, sigma] = DATASET3PARAMS(X, y, Xval, yval) returns your choice of C and 
%   sigma. You should complete this function to return the optimal C and 
%   sigma based on a cross-validation set.
%

% You need to return the following variables correctly.
C = 1;
sigma = 0.3;

% ====================== YOUR CODE HERE ======================
% Instructions: Fill in this function to return the optimal C and sigma
%               learning parameters found using the cross validation set.
%               You can use svmPredict to predict the labels on the cross
%               validation set. For example, 
%                   predictions = svmPredict(model, Xval);
%               will return the predictions on the cross validation set.
%
%  Note: You can compute the prediction error using 
%        mean(double(predictions ~= yval))
%

arry = [0.01, 0.03, 0.1, 0.3, 1, 3, 10, 30];
cTemp = 1;
sTemp = 0.3;
error = Inf;
length(arry)
for i = 1:length(arry)
	for j = 1:length(arry)
		cTemp = arry(i);
		sTemp = arry(j);
		model = svmTrain(X, y,cTemp, @(x1, x2) gaussianKernel(x1, x2, sTemp));
		predictions = svmPredict(model, Xval);
		if(mean(double(predictions ~= yval)) < error)
			C = cTemp;
			sigma = sTemp;
			error = mean(double(predictions ~= yval));
		end
	end
end




% =========================================================================

end

 

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值