为了维护课程的正常进行,我贴出的是不计分的部分,希望各位谅解。主要是我觉的有的人贴的代码不太对,希望给大家提供一个思路。欢迎批评指正。
%%========Optional (ungraded) exercise: Computing test seterror=============
min = exp(30);%set a bigger value
index = 1;%initialize the index
for i = 1:length(lambda_vec)
if(min > abs(error_train(i) - error_val(i)))
min = abs(error_train(i) - error_val(i));
index = i;%save the index of the minimum of |(train - val)|
end;
end;
lambda= lambda_vec(index);
theta=trainLinearReg(X_poly,y, lambda);
lam = 0;
[error_test,grad]=linearRegCostFunction(X_poly_test, ytest, theta, lam);
fprintf('error_test: %f\n',error_test);
fprintf('Program paused. Press enter to continue.\n');
pause;
%%==Optional (ungraded) exercise: Plotting learningcurves with randomly selected examples===
lambda = 0.01;
m = size(X, 1);
error_train = zeros(m, 1);
error_val = zeros(m, 1);
repeat = 50;
for i = 1:repeat
for j = 1:m
seq = randperm(m,j);%produce the random integer,random j in [1,m]
X_poly_rand = X_poly(seq,:);
y_rand = y(seq,:);
seq_val = randperm(m,j);
X_poly_val_rand = X_poly_val(seq_val,:);
yval_val_rand = yval(seq_val,:);
[theta] = trainLinearReg(X_poly_rand, y_rand, lambda);
lam = 0;
[J, grad] = linearRegCostFunction(X_poly_rand, y_rand, theta, lam);
[Jval, gradval] = linearRegCostFunction(X_poly_val_rand, yval_val_rand, theta, lam);
error_train(j) = error_train(j) + J;
error_val(j) = error_val(j) + Jval;
end;
end;
error_train = error_train/repeat;
error_val = error_val/repeat;
plot(1:m, error_train, 1:m, error_val);
title(sprintf('Polynomial Regression Learning Curve (lambda = %f)', lambda));
legend('Train', 'Cross Validation')
xlabel('Number of training examples')
ylabel('Error')
axis([0 13 0 100])
这两个选做题,主要是对前面简单步骤的复杂处理,道理是一样的,运用相同的公式和原理。