Ex05 [coursera] Machine learning - Stanford University - Andrew Ng

Regularized Linear Regression and Bias v.s. Variance

目录

Regularized Linear Regression and Bias v.s. Variance

ex5.m

linearRegCostFunction.m

trainLinearReg.m

learningCurve.m

polyFeatures.m

featureNormalize.m

plotFit.m

validationCurve.m

Prompt Output

Figure Output


ex5.m

%% Machine Learning Online Class
%  Exercise 5 | Regularized Linear Regression and Bias-Variance
%
%  Instructions
%  ------------
% 
%  This file contains code that helps you get started on the
%  exercise. You will need to complete the following functions:
%
%     linearRegCostFunction.m
%     learningCurve.m
%     validationCurve.m
%
%  For this exercise, you will not need to change any code in this file,
%  or any other files other than those mentioned above.
%

%% Initialization
clear ; close all; clc

%% =========== Part 1: Loading and Visualizing Data =============
%  We start the exercise by first loading and visualizing the dataset. 
%  The following code will load the dataset into your environment and plot
%  the data.
%

% Load Training Data
fprintf('Loading and Visualizing Data ...\n')

% Load from ex5data1: 
% You will have X, y, Xval, yval, Xtest, ytest in your environment
load ('ex5data1.mat');

% m = Number of examples
m = size(X, 1);

% Plot training data
plot(X, y, 'rx', 'MarkerSize', 10, 'LineWidth', 1.5);
xlabel('Change in water level (x)');
ylabel('Water flowing out of the dam (y)');

fprintf('Program paused. Press enter to continue.\n');
pause;

%% =========== Part 2: Regularized Linear Regression Cost =============
%  You should now implement the cost function for regularized linear 
%  regression. 
%

theta = [1 ; 1];
J = linearRegCostFunction([ones(m, 1) X], y, theta, 1);

fprintf(['Cost at theta = [1 ; 1]: %f '...
         '\n(this value should be about 303.993192)\n'], J);

fprintf('Program paused. Press enter to continue.\n');
pause;

%% =========== Part 3: Regularized Linear Regression Gradient =============
%  You should now implement the gradient for regularized linear 
%  regression.
%

theta = [1 ; 1];
[J, grad] = linearRegCostFunction([ones(m, 1) X], y, theta, 1);

fprintf(['Gradient at theta = [1 ; 1]:  [%f; %f] '...
         '\n(this value should be about [-15.303016; 598.250744])\n'], ...
         grad(1), grad(2));

fprintf('Program paused. Press enter to continue.\n');
pause;


%% =========== Part 4: Train Linear Regression =============
%  Once you have implemented the cost and gradient correctly, the
%  trainLinearReg function will use your cost function to train 
%  regularized linear regression.
% 
%  Write Up Note: The data is non-linear, so this will not give a great 
%                 fit.
%

%  Train linear regression with lambda = 0
lambda = 0;
[theta] = trainLinearReg([ones(m, 1) X], y, lambda);

%  Plot fit over the data
plot(X, y, 'rx', 'MarkerSize', 10, 'LineWidth', 1.5);
xlabel('Change in water level (x)');
ylabel('Water flowing out of the dam (y)');
hold on;
plot(X, [ones(m, 1) X]*theta, '--', 'LineWidth', 2)
hold off;

fprintf('Program paused. Press enter to continue.\n');
pause;


%% =========== Part 5: Learning Curve for Linear Regression =============
%  Next, you should implement the learningCurve function. 
%
%  Write Up Note: Since the model is underfitting the data, we expect to
%                 see a graph with "high bias" -- Figure 3 in ex5.pdf 
%

lambda = 0;
[error_train, error_val] = ...
    learningCurve([ones(m, 1) X], y, ...
                  [ones(size(Xval, 1), 1) Xval], yval, ...
                  lambda);

plot(1:m, error_train, 1:m, error_val);
title('Learning curve for linear regression')
legend('Train', 'Cross Validation')
xlabel('Number of training examples')
ylabel('Error')
axis([0 13 0 150])

fprintf('# Training Examples\tTrain Error\tCross Validation Error\n');
for i = 1:m
    fprintf('  \t%d\t\t%f\t%f\n', i, error_train(i), error_val(i));
end

fprintf('Program paused. Press enter to continue.\n');
pause;

%% =========== Part 6: Feature Mapping for Polynomial Regression =============
%  One solution to this is to use polynomial regression. You should now
%  complete polyFeatures to map each example into its powers
%

p = 8;

% Map X onto Polynomial Features and Normalize
X_poly = polyFeatures(X, p);
[X_poly, mu, sigma] = featureNormalize(X_poly);  % Normalize
X_poly = [ones(m, 1), X_poly];                   % Add Ones

% Map X_poly_test and normalize (using mu and sigma)
X_poly_test = polyFeatures(Xtest, p);
X_poly_test = bsxfun(@minus, X_poly_test, mu);
X_poly_test = bsxfun(@rdivide, X_poly_test, sigma);
X_poly_test = [ones(size(X_poly_test, 1), 1), X_poly_test];         % Add Ones

% Map X_poly_val and normalize (using mu and sigma)
X_poly_val = polyFeatures(Xval, p);
X_poly_val = bsxfun(@minus, X_poly_val, mu);
X_poly_val = bsxfun(@rdivide, X_poly_val, sigma);
X_poly_val = [ones(size(X_poly_val, 1), 1), X_poly_val];           % Add Ones

fprintf('Normalized Training Example 1:\n');
fprintf('  %f  \n', X_poly(1, :));

fprintf('\nProgram paused. Press enter to continue.\n');
pause;



%% =========== Part 7: Learning Curve for Polynomial Regression =============
%  Now, you will get to experiment with polynomial regression with multiple
%  values of lambda. The code below runs polynomial regression with 
%  lambda = 0. You should try running the code with different values of
%  lambda to see how the fit and learning curve change.
%

lambda = 0;   % overfit
%lambda = 1;   % just right
%lambda = 100; % underfit
[theta] = trainLinearReg(X_poly, y, lambda);

% Plot training data and fit
figure(1);
plot(X, y, 'rx', 'MarkerSize', 10, 'LineWidth', 1.5);
plotFit(min(X), max(X), mu, sigma, theta, p);
xlabel('Change in water level (x)');
ylabel('Water flowing out of the dam (y)');
title (sprintf('Polynomial Regression Fit (lambda = %f)', lambda));

figure(2);
[error_train, error_val] = ...
    learningCurve(X_poly, y, X_poly_val, yval, lambda);
plot(1:m, error_train, 1:m, error_val);

title(sprintf('Polynomial Regression Learning Curve (lambda = %f)', lambda));
xlabel('Number of training examples')
ylabel('Error')
axis([0 13 0 100])
legend('Train', 'Cross Validation')

fprintf('Polynomial Regression (lambda = %f)\n\n', lambda);
fprintf('# Training Examples\tTrain Error\tCross Validation Error\n');
for i = 1:m
    fprintf('  \t%d\t\t%f\t%f\n', i, error_train(i), error_val(i));
end

fprintf('Program paused. Press enter to continue.\n');
pause;



%% =========== Part 8: Validation for Selecting Lambda =============
%  You will now implement validationCurve to test various values of 
%  lambda on a validation set. You will then use this to select the
%  "best" lambda value.
%

[lambda_vec, error_train, error_val] = ...
    validationCurve(X_poly, y, X_poly_val, yval);

close all;
plot(lambda_vec, error_train, lambda_vec, error_val);
legend('Train', 'Cross Validation');
xlabel('lambda');
ylabel('Error');

fprintf('lambda\t\tTrain Error\tValidation Error\n');
for i = 1:length(lambda_vec)
	fprintf(' %f\t%f\t%f\n', ...
            lambda_vec(i), error_train(i), error_val(i));
end

fprintf('Program paused. Press enter to continue.\n');
pause;



%% =========== Part 9: Computing test set error =============
%  compute the test error using the best value of λ you found.
%  In our cross validation, we obtained a test error of 3.8599 for λ = 3.
best_lambda = 3;
theta = trainLinearReg(X_poly, y, best_lambda);
[error_test, grad] = linearRegCostFunction(X_poly_test, ytest, theta, 0);
fprintf('Test Set Error with the best lambda = 3: %f\n', error_test);

fprintf('Program paused. Press enter to continue.\n');
pause;


%% =========== Part 10: Plotting learning curves with randomly selected examples =============
%  To determine the training error and cross validation error for i examples, 
%  you should ?rst randomly select i examples from the training set 
%  and i examples from the cross validation set.
%  
%  The above steps should then be repeated multiple times (say 50) 
%  and the averaged error should be used to determine 
%  the training error and cross validation error for i examples.
%  
%  Plot the learning curve obtained for polynomial regression with λ = 0.01

lambda = 0.01;
m = size(X_poly, 1);   % Number of training examples
error_train = zeros(m, 1);
error_val   = zeros(m, 1);
poly = [X_poly, y];
poly_val = [X_poly_val, yval];
k_MAX = 50;  % repeated 50 times
for i = 1:m
  error_train_sum = 0;
  error_val_sum = 0;
  for k = 1:k_MAX
    rand_seq = round( rand(1,i)*(m-1) )+1;  % 生成i个[0,m]的随机序列
    rand_poly = poly(rand_seq, :);
    rand_poly_val = poly_val(rand_seq, :);
    Xpoly_train = rand_poly(:, 1:end-1);
    ypoly_train = rand_poly(:, end);
    Xpoly_val = rand_poly_val(:, 1:end-1);
    ypoly_val = rand_poly_val(:, end);
    theta = trainLinearReg(Xpoly_train, ypoly_train, lambda);
    [error_train_single, grad_train] = linearRegCostFunction(Xpoly_train, ypoly_train, theta, 0);
    [error_val_single, grad_val] = linearRegCostFunction(Xpoly_val, ypoly_val, theta, 0);
    error_train_sum = error_train_sum + error_train_single;
    error_val_sum = error_val_sum + error_val_single;
  endfor
  error_train(i) = error_train_sum/k_MAX;
  error_val(i) = error_val_sum/k_MAX;
endfor

plot(1:m, error_train, 1:m, error_val);
title(sprintf('Polynomial Regression Learning Curve (lambda = %f)', lambda));
legend('Train', 'Cross Validation');
xlabel('Number of training examples');
ylabel('Error');
axis([0 13 0 100]);

fprintf('Polynomial Regression (lambda = %f)\n', lambda);
fprintf('# Training Examples\tTrain Error\tCross Validation Error\n');
for i = 1:m
    fprintf('  \t%d\t\t%f\t%f\n', i, error_train(i), error_val(i));
end

fprintf('Program paused. Press enter to continue.\n');
pause;


linearRegCostFunction.m

function [J, grad] = linearRegCostFunction(X, y, theta, lambda)
%LINEARREGCOSTFUNCTION Compute cost and gradient for regularized linear 
%regression with multiple variables
%   [J, grad] = LINEARREGCOSTFUNCTION(X, y, theta, lambda) computes the 
%   cost of using theta as the parameter for linear regression to fit the 
%   data points in X and y. Returns the cost in J and the gradient in grad

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;
grad = zeros(size(theta));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost and gradient of regularized linear 
%               regression for a particular choice of theta.
%
%               You should set J to the cost and grad to the gradient.
%
h = X * theta;
Cost = ((h-y) .^ 2)/2;
J = sum(Cost)/m + lambda/(2*m) * sum(theta(2:end) .^ 2);

grad = X'*(h-y) / m;
grad(2:end) = grad(2:end) + lambda/m * theta(2:end);
% =========================================================================

grad = grad(:);

end

trainLinearReg.m

function [theta] = trainLinearReg(X, y, lambda)
%TRAINLINEARREG Trains linear regression given a dataset (X, y) and a
%regularization parameter lambda
%   [theta] = TRAINLINEARREG (X, y, lambda) trains linear regression using
%   the dataset (X, y) and regularization parameter lambda. Returns the
%   trained parameters theta.
%

% Initialize Theta
initial_theta = zeros(size(X, 2), 1); 

% Create "short hand" for the cost function to be minimized
costFunction = @(t) linearRegCostFunction(X, y, t, lambda);

% Now, costFunction is a function that takes in only one argument
options = optimset('MaxIter', 200, 'GradObj', 'on');

% Minimize using fmincg
theta = fmincg(costFunction, initial_theta, options);

end

learningCurve.m

function [error_train, error_val] = ...
    learningCurve(X, y, Xval, yval, lambda)
%LEARNINGCURVE Generates the train and cross validation set errors needed 
%to plot a learning curve
%   [error_train, error_val] = ...
%       LEARNINGCURVE(X, y, Xval, yval, lambda) returns the train and
%       cross validation set errors for a learning curve. In particular, 
%       it returns two vectors of the same length - error_train and 
%       error_val. Then, error_train(i) contains the training error for
%       i examples (and similarly for error_val(i)).
%
%   In this function, you will compute the train and test errors for
%   dataset sizes from 1 up to m. In practice, when working with larger
%   datasets, you might want to do this in larger intervals.
%

% Number of training examples
m = size(X, 1);

% You need to return these values correctly
error_train = zeros(m, 1);
error_val   = zeros(m, 1);

% ====================== YOUR CODE HERE ======================
% Instructions: Fill in this function to return training errors in 
%               error_train and the cross validation errors in error_val. 
%               i.e., error_train(i) and 
%               error_val(i) should give you the errors
%               obtained after training on i examples.
%
% Note: You should evaluate the training error on the first i training
%       examples (i.e., X(1:i, :) and y(1:i)).
%
%       For the cross-validation error, you should instead evaluate on
%       the _entire_ cross validation set (Xval and yval).
%
% Note: If you are using your cost function (linearRegCostFunction)
%       to compute the training and cross validation error, you should 
%       call the function with the lambda argument set to 0. 
%       Do note that you will still need to use lambda when running
%       the training to obtain the theta parameters.
%
% Hint: You can loop over the examples with the following:
%
%       for i = 1:m
%           % Compute train/cross validation errors using training examples 
%           % X(1:i, :) and y(1:i), storing the result in 
%           % error_train(i) and error_val(i)
%           ....
%           
%       end
%

% ---------------------- Sample Solution ----------------------
for i = 1:m
  Xtrain = X(1:i,:);
  ytrain = y(1:i);
  theta = trainLinearReg(Xtrain, ytrain, lambda);
  [error_train(i), grad_train] = linearRegCostFunction(Xtrain, ytrain, theta, 0);
  [error_val(i), grad_val] = linearRegCostFunction(Xval, yval, theta, 0);
endfor
% -------------------------------------------------------------

% =========================================================================

end

polyFeatures.m

function [X_poly] = polyFeatures(X, p)
%POLYFEATURES Maps X (1D vector) into the p-th power
%   [X_poly] = POLYFEATURES(X, p) takes a data matrix X (size m x 1) and
%   maps each example into its polynomial features where
%   X_poly(i, :) = [X(i) X(i).^2 X(i).^3 ...  X(i).^p];
%


% You need to return the following variables correctly.
X_poly = zeros(numel(X), p);

% ====================== YOUR CODE HERE ======================
% Instructions: Given a vector X, return a matrix X_poly where the p-th 
%               column of X contains the values of X to the p-th power.
%
% 
for i = 1:p
  X_poly(:,i) = X .^ i;
endfor
% =========================================================================

end

featureNormalize.m

function [X_norm, mu, sigma] = featureNormalize(X)
%FEATURENORMALIZE Normalizes the features in X 
%   FEATURENORMALIZE(X) returns a normalized version of X where
%   the mean value of each feature is 0 and the standard deviation
%   is 1. This is often a good preprocessing step to do when
%   working with learning algorithms.

mu = mean(X);
X_norm = bsxfun(@minus, X, mu);

sigma = std(X_norm);
X_norm = bsxfun(@rdivide, X_norm, sigma);


% ============================================================

end

plotFit.m

function plotFit(min_x, max_x, mu, sigma, theta, p)
%PLOTFIT Plots a learned polynomial regression fit over an existing figure.
%Also works with linear regression.
%   PLOTFIT(min_x, max_x, mu, sigma, theta, p) plots the learned polynomial
%   fit with power p and feature normalization (mu, sigma).

% Hold on to the current figure
hold on;

% We plot a range slightly bigger than the min and max values to get
% an idea of how the fit will vary outside the range of the data points
x = (min_x - 15: 0.05 : max_x + 25)';

% Map the X values 
X_poly = polyFeatures(x, p);
X_poly = bsxfun(@minus, X_poly, mu);
X_poly = bsxfun(@rdivide, X_poly, sigma);

% Add ones
X_poly = [ones(size(x, 1), 1) X_poly];

% Plot
plot(x, X_poly * theta, '--', 'LineWidth', 2)

% Hold off to the current figure
hold off

end

validationCurve.m

function [lambda_vec, error_train, error_val] = ...
    validationCurve(X, y, Xval, yval)
%VALIDATIONCURVE Generate the train and validation errors needed to
%plot a validation curve that we can use to select lambda
%   [lambda_vec, error_train, error_val] = ...
%       VALIDATIONCURVE(X, y, Xval, yval) returns the train
%       and validation errors (in error_train, error_val)
%       for different values of lambda. You are given the training set (X,
%       y) and validation set (Xval, yval).
%

% Selected values of lambda (you should not change this)
lambda_vec = [0 0.001 0.003 0.01 0.03 0.1 0.3 1 3 10]';

% You need to return these variables correctly.
error_train = zeros(length(lambda_vec), 1);
error_val = zeros(length(lambda_vec), 1);

% ====================== YOUR CODE HERE ======================
% Instructions: Fill in this function to return training errors in 
%               error_train and the validation errors in error_val. The 
%               vector lambda_vec contains the different lambda parameters 
%               to use for each calculation of the errors, i.e, 
%               error_train(i), and error_val(i) should give 
%               you the errors obtained after training with 
%               lambda = lambda_vec(i)
%
% Note: You can loop over lambda_vec with the following:
%
%       for i = 1:length(lambda_vec)
%           lambda = lambda_vec(i);
%           % Compute train / val errors when training linear 
%           % regression with regularization parameter lambda
%           % You should store the result in error_train(i)
%           % and error_val(i)
%           ....
%           
%       end
%
%
for i = 1:length(lambda_vec)
  lambda = lambda_vec(i);
  theta = trainLinearReg(X, y, lambda);
  error_train(i) = linearRegCostFunction(X, y, theta, 0);
  error_val(i) = linearRegCostFunction(Xval, yval, theta, 0);
endfor
% =========================================================================

end

Prompt Output

Loading and Visualizing Data ...
Program paused. Press enter to continue.
Cost at theta = [1 ; 1]: 303.993192
(this value should be about 303.993192)
Program paused. Press enter to continue.
Gradient at theta = [1 ; 1]:  [-15.303016; 598.250744]
(this value should be about [-15.303016; 598.250744])
Program paused. Press enter to continue.
Iteration     2 | Cost: 2.237391e+01
Program paused. Press enter to continue.

Iteration     3 | Cost: 9.860761e-32
Iteration     2 | Cost: 3.286595e+00
Iteration    25 | Cost: 2.842678e+00
Iteration    23 | Cost: 1.315405e+01
Iteration    34 | Cost: 1.944396e+01
Iteration    16 | Cost: 2.009852e+01
Iteration    30 | Cost: 1.817286e+01
Iteration    17 | Cost: 2.260941e+01
Iteration    19 | Cost: 2.326146e+01
Iteration    15 | Cost: 2.431725e+01
Iteration     2 | Cost: 2.237391e+01
# Training Examples     Train Error     Cross Validation Error
        1               0.000000        205.121096
        2               0.000000        110.300366
        3               3.286595        45.010231
        4               2.842678        48.368911
        5               13.154049       35.865165
        6               19.443963       33.829962
        7               20.098522       31.970986
        8               18.172859       30.862446
        9               22.609405       31.135998
        10              23.261462       28.936207
        11              24.317250       29.551432
        12              22.373906       29.433818
Program paused. Press enter to continue.
Normalized Training Example 1:
  1.000000
  -0.362141
  -0.755087
  0.182226
  -0.706190
  0.306618
  -0.590878
  0.344516
  -0.508481

Program paused. Press enter to continue.
Iteration   200 | Cost: 7.639825e-02
Iteration    26 | Cost: 3.286920e-32
Iteration     9 | Cost: 7.220851e-28
Iteration   200 | Cost: 2.028920e-13
Iteration   200 | Cost: 6.053344e-03
Iteration   200 | Cost: 1.840989e-03
Iteration   200 | Cost: 7.073889e-02
Iteration   200 | Cost: 1.873619e-01
Iteration   200 | Cost: 1.557148e-01
Iteration   200 | Cost: 1.362932e-01
Iteration   200 | Cost: 7.639825e-02
Polynomial Regression (lambda = 0.000000)

# Training Examples     Train Error     Cross Validation Error
        1               0.000000        160.721900
        2               0.000000        160.121510
        3               0.000000        61.754825
        4               0.000000        61.928895
        5               0.000000        6.597876
        6               0.006053        9.376517
        7               0.001841        21.847744
        8               0.070739        6.594021
        9               0.187362        8.072575
        10              0.155715        8.873879
        11              0.136293        9.040019
        12              0.076398        9.658082
Program paused. Press enter to continue.
Iteration   200 | Cost: 7.639825e-02
Iteration   200 | Cost: 1.865599e-01
Iteration   200 | Cost: 2.516039e-01
Iteration   200 | Cost: 3.851189e-01
Iteration   200 | Cost: 6.692751e-01
Iteration   177 | Cost: 1.443470e+00
Iteration   109 | Cost: 3.101591e+00
Iteration    83 | Cost: 7.268148e+00
Iteration    39 | Cost: 1.586769e+01
Iteration    23 | Cost: 3.337220e+01
lambda          Train Error     Validation Error
 0.000000       0.076398        9.658082
 0.001000       0.147398        18.849457
 0.003000       0.182183        19.003150
 0.010000       0.222505        17.186615
 0.030000       0.281936        12.829690
 0.100000       0.459318        7.587014
 0.300000       0.921760        4.636833
 1.000000       2.076188        4.260625
 3.000000       4.901351        3.822907
 10.000000      16.092213       9.945508
Program paused. Press enter to continue.
Iteration    39 | Cost: 1.586769e+01
Test Set Error with the best lambda = 3: 3.859888
Program paused. Press enter to continue.
Polynomial Regression (lambda = 0.010000)
# Training Examples     Train Error     Cross Validation Error
        1               0.000000        100.491884
        2               0.017126        70.173877
        3               0.029552        34.983737
        4               0.031472        31.786113
        5               0.033951        20.581176
        6               0.041781        14.221169
        7               0.054377        13.410276
        8               0.055222        14.072239
        9               0.060536        13.877678
        10              0.081491        11.361104
        11              0.082341        11.376847
        12              0.090672        9.459007
Program paused. Press enter to continue.

Figure Output

    Data

   Linear Fit

    Linear regression learning curve

    Polynomial fit, lambda=0 (overfit)

    Polynomial learning curve, lambda=0 (overfit)

    Polynomial fit, lambda=1 (just right)

    Polynomial learning curve, lambda=1 (just right)

    Polynomial fit, lambda=100 (underfit)

    Polynomial learning curve, lambda=1 (underfit)

    Selecting lambda using a cross validation set

   Learning curve with randomly selected examples

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值