Stanford 机器学习 Week2 作业: Linear Regression

Plotting the Data

data = load('ex1data1.txt');       % read comma separated data
X = data(:, 1); y = data(:, 2);
m = length(y);                     % number of training examples
plot(x, y, 'rx', 'MarkerSize', 10); % Plot the data,'rx'表示用红叉画点,'MarkerSize' = 10设定红叉大小

ylabel('Profit in $10,000s'); % Set the y−axis label xlabel('Population of City in 10,000s'); % Set the x−axis label

Computing the cost J(θ)

l = length(X);
T = 1 / 2 / l * ( X * theta - y) .^ 2;
J = sum(T);

Gradient descent

t1 = theta(1) - alpha / m * sum( X * theta - y);
t2 = theta(2) - alpha / m * sum((X * theta - y) .* X(:,2));
theta = [t1; t2];

Feature Normalization

l = size(X_norm,2);
for i = 1:l
    mu(i) = mean(X_norm(:,i));
    sigma(i) = std(X_norm(:,i));
    X_norm(:,i) = (X_norm(:,i) - mu(i)) / sigma(i);
end;

Gradient Descent(muitiple variables)

m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
n = size(X,2);
tmp = zeros(n,1);

for iter = 1:num_iters
    for i = 1:n
        tmp(i) = theta(i) - alpha / m * (X * theta - y)' * X(:,i);
    end;
    theta = tmp;
    J_history(iter) = computeCostMulti(X, y, theta);
end

Normal Equations

theta = pinv(X'*X)*X'*y
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值