Angrew Machine Learning ex5

linearRegCostFunction

h = X * theta; %Linear Regression doesn't need sigmoid while Logistic Regression need it;

theta(1,:) = 0; %This is the same with the file 'costFunctionReg.m' of ex2 except this line.
				%For adding the multi-classification,we should take the matrix instead of vector.Therefore, we set the first row to 0 in order to get them away from the computation of regularization.
grad = (X' * (h - y) + lambda * theta) / m;
J = (sum((h - y) .^ 2) + lambda * sum(theta .^ 2)) / (2 * m);

learningCurve

%We must train m times for every error_train and error_val in order to gain the curve of the function J of m on train set and validation set.
for i = 1:m
  x = X(1:i, :);
  y_ = y(1:i);
  [theta] = trainLinearReg(x, y_, lambda);
  [error_train(i), _] = linearRegCostFunction(x, y_, theta, 0);
  [error_val(i),_] = linearRegCostFunction(Xval, yval, theta, 0);
end

validationCurve

%We execute one train for each lambda in order to get the function J of lambda on the train set and the validation set.
for i = 1:length(lambda_vec)
  [theta] = trainLinearReg(X, y, lambda_vec(i));
  [error_train(i), _] = linearRegCostFunction(X, y, theta, 0);
  [error_val(i),_] = linearRegCostFunction(Xval, yval, theta, 0);
end

polyFeatures

for i = 1:p
  X_poly(:, i) = X .^ i;

end

这是笔者自己想到一些对吴恩达机器学习课程的编程作业的实现方式,如果你有更好的实现方式,欢迎在评论区讨论。

这里只是部分代码,全部代码在 

https://download.csdn.net/download/ti_an_di/10590380

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值