wk1 Linear Regression
最近正在学习MOOC上的经典课程:Machine learning (by Andrew Ng), 具体课程链接:MACHINE LEARNING
根据进度将作业的关键代码部分贴上,仅供交流与讨论。
- computeCost
J=sum((X*theta-y).^2)/(2*m);
- gradientDescent
theta = theta - (1/m)*alpha*(X.'*(X*theta-y));
J_history(iter) = computeCost(X, y, theta);
- featureNormalize
mu=mean(X,1);
sigma = std(X);
X_norm = (X - ones(size(X, 1), 1) * mu) ./ (ones(size(X, 1), 1) * sigma);
- computeCostMulti
J = (X * theta - y).' * (X * theta - y) / (2*m);
- gradeDescentMulti
theta = theta - (1/m)*alpha*(X.'*(X*theta-y));
J_history(iter) = computeCostMulti(X, y, theta);
- normalEqn
theta=(inv(X.'*X))*X.'*y;
——转载请注明出处