# coursera Machine Learning ex2

2416人阅读 评论(0)

1 Logistic Regression

1.2 Implementatiion

1.2.1 Warm up

g = 1./(1 + e.^-z);

Hx = sigmoid(X * theta);
J = 1/m * (-y'*log(Hx)-(1-y')*log(1-Hx));
grad = 1/m * ((Hx - y)' * X);

1.2.3 Learning paramters using fminunc

%  Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);

%  Run fminunc to obtain the optimal theta
%  This function will return theta and the cost
[theta, cost] = ...
fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

1.2.4 Evaluating logistic regression

Hx = sigmoid(X * theta);
for iter = 1:m
if Hx(iter) >= 0.5
p(iter) = 1;
else
p(iter) = 0;
end;
end;

2 Regularized logistic regression

Hx = sigmoid(X * theta);
J = 1/m * (-y'*log(Hx)-(1-y')*log(1-Hx)) + lambda/(2*m) * (theta(2:end)' * theta(2:end));

grad = 1/m * ((Hx - y)' * X) + lambda/m * theta';
grad(1) = grad(1) - lambda/m * theta(1);

0
0

* 以上用户言论只代表其个人观点，不代表CSDN网站的观点或立场
个人资料
• 访问：9574次
• 积分：164
• 等级：
• 排名：千里之外
• 原创：7篇
• 转载：0篇
• 译文：0篇
• 评论：0条
文章分类
文章存档
阅读排行