Anndrew Ng's Machine Learning in Coursera(II)

Logistic Regression

1.Classification/Hypothesis Representation

In the former class,we talked about liner regression,today we will go into a new field.
First we have the tumor size and we need to predict whether it is malignant or benign.
  
We do the liner regression and get the line.W define a threshold=0.5. if hypothetic value>threshold=0.5 We say it's malignant. else we say it is benign.

But what if?

Apparently, the new liner regression is not a good classification.

So we have to do something! A new model appear!

 The So-Called Sigmoid Function or Logistic Function is the same.
 The functional image of g is at the bottom right.0<=g of z <=1.When (z>=0,g of z>0.5||| z<0, g of z<0.5);
 
  From above picture,We know when we have the value x,θ,the probability of y=0 +y=1   =1

2.decision boundary 

 what we say the dissection boundary is to classify the boundary of h of x based on all the data;
 As the following , when we have h(x)=g(θ0+θ1x1+θ2x2) and the parameter θ=[-3,1,1]T,
 predict Y=1  if -3+x1+x2>=0
 predict Y=0, if -3+x1+x2<0
 
  Apart from  liner boundary,we have some non-liner boundary.For example:
 
  

3.Cost Function

For a logistic Regression Model,we have  ,so the cost function is: 

Since y=0 or y=1,we can rewrite cost function into:



And we use Gradient Descent to minimize the cost function


So we have

Do you find that it is  the same as what we get in the week2 liner regression?

 4.advanced Optimization:


Besides Gradient descent,there are other algorithms such as Conjugate gradient,BFGS or L-BFGS.




  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值