Andrew Ng Machine Learning 第三周
前言
网易云课堂(双语字幕,不卡):https://study.163.com/course/courseMain.htm?courseId=1004570029
Coursera:https://www.coursera.org/learn/machine-learning
本人初学者,先在网易云课堂上看网课,再去Coursera上做作业,开博客以记录,文章中引用图片皆为课程中所截。
Logistic 回归(Logistic Regression)
1.分类问题模型描述
Tips: 把问题结果简化成0,1,2,3…n来描述
2.决策界限
Tips: 分离两个或多个y结果的线,用于判定某个区域对应的y值
3.代价函数
Tips: 构造该函数的意义为对应假设函数的值而决定它相应的代价,x轴为hθ(x),y轴为cost代价,当hθ(x)=1时,若y=1,则cost为0,说明预测正是所需要的,而预测hθ(x)为0时,y如果要等于1,则代价无穷大,即概率很低,若等于1,代价无穷大的情况下该Jθ很大,剔除这种假设。
4.梯度下降
5.一对多问题
Tips: 如决策边界所示,抽某个class为基准,与其他剩下的所有做分类问题讨论,从所有分类结果中抽取最合适的
6.过拟合情况
Tips:参数过多虽然符合当前集合,但是不符合常识,解决方法就是选择性减少参数或者对参数进行正则化
7.代价函数
Tips:对每个参数进行正则化,使变量减少(参数即θ几乎等于0)或者减小,其中λ大小要合适,过大会使所有参数几乎为0则hx会几乎为一条直线
8.线性回归正则化
1.梯度下降
Tips:相比正常的梯度下降多了α*(λ/m)
3.正规方程
9.Logistic回归正则化
1.代价函数
Tips:相比正常的代价函数多了最后那项
2.梯度下降
Tips:和线性回归的很像,但是hθ(x)不一样,本质不一样
2.题目
1.Question 1
Suppose that you have trained a logistic regression classifier, and it outputs on a new example x a prediction hθ(x) = 0.7. This means (check all that apply):
解答:BD
2.Question 2
Suppose you have the following training set, and fit a logistic regression classifier hθ(x)=g(θ0+θ1x1+θ2x2).Which of the following are true? Check all that apply.
解答:AB(参数越多对Hθ效果越好)
3.Question 3
解答:AD
4.Question 4
解答:AC(对于选项D,如果是凸函数就能找到全局最小,用牛逼算法是为了不用选择学习率,并且速度更快)
5.Question 5
Suppose you train a logistic classifier hθ(x)=g(θ0+θ1x1+θ2x2) Suppose θ0=−6,θ1=1,θ2=0. Which of the following figures represents the decision boundary found by your classifier?
解答:B(-6+x=0,以x=6为分界)
6.Question 6
You are training a classification model with logisticregression. Which of the following statements are true? Check all that apply.
解答:C
Tips:对于A,加一个特征量会对训练集更加拟合,而不是例子,对于B、D正则化如果不合理使用对例子训练集两者都不好
7.Question 7
Suppose you ran logistic regression twice, once with λ=0, and once with λ=1. One of the times,
解答:B
Tips:加入λ使θ变小(过大会使每个参数为0)
8.Question 8
Which of the following statements about regularization are true? Check all that apply.
解答:A
Tips:对于AB,如Quesition6
9.Question 9
In which one of the following figures do you think the hypothesis has overfit the training set?
解答:A
10.Question 10
In which one of the following figures do you think the hypothesis has underfit the training set?
解答:A