Homework_Week10_Coursera【Machine Learning】AndrewNg、Large Scale Machine Learning
- 1 题目Suppose you are training a logistic regression classifier using stochastic gradient descent
- 2 题目 Which of the following statements about stochastic gradient descent are true? Check all that apply.
- 3 题目 Which of the following statements about online learning are true? Check all that apply.
- 4 题目 Assuming that you have a very large training set, which of the following algorithms do you think can be parallelized using map-reduce and splitting the training set across different machines? Check all that apply
- 5 题目 Which of the following statements about map-reduce are true? Check all that apply.
- 结果
1 题目Suppose you are training a logistic regression classifier using stochastic gradient descent
解析
采用随即下降法的逻辑回归分类器、损失不断随着时间缓慢增加、哪一个选项会有帮助
A说不是问题。错误
B说尝试使用更大的学习率错误
C说尝试小点的学习率可能是正确的、因为学习率大的话可能会导致损失不断变大
D说尝试平均图中大样本的损失
具体解析图片之前的quiz里有过,就不再贴了
答案
C
2 题目 Which of the following statements about stochastic gradient descent are true? Check all that apply.
解析
哪个关于随机梯度下降法是正确的?
A 为了确保随机梯度下降会拟合、每一次迭代都计算损失函数J为了确保损失函数是在减小的 ,duck不必,因为总会有变大的时候,随机嘛。总体是在下降就好了
B 你可以采用数值梯度来检查验证你的随机梯度下降是否是有bug的、正确
C说假设使用随机梯度下降来训练线性回归分类器、损失函数…保证每次迭代后都会减小。很明显未必的、理由同一
D在跑随机梯度下降前,你应该把训练集打乱、正确
答案
BD
3 题目 Which of the following statements about online learning are true? Check all that apply.
解析
A错误什么学习算法都需要选择学习率的。错误
B这个没错,用过新样本,学习到了就可以扔了
C 在线学习的弊端就是需要很大计算机内存空间来存储我们看到的数据。不需要、错误
D 在课上视频中线上学习,我们重复每一个训练样本,选择一步随机梯度下降,然后移步到下一个、正确
答案
BD
4 题目 Assuming that you have a very large training set, which of the following algorithms do you think can be parallelized using map-reduce and splitting the training set across different machines? Check all that apply
解析
如果你有很大的数据集,哪一个算法你觉得可被用来做map reduce并且把训练集分到不同的机器上?
A在线学习数据,不断收到单独的x,y.不行
B使用随机梯度下降法来做逻辑回归训练,不行
C 使用批梯度下降的线性回归
D 使用批剃度下降的神经网络
有数据集才能分割并行计算,且数据之间没有关联关系方可、batch可以
答案
CD
5 题目 Which of the following statements about map-reduce are true? Check all that apply.
解析
A当使用梯度下降的map reduce、我们用单一机器来积累从每个机器上的记过,为了计算迭代的参数、正确
B正确而,因为在本地跑会更快,不需要传数据的时间了
C说如果用n台电脑跑,…错误
D 如果你只有一台点啊弄,一核心数,帮助不大,正确
答案
ABD