Machine Learning week 3 quiz : Regularization

原创 2015年11月17日 20:50:28

Regularization

5 试题

1. 

You are training a classification model with logistic

regression. Which of the following statements are true? Check

all that apply.

Adding many new features to the model helps prevent overfitting on the training set.

Introducing regularization to the model always results in equal or better performance on the training set.

Adding a new feature to the model always results in equal or better performance on the training set.

Introducing regularization to the model always results in equal or better performance on examples not in the training set.

2. 

Suppose you ran logistic regression twice, once with λ=0, and once with λ=1. One of the times, you got

parameters θ=[23.437.9], and the other time you got

θ=[1.030.28]. However, you forgot which value of

λ corresponds to which value of θ. Which one do you

think corresponds to λ=1?

θ=[1.030.28]

θ=[23.437.9]

3. 

Which of the following statements about regularization are

true? Check all that apply.

Because logistic regression outputs values 0hθ(x)1, it's range of output values can only be "shrunk" slightly by regularization anyway, so regularization is generally not helpful for it.

Using a very large value of λ cannot hurt the performance of your hypothesis; the only reason we do not set λ to be too large is to avoid numerical problems.

Using too large a value of λ can cause your hypothesis to overfit the data; this can be avoided by reducing λ.

Consider a classification problem. Adding regularization may cause your classifier to incorrectly classify some training examples (which it had correctly classified when not using regularization, i.e. when λ=0).

4. 

In which one of the following figures do you think the hypothesis has overfit the training set?

Figure:

Figure:

Figure:

Figure:

5. 

In which one of the following figures do you think the hypothesis has underfit the training set?

Figure:

Figure:

Figure:

Figure:


版权声明:本文为博主原创文章,未经博主允许不得转载。 举报

相关文章推荐

Stanford机器学习---第三讲. 逻辑回归和过拟合问题的解决 logistic Regression & Regularization

本栏目(Machine learning)包括单参数的线性回归、多参数的线性回归、Octave Tutorial、Logistic Regression、Regularization、神经网络、机器学...

Coursera Machine Learning 第三周 quiz Regularization

1. You are training a classification model with logistic regression. Which of the following st...

精选:深入理解 Docker 内部原理及网络配置

网络绝对是任何系统的核心,对于容器而言也是如此。Docker 作为目前最火的轻量级容器技术,有很多令人称道的功能,如 Docker 的镜像管理。然而,Docker的网络一直以来都比较薄弱,所以我们有必要深入了解Docker的网络知识,以满足更高的网络需求。

深度学习(十四)基于CNN的性别、年龄识别

CNN应用之性别、年龄识别 原文地址:http://blog.csdn.net/hjimce/article/details/49255013 作者:hjimce 一、相关理论   本篇...

Coursera Machine Learning Week3 学习笔记

五、逻辑回归(Logistic Regression)在分类问题中,我们要预测变量的y是离散的值,所有我们将使用一种叫逻辑回归(Logistic Regression)算法。5.1 分类和表示(Cla...

Machine Learning week 3 quiz : Regularization

Regularization 5 试题 1.  You are training a classification model wit...

Machine Learning week 3 quiz : Logistic Regression

Logistic Regression 5 试题 1.  Suppose that you have trained a logist...

Machine Learning week 6 quiz: Machine Learning System Design

Machine Learning System Design 5 试题 1.  You are working on a spam c...
返回顶部
收藏助手
不良信息举报
您举报文章:深度学习:神经网络中的前向传播和反向传播算法推导
举报原因:
原因补充:

(最多只允许输入30个字)