吴恩达深度学习2笔记week1——深度学习的实用层面 Setting up your ML application
- 1.1 训练_开发_测试集 train/dev/test sets
- 1.2 偏差_方差 Bias/Variance
- 1.3 机器学习基础 Basic "recipe" for machine learning
- 1.4 L2正则化 Regularization
- 1.5 为什么正则化可以减少过拟合?Why regularization reduces overfitting?
- 1.6 Dropout 正则化 Dropout regularization
- 1.7 理解 Dropout Understanding dropout
- 1.8 其他正则化方法 Other regularization methods
- 1.9 归一化输入 Normalizing inputs
- 1.10 梯度消失与梯度爆炸 Vanishing/exploding gradients
- 1.11 神经网络的权重初始化 Weight initialization for deep networks
- 1.12 梯度的数值逼近 Numerical approximation of gradients
- 1.13 梯度检验 Gradient Checking
- 1.14 关于梯度检验实现的注记 Gradient Checking implementation notes
1.1 训练_开发_测试集 train/dev/test sets
1.2 偏差_方差 Bias/Variance
1.3 机器学习基础 Basic “recipe” for machine learning
1.4 L2正则化 Regularization
1.5 为什么正则化可以减少过拟合?Why regularization reduces overfitting?
1.6 Dropout 正则化 Dropout regularization
- Inverted dropout 反向随机失活
1.7 理解 Dropout Understanding dropout
- dropout是一种正则化方法,有助于预防过拟合
1.8 其他正则化方法 Other regularization methods
- 数据扩增 data augmentation
- early stopping
1.9 归一化输入 Normalizing inputs
1.10 梯度消失与梯度爆炸 Vanishing/exploding gradients
1.11 神经网络的权重初始化 Weight initialization for deep networks
- weight 与1接近
1.12 梯度的数值逼近 Numerical approximation of gradients
1.13 梯度检验 Gradient Checking
- help find bugs in implemntations of back propagation