排序:
默认
按更新时间
按访问量

machine learning博客索引

本系列为台大林轩田老师《机器学习基石》和《机器学习技法》课程的部分学习笔记。 机器学习基础 机器学习笔记-Nonlinear Transformation 机器学习笔记-Hazard of Overfitting 机器学习笔记-Regularization 机器学习笔记-Validation...

2018-06-16 08:21:17

阅读数:25

评论数:0

deep learning博客索引

Course1 Week2-foundation of neural network Week3-one hidden layer neural network Week4-deep neural network Course2 ...

2018-06-15 22:49:04

阅读数:19

评论数:0

Course4-week4-face recognition and neural style transfer

1 - what is face recognition? This week will show you a couple important special applications of CONVnet, we will start with face recognition and th...

2018-06-11 22:19:17

阅读数:12

评论数:0

Course4-week3-object detection

1 - object localization In order to build up the object detection, we first learn about object localization. image classification: the algori...

2018-06-11 22:04:20

阅读数:13

评论数:0

Course4-week2-case studies

case studies 1 - why look at cases studies? how the together the basic building block, such as CONV layer, POOL layer, FC layer, to form effectiv...

2018-06-09 11:05:39

阅读数:10

评论数:0

Course4-week1-convolutional neural network

1 - Computer vision Computer vision problem: images recognition object detection style transfer One of the challenges of the computer vision...

2018-06-09 10:52:44

阅读数:62

评论数:0

Course3 - machine learning strategy 2

1 - carrying out error analysis If the learning algorithm is not yet at the performance of a human, then manually examiming mistakes that the algori...

2018-06-08 17:56:03

阅读数:126

评论数:0

Course3 - machine learning strategy 1

introduction to ML strategy 1 - why ML strategy? How to structure machine learning project, that is on the machine learning strategy. What is m...

2018-06-08 17:44:16

阅读数:14

评论数:0

Course2-week3-hyperparameterTuning - BatchNormalization - Framework

hyperparameter tuning 1 - tuning process How to systematically organize hyperparameters tuning process? hyperparameters learning rate αα\alpha...

2018-06-08 16:45:36

阅读数:17

评论数:0

Course2-week2-optimization algorithm

optimization algorithms 1 - mini-batch gradient descent vectorization allows you to efficiently compute on m examples.But if m is large then it c...

2018-06-08 16:37:43

阅读数:15

评论数:0

Course2-week1-setting up your ML application

setting up your ML application 1 - train/dev/test set This week we’ll learn the partical aspects of how to make your neural network work well, ra...

2018-06-08 16:20:37

阅读数:34

评论数:0

Course1-week4-deep neural network

4.1 - deep L-layer neural network We have seen forward propagation and backward propagation in the context of a neural network with a single hidden ...

2018-06-08 15:58:23

阅读数:17

评论数:0

Course1-week3-one hidden layer neural network

3.1 - neural networks overview Some new notation have been introduce, we’ll use superscript square bracket 1 to refer to the layer of neural network...

2018-06-08 15:48:48

阅读数:14

评论数:0

Course1-week2-foundation of neural network

Week 2 Basics of Neural Network Programming 2.1 binary classification mmm training example: (x(1),y(1)),⋯,(x(m),y(m))(x(1),y(1)),⋯,(x(m),y(m))...

2018-06-08 15:37:03

阅读数:13

评论数:0

keras学习简易笔记

Keras:基于Python的深度学习库 Keras是一个高层神经网络API,Keras由纯Python编写而成并基Tensorflow、Theano以及CNTK后端。Keras适用的Python版本是:Python 2.7-3.6。 1 - 一些基本概念 1.1 - 符号计算 K...

2018-05-06 15:43:29

阅读数:70

评论数:0

lambda函数的用法简记

lambda函数 lambda是一个匿名函数,其语法为:lambda parameters:express 一般用法 import numpy as np sigmoid = lambda x:1./(1.+np.exp(-x)) sigmoid(np.array([-10, 0, 10])...

2018-01-27 11:04:02

阅读数:288

评论数:0

关于plt.cm.Spectral

cmap = plt.cm.Spectral用法理解 %matplotlib inline import numpy as np import matplotlib.pyplot as plt np.random.seed(1) # 产生相同的随机数 X = np.random.rand...

2018-01-27 10:46:21

阅读数:1177

评论数:0

决策树ID3算法及实现

0. 信息论 信道模型和信息的含义 信息论是关于信息的本质和传输规律的理论。 信道模型:信源(发送端)-> 信道 -> 信宿(接收端) 1. 通信过程是在随机干扰的环境汇中传递信息的过程 2. 信宿对于信源的先验不确定性:在通信前,信宿不能确切的了解信源的状态; 3. ...

2018-01-12 21:32:27

阅读数:2177

评论数:0

机器学习笔记-Validation

可以使用regularization来避免overfitting的发生。监督机器学习问题可以概括为:在规则化参数的同时最小化误差。最小化误差是为了让我们的模型拟合我们的训练数据,而规则化参数是防止我们的模型过分拟合我们的训练数据。Regularization的具体做法是我们不只是专注在最小化Ein...

2018-01-03 19:49:40

阅读数:6066

评论数:0

机器学习笔记-Regularization

Regularized Hypothesis Set 上一篇中说到,在机器学习中最大的危险是过拟合。 当使用的模型的复杂度过高,资料量不多,资料存在噪声或者是目标函数很复杂的时候都有可能会出现过拟合的情况。Regularization可以看成是对付overfitting的一个方法。 ...

2018-01-03 19:46:23

阅读数:6091

评论数:0

提示
确定要删除当前文章?
取消 删除
关闭
关闭