学习Andrew.Ng的ML课程
文章平均质量分 70
zhangjl2022
这个作者很懒,什么都没留下…
展开
-
#“Machine Learning”(Andrew Ng)#Week 4_2:Model Representation(神经网络)
1、Model Representation 1当我们在运用神经网络时,我们该如何表示我们的假设或模型?既然神经网络是在模仿大脑中的神经元,那么大脑中神经元的工作方式是什么样的呢?原创 2017-02-16 15:57:37 · 403 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 4_1:Neural Networks(神经网络)
Neural Networks(神经网络)Neural networks is a model inspired by how the brain works. It is widely used today in many applications: when your phone interprets and understand your voice commands, it is原创 2017-02-15 21:15:37 · 372 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 2_2:Normal Equation
具体而言,到目前为止,我们一直在使用的线性回归的算法,是梯度下降法。就是说,为了最小化代价函数 J(θ) 来最小化这个,我们使用的迭代算法,需要经过很多步来收敛到全局最小值。相反地,正规方程法提供了一种求 θ 的解析解法,可以直接一次性求解θ的最优值。原创 2017-01-25 10:41:35 · 235 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 2_1:Multivariate Linear Regression
1、Multiple Features (Variables) 一种新的,更为有效的线性回归形式 这种形式适用于多个变量。(现在拥有更多的信息可以用来预测某一个结果)n:表示特征量的数目x 上标 (i) :表示第i个训练样本的输入特征值 (x上标(2) 这样表示 就是一个四维向量 事实上更普遍地来说 这是n维的向量)x上标(i)下标j :第i个训练样本的 第j个特征量 。原创 2017-01-23 15:39:14 · 323 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 1_2:Gradient Descent
Parameter LearningGradient Descent(So we have our hypothesis function and we have a way of measuring how well it fits into the data.Now we need to estimate the parameters in the hypothesis function. That's where gradient descent comes in.)原创 2017-01-21 09:45:58 · 459 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 1_2: Cost Function
Model and Cost FunctionCost Function(We can measure the accuracy of our hypothesis function by using a cost function.)原创 2017-01-20 18:26:24 · 373 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 1_1:Introduction/Supervised/Un supervised/Linear regression
Welcome to Machine Learning!1、Ai: Artificial intelligence.2、Many scientists think the best way to make progress on this is through learning algorithms called neural networks, which mimic how the human brain works.原创 2017-01-20 08:52:21 · 358 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 3_3:Multiclass Classification One-vs-all
1、Multiclass Classification One-vs-all如何使用逻辑回归 (logistic regression) 来解决多类别分类问题,具体来说,我想通过一个叫做"一对多" (one-vs-all) 的分类算法?什么是多类别分类问题?原创 2017-02-02 11:35:48 · 2438 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 3_2:Logistic Regression Model
1、Cost FunctionAnd the question that I want to talk about is given this training set, how do we choose, or how do we fit the parameter's theta?原创 2017-02-01 19:13:01 · 415 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 3_1:Classification and Representation
1、ClassficationTo attempt classification, one method is to use linear regression and map all predictions greater than 0.5 as a 1 and all less than 0.5 as a 0. However, this method doesn't work well原创 2017-02-01 10:15:12 · 382 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 2_2:Octave/Matlab Tutorial
Basic Operations原创 2017-01-30 19:11:04 · 838 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 1_3:Linear Algebra Review
Linear Algebra Review1、Matrices and Vectors表示问题/维数/具体元素的表示/1-索引与0-索引2、Addition and Scalar Multiplication矩阵的加法与减法(对应项元素相加减)/维度相同原则/矩阵的数乘/3、Matrix Vector Multiplication矩阵与向量的乘法运算/维数匹配原则原创 2017-01-21 19:11:00 · 295 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 4_3:Examples and Intuition I
1、Examples and Intuition 1在这个例子中,我只画出了两个正样本和两个负样本,你可以认为这是一个更复杂的学习问题的简化版本。在这个复杂问题中,我们可能在右上角有一堆正样本,在左下方有 一堆用圆圈表示的负样本,我们想要学习一种非线性的决策边界来区分正负样本。原创 2017-02-18 18:11:33 · 289 阅读 · 0 评论 -
#“Machine Learning”(Andrew Ng)#Week 3_4:Solving the Problem of Overfitting
1、The Problem of Overfitting什么是过拟合?我们通过一组图来说明:欠拟合:这个问题的另一个术语叫做高偏差(bias),这两种说法大致相似,意思是它只是没有很好地拟合训练数据。它的意思是,如果拟合一条直线到训练数据,那么该算法就有一个很强的偏见或者说非常大的偏差。原创 2017-02-02 18:59:26 · 386 阅读 · 0 评论