![](https://img-blog.csdnimg.cn/20201014180756927.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
机器学习
文章平均质量分 93
今天你DEBUG了吗
这个作者很懒,什么都没留下…
展开
-
[Machine Learning] 线性回归与逻辑回归(含代码)
1.线性回归与逻辑回归的区别与联系(1)逻辑回归和线性回归首先都是广义的线性回归。(2)经典线性模型的优化目标函数是最小二乘,而逻辑回归则是似然函数。(3)线性回归在整个实数域范围内进行预测,敏感度一致,而分类范围,需要在[0,1]。逻辑 回归就是一种减小预测范围,将预测值限定为[0,1]间的一种回归模型,因而对于这类问题来说,逻辑回归的鲁棒性比线性回归的要好。或者说,线性回归模型无法做到sigmoid的非线性形式,sigmoid可以轻松处理0/1分类问题。2.线性回归的代码实现建立线性回归模型原创 2021-05-09 15:44:59 · 462 阅读 · 0 评论 -
[Machine Learning] 16 图片文字识别(Application Example: Photo OCR)
点Ta16 Application Example: Photo OCR(图片文字识别)16.1 Problem Description and Pipeline16.2 Sliding Windows16.3 Getting Lots of Data and Artificial Data16 Application Example: Photo OCR(图片文字识别)16.1 Problem Description and Pipeline图像文字识别应用所作的事是,从一张给定的图片中识别文字..原创 2020-07-22 15:53:23 · 214 阅读 · 0 评论 -
[Machine Learning] 15 大规模机器学习(Large Scale Machine Learning)
点Ta15 Large Scale Machine Learning(大规模机器学习)15.1 Learning With Large Dataset15.2 Stochastic Gradient Descent(随机梯度下降法)15.3 Mini-Batch Gradient Descent(小批量梯度下降)15.4 Stochastic Gradient Descent Convergence(随机梯度下降收敛)15.5 Online Learning15.6 Map Reduce and Dat..原创 2020-07-22 14:52:23 · 225 阅读 · 0 评论 -
[Machine Learning] 14 推荐系统(Recommender Systems)
点Ta14 Recommender Systems(推荐系统)14.1 Problem Formulation14.2 Content Based Recommendations14.3 Collaborative Filtering(协同过滤)14.4 Vectorization_ Low Rank Matrix Factorization(向量化:低秩矩阵分解)14.5 Implementational Detail_ Mean Normalization14 Recommender System..原创 2020-07-21 16:22:57 · 214 阅读 · 0 评论 -
[Machine Learning] 13 异常检测(Anomaly Detection)
点Ta13 Anomaly Detection(异常检测)13.1 Problem Motivation13.2 Gaussian Distribution(高斯分布)13.3 Algorithm13.4 Developing and Evaluating an Anomaly Detection System13.5 Anomaly Detection vs. Supervised Learning13.5 Choosing What Features to Use13.7 Multivariate ..原创 2020-07-21 14:51:32 · 442 阅读 · 0 评论 -
[Machine Learning] 12 降维(Dimensionality Reduction)
点Ta12 Dimensionality Reduction(降维)12.1 Motivation I_ Data Compression12 Dimensionality Reduction(降维)12.1 Motivation I_ Data Compression第二种无监督学习问题,称为降维。使用降维可以实现数据压缩,数据压缩不仅压缩数据,因而使用较少的计算机内存或磁盘空间,但它也让我们加快我们的学习算法。但首先,让我们谈论降维是什么。作为一种生动的例子,我们收集的数据集,有许多,许多特征原创 2020-07-20 15:23:18 · 331 阅读 · 0 评论 -
[Machine Learning] 11 聚类(Clustering)
点Ta11 Clustering(聚类)11.1 Unsupervised Learning Introduction11.2 K-Means Algorithm11.3 Optimization Objective11.4 Random Initialization11.5 Choosing the Number of Clusters聚类参考资料11 Clustering(聚类)11.1 Unsupervised Learning Introduction在一个典型的监督学习中,训练集是有标签..原创 2020-07-19 14:53:27 · 319 阅读 · 0 评论 -
[Machine Learning] 10 支持向量机(Support Vector Machines)
点Ta10 Support Vector Machines(支持向量机)10.1 Optimization Objective(优化目标)10 Support Vector Machines(支持向量机)10.1 Optimization Objective(优化目标)在监督学习中,许多学习算法的性能都非常类似,因此,重要的不是该选择使用学习算法 A 还是学习算法 B,而是应用这些算法时,所创建的大量数据表现情况通常依赖于你的水平。比如:你为学习算法所设计的特征量的选择,以及如何选择正则化参数,诸原创 2020-07-18 18:33:02 · 424 阅读 · 0 评论 -
[Machine Learning] 9 机器学习系统的设计(Machine Learning System Design)
点Ta9 Machine Learning System Design(机器学习系统的设计)9.1 Prioritizing What to Work On9.2 Error Analysis9.3 Error Metrics for Skewed Classes(偏斜类的误差评估)9.4 Trading Off Precision and Recall(查准率和查全率之间的权衡)9.5 Data For Machine Learning9 Machine Learning System Design..原创 2020-07-15 14:47:10 · 384 阅读 · 0 评论 -
[Machine Learning] 8 应用机器学习的建议(Advice for Applying Machine Learning)
点Ta8 Advice for Applying Machine Learning(应用机器学习的建议)8.1 Introduction8.2 Evaluating a Hypothesis8.3 Model Selection and Train_Validation_Test Sets8.4 Diagnosing Bias vs. Variance8.5 Regularization and Bias/Variance8.6 Learning Curves8.7 Summary8 Advice f..原创 2020-07-14 18:11:59 · 163 阅读 · 0 评论 -
[Machine Learning] 7 神经网络的学习(Neural Networks: Learning)
7 Neural Networks-Learning(神经网络的学习)7.1 Cost Function假设神经网络的训练样本有????个,每个包含一组输入????和一组输出信号????,????表示神经网络层数,????????表示每层的 neuron 个数(????l表示输出层神经元个数),????L代表最后一层中处理单元的个数。将神经网络的分类定义为两种情况:二类分类和多类分类,二类分类:????L = 0, ???? = 0 ???????? 1表示哪一类;????类分类:????L =原创 2020-07-12 14:41:51 · 333 阅读 · 1 评论 -
[Machine Learning] 6 神经网络:表述(Neural Networks: Representation)
点Ta6 Neural Networks: Representation(神经网络:表述)6.1 Non-linear hypotheses(非线性假设)6.2 Neurons and the Brain(神经元和大脑)6 Neural Networks: Representation(神经网络:表述)6.1 Non-linear hypotheses(非线性假设)无论是线性回归还是逻辑回归都有这样一个缺点,即:当特征太多时,计算的负荷会非常大。假设希望训练一个模型来识别视觉对象(例如识别一张图片上原创 2020-07-11 13:04:43 · 215 阅读 · 0 评论 -
[Machine Learning] 5 正则化(Regularization)
点Ta5 Regularization(正则化)5.1 The problem of overfitting5.2 Cost Function5.3 Regularizede linear regression(正则化线性回归)5.4 Regularized Logistic Regression(正则化的逻辑回归模型)5 Regularization(正则化)5.1 The problem of overfittingcase1:第一个模型是一个线性模型,欠拟合,不能很好地适应训练集;第..原创 2020-07-08 16:11:54 · 182 阅读 · 0 评论 -
[Machine Learning] 4 逻辑回归(Logistic Regression)
点Ta4 Logistic Regression(逻辑回归)4.1 Classification4.2 Hypothesis Representation(假设陈述)4.3 Decision Boundary(决策边界)4.4 Cost Function(代价函数)4.5 Advanced Optimization(高级优化)4 Logistic Regression(逻辑回归)4.1 Classification在分类问题中,需要预测的变量 ???? 是离散的值,这种学习算法称为逻辑回归 (Logi原创 2020-07-08 11:45:04 · 425 阅读 · 0 评论 -
[Machine Learning] 3 多变量线性回归(Linear Regression with Multiple Variables)
点Ta3 Linear Regression with Multiple Variables(多变量线性回归)3.1 Multiple Features(多维特征)3.2 Gradient Descent for Multiple Variables(多变量梯度下降)3.2.1 Gradient Descent in Practice I - Feature Scaling (特征缩放)3.2.2 Gradient Descent in Practice II - Learning Rate (学习率)3.原创 2020-07-07 13:16:17 · 398 阅读 · 0 评论 -
[Machine Learning] 2 单变量线性回归(Linear Regression with One Variable)
点Ta2 Linear Regression with One Variable(单变量线性回归)2.1 Case: Housing Prices2.2 Cost Function(代价函数)2.3 Gradient Decent(梯度下降)2.3.1 Case: Cost Function of Gradient Descent2.3.2 Gradient Descent For Linear Regression(梯度下降的线性回归)2 Linear Regression with One Varia原创 2020-07-06 19:06:05 · 271 阅读 · 0 评论 -
[Machine Learning] 1 概述
点Ta1 概述1.1 Learning Map1.2 Supervised Learning(监督学习)1.2.1 Case1: Housing price prediction1.2.2 Case2: Breast cancer (malignant,benign)1.3 Unsupervised Learning(无监督学习)1.3.1 Case: Cocktail party problem algorithm1.4 Semi-supervised Learning(半监督学习)1.5 Trans..原创 2020-07-06 13:10:47 · 469 阅读 · 0 评论