ML-Stanford-Andrew Ng
文章平均质量分 79
Quebradawill
关注PRMLCV,希望多交流。
展开
-
Stanford ML - Lecture 1 - Linear regression with one variable
Model representation Cost function Cost function intuition I Cost function intuition II Gradient descent start with some \theta_0, \theta_1keep changing \theta_0, \theta_1 to reduce原创 2013-03-07 19:57:45 · 794 阅读 · 0 评论 -
Stanford ML - Lecture 11 - Large scale machine learning
1. Learning with large datasets It's not who has the best algorithm that wins. It's who has the most data 2. Stochastic gradient descent batch gradient descent repeat原创 2013-03-21 19:12:05 · 785 阅读 · 0 评论 -
Stanford ML - Lecture 9 - Clustering
1. Unsupervised learning introduction supervised learning - training set: unpervised learning - training set: 2. K-means algorithm randomly select K cluster centroidsrepeat for every原创 2013-03-21 09:15:08 · 701 阅读 · 0 评论 -
Stanford ML - Lecture 8 - Support Vector Machines
1. Optimization Objective logistic regression let support vector machine 2. Large Margin Intuition 3. The mathematics behind large margin classification (optional)原创 2013-03-17 22:50:52 · 838 阅读 · 0 评论 -
Stanford ML - Lecture 10 - Dimensionality Reduction
1. Motivation I: Data Compression Reduce data from 2D to 1D 2. Motivation II: Data Visualization 3. Principal Component Analysis problem formulation reduce from 2D to 1D: find a directio原创 2013-03-21 10:47:07 · 804 阅读 · 0 评论 -
Stanford ML - Lecture 5 - Neural Networks: Learning
1. Cost function Neural Network (Classification) Binary classification 1 output unit Multi-class classification (K classes) K output units Cost function原创 2013-03-16 18:08:22 · 842 阅读 · 0 评论 -
Stanford ML - Lecture 6 - Advice for applying machine learning
1. Deciding what to try next Debugging a learning algorithm Suppose you have implemented regularized linear regression to predict housing prices, when you test your hypothesis on a new set of原创 2013-03-17 20:24:29 · 1993 阅读 · 0 评论 -
Stanford ML - Lecture 7 - Machine learning system design
1. Prioritizing what to work on: Spam classification example collect lots of datadeveloped sophisticated features based on email routing informationdeveloped sophisticated features for message bo原创 2013-03-17 22:01:22 · 1371 阅读 · 0 评论 -
Stanford ML - Lecture 4 - Neural Networks: Representation
1. Non-linear hypotheses why introduce non-linear hypotheses? high dimensional datanon-linear hypotheses 2. Neurons and the brain Neural networks origins: algorithms that try to m原创 2013-03-13 22:33:54 · 871 阅读 · 0 评论 -
Stanford ML - Lecture 3 - Logistic regression
1. Classification2. Hypothesis RepresentationLogistic Regression Model the above function is called sigmoid function or logistic functionInterpretation of Hypothesis outputestimated probability tha原创 2013-03-10 21:59:36 · 1370 阅读 · 0 评论 -
Stanford ML - Lecture 2 - Linear regression with multiple variable
Multiple features for convenience of notation, define , the new hypothesis is Gradient descent for multiple variables new algorithm is Gradient descent in practive I: Feature S原创 2013-03-08 22:35:15 · 881 阅读 · 0 评论 -
斯坦福课程ML
1、线性回归、logistic回归和一般回归 线性回归函数(如果) 损失函数(loss function)或者错误函数(error function) 梯度下降法 梯度下降法最大的问题是解有可能是局部极小值,这与初始点的选取有关步骤 首先对赋值,这个值可以是随机的,也可以让是一个全零的向量改变的值,使得沿梯度下降的的方向减少 由于求得的是极小值,因此原创 2013-03-26 19:12:44 · 1126 阅读 · 0 评论