Machine Learning - Andrew Ng
塔希提
Happy research.
展开
-
【Machine Learning】【Andrew Ng】- Quiz2(Week 6)
1、You are working on a spam classification system using regularized logistic regression. “Spam” is a positive class (y = 1) and “not spam” is the negative class (y = 0). You have trained your classier原创 2017-12-31 18:24:51 · 10053 阅读 · 2 评论 -
【Machine Learning】【Andrew Ng】- 课程总结
Main topics 1、Supervised LearningLinear regressionlogistic regressionneural networksSVMs2、Unsupervised LearningK-meansPCSAnomaly detection3、Special applications/special topicsRecommender sys原创 2018-01-08 15:56:00 · 517 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 1: Introduction)
DefinitionTom Mitchell provides a more modern definition: "A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at ...转载 2018-07-09 14:20:04 · 213 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 1: model and cost function)
Model Representation: To describe the supervised learning problem slightly more formally, our goal is, given a training set, to learn a function h : X → Y so that h(x) is a “good” predictor for the c...转载 2018-07-09 15:34:24 · 297 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 1: Parameter Learning)
Gradient DescentImagine that we graph our hypothesis function based on its fields θ0θ0\theta_0 and θ1θ1\theta_1 (actually we are graphing the cost function as a function of the parameter estimates...转载 2018-07-09 16:13:25 · 198 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 2: Multivariate Linear Regression)
Multiple FeaturesLinear regression with multiple variables is also known as “multivariate linear regression”. We now introduce notation for equations where we can have any number of input variables...转载 2018-07-09 16:47:41 · 198 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 2: Computing Parameters Analytically)
Normal EquationGradient descent gives one way of minimizing J. Let’s discuss a second way of doing so, this time performing the minimization explicitly and without resorting to an iterative algorith...转载 2018-07-09 17:02:29 · 178 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 3: Classification and Representation)
ClassificationTo attempt classification, one method is to use linear regression and map all predictions greater than 0.5 as a 1 and all less than 0.5 as a 0. However, this method doesn’t work well b...转载 2018-07-09 17:30:33 · 172 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 3: Logistic Regression Model)
Cost FunctionWe cannot use the same cost function that we use for linear regression because the Logistic Function will cause the output to be wavy, causing many local optima. In other words, it will...转载 2018-07-09 17:50:57 · 358 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 3: Multiclass Classfication)
Now we will approach the classification of data when we have more than two categories. Instead of y = {0,1} we will expand our definition so that y = {0,1…n}. Since y = {0,1…n}, we divide our problem...转载 2018-07-09 17:55:09 · 173 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 3: Solving the problem of overfitting)
The Problem of OverfittingConsider the problem of predicting y from x ∈ R. The leftmost figure below shows the result of fitting a y=θ0+θ1xy=θ0+θ1xy = θ_0 + θ_1x to a dataset. We see that the data d...转载 2018-07-10 09:51:12 · 297 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- 编程题(Week 9)
1、Estimate Gaussian Parameters mu = 1/m*(sum(X))';for i = 1:m sigma2 = sigma2+ 1/m*((X(i,:)')-mu).^2;end2、Select Threshold • tp is the number of true negatives: the ground truth label say原创 2018-01-08 15:48:23 · 550 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- Quiz2(Week 11)
1、Suppose you are running a sliding window detector to find text in images. Your input images are 1000x1000 pixels. You will run your sliding windows detector at two scales, 10x10 and 20x20 (i.e., you原创 2018-01-08 15:28:37 · 2887 阅读 · 2 评论 -
【Machine Learning】【Andrew Ng】- Quiz(Week 10)
1、Suppose you are training a logistic regression classier using stochastic gradient descent. You find that the cost (say,cost(θ,(x(i) ,y(i))), averaged over the last 500 examples), plotted as a functi原创 2018-01-08 15:06:25 · 16169 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- Quiz(Week 7)
1、Suppose you have trained an SVM classier with a Gaussian kernel, and it learned the following decision boundary on the training set: You suspect that the SVM is undertting your dataset. Should you原创 2018-01-02 21:35:33 · 2969 阅读 · 1 评论 -
【Machine Learning】【Andrew Ng】- 编程题(Week 7)
1、 Gaussian Kernel sim = exp(-sum((x1-x2).^2)/(2*sigma^2));2、Parameters (C, sigma) for Dataset 3C_matrix = [0.01 0.03 0.1 0.3 1 3 10 30];sigma_matrix = C_matrix;err_min = 100000;for i = 1:8 C =原创 2018-01-02 23:03:56 · 530 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- 编程题(Week 5)
Training a neural network:Randomly initialize weights For gradient descent and advanced optimization method, need initial value for Theta; 初始化为0的坏处: After each update, parameters corresponding t原创 2017-12-25 09:24:06 · 430 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- Quiz(Week 6)
You train a learning algorithm, and find that it has unacceptably high error on the test set. You plot the learning curve, and obtain the figure below. Is the algorithm suffering from high bias, high v原创 2017-12-28 10:52:09 · 9085 阅读 · 2 评论 -
【Machine Learning】【Andrew Ng】- 编程题(Week 6)
1、Regularized linear regression cost function J = 1/(2*m)*sum((X*theta-y).^2)+lambda/(2*m)*sum(theta(2:end).^2);2、Regularized linear regression gradient %%%%%gradient of cost functiongrad = 1/m*原创 2017-12-29 10:48:03 · 645 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- Quiz1(Week 8)
1、For which of the following tasks might K-means clustering be a suitable algorithm? Select all that apply. A. Given a database of information about your users, automatically group them into differen原创 2018-01-04 20:00:08 · 7420 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- Quiz2(Week 8)
1、Consider the following 2D dataset: Which of the following figures correspond to possible values that PCA may return for u(1) (the first eigenvector / first principal component)? Check all that ap原创 2018-01-04 20:52:33 · 6629 阅读 · 1 评论 -
【Machine Learning】【Andrew Ng】- 编程题(Week 8)
1、Find Closest Centroids m = size(X,1);for i = 1:m d_min = 10000; for j = 1:K d_temp = sum((X(i,:)- centroids(j,:)).^2); if d_temp<d_min d_min = d_temp; idx(i) = j原创 2018-01-04 21:02:21 · 735 阅读 · 0 评论 -
【Machine Learning】【Andrew Ng】- Quiz1(Week 9)
1、For which of the following problems would anomaly detection be a suitable algorithm? A. In a computer chip fabrication plant, identify microchips that might be defective. B. From a large set of ho原创 2018-01-07 21:14:59 · 6490 阅读 · 5 评论 -
【Machine Learning】【Andrew Ng】- Quiz2(Week 9)
1、Suppose you run a bookstore, and have ratings (1 to 5 stars) of books. Your collaborative filtering algorithm has learned a parameter vector theta(j) for user j, and a feature vector x(i) for each b原创 2018-01-07 21:38:28 · 29650 阅读 · 1 评论 -
【Machine Learning】【Andrew Ng】- notes(Week 4: Neural Network: Representation)
Model Representation ILet’s examine how we will represent a hypothesis function using neural networks. At a very simple level, neurons are basically computational units that take inputs (dendrites) ...转载 2018-07-10 11:25:14 · 298 阅读 · 0 评论