![](https://img-blog.csdnimg.cn/20190902171730145.jpg?x-oss-process=image/resize,m_fixed,h_224,w_224)
机器学习
文章平均质量分 54
糕比嘎嘎
这个作者很懒,什么都没留下…
展开
-
coursera机器学习-Week2编程作业: Linear Regression
因为使用的mac, 使用homebrew安装的octave是5.1.0版本,有一个坑,即:用pause()函数无法响应按键事件,详见 https://www.mobibrw.com/2019/18501 目前运行只能把ex1.m的pause;逐个注释掉,感觉有点糟心,希望5.2.0版本早点发布啊。1 warmUpExercise.m按照文档要求写入代码就可以生成一个5×55\times55...原创 2019-09-02 15:12:46 · 804 阅读 · 1 评论 -
coursera机器学习-Week3编程作业: Logistic Regression
1.1 plotData.m直接copy文档中给出的代码就行了function plotData(X, y)%PLOTDATA Plots the data points X and y into a new figure % PLOTDATA(x,y) plots the data points with + for the positive examples% and o f...原创 2019-09-02 17:12:02 · 568 阅读 · 0 评论 -
coursera机器学习-Week4编程作业: Multi-class Classification and Neural Networks
1.3.3 lrCostFunction.m和上周的costFunctionReg.m一样J(θ)=1m∑i=1m[−y(i)log(hθ(x(i)))−(1−y(i))log(1−hθ(x(i)))]+λ2m∑j=1nθj2J(\theta)=\frac{1}{m}\sum_{i=1}^m{[-y^{(i)}log(h_\theta(x^{(i)}))-(1-y^{(i)})log(1-h_...原创 2019-09-02 18:12:25 · 420 阅读 · 0 评论 -
coursera机器学习-Week5编程作业: Neural Network Learning
1.4 nnCostFunction.m正向传播:J(θ)=1m∑i=1m∑k=1K[−yk(i)log((hθ(x(i)))k)−(1−yk(i))log(1−(hθ(x(i)))k)]+λ2m[∑j=125∑k=1400(θj,k(1))2+∑j=110∑k=125(θj,k(2))2]J(\theta)=\frac{1}{m}\sum_{i=1}^m{\sum_{k=1}^K{\left[...原创 2019-09-04 13:19:13 · 554 阅读 · 0 评论 -
coursera机器学习-Week6编程作业: Regularized Linear Regression and Bias/Variance
1.2 linearRegCostFunction.mJ(θ)=12m(∑i=1m(hθ(x(i))−y(i))2)+λ2m(∑j=1nθj2)J(\theta)=\frac{1}{2m}\left(\sum_{i=1}^m(h_\theta(x^{(i)})-y^{(i)})^2\right)+\frac{\lambda}{2m}\left(\sum_{j=1}^n{\theta_j^2}\r...原创 2019-09-04 16:08:00 · 316 阅读 · 0 评论 -
coursera机器学习-Week7编程作业: Support Vector Machines
1.2.1 gaussianKernel.mfunction sim = gaussianKernel(x1, x2, sigma)%RBFKERNEL returns a radial basis function kernel between x1 and x2% sim = gaussianKernel(x1, x2) returns a gaussian kernel betw...原创 2019-09-05 12:38:54 · 842 阅读 · 0 评论 -
coursera机器学习-Week8编程作业: K-Means Clustering and PCA
1.1.1 findClosestCentroids.mfunction idx = findClosestCentroids(X, centroids)%FINDCLOSESTCENTROIDS computes the centroid memberships for every example% idx = FINDCLOSESTCENTROIDS (X, centroids) r...原创 2019-09-05 18:37:07 · 334 阅读 · 0 评论 -
coursera机器学习-Week9编程作业: Anomaly Detection and Recommender Systems
1.2 estimateGaussian.mμi=1m∑j=1mxi(j)\mu_i=\frac{1}{m}\sum_{j=1}^m{x_i^{(j)}}μi=m1∑j=1mxi(j)σi2=1m∑j=1m(xi(j)−μi)2\sigma_i^2=\frac{1}{m}\sum_{j=1}^m{(x_i^{(j)}-\mu_i)^2}σi2=m1∑j=1m(xi(j)−μi...原创 2019-09-07 14:47:51 · 357 阅读 · 0 评论