Octave
文章平均质量分 70
jkwRay
这个作者很懒,什么都没留下…
展开
-
[Machine Learning] [Octave]Gradient Descent Practice
After Watching Andrew ng’s Machine Coursera Lessons, I want to practise Gradient Descent by myself. So, I generate some datas first.m = 10; %number of the datas n = 1; %feature alpha = 0.01; %learning原创 2017-08-29 21:32:11 · 444 阅读 · 0 评论 -
[Machine Learning][Linear Regression]Feature Scaling
IntroductionWhen I use gradient descent to get the h(x) which is similar to ‘x^2 + 2*x + 1’, I find a problem that the alpha need to be small like 0.000001, otherwise the variable can’t be regressed, s原创 2017-08-29 23:21:26 · 321 阅读 · 0 评论 -
[Machine Learning][Octave]Logistic Regression Practice
After learning the Andrew ng’s lessons, I use his exercises to test the logistic regression. First, I wrote a sigmond function.function g = sigmoid(z) g = 1 ./ (1 .+ exp(-z)); endAnd then I wrote the原创 2017-08-30 16:09:42 · 299 阅读 · 0 评论 -
[Machine Learning][Octave]Multi-class Classification
The theory of Multi-class Classification is based on single-class classification. To each class, set it in y to ‘True’ and others in y to ‘False’ and train a classifier for it. And then we have clas原创 2017-09-02 10:28:31 · 613 阅读 · 0 评论