[Machine Learning][Octave]Logistic Regression Practice

After learning the Andrew ng’s lessons, I use his exercises to test the logistic regression.
First, I wrote a sigmond function.

function g = sigmoid(z)
g = 1 ./ (1 .+ exp(-z));
end

And then I wrote the costFunction.
It needs to be noticed that X is a m*n matrix, y is a m*1 vector.(At first, there are some bugs in my programme,and after displaying many variable size, I find the reason which is that the expression that I deduce is not fit for the offering X,y size.)
It’s the same that the codes are optimized by vectorization instead of using for loop.

function [J, grad] = costFunction(theta, X, y)
m = length(y); % number of training examples

J = (1/m) * (sum((-y') .* log(sigmoid(theta'*X')))-sum((1 .- y') .*log(1 .- sigmoid(theta'*X'))));

grad = (1/m) .* ((sigmoid(theta'*X')-y')*X)';
end

At last, the ex2.m works. And it needs to be wrote down that the code to call the fminunc.

initial_theta = zeros(n + 1, 1);
options = optimset('GradObj', 'on', 'MaxIter', 400);

%  Run fminunc to obtain the optimal theta
%  This function will return theta and the cost 

[theta, cost] = ...
    fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

And the explanation in his instruction.

To specify the actual function we are minimizing, we use a “short-hand” for specifying functions with the @(t) ( costFunction(t, X, y) ) . This creates a function, with argument t, which calls your costFunction. This allows us to wrap the costFunction for use with fminunc.

I’m still confused to the statement ‘@(t)(costFunction(t,X,y))’, I don’t know what the ‘@’ and ‘t’ means. So, here is a problem needs to be solved.
Mark here!

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值