Matlab/Octave调用自定义函数避免显示结果
[cost, grad] = costFunction(initial_theta, X, y);
costFunction.m:
function [J, grad] = costFunction(theta, X, y);
%COSTFUNCTION Compute cost and gradient for logistic regression
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
% parameter for logistic regression and the gradient of the cost
% w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
h = sigmoid(X*theta);
J = -(y'*log(h)+(ones(m,1)-y)'*log(ones(m,1)-h))/m;
grad = ((h-y)'*X)'/m;
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
% =============================================================
end
%调用此语句集成环境便会显示运行结果:
g =
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000
0.50000