- **[cost, grad] = costFunctionReg(initial_theta, X, y, lambda);**
调用costFunctionReg函数总是出现A(I)=X:X must have the same size as I.
costFunctionReg.m:
function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
% J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y) % number of training examples
% You need to return the following variables correctly
h = sigmoid(X*theta);
J = -(log(h')*y+(log(ones(1,m)-h')*(ones(m,1)-y)))/m ...
+ lambda/(2*m)*(sum(theta(2:end).^2)); %attention:we abadon theta0
grad=zeros(size(X,2),1); %要先对grad进行定义,否则会出现矩阵规格不一赋值出错
grad(2:end) = X(:,2:end)'*(h-y)/m + (lambda/m)*theta(2:end);
grad(1) = ((h-y)'*X(:,1))'/m ; %attention!!!
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
% ============================================================