MachineLearning_LogisticRegression使用matlab编程遇到问题

MachineLearning_LogisticRegression使用matlab编程遇到问题

- **[cost, grad] = costFunctionReg(initial_theta, X, y, lambda);**
调用costFunctionReg函数总是出现A(I)=X:X must have the same size as I.

costFunctionReg.m:
function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
%   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
%   theta as the parameter for regularized logistic regression and the
%   gradient of the cost w.r.t. to the parameters. 

% Initialize some useful values
m = length(y) % number of training examples

% You need to return the following variables correctly 
h = sigmoid(X*theta);
J = -(log(h')*y+(log(ones(1,m)-h')*(ones(m,1)-y)))/m ...
 + lambda/(2*m)*(sum(theta(2:end).^2));       %attention:we abadon theta0

grad=zeros(size(X,2),1);                                    %要先对grad进行定义,否则会出现矩阵规格不一赋值出错
grad(2:end) = X(:,2:end)'*(h-y)/m + (lambda/m)*theta(2:end);
grad(1) = ((h-y)'*X(:,1))'/m ;                %attention!!!

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
%               You should set J to the cost.
%               Compute the partial derivatives and set grad to the partial
%               derivatives of the cost w.r.t. each parameter in theta






% ============================================================
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值