一、预测函数HypothesisFunction二、损失函数CostFunctionfunction g = sigmoid(z) %SIGMOID Compute sigmoid functoon %J = SIGMOID(z) computes the sigmoid of z. g = 1.0 ./ (1.0 + exp(-z)); end function H = computeHypothesis(X, theta) H = sigmoid(X*theta); end
三、最优化参数训练 1.梯度下降GradientDescendfunction J = computeCost(theta, X, y) %COMPUTECOST Compute cost for logistic regression %J = COMPUTECOST(theta, X, y) computes the cost of using theta as the parameter for logistic regression. % Initialize some useful values m = length(y); % number of training examples h = sigmoid(X * theta); J = - (y' * log(h) + (1 - y)' * log(1 - h)) / m; end
四、预测结果Predictfunction [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) %GRADIENTDESCENT Performs gradient descent to learn theta %theta = GRADIENTDESCENT(x, y, theta, alpha, num_iters) updates theta by taking num_iters gradient steps with learning rate alpha % Initialize some useful values m = length(y); % number of training examples h = sigmoid(X * theta); J_history = zeros(num_iters, 1); for iter = 1:num_iters theta = theta - alpha * X' * (h - y) / m; % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta); end end
【机器学习】线性分类LogisticRegression
最新推荐文章于 2021-12-02 16:56:05 发布