Octave
Baoli1008
233
展开
-
Stanford 机器学习 Week2 作业: Linear Regression
Plotting the Datadata = load('ex1data1.txt'); % read comma separated dataX = data(:, 1); y = data(:, 2);m = length(y); % number of training examplesplot(x, y, 'rx', 'Mar原创 2016-02-08 22:37:09 · 1465 阅读 · 0 评论 -
Stanford 机器学习 Week5 作业: Neural Networks: Learning
randInitializeWeightepsilon_init = 0.12;W = rand(L_out, 1 + L_in) * 2 * epsilon_init - epsilon_init;sigmoidGradientg = sigmoid(z) .* (1 - sigmoid(z));nnCostFunctionTheta1 = reshape(nn_params(1:hidden_原创 2016-02-29 16:11:52 · 1548 阅读 · 0 评论 -
Stanford 机器学习 Week6 作业:Regularized Linear Regression and Bias v.s. Variance
linearRegCostfunctionm = length(y); J = 0;grad = zeros(size(theta));J = 1.0 / 2 / m * ( sum( (X * theta - y) .^ 2) + lambda * sum(theta(2:end) .^2) );grad = 1 / m * ((X * theta - y)' * X)';grad(2:en原创 2016-03-04 12:28:16 · 1518 阅读 · 0 评论 -
Stanford 机器学习 Week3 作业 Logistic Regression
Visualizing the datapos = find(y==1); neg = find(y == 0);plot(X(pos, 1), X(pos, 2), 'k+','LineWidth', 2, ... 'MarkerSize', 7);plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y', ... 'M原创 2016-02-14 20:40:34 · 1546 阅读 · 0 评论 -
Stanford 机器学习 Week4 作业 Multi-class Classification and Neural Networks
Vectorizing regularized logistic regressionm = length(y); % number of training examplesJ = 0;grad = zeros(size(theta));J = sum( -y .* log(sigmoid(X*theta)) - (1 - y) .* log(1 - sigmoid(X*theta))) /原创 2016-02-22 14:20:48 · 2690 阅读 · 1 评论 -
Stanford 机器学习 Week8 作业:K-means Clustering and Principal Component Analysis
FindClosestCentroidsfor i = 1:size(X,1) dis = sum((centroids - X(i,:)) .^ 2, 2); [t, idx(i)] = min(dis);endComputeCentroidsfor i = 1:K id = find(idx == i); tot = X(id,:); centroids原创 2016-03-20 23:22:41 · 1864 阅读 · 0 评论