吴恩达深度学习第二周编程作业

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_35269302/article/details/79950246

                                                             吴恩达深度学习第二周编程作业 

                                                                                                                                                                                                         吴恩达的编程作业做起来挺累人的,做了一下午,终于完成了。 

            GitHub 传送门

          顺手写个坑点 :

                1. 进行数据归一化操作之后进行额外的x0数据添加,因为x0本身只是为了矩阵操作方便而设立的额外参数,而且归一化会导致BUG;

                  2. 样本数据归一化而训练出来的 theta参数 必须与同样归一化的测试数据相点乘;

                  3.正规方程不需要归一化,因为不需要使用梯度下降法,矩阵运算复杂度只与其数据规模有关

          感受: 

                    1. Matlab 如mean() ,std(), sum() 函数都是以竖直方向为默认方向的说

                     2.    梯度下降 的 矩阵化 推导 只要仔细 还是很可能一发过的!

        Bath_Gradient_ Descent核心代码:

        

function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
%   theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
%   taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

    
    % ====================== YOUR CODE HERE ======================
    % Instructions: Perform a single gradient step on the parameter vector
    %               theta. 
    %
    % Hint: While debugging, it can be useful to print out the values
    %       of the cost function (computeCostMulti) and gradient here.
    %



num_fea = size(X,2);
dcost = zeros(num_fea,1);
for iter = 1:num_iters
    err = X * theta - y;
    for i= 1:num_fea
           dcost(i) = X(:,i)' * err;
    end
    theta = theta - alpha / m * dcost;
    

    % ============================================================

    % Save the cost J in every iteration    
    J_history(iter) = computeCostMulti(X, y, theta);

end
end


阅读更多
想对作者说点什么?

博主推荐

换一批

没有更多推荐了,返回首页