数值梯度 与 解析梯度


J(x1,x2)=x21+3x1∗x2

解析梯度:
function [value,grad] = simpleQuadraticFunction(x)
% this function accepts a 2D vector as input. 
% Its outputs are:
%   value: h(x1, x2) = x1^2 + 3*x1*x2
%   grad: A 2x1 vector that gives the partial derivatives of h with respect to x1 and x2 
value = x(1)^2 + 3*x(1)*x(2);


grad = zeros(2, 1);
grad(1)  = 2*x(1) + 3*x(2);
grad(2)  = 3*x(1);
end


数值梯度
function numgrad = computeNumericalGradient(J, theta)
% numgrad = computeNumericalGradient(J, theta)
% theta: a vector of parameters
% J: a function that outputs a real-number. Calling y = J(theta) will return the
% function value at theta. 


% Initialize numgrad with zeros
numgrad = zeros(size(theta));


% Implement numerical gradient checking, and return the result in numgrad.  


epsilon = 10^(-4);
n = size(theta, 1);
for i=1:n
    theta1 = theta;
    theta1(i) = theta1(i) + epsilon;
    theta2 = theta;
    theta2(i) = theta2(i) - epsilon;
    [J1, grad] = J(theta1);
    [J2, grad] = J(theta2);
    numgrad(i) = (J1-J2)/(2*epsilon);
end
end


x = [4; 10];
[value, grad] = simpleQuadraticFunction(x);


numgrad = computeNumericalGradient(@simpleQuadraticFunction, x);


% Visually examine the two gradient computations.   
disp([numgrad grad]);


% Evaluate the norm of the difference between two solutions.  
diff = norm(numgrad-grad)/norm(numgrad+grad);
disp(diff); 
  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值