MXNet快速开始之自动微分

Basic usage

from mxnet import nd
from mxnet import autograd

示例f(x) = 2 x^2

x = nd.array([[1, 2], [3, 4]])
x

告知NDArray我们计划记录计算过程以计算梯度

x.attach_grad()

定义y=f(x),用autograd.record()便于之后计算梯度

with autograd.record():
    y = 2 * x * x
x.grad
输出:
[[ 4.  8.]
 [12. 16.]]
<NDArray 2x2 @cpu(0)>

Using Python control flows

def f(a):
    b = a * 2
    while b.norm().asscalar() < 1000:
        b = b * 2
    if b.sum().asscalar() >= 0:
        c = b[0]
    else:
        c = b[1]
    return c

a = nd.random.uniform(shape=2)
a.attach_grad()
with autograd.record():
    c = f(a)
c.backward()

We know that b is a linear function of a, and c is chosen from b. Then the gradient with respect to a be will be either [c/a[0], 0] or [0, c/a[1]], depending on which element from b we picked. Let’s find the results:

[a.grad, c/a]
输出:
[
 [2048.    0.]
 <NDArray 2 @cpu(0)>, 
 [2048.      1008.60596]
 <NDArray 2 @cpu(0)>]
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值