PyTorch 深度学习实践 第4讲-back propagation

第四讲 反向传播 back propagation

B站 六二大人,参考up错错莫,并加入了自己的思考(特别鸣谢)

1.w是Tensor(张量类型),Tensor中包含data和grad,data和grad也是Tensor。grad初始为None,调用l.backward()方法后w.grad为Tensor,故更新w.data时需使用w.grad.data。如果w需要计算梯度,那构建的计算图中,跟w相关的tensor都默认需要计算梯度。
刘老师视频中a = torch.Tensor([1.0]) 本文中更改为 a = torch.tensor([1.0]),俩种方法都可以,看个人习惯

import torch
a=torch.tensor([1.0])
a.requires_grad=True #需要计算梯度or a.requires_grad()
print(a)
print(a.data)
print(a.type())
print(a.data.type())
print(a.grad)
print(type(a.grad))

结果为:

2,w是Tensor, forward函数的返回值也是Tensor,loss函数的返回值也是Tensor
3,算法中主要体现反向传播在于l.backward(),调用该方法后w.grad由None更新为Tensor类型,且w.grad.data的值用于后续w.data的更新
l.backward()会把计算图中所有需要梯度(grad)的地方都会求出来,然后把梯度都存在对应的待求的参数中,最终计算图被释放。

取tensor中的data是不会构建计算图的。

import torch

x_data=[1.0,2.0,3.0]
y_data=[2.0,3.0,4.0]

w=torch.tensor([1.0]) #w的初值为1.0
w.requires_grad=True #需要计算梯度

def forward(x):
    return x*w  #w是一个Tensor

def loss(x,y):
    y_pred=forward(x)
    return (y_pred-y)**2


print("pridict(before traning)",4,forward(4).item())

for epoch in range(100):
    for x,y in zip(x_data,y_data):
        l=loss(x,y) #l是一个张量,tensor主要是在建立计算图 forward 计算损失
        l.backward() #bac
        print('\tgrad',x,y,w.grad.item())
        w.data=w.data-0.01*w.grad.data #权重更新时,注意grad也是一个tensor

        w.grad.data.zero_() #更新之后,记得将grad置0

    print ('progress:',epoch,l.item()) #取出loss使用l.item(),不要直接使用l(l是tensor会构建计算图)

print('predict (after training)',4,forward(4).item())


运行结果:
pridict(before traning) 4 4.0
grad 1.0 2.0 -2.0
grad 2.0 3.0 -3.8400001525878906
grad 3.0 4.0 -4.948800086975098
progress: 0 0.6802950501441956
grad 1.0 2.0 -1.784224033355713
grad 2.0 3.0 -2.9941577911376953
grad 3.0 4.0 -3.197906970977783
progress: 1 0.2840724587440491
grad 1.0 2.0 -1.6246981620788574
grad 2.0 3.0 -2.368816375732422
grad 3.0 4.0 -1.9034500122070312
progress: 2 0.10064227879047394
grad 1.0 2.0 -1.506758689880371
grad 2.0 3.0 -1.906494140625
grad 3.0 4.0 -0.9464435577392578
progress: 3 0.02488209493458271
grad 1.0 2.0 -1.419564962387085
grad 2.0 3.0 -1.5646944046020508
grad 3.0 4.0 -0.23891830444335938
progress: 4 0.0015856098616495728
grad 1.0 2.0 -1.3551013469696045
grad 2.0 3.0 -1.311997413635254
grad 3.0 4.0 0.2841653823852539
progress: 5 0.0022430545650422573
grad 1.0 2.0 -1.3074429035186768
grad 2.0 3.0 -1.1251764297485352
grad 3.0 4.0 0.6708869934082031
progress: 6 0.012502482160925865
grad 1.0 2.0 -1.2722082138061523
grad 2.0 3.0 -0.987055778503418
grad 3.0 4.0 0.9567947387695312
progress: 7 0.025429338216781616
grad 1.0 2.0 -1.2461588382720947
grad 2.0 3.0 -0.8849430084228516
grad 3.0 4.0 1.1681671142578125
progress: 8 0.03790595754981041
grad 1.0 2.0 -1.226900339126587
grad 2.0 3.0 -0.8094491958618164
grad 3.0 4.0 1.324441909790039
progress: 9 0.04872628673911095
grad 1.0 2.0 -1.2126619815826416
grad 2.0 3.0 -0.7536354064941406
grad 3.0 4.0 1.4399757385253906
progress: 10 0.0575980581343174
grad 1.0 2.0 -1.2021355628967285
grad 2.0 3.0 -0.712371826171875
grad 3.0 4.0 1.5253887176513672
progress: 11 0.06463363021612167
grad 1.0 2.0 -1.1943533420562744
grad 2.0 3.0 -0.6818647384643555
grad 3.0 4.0 1.5885400772094727
progress: 12 0.07009609788656235
grad 1.0 2.0 -1.1885995864868164
grad 2.0 3.0 -0.6593103408813477
grad 3.0 4.0 1.6352291107177734
progress: 13 0.07427706569433212
grad 1.0 2.0 -1.1843459606170654
grad 2.0 3.0 -0.6426362991333008
grad 3.0 4.0 1.6697416305541992
progress: 14 0.07744547724723816
grad 1.0 2.0 -1.1812012195587158
grad 2.0 3.0 -0.6303091049194336
grad 3.0 4.0 1.6952590942382812
progress: 15 0.07983064651489258
grad 1.0 2.0 -1.1788763999938965
grad 2.0 3.0 -0.6211957931518555
grad 3.0 4.0 1.7141246795654297
progress: 16 0.0816173180937767
grad 1.0 2.0 -1.1771574020385742
grad 2.0 3.0 -0.6144571304321289
grad 3.0 4.0 1.728072166442871
progress: 17 0.08295092731714249
grad 1.0 2.0 -1.175886631011963
grad 2.0 3.0 -0.6094751358032227
grad 3.0 4.0 1.7383861541748047
progress: 18 0.08394406735897064
grad 1.0 2.0 -1.1749470233917236
grad 2.0 3.0 -0.6057920455932617
grad 3.0 4.0 1.7460107803344727
progress: 19 0.08468204736709595
grad 1.0 2.0 -1.1742522716522217
grad 2.0 3.0 -0.6030693054199219
grad 3.0 4.0 1.7516469955444336
progress: 20 0.08522964268922806
grad 1.0 2.0 -1.173738956451416
grad 2.0 3.0 -0.6010570526123047
grad 3.0 4.0 1.7558097839355469
progress: 21 0.0856352224946022
grad 1.0 2.0 -1.1733593940734863
grad 2.0 3.0 -0.5995683670043945
grad 3.0 4.0 1.7588939666748047
progress: 22 0.08593633025884628
grad 1.0 2.0 -1.1730787754058838
grad 2.0 3.0 -0.5984687805175781
grad 3.0 4.0 1.7611684799194336
progress: 23 0.08615873008966446
grad 1.0 2.0 -1.1728713512420654
grad 2.0 3.0 -0.5976552963256836
grad 3.0 4.0 1.7628536224365234
progress: 24 0.08632369339466095
grad 1.0 2.0 -1.172717809677124
grad 2.0 3.0 -0.5970535278320312
grad 3.0 4.0 1.7640981674194336
progress: 25 0.08644562214612961
grad 1.0 2.0 -1.1726043224334717
grad 2.0 3.0 -0.5966091156005859
grad 3.0 4.0 1.765019416809082
progress: 26 0.08653593063354492
grad 1.0 2.0 -1.172520637512207
grad 2.0 3.0 -0.5962810516357422
grad 3.0 4.0 1.7656974792480469
progress: 27 0.08660243451595306
grad 1.0 2.0 -1.1724584102630615
grad 2.0 3.0 -0.5960369110107422
grad 3.0 4.0 1.7662038803100586
progress: 28 0.08665211498737335
grad 1.0 2.0 -1.172412633895874
grad 2.0 3.0 -0.5958576202392578
grad 3.0 4.0 1.7665729522705078
progress: 29 0.086688332259655
grad 1.0 2.0 -1.1723787784576416
grad 2.0 3.0 -0.5957250595092773
grad 3.0 4.0 1.7668476104736328
progress: 30 0.08671528846025467
grad 1.0 2.0 -1.172353744506836
grad 2.0 3.0 -0.5956268310546875
grad 3.0 4.0 1.7670536041259766
progress: 31 0.08673550933599472
grad 1.0 2.0 -1.172335147857666
grad 2.0 3.0 -0.5955533981323242
grad 3.0 4.0 1.7672052383422852
progress: 32 0.08675039559602737
grad 1.0 2.0 -1.1723213195800781
grad 2.0 3.0 -0.5954999923706055
grad 3.0 4.0 1.7673139572143555
progress: 33 0.08676107227802277
grad 1.0 2.0 -1.1723113059997559
grad 2.0 3.0 -0.5954599380493164
grad 3.0 4.0 1.7673969268798828
progress: 34 0.0867692157626152
grad 1.0 2.0 -1.1723036766052246
grad 2.0 3.0 -0.5954303741455078
grad 3.0 4.0 1.7674598693847656
progress: 35 0.08677539974451065
grad 1.0 2.0 -1.1722981929779053
grad 2.0 3.0 -0.5954093933105469
grad 3.0 4.0 1.767502784729004
progress: 36 0.08677961677312851
grad 1.0 2.0 -1.1722941398620605
grad 2.0 3.0 -0.595393180847168
grad 3.0 4.0 1.7675342559814453
progress: 37 0.08678270131349564
grad 1.0 2.0 -1.1722912788391113
grad 2.0 3.0 -0.5953817367553711
grad 3.0 4.0 1.7675600051879883
progress: 38 0.08678523451089859
grad 1.0 2.0 -1.1722891330718994
grad 2.0 3.0 -0.5953731536865234
grad 3.0 4.0 1.7675800323486328
progress: 39 0.08678720146417618
grad 1.0 2.0 -1.1722872257232666
grad 2.0 3.0 -0.5953655242919922
grad 3.0 4.0 1.767594337463379
progress: 40 0.0867886021733284
grad 1.0 2.0 -1.172286033630371
grad 2.0 3.0 -0.5953617095947266
grad 3.0 4.0 1.7676029205322266
progress: 41 0.08678944408893585
grad 1.0 2.0 -1.1722850799560547
grad 2.0 3.0 -0.5953578948974609
grad 3.0 4.0 1.767608642578125
progress: 42 0.08679001033306122
grad 1.0 2.0 -1.1722846031188965
grad 2.0 3.0 -0.5953559875488281
grad 3.0 4.0 1.7676143646240234
progress: 43 0.08679056912660599
grad 1.0 2.0 -1.1722841262817383
grad 2.0 3.0 -0.5953540802001953
grad 3.0 4.0 1.7676172256469727
progress: 44 0.08679085224866867
grad 1.0 2.0 -1.17228364944458
grad 2.0 3.0 -0.5953521728515625
grad 3.0 4.0 1.7676200866699219
progress: 45 0.08679113537073135
grad 1.0 2.0 -1.172283411026001
grad 2.0 3.0 -0.5953512191772461
grad 3.0 4.0 1.767622947692871
progress: 46 0.08679141104221344
grad 1.0 2.0 -1.1722831726074219
grad 2.0 3.0 -0.5953502655029297
grad 3.0 4.0 1.7676258087158203
progress: 47 0.08679169416427612
grad 1.0 2.0 -1.1722829341888428
grad 2.0 3.0 -0.5953493118286133
grad 3.0 4.0 1.7676286697387695
progress: 48 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 49 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 50 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 51 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 52 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 53 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 54 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 55 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 56 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 57 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 58 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 59 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 60 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 61 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 62 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 63 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 64 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 65 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 66 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 67 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 68 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 69 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 70 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 71 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 72 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 73 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 74 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 75 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 76 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 77 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 78 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 79 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 80 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 81 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 82 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 83 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 84 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 85 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 86 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 87 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 88 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 89 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 90 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 91 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 92 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 93 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 94 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 95 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 96 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 97 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 98 0.0867919772863388
grad 1.0 2.0 -1.1722826957702637
grad 2.0 3.0 -0.5953483581542969
grad 3.0 4.0 1.7676286697387695
progress: 99 0.0867919772863388
predict (after training) 4 5.655434608459473

练习

y=w*x线性模型,用pytorch实现反向传播
代码如下:

import numpy as np
import matplotlib.pyplot as plt
import torch

x_data=[1.0,2.0,3.0]
y_data=[2.0,4.0,6.0]

w=torch.tensor([1.0]) #初始权值
w.requires_grad=True  #计算梯度,默认是不计算

def forward(x):
    return  x*w

def loss(x,y): #构建计算图
    y_pred=forward(x)
    return (y_pred-y)**2

print('Prdict (befor training)',4,forward(4))

for epoch in range(100):
    l=loss(1,2) #为了在for循环之前定义l,以便之后的输出,无实际意义
    for x,y in zip(x_data,y_data):
        l=loss(x,y)
        l.backward()
        print('\tgrad',x,y,w.grad.item())
        w.data=w.data-0.01*w.grad.data #注意这里的grad是一个tensor,所以要取他的data
        w.grad.data.zero_() #释放之前计算的梯度
    print('epoch',epoch,l.item())

print('Predict(after training)',4,forward(4).item())

运行结果:
Prdict (befor training) 4 tensor([4.], grad_fn=)
grad 1.0 2.0 -2.0
grad 2.0 4.0 -7.840000152587891
grad 3.0 6.0 -16.228801727294922
epoch 0 7.315943717956543
grad 1.0 2.0 -1.478623867034912
grad 2.0 4.0 -5.796205520629883
grad 3.0 6.0 -11.998146057128906
epoch 1 3.9987640380859375
grad 1.0 2.0 -1.0931644439697266
grad 2.0 4.0 -4.285204887390137
grad 3.0 6.0 -8.870372772216797
epoch 2 2.1856532096862793
grad 1.0 2.0 -0.8081896305084229
grad 2.0 4.0 -3.1681032180786133
grad 3.0 6.0 -6.557973861694336
epoch 3 1.1946394443511963
grad 1.0 2.0 -0.5975041389465332
grad 2.0 4.0 -2.3422164916992188
grad 3.0 6.0 -4.848389625549316
epoch 4 0.6529689431190491
grad 1.0 2.0 -0.4417421817779541
grad 2.0 4.0 -1.7316293716430664
grad 3.0 6.0 -3.58447265625
epoch 5 0.35690122842788696
grad 1.0 2.0 -0.3265852928161621
grad 2.0 4.0 -1.2802143096923828
grad 3.0 6.0 -2.650045394897461
epoch 6 0.195076122879982
grad 1.0 2.0 -0.24144840240478516
grad 2.0 4.0 -0.9464778900146484
grad 3.0 6.0 -1.9592113494873047
epoch 7 0.10662525147199631
grad 1.0 2.0 -0.17850565910339355
grad 2.0 4.0 -0.699742317199707
grad 3.0 6.0 -1.4484672546386719
epoch 8 0.0582793727517128
grad 1.0 2.0 -0.1319713592529297
grad 2.0 4.0 -0.5173273086547852
grad 3.0 6.0 -1.070866584777832
epoch 9 0.03185431286692619
grad 1.0 2.0 -0.09756779670715332
grad 2.0 4.0 -0.3824653625488281
grad 3.0 6.0 -0.7917022705078125
epoch 10 0.017410902306437492
grad 1.0 2.0 -0.07213282585144043
grad 2.0 4.0 -0.2827606201171875
grad 3.0 6.0 -0.5853137969970703
epoch 11 0.009516451507806778
grad 1.0 2.0 -0.053328514099121094
grad 2.0 4.0 -0.2090473175048828
grad 3.0 6.0 -0.43272972106933594
epoch 12 0.005201528314501047
grad 1.0 2.0 -0.039426326751708984
grad 2.0 4.0 -0.15455150604248047
grad 3.0 6.0 -0.3199195861816406
epoch 13 0.0028430151287466288
grad 1.0 2.0 -0.029148340225219727
grad 2.0 4.0 -0.11426162719726562
grad 3.0 6.0 -0.23652076721191406
epoch 14 0.0015539465239271522
grad 1.0 2.0 -0.021549701690673828
grad 2.0 4.0 -0.08447456359863281
grad 3.0 6.0 -0.17486286163330078
epoch 15 0.0008493617060594261
grad 1.0 2.0 -0.01593184471130371
grad 2.0 4.0 -0.062453269958496094
grad 3.0 6.0 -0.12927818298339844
epoch 16 0.00046424579340964556
grad 1.0 2.0 -0.011778593063354492
grad 2.0 4.0 -0.046172142028808594
grad 3.0 6.0 -0.09557533264160156
epoch 17 0.0002537401160225272
grad 1.0 2.0 -0.00870823860168457
grad 2.0 4.0 -0.03413581848144531
grad 3.0 6.0 -0.07066154479980469
epoch 18 0.00013869594840798527
grad 1.0 2.0 -0.006437778472900391
grad 2.0 4.0 -0.025236129760742188
grad 3.0 6.0 -0.052239418029785156
epoch 19 7.580435340059921e-05
grad 1.0 2.0 -0.004759550094604492
grad 2.0 4.0 -0.018657684326171875
grad 3.0 6.0 -0.038620948791503906
epoch 20 4.143271507928148e-05
grad 1.0 2.0 -0.003518819808959961
grad 2.0 4.0 -0.0137939453125
grad 3.0 6.0 -0.028553009033203125
epoch 21 2.264650902361609e-05
grad 1.0 2.0 -0.00260162353515625
grad 2.0 4.0 -0.010198593139648438
grad 3.0 6.0 -0.021108627319335938
epoch 22 1.2377059647405986e-05
grad 1.0 2.0 -0.0019233226776123047
grad 2.0 4.0 -0.0075397491455078125
grad 3.0 6.0 -0.0156097412109375
epoch 23 6.768445018678904e-06
grad 1.0 2.0 -0.0014221668243408203
grad 2.0 4.0 -0.0055751800537109375
grad 3.0 6.0 -0.011541366577148438
epoch 24 3.7000872907810844e-06
grad 1.0 2.0 -0.0010514259338378906
grad 2.0 4.0 -0.0041217803955078125
grad 3.0 6.0 -0.008531570434570312
epoch 25 2.021880391112063e-06
grad 1.0 2.0 -0.0007772445678710938
grad 2.0 4.0 -0.0030469894409179688
grad 3.0 6.0 -0.006305694580078125
epoch 26 1.1044940038118511e-06
grad 1.0 2.0 -0.0005745887756347656
grad 2.0 4.0 -0.0022525787353515625
grad 3.0 6.0 -0.0046634674072265625
epoch 27 6.041091182851233e-07
grad 1.0 2.0 -0.0004248619079589844
grad 2.0 4.0 -0.0016651153564453125
grad 3.0 6.0 -0.003444671630859375
epoch 28 3.296045179013163e-07
grad 1.0 2.0 -0.0003139972686767578
grad 2.0 4.0 -0.0012311935424804688
grad 3.0 6.0 -0.0025491714477539062
epoch 29 1.805076408345485e-07
grad 1.0 2.0 -0.00023221969604492188
grad 2.0 4.0 -0.0009107589721679688
grad 3.0 6.0 -0.0018854141235351562
epoch 30 9.874406714516226e-08
grad 1.0 2.0 -0.00017189979553222656
grad 2.0 4.0 -0.0006742477416992188
grad 3.0 6.0 -0.00139617919921875
epoch 31 5.4147676564753056e-08
grad 1.0 2.0 -0.0001270771026611328
grad 2.0 4.0 -0.0004978179931640625
grad 3.0 6.0 -0.00102996826171875
epoch 32 2.9467628337442875e-08
grad 1.0 2.0 -9.393692016601562e-05
grad 2.0 4.0 -0.0003681182861328125
grad 3.0 6.0 -0.0007610321044921875
epoch 33 1.6088051779661328e-08
grad 1.0 2.0 -6.937980651855469e-05
grad 2.0 4.0 -0.00027179718017578125
grad 3.0 6.0 -0.000560760498046875
epoch 34 8.734787115827203e-09
grad 1.0 2.0 -5.125999450683594e-05
grad 2.0 4.0 -0.00020122528076171875
grad 3.0 6.0 -0.0004177093505859375
epoch 35 4.8466972657479346e-09
grad 1.0 2.0 -3.790855407714844e-05
grad 2.0 4.0 -0.000148773193359375
grad 3.0 6.0 -0.000308990478515625
epoch 36 2.6520865503698587e-09
grad 1.0 2.0 -2.8133392333984375e-05
grad 2.0 4.0 -0.000110626220703125
grad 3.0 6.0 -0.0002288818359375
epoch 37 1.4551915228366852e-09
grad 1.0 2.0 -2.09808349609375e-05
grad 2.0 4.0 -8.20159912109375e-05
grad 3.0 6.0 -0.00016880035400390625
epoch 38 7.914877642178908e-10
grad 1.0 2.0 -1.5497207641601562e-05
grad 2.0 4.0 -6.103515625e-05
grad 3.0 6.0 -0.000125885009765625
epoch 39 4.4019543565809727e-10
grad 1.0 2.0 -1.1444091796875e-05
grad 2.0 4.0 -4.482269287109375e-05
grad 3.0 6.0 -9.1552734375e-05
epoch 40 2.3283064365386963e-10
grad 1.0 2.0 -8.344650268554688e-06
grad 2.0 4.0 -3.24249267578125e-05
grad 3.0 6.0 -6.580352783203125e-05
epoch 41 1.2028067430946976e-10
grad 1.0 2.0 -5.9604644775390625e-06
grad 2.0 4.0 -2.288818359375e-05
grad 3.0 6.0 -4.57763671875e-05
epoch 42 5.820766091346741e-11
grad 1.0 2.0 -4.291534423828125e-06
grad 2.0 4.0 -1.71661376953125e-05
grad 3.0 6.0 -3.719329833984375e-05
epoch 43 3.842615114990622e-11
grad 1.0 2.0 -3.337860107421875e-06
grad 2.0 4.0 -1.33514404296875e-05
grad 3.0 6.0 -2.86102294921875e-05
epoch 44 2.2737367544323206e-11
grad 1.0 2.0 -2.6226043701171875e-06
grad 2.0 4.0 -1.049041748046875e-05
grad 3.0 6.0 -2.288818359375e-05
epoch 45 1.4551915228366852e-11
grad 1.0 2.0 -1.9073486328125e-06
grad 2.0 4.0 -7.62939453125e-06
grad 3.0 6.0 -1.430511474609375e-05
epoch 46 5.6843418860808015e-12
grad 1.0 2.0 -1.430511474609375e-06
grad 2.0 4.0 -5.7220458984375e-06
grad 3.0 6.0 -1.1444091796875e-05
epoch 47 3.637978807091713e-12
grad 1.0 2.0 -1.1920928955078125e-06
grad 2.0 4.0 -4.76837158203125e-06
grad 3.0 6.0 -1.1444091796875e-05
epoch 48 3.637978807091713e-12
grad 1.0 2.0 -9.5367431640625e-07
grad 2.0 4.0 -3.814697265625e-06
grad 3.0 6.0 -8.58306884765625e-06
epoch 49 2.0463630789890885e-12
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 50 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 51 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 52 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 53 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 54 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 55 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 56 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 57 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 58 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 59 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 60 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 61 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 62 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 63 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 64 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 65 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 66 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 67 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 68 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 69 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 70 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 71 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 72 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 73 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 74 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 75 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 76 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 77 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 78 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 79 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 80 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 81 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 82 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 83 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 84 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 85 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 86 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 87 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 88 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 89 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 90 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 91 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 92 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 93 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 94 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 95 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 96 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 97 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 98 9.094947017729282e-13
grad 1.0 2.0 -7.152557373046875e-07
grad 2.0 4.0 -2.86102294921875e-06
grad 3.0 6.0 -5.7220458984375e-06
epoch 99 9.094947017729282e-13
Predict(after training) 4 7.999998569488525
1、手动推导线性模型y=w*x,损失函数loss=(ŷ-y)²下,当数据集x=2,y=4的时候,反向传播的过程。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值