PyTorch 深度学习实践 第3讲-Gradient descent algorithm

第三讲 梯度下降法
深度学习算法中,并没有过多的局部最优点
避免鞍点

任务:模拟梯度下降算法,计算在x_data、y_data数据集下,y=w*x模型找到合适的w的值。
代码实现如下:

import numpy as np
import matplotlib.pyplot as plt

##prepare the training set
x_data=[1.0,2.0,3.0]
y_data=[2.0,3.0,4.0]

#initial guess of weight
w=1.0

#define the model linear model y=w*x
def forward(x):
    return x*w

#define the loss function MSE
def cost(xs,ys):
    cost=0
    for x,y in zip(xs,ys):
        y_pred=forward(x)
        cost+=(y_pred-y)**2
    return cost/len(xs)

# define the gradient function gd
def gradient(xs,ys):
    grad=0
    for x,y in zip(xs,ys):
        grad+=2*x*(x*w-y)
    return grad/len(xs)

epoch_list=[]
cost_list=[]
print('predict(before traning)',4,forward(4))
for epoch in range(100): #训练过程:100轮训练
    cost_val=cost(x_data,y_data)
    grad_val=gradient(x_data,y_data) #计算梯度
    w-=0.01*grad_val  #0.01 learing rate
    print('epoch:',epoch,'w=',w,'loss=',cost_val) #输出学习日志
    epoch_list.append(epoch)
    cost_list.append(cost_val)

#
print('predict(after training)',4,forward(4))
plt.plot(epoch_list,cost_list)
plt.ylabel('cost')
plt.xlabel('epoch')
plt.show()

结果:
predict(before traning) 4 4.0
epoch: 0 w= 1.04 loss= 1.0
epoch: 1 w= 1.0762666666666667 loss= 0.8474666666666666
epoch: 2 w= 1.1091484444444444 loss= 0.7220774874074073
epoch: 3 w= 1.1389612562962963 loss= 0.6190020092038849
epoch: 4 w= 1.1659915390419753 loss= 0.5342693849882246
epoch: 5 w= 1.1904989953980576 loss= 0.46461540198854245
epoch: 6 w= 1.2127190891609057 loss= 0.4073567322302258
epoch: 7 w= 1.2328653075058877 loss= 0.3602875608591223
epoch: 8 w= 1.2511312121386715 loss= 0.32159461002890344
epoch: 9 w= 1.2676922990057289 loss= 0.28978728475975996
epoch: 10 w= 1.2827076844318608 loss= 0.2636402497296231
epoch: 11 w= 1.296321633884887 loss= 0.24214622484440493
epoch: 12 w= 1.308664948055631 loss= 0.2244771810987606
epoch: 13 w= 1.3198562195704386 loss= 0.20995244184900785
epoch: 14 w= 1.3300029724105311 loss= 0.1980124606417445
epoch: 15 w= 1.3392026949855482 loss= 0.18819726542354243
epoch: 16 w= 1.347543776786897 loss= 0.1801287387232818
epoch: 17 w= 1.3551063576201199 loss= 0.17349605117448097
epoch: 18 w= 1.3619630975755754 loss= 0.16804368722325336
epoch: 19 w= 1.3681798751351884 loss= 0.16356160172805748
epoch: 20 w= 1.3738164201225709 loss= 0.1598771282472067
epoch: 21 w= 1.3789268875777976 loss= 0.15684832729157042
epoch: 22 w= 1.3835603780705366 loss= 0.15435851829266162
epoch: 23 w= 1.3877614094506199 loss= 0.1523117846373808
epoch: 24 w= 1.391570344568562 loss= 0.15062927860679973
epoch: 25 w= 1.3950237790754962 loss= 0.1492461838716164
epoch: 26 w= 1.39815489302845 loss= 0.1481092185284186
epoch: 27 w= 1.400993769679128 loss= 0.1471745824845168
epoch: 28 w= 1.4035676845090759 loss= 0.1464062701170499
epoch: 29 w= 1.4059013672882288 loss= 0.14577468320377582
epoch: 30 w= 1.4080172396746609 loss= 0.1452554906905351
epoch: 31 w= 1.4099356306383592 loss= 0.14482869136942825
epoch: 32 w= 1.411674971778779 loss= 0.14447784335861977
epoch: 33 w= 1.4132519744127596 loss= 0.14418943070049028
epoch: 34 w= 1.4146817901342355 loss= 0.14395234267716753
epoch: 35 w= 1.4159781563883735 loss= 0.14375744578475066
epoch: 36 w= 1.417153528458792 loss= 0.14359723187709994
epoch: 37 w= 1.4182191991359714 loss= 0.14346552892439282
epoch: 38 w= 1.4191854072166141 loss= 0.14335726324380313
epoch: 39 w= 1.4200614358763968 loss= 0.14326826404255036
epoch: 40 w= 1.4208557018612664 loss= 0.1431951027436005
epoch: 41 w= 1.4215758363542148 loss= 0.14313496090425046
epoch: 42 w= 1.4222287582944881 loss= 0.143085521639334
epoch: 43 w= 1.4228207408536693 loss= 0.14304488036627214
epoch: 44 w= 1.4233574717073267 loss= 0.14301147143353632
epoch: 45 w= 1.4238441076813095 loss= 0.1429840078059862
epoch: 46 w= 1.4242853242977207 loss= 0.14296143148353416
epoch: 47 w= 1.4246853606966001 loss= 0.14294287274308667
epoch: 48 w= 1.4250480603649174 loss= 0.14292761663360584
epoch: 49 w= 1.4253769080641918 loss= 0.14291507543356333
epoch: 50 w= 1.4256750633115338 loss= 0.14290476600974159
epoch: 51 w= 1.4259453907357906 loss= 0.14289629120516356
epoch: 52 w= 1.42619048760045 loss= 0.1428893245391424
epoch: 53 w= 1.4264127087577414 loss= 0.14288359763004355
epoch: 54 w= 1.4266141892736854 loss= 0.14287888985623487
epoch: 55 w= 1.4267968649414748 loss= 0.14287501985692977
epoch: 56 w= 1.4269624908802705 loss= 0.14287183854550112
epoch: 57 w= 1.427112658398112 loss= 0.14286922336611504
epoch: 58 w= 1.4272488102809548 loss= 0.14286707357242942
epoch: 59 w= 1.4273722546547325 loss= 0.14286530634647357
epoch: 60 w= 1.4274841775536242 loss= 0.14286385360819437
epoch: 61 w= 1.427585654315286 loss= 0.1428626593927629
epoch: 62 w= 1.427677659912526 loss= 0.14286167769460187
epoch: 63 w= 1.4277610783206902 loss= 0.14286087069508258
epoch: 64 w= 1.427836711010759 loss= 0.14286020730561083
epoch: 65 w= 1.4279052846497549 loss= 0.14285966196998137
epoch: 66 w= 1.4279674580824444 loss= 0.14285921367985657
epoch: 67 w= 1.4280238286614162 loss= 0.1428588451654501
epoch: 68 w= 1.4280749379863507 loss= 0.14285854223022956
epoch: 69 w= 1.4281212771076246 loss= 0.14285829320401458
epoch: 70 w= 1.4281632912442463 loss= 0.14285808849339784
epoch: 71 w= 1.42820138406145 loss= 0.1428579202121728
epoch: 72 w= 1.428235921549048 loss= 0.1428577818775266
epoch: 73 w= 1.4282672355378037 loss= 0.14285766816029924
epoch: 74 w= 1.4282956268876086 loss= 0.14285757467968418
epoch: 75 w= 1.4283213683780984 loss= 0.14285749783446386
epoch: 76 w= 1.428344707329476 loss= 0.14285743466427758
epoch: 77 w= 1.4283658679787248 loss= 0.14285738273557677
epoch: 78 w= 1.4283850536340439 loss= 0.14285734004787679
epoch: 79 w= 1.4284024486281999 loss= 0.14285730495669027
epoch: 80 w= 1.4284182200895679 loss= 0.14285727611017524
epoch: 81 w= 1.4284325195478749 loss= 0.14285725239705782
epoch: 82 w= 1.4284454843900731 loss= 0.14285723290382138
epoch: 83 w= 1.428457239180333 loss= 0.14285721687951472
epoch: 84 w= 1.4284678968568352 loss= 0.14285720370682234
epoch: 85 w= 1.428477559816864 loss= 0.1428571928782838
epoch: 86 w= 1.4284863209006233 loss= 0.1428571839767439
epoch: 87 w= 1.428494264283232 loss= 0.14285717665928255
epoch: 88 w= 1.4285014662834636 loss= 0.14285717064400386
epoch: 89 w= 1.428507996097007 loss= 0.14285716569917753
epoch: 90 w= 1.4285139164612863 loss= 0.1428571616343106
epoch: 91 w= 1.428519284258233 loss= 0.14285715829280934
epoch: 92 w= 1.428524151060798 loss= 0.1428571555459468
epoch: 93 w= 1.4285285636284568 loss= 0.14285715328790358
epoch: 94 w= 1.4285325643564675 loss= 0.14285715143169175
epoch: 95 w= 1.4285361916831971 loss= 0.14285714990580306
epoch: 96 w= 1.428539480459432 loss= 0.14285714865145496
epoch: 97 w= 1.4285424622832184 loss= 0.14285714762032486
epoch: 98 w= 1.4285451658034514 loss= 0.1428571467726902
epoch: 99 w= 1.4285476169951292 loss= 0.14285714607589672
predict(after training) 4 5.714190467980517
可看到loss逐渐下降,最后趋近于收敛
cost图像:
在这里插入图片描述
为了解决在机器学习过程中在遇到“鞍点”(即总体所有点的梯度和为0,导致w=w-0.01*0,w不会改变)而导致不能继续进行的问题。可以采用随机梯度下降,即随机的取一组(x,y)的梯度,作为梯度下降的依据,而不用总体所有点的梯度和,作为梯度下降的依据。

随机梯度下降
随机梯度下降在神经网络中被证明是有效的。效率较低(实践复杂较高),学习性能较好
随机梯度下降法和梯度下降法的主要区别在于:
1、损失函数由cost()更改为loss()。cost是计算所有训练数据的损失,loss是计算一个训练数据的损失。对应于源代码则是少了两个for循环。
2.梯度函数gradient()由计算所有训练数据的梯度更改为计算一个训练数据的梯度。
3、本算法中的随机梯度主要是指,每次拿一个训练数据来训练,然后更新梯度参数。本算法中梯度总共更新100(epoch)x3 = 300次。梯度下降法中梯度总共更新100(epoch)次。

代码实现如下:

import matplotlib.pyplot as plt

x_data=[1.0,2.0,3.0]
y_data=[2.0,4.0,6.0]

w=1.0 #初始随机权重

def forward(x):
    return x*w

#计算损失函数 calculate loss function
def loss(x,y):
    y_pred=forward(x) #只求一个样本
    return  (y_pred-y)**2

#定义随机梯度函数 SGD
def gradient(x,y):
    return 2*x*(x*w-y)

epoch_list=[]
loss_list=[]
print('predict(before training)',4,forward(4))
for epoch in range(100):
    for x,y in zip(x_data,y_data):
        grad=gradient(x,y) #对每一个样本求梯度
        w=w-0.01*grad #update weight by every grad of sample of training set,哪一个样本去更新
        print("\tgrad:",x,y,grad)
        l=loss(x,y)
    print("progress:",epoch,"w=",w,"loss=",l)
    epoch_list.append(epoch)
    loss_list.append(l)

print('predict (after training)',4,forward(4))
plt.plot(epoch_list,loss_list)
plt.ylabel('loss')
plt.xlabel('epoch')
plt.show()

训练结果:
predict(before training) 4 4.0
grad: 1.0 2.0 -2.0
grad: 2.0 4.0 -7.84
grad: 3.0 6.0 -16.2288
progress: 0 w= 1.260688 loss= 4.919240100095999
grad: 1.0 2.0 -1.478624
grad: 2.0 4.0 -5.796206079999999
grad: 3.0 6.0 -11.998146585599997
progress: 1 w= 1.453417766656 loss= 2.688769240265834
grad: 1.0 2.0 -1.093164466688
grad: 2.0 4.0 -4.285204709416961
grad: 3.0 6.0 -8.87037374849311
progress: 2 w= 1.5959051959019805 loss= 1.4696334962911515
grad: 1.0 2.0 -0.8081896081960389
grad: 2.0 4.0 -3.1681032641284723
grad: 3.0 6.0 -6.557973756745939
progress: 3 w= 1.701247862192685 loss= 0.8032755585999681
grad: 1.0 2.0 -0.59750427561463
grad: 2.0 4.0 -2.3422167604093502
grad: 3.0 6.0 -4.848388694047353
progress: 4 w= 1.7791289594933983 loss= 0.43905614881022015
grad: 1.0 2.0 -0.44174208101320334
grad: 2.0 4.0 -1.7316289575717576
grad: 3.0 6.0 -3.584471942173538
progress: 5 w= 1.836707389300983 loss= 0.2399802903801062
grad: 1.0 2.0 -0.3265852213980338
grad: 2.0 4.0 -1.2802140678802925
grad: 3.0 6.0 -2.650043120512205
progress: 6 w= 1.8792758133988885 loss= 0.1311689630744999
grad: 1.0 2.0 -0.241448373202223
grad: 2.0 4.0 -0.946477622952715
grad: 3.0 6.0 -1.9592086795121197
progress: 7 w= 1.910747160155559 loss= 0.07169462478267678
grad: 1.0 2.0 -0.17850567968888198
grad: 2.0 4.0 -0.6997422643804168
grad: 3.0 6.0 -1.4484664872674653
progress: 8 w= 1.9340143044689266 loss= 0.03918700813247573
grad: 1.0 2.0 -0.13197139106214673
grad: 2.0 4.0 -0.5173278529636143
grad: 3.0 6.0 -1.0708686556346834
progress: 9 w= 1.9512159834655312 loss= 0.021418922423117836
grad: 1.0 2.0 -0.09756803306893769
grad: 2.0 4.0 -0.38246668963023644
grad: 3.0 6.0 -0.7917060475345892
progress: 10 w= 1.9639333911678687 loss= 0.01170720245384975
grad: 1.0 2.0 -0.07213321766426262
grad: 2.0 4.0 -0.2827622132439096
grad: 3.0 6.0 -0.5853177814148953
progress: 11 w= 1.9733355232910992 loss= 0.006398948863435593
grad: 1.0 2.0 -0.05332895341780164
grad: 2.0 4.0 -0.2090494973977819
grad: 3.0 6.0 -0.4327324596134101
progress: 12 w= 1.9802866323953892 loss= 0.003497551760830656
grad: 1.0 2.0 -0.039426735209221686
grad: 2.0 4.0 -0.15455280202014876
grad: 3.0 6.0 -0.3199243001817109
progress: 13 w= 1.9854256707695 loss= 0.001911699652671057
grad: 1.0 2.0 -0.02914865846100012
grad: 2.0 4.0 -0.11426274116712065
grad: 3.0 6.0 -0.2365238742159388
progress: 14 w= 1.9892250235079405 loss= 0.0010449010656399273
grad: 1.0 2.0 -0.021549952984118992
grad: 2.0 4.0 -0.08447581569774698
grad: 3.0 6.0 -0.17486493849433593
progress: 15 w= 1.9920339305797026 loss= 0.0005711243580809696
grad: 1.0 2.0 -0.015932138840594856
grad: 2.0 4.0 -0.062453984255132156
grad: 3.0 6.0 -0.12927974740812687
progress: 16 w= 1.994110589284741 loss= 0.0003121664271570621
grad: 1.0 2.0 -0.011778821430517894
grad: 2.0 4.0 -0.046172980007630926
grad: 3.0 6.0 -0.09557806861579543
progress: 17 w= 1.9956458879852805 loss= 0.0001706246229305199
grad: 1.0 2.0 -0.008708224029438938
grad: 2.0 4.0 -0.03413623819540135
grad: 3.0 6.0 -0.07066201306448505
progress: 18 w= 1.9967809527381737 loss= 9.326038746484765e-05
grad: 1.0 2.0 -0.006438094523652627
grad: 2.0 4.0 -0.02523733053271826
grad: 3.0 6.0 -0.052241274202728505
progress: 19 w= 1.9976201197307648 loss= 5.097447086306101e-05
grad: 1.0 2.0 -0.004759760538470381
grad: 2.0 4.0 -0.01865826131080439
grad: 3.0 6.0 -0.03862260091336722
progress: 20 w= 1.998240525958391 loss= 2.7861740127856012e-05
grad: 1.0 2.0 -0.0035189480832178432
grad: 2.0 4.0 -0.01379427648621423
grad: 3.0 6.0 -0.028554152326460525
progress: 21 w= 1.99869919972735 loss= 1.5228732143933469e-05
grad: 1.0 2.0 -0.002601600545300009
grad: 2.0 4.0 -0.01019827413757568
grad: 3.0 6.0 -0.021110427464781978
progress: 22 w= 1.9990383027488265 loss= 8.323754426231206e-06
grad: 1.0 2.0 -0.001923394502346909
grad: 2.0 4.0 -0.007539706449199102
grad: 3.0 6.0 -0.01560719234984198
progress: 23 w= 1.9992890056818404 loss= 4.549616284094891e-06
grad: 1.0 2.0 -0.0014219886363191492
grad: 2.0 4.0 -0.005574195454370212
grad: 3.0 6.0 -0.011538584590544687
progress: 24 w= 1.999474353368653 loss= 2.486739429417538e-06
grad: 1.0 2.0 -0.0010512932626940419
grad: 2.0 4.0 -0.004121069589761106
grad: 3.0 6.0 -0.008530614050808794
progress: 25 w= 1.9996113831376856 loss= 1.3592075910762856e-06
grad: 1.0 2.0 -0.0007772337246287897
grad: 2.0 4.0 -0.0030467562005451754
grad: 3.0 6.0 -0.006306785335127074
progress: 26 w= 1.9997126908902887 loss= 7.429187207079447e-07
grad: 1.0 2.0 -0.0005746182194226179
grad: 2.0 4.0 -0.002252503420136165
grad: 3.0 6.0 -0.00466268207967957
progress: 27 w= 1.9997875889274812 loss= 4.060661735575354e-07
grad: 1.0 2.0 -0.0004248221450375844
grad: 2.0 4.0 -0.0016653028085471533
grad: 3.0 6.0 -0.0034471768136938863
progress: 28 w= 1.9998429619451539 loss= 2.2194855602869353e-07
grad: 1.0 2.0 -0.00031407610969225175
grad: 2.0 4.0 -0.0012311783499932005
grad: 3.0 6.0 -0.0025485391844828342
progress: 29 w= 1.9998838998815958 loss= 1.213131374411496e-07
grad: 1.0 2.0 -0.00023220023680847746
grad: 2.0 4.0 -0.0009102249282886277
grad: 3.0 6.0 -0.0018841656015560204
progress: 30 w= 1.9999141657892625 loss= 6.630760559646474e-08
grad: 1.0 2.0 -0.00017166842147497974
grad: 2.0 4.0 -0.0006729402121816719
grad: 3.0 6.0 -0.0013929862392156878
progress: 31 w= 1.9999365417379913 loss= 3.624255915449335e-08
grad: 1.0 2.0 -0.0001269165240174175
grad: 2.0 4.0 -0.0004975127741477792
grad: 3.0 6.0 -0.0010298514424817995
progress: 32 w= 1.9999530845453979 loss= 1.9809538924707548e-08
grad: 1.0 2.0 -9.383090920422887e-05
grad: 2.0 4.0 -0.00036781716408107457
grad: 3.0 6.0 -0.0007613815296476645
progress: 33 w= 1.9999653148414271 loss= 1.0827542027017377e-08
grad: 1.0 2.0 -6.937031714571162e-05
grad: 2.0 4.0 -0.0002719316432120422
grad: 3.0 6.0 -0.0005628985014531906
progress: 34 w= 1.999974356846045 loss= 5.9181421028034105e-09
grad: 1.0 2.0 -5.1286307909848006e-05
grad: 2.0 4.0 -0.00020104232700646207
grad: 3.0 6.0 -0.0004161576169003922
progress: 35 w= 1.9999810417085633 loss= 3.2347513278475087e-09
grad: 1.0 2.0 -3.7916582873442906e-05
grad: 2.0 4.0 -0.0001486330048638962
grad: 3.0 6.0 -0.0003076703200690645
progress: 36 w= 1.9999859839076413 loss= 1.7680576050779005e-09
grad: 1.0 2.0 -2.8032184717474706e-05
grad: 2.0 4.0 -0.0001098861640933535
grad: 3.0 6.0 -0.00022746435967313516
progress: 37 w= 1.9999896377347262 loss= 9.6638887447731e-10
grad: 1.0 2.0 -2.0724530547688857e-05
grad: 2.0 4.0 -8.124015974608767e-05
grad: 3.0 6.0 -0.00016816713067413502
progress: 38 w= 1.999992339052936 loss= 5.282109892545845e-10
grad: 1.0 2.0 -1.5321894128117464e-05
grad: 2.0 4.0 -6.006182498197177e-05
grad: 3.0 6.0 -0.00012432797771566584
progress: 39 w= 1.9999943361699042 loss= 2.887107421958329e-10
grad: 1.0 2.0 -1.1327660191629008e-05
grad: 2.0 4.0 -4.4404427951505454e-05
grad: 3.0 6.0 -9.191716585732479e-05
progress: 40 w= 1.9999958126624442 loss= 1.5780416225633037e-10
grad: 1.0 2.0 -8.37467511161094e-06
grad: 2.0 4.0 -3.282872643772805e-05
grad: 3.0 6.0 -6.795546372551087e-05
progress: 41 w= 1.999996904251097 loss= 8.625295142578772e-11
grad: 1.0 2.0 -6.191497806007362e-06
grad: 2.0 4.0 -2.4270671399762023e-05
grad: 3.0 6.0 -5.0240289795056015e-05
progress: 42 w= 1.999997711275687 loss= 4.71443308235547e-11
grad: 1.0 2.0 -4.5774486259198e-06
grad: 2.0 4.0 -1.794359861406747e-05
grad: 3.0 6.0 -3.714324913239864e-05
progress: 43 w= 1.9999983079186507 loss= 2.5768253628059826e-11
grad: 1.0 2.0 -3.3841626985164908e-06
grad: 2.0 4.0 -1.326591777761621e-05
grad: 3.0 6.0 -2.7460449796734565e-05
progress: 44 w= 1.9999987490239537 loss= 1.4084469615916932e-11
grad: 1.0 2.0 -2.5019520926150562e-06
grad: 2.0 4.0 -9.807652203264183e-06
grad: 3.0 6.0 -2.0301840059744336e-05
progress: 45 w= 1.9999990751383971 loss= 7.698320862431846e-12
grad: 1.0 2.0 -1.8497232057157476e-06
grad: 2.0 4.0 -7.250914967116273e-06
grad: 3.0 6.0 -1.5009393983689279e-05
progress: 46 w= 1.9999993162387186 loss= 4.20776540913866e-12
grad: 1.0 2.0 -1.3675225627451937e-06
grad: 2.0 4.0 -5.3606884460322135e-06
grad: 3.0 6.0 -1.109662508014253e-05
progress: 47 w= 1.9999994944870796 loss= 2.299889814334344e-12
grad: 1.0 2.0 -1.0110258408246864e-06
grad: 2.0 4.0 -3.963221296032771e-06
grad: 3.0 6.0 -8.20386808086937e-06
progress: 48 w= 1.9999996262682318 loss= 1.2570789110540446e-12
grad: 1.0 2.0 -7.474635363990956e-07
grad: 2.0 4.0 -2.930057062755509e-06
grad: 3.0 6.0 -6.065218119744031e-06
progress: 49 w= 1.999999723695619 loss= 6.870969979249939e-13
grad: 1.0 2.0 -5.526087618612507e-07
grad: 2.0 4.0 -2.166226346744793e-06
grad: 3.0 6.0 -4.484088535150477e-06
progress: 50 w= 1.9999997957248556 loss= 3.7555501141274804e-13
grad: 1.0 2.0 -4.08550288710785e-07
grad: 2.0 4.0 -1.6015171322436572e-06
grad: 3.0 6.0 -3.3151404608133817e-06
progress: 51 w= 1.9999998489769344 loss= 2.052716967104274e-13
grad: 1.0 2.0 -3.020461312175371e-07
grad: 2.0 4.0 -1.1840208351543424e-06
grad: 3.0 6.0 -2.4509231284497446e-06
progress: 52 w= 1.9999998883468353 loss= 1.1219786256679713e-13
grad: 1.0 2.0 -2.2330632942768602e-07
grad: 2.0 4.0 -8.753608113920563e-07
grad: 3.0 6.0 -1.811996877876254e-06
progress: 53 w= 1.9999999174534755 loss= 6.132535848018759e-14
grad: 1.0 2.0 -1.6509304900935717e-07
grad: 2.0 4.0 -6.471647520100987e-07
grad: 3.0 6.0 -1.3396310407642886e-06
progress: 54 w= 1.999999938972364 loss= 3.351935118167793e-14
grad: 1.0 2.0 -1.220552721115098e-07
grad: 2.0 4.0 -4.784566662863199e-07
grad: 3.0 6.0 -9.904052991061008e-07
progress: 55 w= 1.9999999548815364 loss= 1.8321081844499955e-14
grad: 1.0 2.0 -9.023692726373156e-08
grad: 2.0 4.0 -3.5372875473171916e-07
grad: 3.0 6.0 -7.322185204827747e-07
progress: 56 w= 1.9999999666433785 loss= 1.0013977760018664e-14
grad: 1.0 2.0 -6.671324292994996e-08
grad: 2.0 4.0 -2.615159129248923e-07
grad: 3.0 6.0 -5.413379398078177e-07
progress: 57 w= 1.9999999753390494 loss= 5.473462367088053e-15
grad: 1.0 2.0 -4.932190122985958e-08
grad: 2.0 4.0 -1.9334185274999527e-07
grad: 3.0 6.0 -4.002176350326181e-07
progress: 58 w= 1.9999999817678633 loss= 2.991697274308627e-15
grad: 1.0 2.0 -3.6464273378555845e-08
grad: 2.0 4.0 -1.429399514307761e-07
grad: 3.0 6.0 -2.9588569994132286e-07
progress: 59 w= 1.9999999865207625 loss= 1.6352086111474931e-15
grad: 1.0 2.0 -2.6958475007887728e-08
grad: 2.0 4.0 -1.0567722164012139e-07
grad: 3.0 6.0 -2.1875184863517916e-07
progress: 60 w= 1.999999990034638 loss= 8.937759877335403e-16
grad: 1.0 2.0 -1.993072418216002e-08
grad: 2.0 4.0 -7.812843882959442e-08
grad: 3.0 6.0 -1.617258700292723e-07
progress: 61 w= 1.9999999926324883 loss= 4.885220495987371e-16
grad: 1.0 2.0 -1.473502342363986e-08
grad: 2.0 4.0 -5.7761292637792394e-08
grad: 3.0 6.0 -1.195658771990793e-07
progress: 62 w= 1.99999999455311 loss= 2.670175009618106e-16
grad: 1.0 2.0 -1.0893780100218464e-08
grad: 2.0 4.0 -4.270361841918202e-08
grad: 3.0 6.0 -8.839649012770678e-08
progress: 63 w= 1.9999999959730488 loss= 1.4594702493172377e-16
grad: 1.0 2.0 -8.05390243385773e-09
grad: 2.0 4.0 -3.1571296688071016e-08
grad: 3.0 6.0 -6.53525820126788e-08
progress: 64 w= 1.9999999970228268 loss= 7.977204100704301e-17
grad: 1.0 2.0 -5.9543463493128e-09
grad: 2.0 4.0 -2.334103754719763e-08
grad: 3.0 6.0 -4.8315948575350376e-08
progress: 65 w= 1.9999999977989402 loss= 4.360197735196887e-17
grad: 1.0 2.0 -4.402119557767037e-09
grad: 2.0 4.0 -1.725630838222969e-08
grad: 3.0 6.0 -3.5720557178819945e-08
progress: 66 w= 1.9999999983727301 loss= 2.3832065197304227e-17
grad: 1.0 2.0 -3.254539748809293e-09
grad: 2.0 4.0 -1.2757796596929438e-08
grad: 3.0 6.0 -2.6408640607655798e-08
progress: 67 w= 1.9999999987969397 loss= 1.3026183953845832e-17
grad: 1.0 2.0 -2.406120636067044e-09
grad: 2.0 4.0 -9.431992964437086e-09
grad: 3.0 6.0 -1.9524227568012975e-08
progress: 68 w= 1.999999999110563 loss= 7.11988308874388e-18
grad: 1.0 2.0 -1.7788739370416806e-09
grad: 2.0 4.0 -6.97318647269185e-09
grad: 3.0 6.0 -1.4434496264925656e-08
progress: 69 w= 1.9999999993424284 loss= 3.89160224698574e-18
grad: 1.0 2.0 -1.3151431055291596e-09
grad: 2.0 4.0 -5.155360582875801e-09
grad: 3.0 6.0 -1.067159693945996e-08
progress: 70 w= 1.9999999995138495 loss= 2.1270797208746147e-18
grad: 1.0 2.0 -9.72300906454393e-10
grad: 2.0 4.0 -3.811418736177075e-09
grad: 3.0 6.0 -7.88963561149103e-09
progress: 71 w= 1.9999999996405833 loss= 1.1626238773828175e-18
grad: 1.0 2.0 -7.18833437218791e-10
grad: 2.0 4.0 -2.8178277489132597e-09
grad: 3.0 6.0 -5.832902161273523e-09
progress: 72 w= 1.999999999734279 loss= 6.354692062078993e-19
grad: 1.0 2.0 -5.314420015167798e-10
grad: 2.0 4.0 -2.0832526814729135e-09
grad: 3.0 6.0 -4.31233715403323e-09
progress: 73 w= 1.9999999998035491 loss= 3.4733644793346653e-19
grad: 1.0 2.0 -3.92901711165905e-10
grad: 2.0 4.0 -1.5401742103904326e-09
grad: 3.0 6.0 -3.188159070077745e-09
progress: 74 w= 1.9999999998547615 loss= 1.8984796531526204e-19
grad: 1.0 2.0 -2.9047697580608656e-10
grad: 2.0 4.0 -1.1386696030513122e-09
grad: 3.0 6.0 -2.3570478902001923e-09
progress: 75 w= 1.9999999998926234 loss= 1.0376765851119951e-19
grad: 1.0 2.0 -2.1475310418850313e-10
grad: 2.0 4.0 -8.418314934033333e-10
grad: 3.0 6.0 -1.7425900722400911e-09
progress: 76 w= 1.9999999999206153 loss= 5.671751114309842e-20
grad: 1.0 2.0 -1.5876944203796484e-10
grad: 2.0 4.0 -6.223768167501476e-10
grad: 3.0 6.0 -1.2883241140571045e-09
progress: 77 w= 1.9999999999413098 loss= 3.100089617511693e-20
grad: 1.0 2.0 -1.17380327679939e-10
grad: 2.0 4.0 -4.601314884666863e-10
grad: 3.0 6.0 -9.524754318590567e-10
progress: 78 w= 1.9999999999566096 loss= 1.6944600977692705e-20
grad: 1.0 2.0 -8.678080476443029e-11
grad: 2.0 4.0 -3.4018121652934497e-10
grad: 3.0 6.0 -7.041780492045291e-10
progress: 79 w= 1.9999999999679208 loss= 9.2616919156479e-21
grad: 1.0 2.0 -6.415845632545825e-11
grad: 2.0 4.0 -2.5150193039280566e-10
grad: 3.0 6.0 -5.206075570640678e-10
progress: 80 w= 1.9999999999762834 loss= 5.062350511130293e-21
grad: 1.0 2.0 -4.743316850408519e-11
grad: 2.0 4.0 -1.8593837580738182e-10
grad: 3.0 6.0 -3.8489211817704927e-10
progress: 81 w= 1.999999999982466 loss= 2.7669155644059242e-21
grad: 1.0 2.0 -3.5067948545020045e-11
grad: 2.0 4.0 -1.3746692673066718e-10
grad: 3.0 6.0 -2.845563784603655e-10
progress: 82 w= 1.9999999999870368 loss= 1.5124150106147723e-21
grad: 1.0 2.0 -2.5926372160256506e-11
grad: 2.0 4.0 -1.0163070385260653e-10
grad: 3.0 6.0 -2.1037571684701106e-10
progress: 83 w= 1.999999999990416 loss= 8.26683933105326e-22
grad: 1.0 2.0 -1.9167778475548403e-11
grad: 2.0 4.0 -7.51381179497912e-11
grad: 3.0 6.0 -1.5553425214420713e-10
progress: 84 w= 1.9999999999929146 loss= 4.518126871054872e-22
grad: 1.0 2.0 -1.4170886686315498e-11
grad: 2.0 4.0 -5.555023108172463e-11
grad: 3.0 6.0 -1.1499068364173581e-10
progress: 85 w= 1.9999999999947617 loss= 2.469467919185614e-22
grad: 1.0 2.0 -1.0476508549572827e-11
grad: 2.0 4.0 -4.106759377009439e-11
grad: 3.0 6.0 -8.500933290633839e-11
progress: 86 w= 1.9999999999961273 loss= 1.349840097651456e-22
grad: 1.0 2.0 -7.745359908994942e-12
grad: 2.0 4.0 -3.036149109902908e-11
grad: 3.0 6.0 -6.285105769165966e-11
progress: 87 w= 1.999999999997137 loss= 7.376551550022107e-23
grad: 1.0 2.0 -5.726086271806707e-12
grad: 2.0 4.0 -2.2446045022661565e-11
grad: 3.0 6.0 -4.646416584819235e-11
progress: 88 w= 1.9999999999978835 loss= 4.031726170507742e-23
grad: 1.0 2.0 -4.233058348290797e-12
grad: 2.0 4.0 -1.659294923683774e-11
grad: 3.0 6.0 -3.4351188560322043e-11
progress: 89 w= 1.9999999999984353 loss= 2.2033851437431755e-23
grad: 1.0 2.0 -3.1294966618133913e-12
grad: 2.0 4.0 -1.226752033289813e-11
grad: 3.0 6.0 -2.539835008974478e-11
progress: 90 w= 1.9999999999988431 loss= 1.2047849775995315e-23
grad: 1.0 2.0 -2.3137047833188262e-12
grad: 2.0 4.0 -9.070078021977679e-12
grad: 3.0 6.0 -1.8779644506139448e-11
progress: 91 w= 1.9999999999991447 loss= 6.5840863393251405e-24
grad: 1.0 2.0 -1.7106316363424412e-12
grad: 2.0 4.0 -6.7057470687359455e-12
grad: 3.0 6.0 -1.3882228699912957e-11
progress: 92 w= 1.9999999999993676 loss= 3.5991747246272455e-24
grad: 1.0 2.0 -1.2647660696529783e-12
grad: 2.0 4.0 -4.957811938766099e-12
grad: 3.0 6.0 -1.0263789818054647e-11
progress: 93 w= 1.9999999999995324 loss= 1.969312363793734e-24
grad: 1.0 2.0 -9.352518759442319e-13
grad: 2.0 4.0 -3.666400516522117e-12
grad: 3.0 6.0 -7.58859641791787e-12
progress: 94 w= 1.9999999999996543 loss= 1.0761829795642296e-24
grad: 1.0 2.0 -6.914468997365475e-13
grad: 2.0 4.0 -2.7107205369247822e-12
grad: 3.0 6.0 -5.611511255665391e-12
progress: 95 w= 1.9999999999997444 loss= 5.875191475205477e-25
grad: 1.0 2.0 -5.111466805374221e-13
grad: 2.0 4.0 -2.0037305148434825e-12
grad: 3.0 6.0 -4.1460168631601846e-12
progress: 96 w= 1.999999999999811 loss= 3.2110109830478153e-25
grad: 1.0 2.0 -3.779199175824033e-13
grad: 2.0 4.0 -1.4814816040598089e-12
grad: 3.0 6.0 -3.064215547965432e-12
progress: 97 w= 1.9999999999998603 loss= 1.757455879087579e-25
grad: 1.0 2.0 -2.793321129956894e-13
grad: 2.0 4.0 -1.0942358130705543e-12
grad: 3.0 6.0 -2.2648549702353193e-12
progress: 98 w= 1.9999999999998967 loss= 9.608404711682446e-26
grad: 1.0 2.0 -2.0650148258027912e-13
grad: 2.0 4.0 -8.100187187665142e-13
grad: 3.0 6.0 -1.6786572132332367e-12
progress: 99 w= 1.9999999999999236 loss= 5.250973729513143e-26
predict (after training) 4 7.9999999999996945
在这里插入图片描述

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值