深度学习:梯度下降算法

目录

梯度下降算法

梯度下降算法

1.导入相应的库

2.设置训练集

3.定义学习模型

4.定义损失函数

5.定义梯度函数

6.训练模型

7.画图

随机梯度下降算法

Mini-Batch


梯度下降算法

贪心算法的思想:可以得到局部区域的最优解。

全局最优:没有任何一个局部最优函数的最小值比他小。

鞍点:梯度g=0

对梯度进行求导,最后得到更新后的模型

梯度下降算法

1.导入相应的库

import numpy as np
import matplotlib.pyplot as plt

2.设置训练集

x_data = [1.0, 2.0, 3.0]
y_data = [2.0, 4.0, 6.0]
​
w=1.0#设置初始权重

3.定义学习模型

def forward(x):
    return x*w

4.定义损失函数

def cost(xs, ys):
    cost = 0
    for x, y in zip(xs, ys):
        y_pred = forward(x)
        cost += (y_pred - y) ** 2
    return cost / len(xs)

5.定义梯度函数

def gradient(xs, ys):
    grad = 0
    for x, y in zip(xs, ys):
        grad += 2 * x * (x * w - y)
    return grad / len(xs)

6.训练模型

cost_list = []
epoch_list = []
print('Predict (before training)', 4, forward(4))
for epoch in range(100):  #训练100轮
    cost_val = cost(x_data, y_data)  #求损失值
    grad_val = gradient(x_data, y_data) #求梯度
    w -= 0.01 * grad_val  #更新权重
    print('Epoch:',epoch, 'w=',w, 'loss=', cost_val)
    cost_list.append(cost_val)
    epoch_list.append(epoch)
print('Predict (after training)', 4, forward(4))
#梯度下降算法:性能低,时间复杂度低

输出结果

Predict (before training) 4 4.0
Epoch: 0 w= 1.0933333333333333 loss= 4.666666666666667
Epoch: 1 w= 1.1779555555555554 loss= 3.8362074074074086
Epoch: 2 w= 1.2546797037037036 loss= 3.1535329869958857
Epoch: 3 w= 1.3242429313580246 loss= 2.592344272332262
Epoch: 4 w= 1.3873135910979424 loss= 2.1310222071581117
Epoch: 5 w= 1.4444976559288012 loss= 1.7517949663820642
Epoch: 6 w= 1.4963445413754464 loss= 1.440053319920117
Epoch: 7 w= 1.5433523841804047 loss= 1.1837878313441108
Epoch: 8 w= 1.5859728283235668 loss= 0.9731262101573632
Epoch: 9 w= 1.6246153643467005 loss= 0.7999529948031382
Epoch: 10 w= 1.659651263674342 loss= 0.6575969151946154
Epoch: 11 w= 1.6914171457314033 loss= 0.5405738908195378
Epoch: 12 w= 1.7202182121298057 loss= 0.44437576375991855
Epoch: 13 w= 1.7463311789976905 loss= 0.365296627844598
Epoch: 14 w= 1.7700069356245727 loss= 0.3002900634939416
Epoch: 15 w= 1.7914729549662791 loss= 0.2468517784170642
Epoch: 16 w= 1.8109354791694263 loss= 0.2029231330489788
Epoch: 17 w= 1.8285815011136133 loss= 0.16681183417217407
Epoch: 18 w= 1.8445805610096762 loss= 0.1371267415488235
Epoch: 19 w= 1.8590863753154396 loss= 0.11272427607497944
Epoch: 20 w= 1.872238313619332 loss= 0.09266436490145864
Epoch: 21 w= 1.8841627376815275 loss= 0.07617422636521683
Epoch: 22 w= 1.8949742154979183 loss= 0.06261859959338009
Epoch: 23 w= 1.904776622051446 loss= 0.051475271914629306
Epoch: 24 w= 1.9136641373266443 loss= 0.04231496130368814
Epoch: 25 w= 1.9217221511761575 loss= 0.03478477885657844
Epoch: 26 w= 1.9290280837330496 loss= 0.02859463421027894
Epoch: 27 w= 1.9356521292512983 loss= 0.023506060193480772
Epoch: 28 w= 1.9416579305211772 loss= 0.01932302619282764
Epoch: 29 w= 1.9471031903392007 loss= 0.015884386331668398
Epoch: 30 w= 1.952040225907542 loss= 0.01305767153735723
Epoch: 31 w= 1.9565164714895047 loss= 0.010733986344664803
Epoch: 32 w= 1.9605749341504843 loss= 0.008823813841374291
Epoch: 33 w= 1.9642546069631057 loss= 0.007253567147113681
Epoch: 34 w= 1.9675908436465492 loss= 0.005962754575689583
Epoch: 35 w= 1.970615698239538 loss= 0.004901649272531298
Epoch: 36 w= 1.9733582330705144 loss= 0.004029373553099482
Epoch: 37 w= 1.975844797983933 loss= 0.0033123241439168096
Epoch: 38 w= 1.9780992835054327 loss= 0.0027228776607060357
Epoch: 39 w= 1.980143350378259 loss= 0.002238326453885249
Epoch: 40 w= 1.9819966376762883 loss= 0.001840003826269386
Epoch: 41 w= 1.983676951493168 loss= 0.0015125649231412608
Epoch: 42 w= 1.9852004360204722 loss= 0.0012433955919298103
Epoch: 43 w= 1.9865817286585614 loss= 0.0010221264385926248
Epoch: 44 w= 1.987834100650429 loss= 0.0008402333603648631
Epoch: 45 w= 1.9889695845897222 loss= 0.0006907091659248264
Epoch: 46 w= 1.9899990900280147 loss= 0.0005677936325753796
Epoch: 47 w= 1.9909325082920666 loss= 0.0004667516012495216
Epoch: 48 w= 1.9917788075181404 loss= 0.000383690560742734
Epoch: 49 w= 1.9925461188164473 loss= 0.00031541069384432885
Epoch: 50 w= 1.9932418143935788 loss= 0.0002592816085930997
Epoch: 51 w= 1.9938725783835114 loss= 0.0002131410058905752
Epoch: 52 w= 1.994444471067717 loss= 0.00017521137977565514
Epoch: 53 w= 1.9949629871013967 loss= 0.0001440315413480261
Epoch: 54 w= 1.9954331083052663 loss= 0.0001184003283899171
Epoch: 55 w= 1.9958593515301082 loss= 9.733033217332803e-05
Epoch: 56 w= 1.9962458120539648 loss= 8.000985883901657e-05
Epoch: 57 w= 1.9965962029289281 loss= 6.57716599593935e-05
Epoch: 58 w= 1.9969138906555615 loss= 5.406722767150764e-05
Epoch: 59 w= 1.997201927527709 loss= 4.444566413387458e-05
Epoch: 60 w= 1.9974630809584561 loss= 3.65363112808981e-05
Epoch: 61 w= 1.9976998600690001 loss= 3.0034471708953996e-05
Epoch: 62 w= 1.9979145397958935 loss= 2.4689670610172655e-05
Epoch: 63 w= 1.9981091827482769 loss= 2.0296006560253656e-05
Epoch: 64 w= 1.9982856590251044 loss= 1.6684219437262796e-05
Epoch: 65 w= 1.9984456641827613 loss= 1.3715169898293847e-05
Epoch: 66 w= 1.9985907355257035 loss= 1.1274479219506377e-05
Epoch: 67 w= 1.9987222668766378 loss= 9.268123006398985e-06
Epoch: 68 w= 1.9988415219681517 loss= 7.61880902783969e-06
Epoch: 69 w= 1.9989496465844576 loss= 6.262999634617916e-06
Epoch: 70 w= 1.9990476795699081 loss= 5.1484640551938914e-06
Epoch: 71 w= 1.9991365628100501 loss= 4.232266273994499e-06
Epoch: 72 w= 1.999217150281112 loss= 3.479110977946351e-06
Epoch: 73 w= 1.999290216254875 loss= 2.859983851026929e-06
Epoch: 74 w= 1.9993564627377531 loss= 2.3510338359374262e-06
Epoch: 75 w= 1.9994165262155628 loss= 1.932654303533636e-06
Epoch: 76 w= 1.999470983768777 loss= 1.5887277332523938e-06
Epoch: 77 w= 1.9995203586170245 loss= 1.3060048068548734e-06
Epoch: 78 w= 1.9995651251461022 loss= 1.0735939958924364e-06
Epoch: 79 w= 1.9996057134657994 loss= 8.825419799121559e-07
Epoch: 80 w= 1.9996425135423248 loss= 7.254887315754342e-07
Epoch: 81 w= 1.999675878945041 loss= 5.963839812987369e-07
Epoch: 82 w= 1.999706130243504 loss= 4.902541385825727e-07
Epoch: 83 w= 1.9997335580874436 loss= 4.0301069098738336e-07
Epoch: 84 w= 1.9997584259992822 loss= 3.312926995781724e-07
Epoch: 85 w= 1.9997809729060159 loss= 2.723373231729343e-07
Epoch: 86 w= 1.9998014154347876 loss= 2.2387338352920307e-07
Epoch: 87 w= 1.9998199499942075 loss= 1.8403387118941732e-07
Epoch: 88 w= 1.9998367546614149 loss= 1.5128402140063082e-07
Epoch: 89 w= 1.9998519908930161 loss= 1.2436218932547864e-07
Epoch: 90 w= 1.9998658050763347 loss= 1.0223124683409346e-07
Epoch: 91 w= 1.9998783299358769 loss= 8.403862850836479e-08
Epoch: 92 w= 1.9998896858085284 loss= 6.908348768398496e-08
Epoch: 93 w= 1.9998999817997325 loss= 5.678969725349543e-08
Epoch: 94 w= 1.9999093168317574 loss= 4.66836551287917e-08
Epoch: 95 w= 1.9999177805941268 loss= 3.8376039345125727e-08
Epoch: 96 w= 1.9999254544053418 loss= 3.154680994333735e-08
Epoch: 97 w= 1.9999324119941766 loss= 2.593287985380858e-08
Epoch: 98 w= 1.9999387202080534 loss= 2.131797981222471e-08
Epoch: 99 w= 1.9999444396553017 loss= 1.752432687141379e-08
Predict (after training) 4 7.999777758621207

7.画图

plt.plot(epoch_list, cost_list)
plt.ylabel('Cost')
plt.xlabel('Epoch')
plt.show()

由图可知,函数收敛

随机梯度下降算法

cost func(所有样本)

原理:用一个样本loss

好处:即使陷入鞍点,因为随机噪声可能会跨过鞍点

#1.导入对应的库
import numpy as np
import matplotlib.pyplot as plt
#2.设置训练集
x_data = [1.0, 2.0, 3.0]
y_data = [2.0, 4.0, 6.0]
w=1.0#设置初始权重
#3.定义模型
def forward(x):
    return x*w
#4.定义单个样本的损失函数
def loss(xs, ys):
    y_pred = forward(x)
    return (y_pred - y) ** 2
#5.定义梯度函数
def gradient(xs, ys):
    return 2 * x * (x * w -y)
#6.模型训练
w_list = []
epoch_list = []
print('Predict (before training)', 4, forward(4))
for epoch in range(100):  #训练100轮
    for x, y in zip(x_data, y_data):
        grad = gradient(x, y) #对每一个样本梯度进行更新
        w = w - 0.01 * grad#对每一个样本权重进行更新
        print("\tgrad: ",x, y, grad)
        l = loss(x, y) #计算损失
        w_list.append(w)
        epoch_list.append(epoch)
print('Predict (after training)', 4, forward(4))
#7.画图
plt.plot(epoch_list, w_list)
plt.ylabel('w')
plt.xlabel('Epoch')
plt.show()
#随机梯度下降性能比较好,但是不能并行,因为每次更新权重与上一次训练相关。时间复杂度太高。
#性能高,时间复杂度高

输出结果

Predict (before training) 4 4.0
    grad:  1.0 2.0 -2.0
    grad:  2.0 4.0 -7.84
    grad:  3.0 6.0 -16.2288
    grad:  1.0 2.0 -1.478624
    grad:  2.0 4.0 -5.796206079999999
    grad:  3.0 6.0 -11.998146585599997
    grad:  1.0 2.0 -1.093164466688
    grad:  2.0 4.0 -4.285204709416961
    grad:  3.0 6.0 -8.87037374849311
    grad:  1.0 2.0 -0.8081896081960389
    grad:  2.0 4.0 -3.1681032641284723
    grad:  3.0 6.0 -6.557973756745939
    grad:  1.0 2.0 -0.59750427561463
    grad:  2.0 4.0 -2.3422167604093502
    grad:  3.0 6.0 -4.848388694047353
    grad:  1.0 2.0 -0.44174208101320334
    grad:  2.0 4.0 -1.7316289575717576
    grad:  3.0 6.0 -3.584471942173538
    grad:  1.0 2.0 -0.3265852213980338
    grad:  2.0 4.0 -1.2802140678802925
    grad:  3.0 6.0 -2.650043120512205
    grad:  1.0 2.0 -0.241448373202223
    grad:  2.0 4.0 -0.946477622952715
    grad:  3.0 6.0 -1.9592086795121197
    grad:  1.0 2.0 -0.17850567968888198
    grad:  2.0 4.0 -0.6997422643804168
    grad:  3.0 6.0 -1.4484664872674653
    grad:  1.0 2.0 -0.13197139106214673
    grad:  2.0 4.0 -0.5173278529636143
    grad:  3.0 6.0 -1.0708686556346834
    grad:  1.0 2.0 -0.09756803306893769
    grad:  2.0 4.0 -0.38246668963023644
    grad:  3.0 6.0 -0.7917060475345892
    grad:  1.0 2.0 -0.07213321766426262
    grad:  2.0 4.0 -0.2827622132439096
    grad:  3.0 6.0 -0.5853177814148953
    grad:  1.0 2.0 -0.05332895341780164
    grad:  2.0 4.0 -0.2090494973977819
    grad:  3.0 6.0 -0.4327324596134101
    grad:  1.0 2.0 -0.039426735209221686
    grad:  2.0 4.0 -0.15455280202014876
    grad:  3.0 6.0 -0.3199243001817109
    grad:  1.0 2.0 -0.02914865846100012
    grad:  2.0 4.0 -0.11426274116712065
    grad:  3.0 6.0 -0.2365238742159388
    grad:  1.0 2.0 -0.021549952984118992
    grad:  2.0 4.0 -0.08447581569774698
    grad:  3.0 6.0 -0.17486493849433593
    grad:  1.0 2.0 -0.015932138840594856
    grad:  2.0 4.0 -0.062453984255132156
    grad:  3.0 6.0 -0.12927974740812687
    grad:  1.0 2.0 -0.011778821430517894
    grad:  2.0 4.0 -0.046172980007630926
    grad:  3.0 6.0 -0.09557806861579543
    grad:  1.0 2.0 -0.008708224029438938
    grad:  2.0 4.0 -0.03413623819540135
    grad:  3.0 6.0 -0.07066201306448505
    grad:  1.0 2.0 -0.006438094523652627
    grad:  2.0 4.0 -0.02523733053271826
    grad:  3.0 6.0 -0.052241274202728505
    grad:  1.0 2.0 -0.004759760538470381
    grad:  2.0 4.0 -0.01865826131080439
    grad:  3.0 6.0 -0.03862260091336722
    grad:  1.0 2.0 -0.0035189480832178432
    grad:  2.0 4.0 -0.01379427648621423
    grad:  3.0 6.0 -0.028554152326460525
    grad:  1.0 2.0 -0.002601600545300009
    grad:  2.0 4.0 -0.01019827413757568
    grad:  3.0 6.0 -0.021110427464781978
    grad:  1.0 2.0 -0.001923394502346909
    grad:  2.0 4.0 -0.007539706449199102
    grad:  3.0 6.0 -0.01560719234984198
    grad:  1.0 2.0 -0.0014219886363191492
    grad:  2.0 4.0 -0.005574195454370212
    grad:  3.0 6.0 -0.011538584590544687
    grad:  1.0 2.0 -0.0010512932626940419
    grad:  2.0 4.0 -0.004121069589761106
    grad:  3.0 6.0 -0.008530614050808794
    grad:  1.0 2.0 -0.0007772337246287897
    grad:  2.0 4.0 -0.0030467562005451754
    grad:  3.0 6.0 -0.006306785335127074
    grad:  1.0 2.0 -0.0005746182194226179
    grad:  2.0 4.0 -0.002252503420136165
    grad:  3.0 6.0 -0.00466268207967957
    grad:  1.0 2.0 -0.0004248221450375844
    grad:  2.0 4.0 -0.0016653028085471533
    grad:  3.0 6.0 -0.0034471768136938863
    grad:  1.0 2.0 -0.00031407610969225175
    grad:  2.0 4.0 -0.0012311783499932005
    grad:  3.0 6.0 -0.0025485391844828342
    grad:  1.0 2.0 -0.00023220023680847746
    grad:  2.0 4.0 -0.0009102249282886277
    grad:  3.0 6.0 -0.0018841656015560204
    grad:  1.0 2.0 -0.00017166842147497974
    grad:  2.0 4.0 -0.0006729402121816719
    grad:  3.0 6.0 -0.0013929862392156878
    grad:  1.0 2.0 -0.0001269165240174175
    grad:  2.0 4.0 -0.0004975127741477792
    grad:  3.0 6.0 -0.0010298514424817995
    grad:  1.0 2.0 -9.383090920422887e-05
    grad:  2.0 4.0 -0.00036781716408107457
    grad:  3.0 6.0 -0.0007613815296476645
    grad:  1.0 2.0 -6.937031714571162e-05
    grad:  2.0 4.0 -0.0002719316432120422
    grad:  3.0 6.0 -0.0005628985014531906
    grad:  1.0 2.0 -5.1286307909848006e-05
    grad:  2.0 4.0 -0.00020104232700646207
    grad:  3.0 6.0 -0.0004161576169003922
    grad:  1.0 2.0 -3.7916582873442906e-05
    grad:  2.0 4.0 -0.0001486330048638962
    grad:  3.0 6.0 -0.0003076703200690645
    grad:  1.0 2.0 -2.8032184717474706e-05
    grad:  2.0 4.0 -0.0001098861640933535
    grad:  3.0 6.0 -0.00022746435967313516
    grad:  1.0 2.0 -2.0724530547688857e-05
    grad:  2.0 4.0 -8.124015974608767e-05
    grad:  3.0 6.0 -0.00016816713067413502
    grad:  1.0 2.0 -1.5321894128117464e-05
    grad:  2.0 4.0 -6.006182498197177e-05
    grad:  3.0 6.0 -0.00012432797771566584
    grad:  1.0 2.0 -1.1327660191629008e-05
    grad:  2.0 4.0 -4.4404427951505454e-05
    grad:  3.0 6.0 -9.191716585732479e-05
    grad:  1.0 2.0 -8.37467511161094e-06
    grad:  2.0 4.0 -3.282872643772805e-05
    grad:  3.0 6.0 -6.795546372551087e-05
    grad:  1.0 2.0 -6.191497806007362e-06
    grad:  2.0 4.0 -2.4270671399762023e-05
    grad:  3.0 6.0 -5.0240289795056015e-05
    grad:  1.0 2.0 -4.5774486259198e-06
    grad:  2.0 4.0 -1.794359861406747e-05
    grad:  3.0 6.0 -3.714324913239864e-05
    grad:  1.0 2.0 -3.3841626985164908e-06
    grad:  2.0 4.0 -1.326591777761621e-05
    grad:  3.0 6.0 -2.7460449796734565e-05
    grad:  1.0 2.0 -2.5019520926150562e-06
    grad:  2.0 4.0 -9.807652203264183e-06
    grad:  3.0 6.0 -2.0301840059744336e-05
    grad:  1.0 2.0 -1.8497232057157476e-06
    grad:  2.0 4.0 -7.250914967116273e-06
    grad:  3.0 6.0 -1.5009393983689279e-05
    grad:  1.0 2.0 -1.3675225627451937e-06
    grad:  2.0 4.0 -5.3606884460322135e-06
    grad:  3.0 6.0 -1.109662508014253e-05
    grad:  1.0 2.0 -1.0110258408246864e-06
    grad:  2.0 4.0 -3.963221296032771e-06
    grad:  3.0 6.0 -8.20386808086937e-06
    grad:  1.0 2.0 -7.474635363990956e-07
    grad:  2.0 4.0 -2.930057062755509e-06
    grad:  3.0 6.0 -6.065218119744031e-06
    grad:  1.0 2.0 -5.526087618612507e-07
    grad:  2.0 4.0 -2.166226346744793e-06
    grad:  3.0 6.0 -4.484088535150477e-06
    grad:  1.0 2.0 -4.08550288710785e-07
    grad:  2.0 4.0 -1.6015171322436572e-06
    grad:  3.0 6.0 -3.3151404608133817e-06
    grad:  1.0 2.0 -3.020461312175371e-07
    grad:  2.0 4.0 -1.1840208351543424e-06
    grad:  3.0 6.0 -2.4509231284497446e-06
    grad:  1.0 2.0 -2.2330632942768602e-07
    grad:  2.0 4.0 -8.753608113920563e-07
    grad:  3.0 6.0 -1.811996877876254e-06
    grad:  1.0 2.0 -1.6509304900935717e-07
    grad:  2.0 4.0 -6.471647520100987e-07
    grad:  3.0 6.0 -1.3396310407642886e-06
    grad:  1.0 2.0 -1.220552721115098e-07
    grad:  2.0 4.0 -4.784566662863199e-07
    grad:  3.0 6.0 -9.904052991061008e-07
    grad:  1.0 2.0 -9.023692726373156e-08
    grad:  2.0 4.0 -3.5372875473171916e-07
    grad:  3.0 6.0 -7.322185204827747e-07
    grad:  1.0 2.0 -6.671324292994996e-08
    grad:  2.0 4.0 -2.615159129248923e-07
    grad:  3.0 6.0 -5.413379398078177e-07
    grad:  1.0 2.0 -4.932190122985958e-08
    grad:  2.0 4.0 -1.9334185274999527e-07
    grad:  3.0 6.0 -4.002176350326181e-07
    grad:  1.0 2.0 -3.6464273378555845e-08
    grad:  2.0 4.0 -1.429399514307761e-07
    grad:  3.0 6.0 -2.9588569994132286e-07
    grad:  1.0 2.0 -2.6958475007887728e-08
    grad:  2.0 4.0 -1.0567722164012139e-07
    grad:  3.0 6.0 -2.1875184863517916e-07
    grad:  1.0 2.0 -1.993072418216002e-08
    grad:  2.0 4.0 -7.812843882959442e-08
    grad:  3.0 6.0 -1.617258700292723e-07
    grad:  1.0 2.0 -1.473502342363986e-08
    grad:  2.0 4.0 -5.7761292637792394e-08
    grad:  3.0 6.0 -1.195658771990793e-07
    grad:  1.0 2.0 -1.0893780100218464e-08
    grad:  2.0 4.0 -4.270361841918202e-08
    grad:  3.0 6.0 -8.839649012770678e-08
    grad:  1.0 2.0 -8.05390243385773e-09
    grad:  2.0 4.0 -3.1571296688071016e-08
    grad:  3.0 6.0 -6.53525820126788e-08
    grad:  1.0 2.0 -5.9543463493128e-09
    grad:  2.0 4.0 -2.334103754719763e-08
    grad:  3.0 6.0 -4.8315948575350376e-08
    grad:  1.0 2.0 -4.402119557767037e-09
    grad:  2.0 4.0 -1.725630838222969e-08
    grad:  3.0 6.0 -3.5720557178819945e-08
    grad:  1.0 2.0 -3.254539748809293e-09
    grad:  2.0 4.0 -1.2757796596929438e-08
    grad:  3.0 6.0 -2.6408640607655798e-08
    grad:  1.0 2.0 -2.406120636067044e-09
    grad:  2.0 4.0 -9.431992964437086e-09
    grad:  3.0 6.0 -1.9524227568012975e-08
    grad:  1.0 2.0 -1.7788739370416806e-09
    grad:  2.0 4.0 -6.97318647269185e-09
    grad:  3.0 6.0 -1.4434496264925656e-08
    grad:  1.0 2.0 -1.3151431055291596e-09
    grad:  2.0 4.0 -5.155360582875801e-09
    grad:  3.0 6.0 -1.067159693945996e-08
    grad:  1.0 2.0 -9.72300906454393e-10
    grad:  2.0 4.0 -3.811418736177075e-09
    grad:  3.0 6.0 -7.88963561149103e-09
    grad:  1.0 2.0 -7.18833437218791e-10
    grad:  2.0 4.0 -2.8178277489132597e-09
    grad:  3.0 6.0 -5.832902161273523e-09
    grad:  1.0 2.0 -5.314420015167798e-10
    grad:  2.0 4.0 -2.0832526814729135e-09
    grad:  3.0 6.0 -4.31233715403323e-09
    grad:  1.0 2.0 -3.92901711165905e-10
    grad:  2.0 4.0 -1.5401742103904326e-09
    grad:  3.0 6.0 -3.188159070077745e-09
    grad:  1.0 2.0 -2.9047697580608656e-10
    grad:  2.0 4.0 -1.1386696030513122e-09
    grad:  3.0 6.0 -2.3570478902001923e-09
    grad:  1.0 2.0 -2.1475310418850313e-10
    grad:  2.0 4.0 -8.418314934033333e-10
    grad:  3.0 6.0 -1.7425900722400911e-09
    grad:  1.0 2.0 -1.5876944203796484e-10
    grad:  2.0 4.0 -6.223768167501476e-10
    grad:  3.0 6.0 -1.2883241140571045e-09
    grad:  1.0 2.0 -1.17380327679939e-10
    grad:  2.0 4.0 -4.601314884666863e-10
    grad:  3.0 6.0 -9.524754318590567e-10
    grad:  1.0 2.0 -8.678080476443029e-11
    grad:  2.0 4.0 -3.4018121652934497e-10
    grad:  3.0 6.0 -7.041780492045291e-10
    grad:  1.0 2.0 -6.415845632545825e-11
    grad:  2.0 4.0 -2.5150193039280566e-10
    grad:  3.0 6.0 -5.206075570640678e-10
    grad:  1.0 2.0 -4.743316850408519e-11
    grad:  2.0 4.0 -1.8593837580738182e-10
    grad:  3.0 6.0 -3.8489211817704927e-10
    grad:  1.0 2.0 -3.5067948545020045e-11
    grad:  2.0 4.0 -1.3746692673066718e-10
    grad:  3.0 6.0 -2.845563784603655e-10
    grad:  1.0 2.0 -2.5926372160256506e-11
    grad:  2.0 4.0 -1.0163070385260653e-10
    grad:  3.0 6.0 -2.1037571684701106e-10
    grad:  1.0 2.0 -1.9167778475548403e-11
    grad:  2.0 4.0 -7.51381179497912e-11
    grad:  3.0 6.0 -1.5553425214420713e-10
    grad:  1.0 2.0 -1.4170886686315498e-11
    grad:  2.0 4.0 -5.555023108172463e-11
    grad:  3.0 6.0 -1.1499068364173581e-10
    grad:  1.0 2.0 -1.0476508549572827e-11
    grad:  2.0 4.0 -4.106759377009439e-11
    grad:  3.0 6.0 -8.500933290633839e-11
    grad:  1.0 2.0 -7.745359908994942e-12
    grad:  2.0 4.0 -3.036149109902908e-11
    grad:  3.0 6.0 -6.285105769165966e-11
    grad:  1.0 2.0 -5.726086271806707e-12
    grad:  2.0 4.0 -2.2446045022661565e-11
    grad:  3.0 6.0 -4.646416584819235e-11
    grad:  1.0 2.0 -4.233058348290797e-12
    grad:  2.0 4.0 -1.659294923683774e-11
    grad:  3.0 6.0 -3.4351188560322043e-11
    grad:  1.0 2.0 -3.1294966618133913e-12
    grad:  2.0 4.0 -1.226752033289813e-11
    grad:  3.0 6.0 -2.539835008974478e-11
    grad:  1.0 2.0 -2.3137047833188262e-12
    grad:  2.0 4.0 -9.070078021977679e-12
    grad:  3.0 6.0 -1.8779644506139448e-11
    grad:  1.0 2.0 -1.7106316363424412e-12
    grad:  2.0 4.0 -6.7057470687359455e-12
    grad:  3.0 6.0 -1.3882228699912957e-11
    grad:  1.0 2.0 -1.2647660696529783e-12
    grad:  2.0 4.0 -4.957811938766099e-12
    grad:  3.0 6.0 -1.0263789818054647e-11
    grad:  1.0 2.0 -9.352518759442319e-13
    grad:  2.0 4.0 -3.666400516522117e-12
    grad:  3.0 6.0 -7.58859641791787e-12
    grad:  1.0 2.0 -6.914468997365475e-13
    grad:  2.0 4.0 -2.7107205369247822e-12
    grad:  3.0 6.0 -5.611511255665391e-12
    grad:  1.0 2.0 -5.111466805374221e-13
    grad:  2.0 4.0 -2.0037305148434825e-12
    grad:  3.0 6.0 -4.1460168631601846e-12
    grad:  1.0 2.0 -3.779199175824033e-13
    grad:  2.0 4.0 -1.4814816040598089e-12
    grad:  3.0 6.0 -3.064215547965432e-12
    grad:  1.0 2.0 -2.793321129956894e-13
    grad:  2.0 4.0 -1.0942358130705543e-12
    grad:  3.0 6.0 -2.2648549702353193e-12
    grad:  1.0 2.0 -2.0650148258027912e-13
    grad:  2.0 4.0 -8.100187187665142e-13
    grad:  3.0 6.0 -1.6786572132332367e-12
Predict (after training) 4 7.9999999999996945

Mini-Batch

Batch 批量的随机梯度下降 也叫(Mini-Batch)

全都聚在一起性能不好,全都分开时间复杂度不好。所以采用batch,若干个一组,每次用一组样本求相应的梯度进行更新。

  • 2
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

自律的光电人

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值