DL之CNN:自定义SimpleConvNet【3层,im2col优化】利用mnist数据集实现手写数字识别多分类训练来评估模型

DL之CNN:自定义SimpleConvNet【3层,im2col优化】利用mnist数据集实现手写数字识别多分类训练来评估模型

 

 

目录

输出结果

设计思路

核心代码

更多输出


 

 

 

 

输出结果

 

 

设计思路

 

 

核心代码

class Convolution:
    def __init__(self, W, b, stride=1, pad=0):  

……


    def forward(self, x):  
        FN, C, FH, FW = self.W.shape  
        N, C, H, W = x.shape
        out_h = 1 + int((H + 2*self.pad - FH) / self.stride)
        out_w = 1 + int((W + 2*self.pad - FW) / self.stride)

        col = im2col(x, FH, FW, self.stride, self.pad)     
        col_W = self.W.reshape(FN, -1).T                  
               

        out = np.dot(col, col_W) + self.b                  
        out = out.reshape(N, out_h, out_w, -1).transpose(0, 3, 1, 2)
           
        self.x = x
        self.col = col
        self.col_W = col_W

        return out   

    def backward(self, dout):    
        FN, C, FH, FW = self.W.shape                   
        dout = dout.transpose(0,2,3,1).reshape(-1, FN)  

        self.db = np.sum(dout, axis=0)    
        self.dW = np.dot(self.col.T, dout)
        self.dW = self.dW.transpose(1, 0).reshape(FN, C, FH, FW)

        dcol = np.dot(dout, self.col_W.T)  

        return dx  


class Pooling:
    def __init__(self, pool_h, pool_w, stride=1, pad=0): 

        self.pool_h = pool_h
        self.pool_w = pool_w
        self.stride = stride
        self.pad = pad
        
        self.x = None
        self.arg_max = None

……



class SimpleConvNet:   #
    def __init__(self, input_dim=(1, 28, 28),   
                 conv_param={'filter_num':30, 'filter_size':5, 'pad':0, 'stride':1},
                 hidden_size=100, output_size=10, weight_init_std=0.01):
   
        filter_num = conv_param['filter_num']
        filter_size = conv_param['filter_size']
        filter_pad = conv_param['pad']
        filter_stride = conv_param['stride']
        input_size = input_dim[1]
        conv_output_size = (input_size - filter_size + 2*filter_pad) / filter_stride + 1
        pool_output_size = int(filter_num * (conv_output_size/2) * (conv_output_size/2))

        self.params = {}
        self.params['W1'] = weight_init_std * \
                            np.random.randn(filter_num, input_dim[0], filter_size, filter_size)
        self.params['b1'] = np.zeros(filter_num)
        self.params['W2'] = weight_init_std * \
                            np.random.randn(pool_output_size, hidden_size)
        self.params['b2'] = np.zeros(hidden_size)
        self.params['W3'] = weight_init_std * \
                            np.random.randn(hidden_size, output_size)
        self.params['b3'] = np.zeros(output_size)

        self.layers = OrderedDict()
        self.layers['Conv1'] = Convolution(self.params['W1'], self.params['b1'],
                                           conv_param['stride'], conv_param['pad'])  
        self.layers['Relu1'] = Relu()                                                
        self.layers['Pool1'] = Pooling(pool_h=2, pool_w=2, stride=2)                 
        self.layers['Affine1'] = Affine(self.params['W2'], self.params['b2'])        
        self.layers['Relu2'] = Relu()
        self.layers['Affine2'] = Affine(self.params['W3'], self.params['b3'])

        self.last_layer = SoftmaxWithLoss()  
    
……

    def save_params(self, file_name="params.pkl"):   
        params = {}                            
        for key, val in self.params.items(): 
            params[key] = val
        with open(file_name, 'wb') as f:     
            pickle.dump(params, f)

    def load_params(self, file_name="params.pkl"):   
        with open(file_name, 'rb') as f:      
            params = pickle.load(f)
        for key, val in params.items():       
            self.params[key] = val

        for i, key in enumerate(['Conv1', 'Affine1', 'Affine2']): 
            self.layers[key].W = self.params['W' + str(i+1)]
            self.layers[key].b = self.params['b' + str(i+1)]

 

 

更多输出

train_loss:2.29956519109714
=== epoch:1, train_acc:0.216, test_acc:0.218 ===
train_loss:2.2975110344641716
train_loss:2.291654113382576
train_loss:2.2858174689127875
train_loss:2.272262093336837
train_loss:2.267908303517325
train_loss:2.2584119706864336
train_loss:2.2258807222804693
train_loss:2.2111025085252543
train_loss:2.188119055308738
train_loss:2.163215575430596
train_loss:2.1191887076886724
train_loss:2.0542599060672186
train_loss:2.0244523646451915
train_loss:1.9779786923239808
train_loss:1.9248431928319325
train_loss:1.7920653808470397
train_loss:1.726860911000866
train_loss:1.7075144252509131
train_loss:1.6875413868425186
train_loss:1.6347461097804266
train_loss:1.5437112361395253
train_loss:1.4987893515035628
train_loss:1.3856720782969847
train_loss:1.2002110952243676
train_loss:1.2731100379603273
train_loss:1.117132621224333
train_loss:1.0622583460165833
train_loss:1.0960592785565957
train_loss:0.8692067763172185
train_loss:0.8548780420217317
train_loss:0.83872966253374
train_loss:0.7819342397053507
train_loss:0.7589812430284729
train_loss:0.7955332004991336
train_loss:0.8190930469691535
train_loss:0.6297212128196131
train_loss:0.8279837022068413
train_loss:0.6996430264702379
train_loss:0.5256550729087258
train_loss:0.7288553394002595
train_loss:0.7033049908220391
train_loss:0.5679669207218877
train_loss:0.6344174262581003
train_loss:0.7151382401438272
train_loss:0.5814593192354963
train_loss:0.5736217677325146
train_loss:0.5673622947809682
train_loss:0.48303413903204395
train_loss:0.452267909884157
train_loss:0.4009118158839013
=== epoch:2, train_acc:0.818, test_acc:0.806 ===
train_loss:0.5669686001623327
train_loss:0.5358187806595359
train_loss:0.3837535143737321
train_loss:0.544335563142595
train_loss:0.39288485196871803
train_loss:0.49770310644457566
train_loss:0.4610248131112265
train_loss:0.36641463191798196
train_loss:0.4874682221372042
train_loss:0.38796698110644817
train_loss:0.3620230776259665
train_loss:0.4744726274001774
train_loss:0.3086952062454927
train_loss:0.40012397040718645
train_loss:0.3634667070910744
train_loss:0.3204093812396573
train_loss:0.5063082359543781
train_loss:0.5624992123039615
train_loss:0.34281562891324663
train_loss:0.3415065217065326
train_loss:0.4946703009790488
train_loss:0.48942997572068253
train_loss:0.25416776815225534
train_loss:0.3808555005314615
train_loss:0.22793380858862108
train_loss:0.4709915396804245
train_loss:0.25826190862498605
train_loss:0.44862426522901516
train_loss:0.25519522472564815
train_loss:0.5063495442657376
train_loss:0.37233317168099206
train_loss:0.4027673899570495
train_loss:0.4234905061164214
train_loss:0.44590221111177714
train_loss:0.3846538639824134
train_loss:0.3371733857576183
train_loss:0.23612786737321756
train_loss:0.4814543539448962
train_loss:0.38362762929477556
train_loss:0.5105811329813293
train_loss:0.31729857191880056
train_loss:0.43677582454472663
train_loss:0.37362647454980324
train_loss:0.2696715797445873
train_loss:0.26682852302518134
train_loss:0.18763432881504752
train_loss:0.2886557425885745
train_loss:0.23833327847639763
train_loss:0.36315802981646
train_loss:0.21083779781027828
=== epoch:3, train_acc:0.89, test_acc:0.867 ===
train_loss:0.34070333399972674
train_loss:0.3356587138064409
train_loss:0.25919406618960505
train_loss:0.31537349840856743
train_loss:0.2276928810208216
train_loss:0.32171416950979326
train_loss:0.22754919179736025
train_loss:0.37619164258262944
train_loss:0.3221102374023198
train_loss:0.36724681541104537
train_loss:0.3310213819075522
train_loss:0.33583429981768936
train_loss:0.36054827740285833
train_loss:0.3002031789326344
train_loss:0.19480027104864756
train_loss:0.3074748184113467
train_loss:0.31035699050378
train_loss:0.37289594799797554
train_loss:0.38054981033442864
train_loss:0.2150866558286973
train_loss:0.4014488874986493
train_loss:0.2643304660197891
train_loss:0.31806887985854354
train_loss:0.29365139713396693
train_loss:0.33212651106203267
train_loss:0.29544164636048587
train_loss:0.4969991428069569
train_loss:0.3348535409949116
train_loss:0.18914984777413654
train_loss:0.3868380951987871
train_loss:0.26857192970788485
train_loss:0.373151707743815
train_loss:0.3522570704735893
train_loss:0.204823140388568
train_loss:0.3974239710544049
train_loss:0.21753509102652058
train_loss:0.26034229667679715
train_loss:0.26991319118062235
train_loss:0.30959776720795107
train_loss:0.2718109180045845
train_loss:0.2738413103423023
train_loss:0.22209179719364106
train_loss:0.5025051167945939
train_loss:0.23308114849307443
train_loss:0.24989561030033144
train_loss:0.4666621160650158
train_loss:0.3511547384608582
train_loss:0.32856542443039893
train_loss:0.29344954251556093
train_loss:0.21027623914222787
=== epoch:4, train_acc:0.905, test_acc:0.897 ===
train_loss:0.3912739685030935
train_loss:0.38209838818230624
train_loss:0.34743100915819064
train_loss:0.2466622246872034
train_loss:0.4342299239968299
train_loss:0.2691256872383198
train_loss:0.33061633649960986
train_loss:0.24714178601043
train_loss:0.27972544337302246
train_loss:0.2594663777039397
train_loss:0.3618566656990062
train_loss:0.46329147512107755
train_loss:0.24382989786183829
train_loss:0.30893321320835465
train_loss:0.32945962831674774
train_loss:0.14512986683598966
train_loss:0.18177996995372436
train_loss:0.33010123547450865
train_loss:0.22821102485978303
train_loss:0.13184290288561265
train_loss:0.1623416243274031
train_loss:0.15789928544006773
train_loss:0.28080142395723756
train_loss:0.37489571529660976
train_loss:0.14201501357680735
train_loss:0.2721256133343583
train_loss:0.3284216941766708
train_loss:0.18839612600685815
train_loss:0.22950135076005498
train_loss:0.3657428746249682
train_loss:0.2656377917932745
train_loss:0.18838799129016182
train_loss:0.2875731634059018
train_loss:0.4565329335709001
train_loss:0.18200894573118304
train_loss:0.2305260793504801
train_loss:0.2148999949995126
train_loss:0.28529427710203675
train_loss:0.2819535462668795
train_loss:0.2670982521557257
train_loss:0.2734307192256681
train_loss:0.1388387469300277
train_loss:0.2700532055195449
train_loss:0.2179124091178431
train_loss:0.19658434695884133
train_loss:0.2777291934300614
train_loss:0.20381437081332332
train_loss:0.32907137120155455
train_loss:0.27254826158873285
train_loss:0.22710678143573176
=== epoch:5, train_acc:0.913, test_acc:0.912 ===
train_loss:0.16794884237909946
train_loss:0.22785903063567253
train_loss:0.1704819172872827
train_loss:0.2525653382920443
train_loss:0.21185790294965987
train_loss:0.17767717976901584
train_loss:0.1889506605539382
train_loss:0.17273423199217824
train_loss:0.2510078095831616
train_loss:0.14205249835249428
train_loss:0.3129092704025964
train_loss:0.3117928731764807
train_loss:0.20503712236242064
train_loss:0.20318831742627225
train_loss:0.21303909770975452
train_loss:0.23190878850961483
train_loss:0.17291311185744473
train_loss:0.20334851907094717
train_loss:0.15855326731614855
train_loss:0.21942667459237625
train_loss:0.0924354215910217
train_loss:0.09567491107181217
train_loss:0.19180958792274005
train_loss:0.25969731631050624
train_loss:0.27574837165425986
train_loss:0.24987203428843377
train_loss:0.4377410898909417
train_loss:0.26026206472975066
train_loss:0.27954893992114804
train_loss:0.1699281856687059
train_loss:0.15934689245821898
train_loss:0.3161871226226364
train_loss:0.10976032096009508
train_loss:0.1763696866686196
train_loss:0.18580995761265345
train_loss:0.1842207131970236
train_loss:0.2443475666901613
train_loss:0.18738051698673439
train_loss:0.22270658116867303
train_loss:0.1662389219099242
train_loss:0.209158762880929
train_loss:0.22983617951577964
train_loss:0.2790296623615454
train_loss:0.24788172524111998
train_loss:0.1293738188409751
train_loss:0.1552172413660744
train_loss:0.23018276943562502
train_loss:0.16189165875684913
train_loss:0.24392025522410113
train_loss:0.13403840930108568
=== epoch:6, train_acc:0.921, test_acc:0.918 ===
train_loss:0.1961216529174243
train_loss:0.2924197504956213
train_loss:0.19465010122753057
train_loss:0.28290935332276435
train_loss:0.14427638876873242
train_loss:0.2566711475334627
train_loss:0.167375730919932
train_loss:0.3154511081448441
train_loss:0.15788775201275967
train_loss:0.17910954391766404
train_loss:0.23884644581690193
train_loss:0.09618189067278102
train_loss:0.24388882345961582
train_loss:0.08541530798998809
train_loss:0.06809986906621876
train_loss:0.24638946409490692
train_loss:0.18927011798228044
train_loss:0.09945981596350358
train_loss:0.18495019162631973
train_loss:0.15258840338866894
train_loss:0.19096173442426728
train_loss:0.14569967578533724
train_loss:0.1841763707949563
train_loss:0.0967340944259887
train_loss:0.0970240457283082
train_loss:0.15266131436990713
train_loss:0.11793802844679865
train_loss:0.23125882163453734
train_loss:0.15401815338201266
train_loss:0.11575841101176092
train_loss:0.1333871420622398
train_loss:0.08651040019662394
train_loss:0.216125204224472
train_loss:0.16165588422959304
train_loss:0.27869245421310007
train_loss:0.11198243521614289
train_loss:0.17313438972459186
train_loss:0.17212043609334862
train_loss:0.13791897831064198
train_loss:0.2267562895570335
train_loss:0.10722405971795468
train_loss:0.1149995899103652
train_loss:0.09703973400039906
train_loss:0.21399583320148452
train_loss:0.17101299029565184
train_loss:0.12963329125364453
train_loss:0.1946558983682687
train_loss:0.15189507558508436
train_loss:0.15603991257676963
train_loss:0.1894440989591196
=== epoch:7, train_acc:0.944, test_acc:0.921 ===
train_loss:0.1949166062126958
train_loss:0.16660652708551138
train_loss:0.11841422215045073
train_loss:0.09924967850906151
train_loss:0.20053562463811267
train_loss:0.15198739956171664
train_loss:0.23276767408280194
train_loss:0.11995565794860409
train_loss:0.21661120479200555
train_loss:0.17637313795453327
train_loss:0.172362454787868
train_loss:0.20851418734477065
train_loss:0.09537001525763981
train_loss:0.14146913793087992
train_loss:0.2617576866376055
train_loss:0.10500607559534571
train_loss:0.3396765217711637
train_loss:0.08427796011888775
train_loss:0.15303614654098532
train_loss:0.132821052254927
train_loss:0.1154173668832886
train_loss:0.12357953723411788
train_loss:0.18706847766652746
train_loss:0.2688341936588257
train_loss:0.16520252414666456
train_loss:0.08039280193318782
train_loss:0.1178618737147573
train_loss:0.1495808236060719
train_loss:0.13937468284703372
train_loss:0.09823544010832733
train_loss:0.1262785713216828
train_loss:0.17823790661433755
train_loss:0.08725751897376116
train_loss:0.1280730814886477
train_loss:0.16139747833498747
train_loss:0.13856299791286275
train_loss:0.11895206801034919
train_loss:0.12937502196848547
train_loss:0.10080232388997615
train_loss:0.1433918613109576
train_loss:0.15192895187892305
train_loss:0.1648711640447537
train_loss:0.15515860320952918
train_loss:0.11577427405176502
train_loss:0.04991838139950274
train_loss:0.16669192227101182
train_loss:0.18872017594842527
train_loss:0.13278044728094665
train_loss:0.14462363902692724
train_loss:0.12899222057327978
=== epoch:8, train_acc:0.953, test_acc:0.929 ===
train_loss:0.11614658829052528
train_loss:0.1283181306383869
train_loss:0.13602630519082037
train_loss:0.08820753814622587
train_loss:0.16890325196609468
train_loss:0.06370471015340015
train_loss:0.1380223598283016
train_loss:0.10414267340046371
train_loss:0.09350530384194355
train_loss:0.12745550967245167
train_loss:0.08580615867361312
train_loss:0.07332708433862614
train_loss:0.14091931565454754
train_loss:0.0760411000748177
train_loss:0.09505745644205849
train_loss:0.06360761624213854
train_loss:0.06541500736200513
train_loss:0.12404314553963294
train_loss:0.10167160576295751
train_loss:0.10616148380778018
train_loss:0.1346644429775604
train_loss:0.12441423831964894
train_loss:0.3573323396268424
train_loss:0.24916186199107485
train_loss:0.12530529822852685
train_loss:0.08754367015669812
train_loss:0.07334443956083914
train_loss:0.20917550197781243
train_loss:0.1847840883495349
train_loss:0.1183049487746507
train_loss:0.07881905605438366
train_loss:0.15063665903727463
train_loss:0.17107469503107173
train_loss:0.11236219217021456
train_loss:0.09393106092285483
train_loss:0.06416538395448765
train_loss:0.11236854428092079
train_loss:0.20945523787716333
train_loss:0.08337149369731861
train_loss:0.05732487355325358
train_loss:0.1570864506321766
train_loss:0.18076648840092233
train_loss:0.13745138865307854
train_loss:0.08714081091649845
train_loss:0.1435806754576637
train_loss:0.24435407501635567
train_loss:0.12994146376471538
train_loss:0.15372389864103003
train_loss:0.09813508945397395
train_loss:0.12535304105848438
=== epoch:9, train_acc:0.949, test_acc:0.929 ===
train_loss:0.12884389358627435
train_loss:0.07230903284506444
train_loss:0.13088479970015968
train_loss:0.08134419807781099
train_loss:0.13741150483980263
train_loss:0.11837091458319343
train_loss:0.0360333597933849
train_loss:0.10086706481279009
train_loss:0.07501685865192625
train_loss:0.07863162231090925
train_loss:0.13702724499254867
train_loss:0.08084087775983821
train_loss:0.12343541914233253
train_loss:0.07850160249109997
train_loss:0.09418802616477617
train_loss:0.09552050398868868
train_loss:0.07673580117804006
train_loss:0.026939052951253605
train_loss:0.04395589295983649
train_loss:0.038031816812409164
train_loss:0.06999557624936044
train_loss:0.1655966718000311
train_loss:0.06368445153357599
train_loss:0.04010530475275284
train_loss:0.12382479494357689
train_loss:0.1641936287301483
train_loss:0.18920478194308601
train_loss:0.05733130321010137
train_loss:0.17698603597887125
train_loss:0.10764127802606108
train_loss:0.09413680031262134
train_loss:0.08907267445559093
train_loss:0.15502890698462124
train_loss:0.1533752414611575
train_loss:0.12011510053939835
train_loss:0.09968853683767069
train_loss:0.0906986479553312
train_loss:0.06981896162587345
train_loss:0.125922628245562
train_loss:0.08376618287979185
train_loss:0.05995160730233552
train_loss:0.09389935503195222
train_loss:0.13350440149583398
train_loss:0.09142311542034161
train_loss:0.13335311846237471
train_loss:0.11711887232469347
train_loss:0.044254101034480256
train_loss:0.06471555203906754
train_loss:0.14891282539205272
train_loss:0.2014883194756923
=== epoch:10, train_acc:0.953, test_acc:0.94 ===
train_loss:0.07038223814736246
train_loss:0.04957925723048767
train_loss:0.1133203501417986
train_loss:0.06346746023246018
train_loss:0.09239005821377208
train_loss:0.09635593692155876
train_loss:0.08332106191636164
train_loss:0.09923978538225704
train_loss:0.0695841620944646
train_loss:0.06700538032716745
train_loss:0.0624946961727422
train_loss:0.08112967415293411
train_loss:0.07319622148310498
train_loss:0.060854721728220804
train_loss:0.10026635040038442
train_loss:0.10472330229823613
train_loss:0.10699083742922384
train_loss:0.11619034438665427
train_loss:0.11232902974524973
train_loss:0.20983846300025782
train_loss:0.06507078644782731
train_loss:0.04803232504884892
train_loss:0.11241615961989934
train_loss:0.10809407983258541
train_loss:0.11393344596723093
train_loss:0.0780092673392942
train_loss:0.14979393788923598
train_loss:0.12941990772896717
train_loss:0.11111693366947283
train_loss:0.09567980863367559
train_loss:0.09901129012576136
train_loss:0.10082353815636745
train_loss:0.12243756319120067
train_loss:0.08689941759333618
train_loss:0.05216551452802829
train_loss:0.10835939204484273
train_loss:0.07147497183981844
train_loss:0.08423764778379547
train_loss:0.07612742085525462
train_loss:0.041279006803477764
train_loss:0.09023533744854008
train_loss:0.1187026526641907
train_loss:0.07174824257614387
train_loss:0.08675031602602198
train_loss:0.04807893244994377
train_loss:0.1318909470505687
train_loss:0.19234102727794575
train_loss:0.0844066471575179
train_loss:0.1194799891798427
train_loss:0.11756051445361188
=== epoch:11, train_acc:0.964, test_acc:0.94 ===
train_loss:0.1741824301332884
train_loss:0.041286453453026034
train_loss:0.20004781800934587
train_loss:0.08271887641369358
train_loss:0.0606625239406979
train_loss:0.06538885049218911
train_loss:0.1356239427381109
train_loss:0.12831547213191985
train_loss:0.14952022857091044
train_loss:0.09204728635629016
train_loss:0.06343795479799186
train_loss:0.09542404144224398
train_loss:0.09551244124437158
train_loss:0.0891461114187921
train_loss:0.08209391054821052
train_loss:0.06472937443672702
train_loss:0.10047991184910417
train_loss:0.05707977543296623
train_loss:0.04815266262234755
train_loss:0.10651405686868827
train_loss:0.12602581734400617
train_loss:0.11018803681586739
train_loss:0.09593175516685674
train_loss:0.10567684258621385
train_loss:0.07294477870498717
train_loss:0.1567460170890917
train_loss:0.08316370852102375
train_loss:0.04109785490526308
train_loss:0.09704109927945906
train_loss:0.06787451589479968
train_loss:0.1423526303311424
train_loss:0.10986156365848007
train_loss:0.10423944228047448
train_loss:0.1028545207161217
train_loss:0.05618516378954049
train_loss:0.12271709492529449
train_loss:0.06721168644287813
train_loss:0.10895658850953614
train_loss:0.10775961729824406
train_loss:0.06743315701995885
train_loss:0.08305814341761182
train_loss:0.05321124556958834
train_loss:0.05756614795873562
train_loss:0.03164124719166145
train_loss:0.07571387158776285
train_loss:0.022717308653022045
train_loss:0.08454968003060453
train_loss:0.06985803163452406
train_loss:0.0735357209850279
train_loss:0.12137582450718915
=== epoch:12, train_acc:0.968, test_acc:0.953 ===
train_loss:0.07907120936971256
train_loss:0.08286032073978893
train_loss:0.04898870244905463
train_loss:0.034494833700644746
train_loss:0.0545292573630558
train_loss:0.09563509920019846
train_loss:0.04436742890869528
train_loss:0.10660676044922741
train_loss:0.019977276298316103
train_loss:0.1328083457613646
train_loss:0.0907383936554434
train_loss:0.17664993915612345
train_loss:0.05548546973911768
train_loss:0.0578792152572221
train_loss:0.038371068208326226
train_loss:0.12337543344621996
train_loss:0.04066448395658238
train_loss:0.0891017754256894
train_loss:0.048119613606837836
train_loss:0.09627189693299613
train_loss:0.0615439438317032
train_loss:0.03652546901286493
train_loss:0.04904481977735155
train_loss:0.03786403574522856
train_loss:0.04851347835633977
train_loss:0.03595106606907578
train_loss:0.04505040897006021
train_loss:0.09218815322372864
train_loss:0.0898107270167961
train_loss:0.06807205147334808
train_loss:0.11208901315010138
train_loss:0.02846301456851753
train_loss:0.03331721683136077
train_loss:0.027542070923049847
train_loss:0.06303924155306156
train_loss:0.13016506969855235
train_loss:0.03590030898483354
train_loss:0.033862974609868444
train_loss:0.039098987899974916
train_loss:0.1709281757500104
train_loss:0.0383273966279281
train_loss:0.03892162515633711
train_loss:0.10949855394502289
train_loss:0.0812137443231561
train_loss:0.14633906802587351
train_loss:0.10698167565558854
train_loss:0.02567424926759748
train_loss:0.08120468910017875
train_loss:0.08020246456611246
train_loss:0.08497396843283474
=== epoch:13, train_acc:0.972, test_acc:0.953 ===
train_loss:0.06180842566259316
train_loss:0.06275956683872176
train_loss:0.03597311434260791
train_loss:0.08955532839130037
train_loss:0.09472783598052546
train_loss:0.09784739962031823
train_loss:0.05449014569529458
train_loss:0.1539071976175351
train_loss:0.09529460808203737
train_loss:0.07943081264823855
train_loss:0.06282500883951327
train_loss:0.08120914933452372
train_loss:0.05394166809037722
train_loss:0.059178370081143274
train_loss:0.06097175155344926
train_loss:0.08850387282237344
train_loss:0.07763568680618946
train_loss:0.05984945146739694
train_loss:0.058515554469394306
train_loss:0.041470749797641863
train_loss:0.04641305484474891
train_loss:0.043105933680273774
train_loss:0.07810105339636093
train_loss:0.07343223348336785
train_loss:0.11328438379951372
train_loss:0.064209095862823
train_loss:0.058276521292794765
train_loss:0.08575165759210586
train_loss:0.03446469146442009
train_loss:0.08030590413200737
train_loss:0.06030731369033857
train_loss:0.059937874948476855
train_loss:0.09825030448814026
train_loss:0.033150548450314274
train_loss:0.06275798815573187
train_loss:0.07623978702315799
train_loss:0.06863532191157451
train_loss:0.09434234640572493
train_loss:0.05988773543728522
train_loss:0.0973386163099195
train_loss:0.037677231861936444
train_loss:0.04349353141613669
train_loss:0.054963630265228526
train_loss:0.07002833794183859
train_loss:0.11146208322987784
train_loss:0.0371527618982775
train_loss:0.07357346163635663
train_loss:0.05434699135953322
train_loss:0.05237280178266695
train_loss:0.061138199418957304
=== epoch:14, train_acc:0.977, test_acc:0.953 ===
train_loss:0.10066501587317372
train_loss:0.08921077888039124
train_loss:0.08231892225338307
train_loss:0.04772890908936607
train_loss:0.09184041168344921
train_loss:0.0938990402442275
train_loss:0.0494225943872303
train_loss:0.03844382368238921
train_loss:0.06391940914619151
train_loss:0.05342051430572013
train_loss:0.026444387224084483
train_loss:0.04130568390788019
train_loss:0.04355302798092278
train_loss:0.04368090744575301
train_loss:0.06303958330270483
train_loss:0.05266226318173275
train_loss:0.03821582056566959
train_loss:0.07639486631748305
train_loss:0.04911411347416994
train_loss:0.038169986550654546
train_loss:0.13870806289567578
train_loss:0.02962001734644125
train_loss:0.04476946757525486
train_loss:0.029287110761754498
train_loss:0.09072230859627803
train_loss:0.04213956443267707
train_loss:0.026866370710789175
train_loss:0.031073106822891664
train_loss:0.02913660454796326
train_loss:0.01717886084993834
train_loss:0.03947121149322037
train_loss:0.10302445790288721
train_loss:0.05921670277047061
train_loss:0.0441078831750056
train_loss:0.034245762460219924
train_loss:0.03702118405857356
train_loss:0.059523914896238844
train_loss:0.08474177088511838
train_loss:0.01984261067581143
train_loss:0.03649283528554719
train_loss:0.0696744613847696
train_loss:0.043124531467626355
train_loss:0.07847660225519426
train_loss:0.03110892663155919
train_loss:0.013048617405107545
train_loss:0.03058430961791362
train_loss:0.10944775307658777
train_loss:0.036016185483549956
train_loss:0.02334871888725246
train_loss:0.03343570584902615
=== epoch:15, train_acc:0.978, test_acc:0.955 ===
train_loss:0.03039950446343728
train_loss:0.08462547050837538
train_loss:0.032203680055763614
train_loss:0.03436650325431724
train_loss:0.07253946928673467
train_loss:0.06683830994435695
train_loss:0.06365612671518663
train_loss:0.038592355748068366
train_loss:0.017214805539273587
train_loss:0.03392215480646994
train_loss:0.06712344038335312
train_loss:0.08545444441474491
train_loss:0.03565551818896037
train_loss:0.03700222964797901
train_loss:0.05504566593144957
train_loss:0.06284156488557872
train_loss:0.01790621057871843
train_loss:0.04948893828174306
train_loss:0.04592254340798565
train_loss:0.06398640989500583
train_loss:0.10908329324005156
train_loss:0.09487084234534628
train_loss:0.053787562583242826
train_loss:0.05612223096492913
train_loss:0.024009003497293274
train_loss:0.03787210692940926
train_loss:0.09744410172518134
train_loss:0.02282525149417848
train_loss:0.06533342475382259
train_loss:0.08171715736560953
train_loss:0.04070724777349443
train_loss:0.06953272511044452
train_loss:0.02855280306742936
train_loss:0.0474283156516662
train_loss:0.04395351930213369
train_loss:0.04529719694665024
train_loss:0.11563204324980689
train_loss:0.031898844518736105
train_loss:0.027477227657423706
train_loss:0.023383771724825565
train_loss:0.049706631766448794
train_loss:0.031100655225489174
train_loss:0.09009450125248943
train_loss:0.030676528683159683
train_loss:0.01692270088282052
train_loss:0.025600749636003037
train_loss:0.023930285953440864
train_loss:0.05294293370777191
train_loss:0.08650284038477984
train_loss:0.10454565072160892
=== epoch:16, train_acc:0.98, test_acc:0.955 ===
train_loss:0.05020287465705867
train_loss:0.06582624488708202
train_loss:0.05263721175022644
train_loss:0.13467920218173793
train_loss:0.042511734082618255
train_loss:0.06410160534179558
train_loss:0.04919028612235428
train_loss:0.05743613134261321
train_loss:0.0654026197411463
train_loss:0.044988743028737746
train_loss:0.03509888962259968
train_loss:0.04152055661578496
train_loss:0.07984768703470407
train_loss:0.04598595090000615
train_loss:0.04695586870826502
train_loss:0.023194242317372736
train_loss:0.0727661396279491
train_loss:0.029529078635952798
train_loss:0.03247264667136894
train_loss:0.045715430493677864
train_loss:0.09389997032682505
train_loss:0.030092722641706086
train_loss:0.040039704380178245
train_loss:0.01691320967299449
train_loss:0.05070621322747806
train_loss:0.0225280810454206
train_loss:0.04835428643664134
train_loss:0.04789046408078379
train_loss:0.04612012129182796
train_loss:0.03235681563723572
train_loss:0.025013118629385985
train_loss:0.02686317762122873
train_loss:0.01619148759252484
train_loss:0.025772857201395855
train_loss:0.11601878857144289
train_loss:0.03260786464856165
train_loss:0.11699193164137509
train_loss:0.03512108879147574
train_loss:0.1296771456246295
train_loss:0.05990833703421112
train_loss:0.04814119058671268
train_loss:0.030508106418284164
train_loss:0.040792767467867204
train_loss:0.03729097681074012
train_loss:0.033829135343674634
train_loss:0.04572861828306607
train_loss:0.08219478878922817
train_loss:0.03992035364218883
train_loss:0.03877334387840298
train_loss:0.020135442415332494
=== epoch:17, train_acc:0.985, test_acc:0.952 ===
train_loss:0.0573879679439545
train_loss:0.021548063539220688
train_loss:0.02026094914055154
train_loss:0.017008292034281135
train_loss:0.03644381984642446
train_loss:0.014282373129234844
train_loss:0.016566814170416534
train_loss:0.0716841677349114
train_loss:0.03655291810668415
train_loss:0.021277181810570735
train_loss:0.031425444981420726
train_loss:0.023091189748999884
train_loss:0.03965608369203497
train_loss:0.02083114039735955
train_loss:0.019066995516890377
train_loss:0.031482705592815144
train_loss:0.01120953512484204
train_loss:0.02228841358023976
train_loss:0.019201103378694014
train_loss:0.0578870953252985
train_loss:0.06953714404223653
train_loss:0.01477906336701353
train_loss:0.03570613669823849
train_loss:0.032205423631456224
train_loss:0.017607830384249956
train_loss:0.022332266983392062
train_loss:0.02484238892631349
train_loss:0.024456964557631952
train_loss:0.014892596258498645
train_loss:0.02007855498244406
train_loss:0.10612949393231301
train_loss:0.027800458122900946
train_loss:0.02032975418675139
train_loss:0.0687399190755896
train_loss:0.045257181737845
train_loss:0.022502761141273062
train_loss:0.016465106232977655
train_loss:0.047075313910580195
train_loss:0.015330605341329271
train_loss:0.017603254364037816
train_loss:0.031170443502446705
train_loss:0.07249246022522765
train_loss:0.08642323375728528
train_loss:0.009238019288650805
train_loss:0.016168523687302924
train_loss:0.059189578742659926
train_loss:0.032899410552574435
train_loss:0.021636004794118757
train_loss:0.02361620610060937
train_loss:0.009924447333153601
=== epoch:18, train_acc:0.984, test_acc:0.955 ===
train_loss:0.03297920575719971
train_loss:0.023211974536229463
train_loss:0.023447487978865138
train_loss:0.02110348003690432
train_loss:0.01658551264501526
train_loss:0.027321771841294
train_loss:0.02393954174141599
train_loss:0.020660925712682302
train_loss:0.059811565901714096
train_loss:0.03889841545509195
train_loss:0.030567186107595505
train_loss:0.014637006415181588
train_loss:0.009532910801279574
train_loss:0.05419154580817005
train_loss:0.016191570395205506
train_loss:0.037379669867669094
train_loss:0.02203393293059752
train_loss:0.010187609365885714
train_loss:0.014143504678078544
train_loss:0.02286213697760976
train_loss:0.023042577643409064
train_loss:0.02471646877045257
train_loss:0.08498801234463908
train_loss:0.0242036001152738
train_loss:0.022578090133924276
train_loss:0.05970722708772782
train_loss:0.03202530556617804
train_loss:0.05338138039773102
train_loss:0.04463245296079495
train_loss:0.03206047252903087
train_loss:0.019347849251929422
train_loss:0.023362730340730487
train_loss:0.03485291969510898
train_loss:0.05924776862243811
train_loss:0.009056978954709626
train_loss:0.04308362839978184
train_loss:0.05077186188071209
train_loss:0.020649662647307408
train_loss:0.02737382223358688
train_loss:0.016355353461969535
train_loss:0.04353351414915996
train_loss:0.03866700258198946
train_loss:0.034930203176868485
train_loss:0.05397897521853895
train_loss:0.026402778273328865
train_loss:0.01689432395084394
train_loss:0.009645053179985376
train_loss:0.015939626217848713
train_loss:0.04521449099196396
train_loss:0.009337164608357028
=== epoch:19, train_acc:0.988, test_acc:0.957 ===
train_loss:0.07281545036894353
train_loss:0.03304679858053896
train_loss:0.017578649483574377
train_loss:0.035316237680244694
train_loss:0.06109649867654281
train_loss:0.11357374683767389
train_loss:0.02483234829972833
train_loss:0.012946971291290165
train_loss:0.023761433518867836
train_loss:0.026861396528693442
train_loss:0.038687220428920886
train_loss:0.025045999932346977
train_loss:0.030357339557961008
train_loss:0.015449713594176082
train_loss:0.029012299978168895
train_loss:0.013354758625586532
train_loss:0.024714900194681148
train_loss:0.03025567287666344
train_loss:0.020948136642865007
train_loss:0.022452751530621335
train_loss:0.017637320910353846
train_loss:0.037696091268993266
train_loss:0.04133004023875008
train_loss:0.02098629767264089
train_loss:0.027257711709578428
train_loss:0.03464859263099433
train_loss:0.024586449767853447
train_loss:0.031324097177386274
train_loss:0.03772372686263441
train_loss:0.016790171599489236
train_loss:0.015417473534566956
train_loss:0.014313304385103295
train_loss:0.018911987710428353
train_loss:0.03268773877599193
train_loss:0.03169852876511249
train_loss:0.016634851724425005
train_loss:0.022491226115508897
train_loss:0.012846097684058401
train_loss:0.04491637989670535
train_loss:0.026276411989839717
train_loss:0.046483783459765664
train_loss:0.027554142605377273
train_loss:0.045690054296902184
train_loss:0.007125631899693551
train_loss:0.030307882046600162
train_loss:0.043824235242418484
train_loss:0.012116814299235173
train_loss:0.02551120642569737
train_loss:0.020675326146158267
train_loss:0.01904337304161037
=== epoch:20, train_acc:0.988, test_acc:0.955 ===
train_loss:0.019811489377421325
train_loss:0.02904394417605083
train_loss:0.014182669827434878
train_loss:0.08310473502963105
train_loss:0.025266767052067554
train_loss:0.0145968293286221
train_loss:0.024431311092897222
train_loss:0.017772308902654126
train_loss:0.013775975044123154
train_loss:0.019179699126282618
train_loss:0.02050997906687725
train_loss:0.06601309296229428
train_loss:0.04328029600024481
train_loss:0.013779654928032846
train_loss:0.03548073194070947
train_loss:0.028314291416797463
train_loss:0.017903589499994797
train_loss:0.026682962872456875
train_loss:0.015331922374534714
train_loss:0.03510248118020717
train_loss:0.015798064472410285
train_loss:0.02278987724913449
train_loss:0.015320626099717495
train_loss:0.014856919374004763
train_loss:0.049061211134819704
train_loss:0.013149540835931117
train_loss:0.02876937879648784
train_loss:0.011511044682713648
train_loss:0.017319277626619986
train_loss:0.021966338633536506
train_loss:0.022826014668981102
train_loss:0.02972405077807331
train_loss:0.017999248202233014
train_loss:0.015019578338274385
train_loss:0.013615559543221783
train_loss:0.017157088527906976
train_loss:0.031165739705942195
train_loss:0.016688990000663685
train_loss:0.020805501326501673
train_loss:0.004446125733896681
train_loss:0.019461930759853602
train_loss:0.017395898859850177
train_loss:0.011972844953611752
train_loss:0.02855626286829241
train_loss:0.03471848511969467
train_loss:0.03534078528114222
train_loss:0.012080809790091997
train_loss:0.012558807787670045
train_loss:0.012191937787715228
=============== Final Test Accuracy ===============
test_acc:0.959
Saved Network Parameters!

 

  • 5
    点赞
  • 16
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 2
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

一个处女座的程序猿

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值