DL之DNN:利用MultiLayerNet模型【6*100+ReLU+SGD,weight_decay】对Mnist数据集训练来抑制过拟合

DL之DNN:利用MultiLayerNet模型【6*100+ReLU+SGD,weight_decay】对Mnist数据集训练来抑制过拟合

 

 

目录

输出结果

设计思路

核心代码

更多输出


 

 

 

输出结果

 

设计思路

 

 

核心代码

# weight_decay_lambda = 0
weight_decay_lambda = 0.1


for i in range(1000000):
    batch_mask = np.random.choice(train_size, batch_size)
    x_batch = x_train[batch_mask]
    t_batch = t_train[batch_mask]
 
    grads = network.gradient(x_batch, t_batch)  
    optimizer.update(network.params, grads)   
 
    if i % iter_per_epoch == 0:   
        train_acc = network.accuracy(x_train, t_train)  
        test_acc = network.accuracy(x_test, t_test)     
        train_acc_list.append(train_acc)
        test_acc_list.append(test_acc)
 
        print("epoch:" + str(epoch_cnt) + ", train_acc:" + str(float('%.4f' % train_acc)) + ", test_acc:" + str(float('%.4f' % test_acc))) 
        epoch_cnt += 1
        if epoch_cnt >= max_epochs:  #
            break

 

更多输出

1、MultiLayerNet[6*100+ReLU+SGD]: DIY Overfitting Data Set based on mnist,train_acc VS test_acc

epoch:0, train_acc:0.06, test_acc:0.0834
epoch:1, train_acc:0.1233, test_acc:0.1109
epoch:2, train_acc:0.1467, test_acc:0.1292
epoch:3, train_acc:0.2233, test_acc:0.1717
epoch:4, train_acc:0.2567, test_acc:0.1891
epoch:5, train_acc:0.27, test_acc:0.2181
epoch:6, train_acc:0.31, test_acc:0.229
epoch:7, train_acc:0.32, test_acc:0.24
epoch:8, train_acc:0.3567, test_acc:0.2502
epoch:9, train_acc:0.37, test_acc:0.2651
epoch:10, train_acc:0.3767, test_acc:0.2743
epoch:11, train_acc:0.39, test_acc:0.2833
epoch:12, train_acc:0.3767, test_acc:0.2769
epoch:13, train_acc:0.4067, test_acc:0.295
epoch:14, train_acc:0.4667, test_acc:0.3169
epoch:15, train_acc:0.45, test_acc:0.3213
epoch:16, train_acc:0.5067, test_acc:0.3439
epoch:17, train_acc:0.54, test_acc:0.3593
epoch:18, train_acc:0.5233, test_acc:0.3687
epoch:19, train_acc:0.5367, test_acc:0.3691
epoch:20, train_acc:0.5667, test_acc:0.4051
epoch:21, train_acc:0.5967, test_acc:0.4265
epoch:22, train_acc:0.63, test_acc:0.4477
epoch:23, train_acc:0.6467, test_acc:0.4627
epoch:24, train_acc:0.6567, test_acc:0.4708
epoch:25, train_acc:0.6533, test_acc:0.4896
epoch:26, train_acc:0.66, test_acc:0.5034
epoch:27, train_acc:0.68, test_acc:0.5107
epoch:28, train_acc:0.6833, test_acc:0.5083
epoch:29, train_acc:0.7067, test_acc:0.5244
epoch:30, train_acc:0.7567, test_acc:0.5564
epoch:31, train_acc:0.7333, test_acc:0.5411
epoch:32, train_acc:0.7533, test_acc:0.5698
epoch:33, train_acc:0.7633, test_acc:0.5738
epoch:34, train_acc:0.7833, test_acc:0.5764
epoch:35, train_acc:0.7633, test_acc:0.5863
epoch:36, train_acc:0.7733, test_acc:0.5915
epoch:37, train_acc:0.8067, test_acc:0.608
epoch:38, train_acc:0.81, test_acc:0.6113
epoch:39, train_acc:0.8033, test_acc:0.5922
epoch:40, train_acc:0.8233, test_acc:0.6192
epoch:41, train_acc:0.83, test_acc:0.6203
epoch:42, train_acc:0.8033, test_acc:0.6066
epoch:43, train_acc:0.8333, test_acc:0.6311
epoch:44, train_acc:0.8433, test_acc:0.6273
epoch:45, train_acc:0.85, test_acc:0.6413
epoch:46, train_acc:0.85, test_acc:0.6375
epoch:47, train_acc:0.86, test_acc:0.6352
epoch:48, train_acc:0.8667, test_acc:0.6504
epoch:49, train_acc:0.8767, test_acc:0.6588
epoch:50, train_acc:0.8667, test_acc:0.6592
epoch:51, train_acc:0.89, test_acc:0.6648
epoch:52, train_acc:0.88, test_acc:0.6605
epoch:53, train_acc:0.88, test_acc:0.6654
epoch:54, train_acc:0.8967, test_acc:0.6674
epoch:55, train_acc:0.8967, test_acc:0.6701
epoch:56, train_acc:0.9, test_acc:0.6636
epoch:57, train_acc:0.9, test_acc:0.6755
epoch:58, train_acc:0.9167, test_acc:0.6763
epoch:59, train_acc:0.9133, test_acc:0.6748
epoch:60, train_acc:0.92, test_acc:0.6788
epoch:61, train_acc:0.9033, test_acc:0.6759
epoch:62, train_acc:0.9133, test_acc:0.6747
epoch:63, train_acc:0.9233, test_acc:0.6915
epoch:64, train_acc:0.9267, test_acc:0.687
epoch:65, train_acc:0.92, test_acc:0.6822
epoch:66, train_acc:0.9133, test_acc:0.6827
epoch:67, train_acc:0.92, test_acc:0.6932
epoch:68, train_acc:0.9333, test_acc:0.6976
epoch:69, train_acc:0.94, test_acc:0.6953
epoch:70, train_acc:0.94, test_acc:0.7031
epoch:71, train_acc:0.9367, test_acc:0.6951
epoch:72, train_acc:0.9433, test_acc:0.7036
epoch:73, train_acc:0.9367, test_acc:0.7051
epoch:74, train_acc:0.9433, test_acc:0.706
epoch:75, train_acc:0.95, test_acc:0.707
epoch:76, train_acc:0.9567, test_acc:0.7052
epoch:77, train_acc:0.9433, test_acc:0.6991
epoch:78, train_acc:0.9567, test_acc:0.7121
epoch:79, train_acc:0.9633, test_acc:0.7055
epoch:80, train_acc:0.96, test_acc:0.7088
epoch:81, train_acc:0.9567, test_acc:0.7105
epoch:82, train_acc:0.9633, test_acc:0.7091
epoch:83, train_acc:0.9567, test_acc:0.7159
epoch:84, train_acc:0.9567, test_acc:0.7072
epoch:85, train_acc:0.9633, test_acc:0.7138
epoch:86, train_acc:0.9767, test_acc:0.7127
epoch:87, train_acc:0.9733, test_acc:0.7167
epoch:88, train_acc:0.9733, test_acc:0.7241
epoch:89, train_acc:0.98, test_acc:0.721
epoch:90, train_acc:0.9767, test_acc:0.7202
epoch:91, train_acc:0.9767, test_acc:0.7232
epoch:92, train_acc:0.9833, test_acc:0.717
epoch:93, train_acc:0.9867, test_acc:0.7215
epoch:94, train_acc:0.9867, test_acc:0.7299
epoch:95, train_acc:0.9833, test_acc:0.728
epoch:96, train_acc:0.99, test_acc:0.7223
epoch:97, train_acc:0.9867, test_acc:0.7205
epoch:98, train_acc:0.99, test_acc:0.7287
epoch:99, train_acc:0.9967, test_acc:0.7298
epoch:100, train_acc:0.99, test_acc:0.7288
epoch:101, train_acc:1.0, test_acc:0.7258
epoch:102, train_acc:0.9967, test_acc:0.7274
epoch:103, train_acc:0.9967, test_acc:0.7238
epoch:104, train_acc:1.0, test_acc:0.7275
epoch:105, train_acc:0.9967, test_acc:0.7275
epoch:106, train_acc:1.0, test_acc:0.7209
epoch:107, train_acc:1.0, test_acc:0.7306
epoch:108, train_acc:0.9933, test_acc:0.7267
epoch:109, train_acc:0.9967, test_acc:0.7278
epoch:110, train_acc:1.0, test_acc:0.7306
epoch:111, train_acc:1.0, test_acc:0.7279
epoch:112, train_acc:0.9967, test_acc:0.7326
epoch:113, train_acc:0.9967, test_acc:0.7274
epoch:114, train_acc:0.9967, test_acc:0.7279
epoch:115, train_acc:1.0, test_acc:0.7301
epoch:116, train_acc:1.0, test_acc:0.7296
epoch:117, train_acc:1.0, test_acc:0.7327
epoch:118, train_acc:1.0, test_acc:0.7248
epoch:119, train_acc:1.0, test_acc:0.733
epoch:120, train_acc:1.0, test_acc:0.7286
epoch:121, train_acc:1.0, test_acc:0.7302
epoch:122, train_acc:1.0, test_acc:0.7346
epoch:123, train_acc:1.0, test_acc:0.7309
epoch:124, train_acc:1.0, test_acc:0.7309
epoch:125, train_acc:1.0, test_acc:0.7327
epoch:126, train_acc:1.0, test_acc:0.7353
epoch:127, train_acc:1.0, test_acc:0.7316
epoch:128, train_acc:1.0, test_acc:0.7296
epoch:129, train_acc:1.0, test_acc:0.731
epoch:130, train_acc:1.0, test_acc:0.733
epoch:131, train_acc:1.0, test_acc:0.7331
epoch:132, train_acc:1.0, test_acc:0.732
epoch:133, train_acc:1.0, test_acc:0.7333
epoch:134, train_acc:1.0, test_acc:0.7288
epoch:135, train_acc:1.0, test_acc:0.7347
epoch:136, train_acc:1.0, test_acc:0.7349
epoch:137, train_acc:1.0, test_acc:0.7356
epoch:138, train_acc:1.0, test_acc:0.7308
epoch:139, train_acc:1.0, test_acc:0.7359
epoch:140, train_acc:1.0, test_acc:0.7337
epoch:141, train_acc:1.0, test_acc:0.7355
epoch:142, train_acc:1.0, test_acc:0.7349
epoch:143, train_acc:1.0, test_acc:0.7327
epoch:144, train_acc:1.0, test_acc:0.7344
epoch:145, train_acc:1.0, test_acc:0.7367
epoch:146, train_acc:1.0, test_acc:0.7372
epoch:147, train_acc:1.0, test_acc:0.7353
epoch:148, train_acc:1.0, test_acc:0.7373
epoch:149, train_acc:1.0, test_acc:0.7362
epoch:150, train_acc:1.0, test_acc:0.7366
epoch:151, train_acc:1.0, test_acc:0.7376
epoch:152, train_acc:1.0, test_acc:0.7357
epoch:153, train_acc:1.0, test_acc:0.7341
epoch:154, train_acc:1.0, test_acc:0.7338
epoch:155, train_acc:1.0, test_acc:0.7351
epoch:156, train_acc:1.0, test_acc:0.7339
epoch:157, train_acc:1.0, test_acc:0.7383
epoch:158, train_acc:1.0, test_acc:0.7366
epoch:159, train_acc:1.0, test_acc:0.7376
epoch:160, train_acc:1.0, test_acc:0.7383
epoch:161, train_acc:1.0, test_acc:0.7404
epoch:162, train_acc:1.0, test_acc:0.7373
epoch:163, train_acc:1.0, test_acc:0.7357
epoch:164, train_acc:1.0, test_acc:0.7359
epoch:165, train_acc:1.0, test_acc:0.7392
epoch:166, train_acc:1.0, test_acc:0.7384
epoch:167, train_acc:1.0, test_acc:0.7381
epoch:168, train_acc:1.0, test_acc:0.734
epoch:169, train_acc:1.0, test_acc:0.7352
epoch:170, train_acc:1.0, test_acc:0.7356
epoch:171, train_acc:1.0, test_acc:0.7381
epoch:172, train_acc:1.0, test_acc:0.7384
epoch:173, train_acc:1.0, test_acc:0.7398
epoch:174, train_acc:1.0, test_acc:0.7395
epoch:175, train_acc:1.0, test_acc:0.7413
epoch:176, train_acc:1.0, test_acc:0.7387
epoch:177, train_acc:1.0, test_acc:0.7402
epoch:178, train_acc:1.0, test_acc:0.7378
epoch:179, train_acc:1.0, test_acc:0.7389
epoch:180, train_acc:1.0, test_acc:0.7396
epoch:181, train_acc:1.0, test_acc:0.7375
epoch:182, train_acc:1.0, test_acc:0.7403
epoch:183, train_acc:1.0, test_acc:0.7392
epoch:184, train_acc:1.0, test_acc:0.7382
epoch:185, train_acc:1.0, test_acc:0.7389
epoch:186, train_acc:1.0, test_acc:0.7385
epoch:187, train_acc:1.0, test_acc:0.7385
epoch:188, train_acc:1.0, test_acc:0.7401
epoch:189, train_acc:1.0, test_acc:0.7382
epoch:190, train_acc:1.0, test_acc:0.7401
epoch:191, train_acc:1.0, test_acc:0.7404
epoch:192, train_acc:1.0, test_acc:0.739
epoch:193, train_acc:1.0, test_acc:0.7398
epoch:194, train_acc:1.0, test_acc:0.7411
epoch:195, train_acc:1.0, test_acc:0.7401
epoch:196, train_acc:1.0, test_acc:0.7394
epoch:197, train_acc:1.0, test_acc:0.7415
epoch:198, train_acc:1.0, test_acc:0.7408
epoch:199, train_acc:1.0, test_acc:0.74
epoch:200, train_acc:1.0, test_acc:0.7395

2、MultiLayerNet[6*100+ReLU+SGD,weight_decay]: DIY Overfitting Data Set based on mnist,train_acc VS test_acc

epoch:0, train_acc:0.1067, test_acc:0.1358
epoch:1, train_acc:0.1233, test_acc:0.1302
epoch:2, train_acc:0.1433, test_acc:0.1246
epoch:3, train_acc:0.16, test_acc:0.1227
epoch:4, train_acc:0.17, test_acc:0.1264
epoch:5, train_acc:0.1933, test_acc:0.1408
epoch:6, train_acc:0.2133, test_acc:0.147
epoch:7, train_acc:0.2433, test_acc:0.1677
epoch:8, train_acc:0.2867, test_acc:0.1847
epoch:9, train_acc:0.3333, test_acc:0.2162
epoch:10, train_acc:0.3667, test_acc:0.2291
epoch:11, train_acc:0.4067, test_acc:0.2616
epoch:12, train_acc:0.4367, test_acc:0.2827
epoch:13, train_acc:0.44, test_acc:0.3016
epoch:14, train_acc:0.4733, test_acc:0.32
epoch:15, train_acc:0.4733, test_acc:0.3389
epoch:16, train_acc:0.52, test_acc:0.3503
epoch:17, train_acc:0.5567, test_acc:0.363
epoch:18, train_acc:0.5367, test_acc:0.3748
epoch:19, train_acc:0.5933, test_acc:0.3959
epoch:20, train_acc:0.6033, test_acc:0.4055
epoch:21, train_acc:0.6133, test_acc:0.4226
epoch:22, train_acc:0.62, test_acc:0.4284
epoch:23, train_acc:0.64, test_acc:0.4561
epoch:24, train_acc:0.63, test_acc:0.4728
epoch:25, train_acc:0.6533, test_acc:0.4675
epoch:26, train_acc:0.6433, test_acc:0.4762
epoch:27, train_acc:0.6767, test_acc:0.4792
epoch:28, train_acc:0.6967, test_acc:0.4964
epoch:29, train_acc:0.6833, test_acc:0.489
epoch:30, train_acc:0.71, test_acc:0.504
epoch:31, train_acc:0.7067, test_acc:0.5129
epoch:32, train_acc:0.7167, test_acc:0.5162
epoch:33, train_acc:0.7167, test_acc:0.5191
epoch:34, train_acc:0.72, test_acc:0.5264
epoch:35, train_acc:0.7067, test_acc:0.5359
epoch:36, train_acc:0.6933, test_acc:0.5414
epoch:37, train_acc:0.6933, test_acc:0.5541
epoch:38, train_acc:0.7233, test_acc:0.5613
epoch:39, train_acc:0.7233, test_acc:0.5569
epoch:40, train_acc:0.7267, test_acc:0.5688
epoch:41, train_acc:0.7467, test_acc:0.5641
epoch:42, train_acc:0.75, test_acc:0.5778
epoch:43, train_acc:0.7667, test_acc:0.571
epoch:44, train_acc:0.7767, test_acc:0.5788
epoch:45, train_acc:0.7567, test_acc:0.5662
epoch:46, train_acc:0.7833, test_acc:0.5926
epoch:47, train_acc:0.7933, test_acc:0.6
epoch:48, train_acc:0.8, test_acc:0.6023
epoch:49, train_acc:0.7933, test_acc:0.6004
epoch:50, train_acc:0.8033, test_acc:0.6044
epoch:51, train_acc:0.7767, test_acc:0.596
epoch:52, train_acc:0.8, test_acc:0.5882
epoch:53, train_acc:0.82, test_acc:0.6071
epoch:54, train_acc:0.82, test_acc:0.6031
epoch:55, train_acc:0.8133, test_acc:0.6175
epoch:56, train_acc:0.8067, test_acc:0.6142
epoch:57, train_acc:0.81, test_acc:0.6024
epoch:58, train_acc:0.8333, test_acc:0.6019
epoch:59, train_acc:0.82, test_acc:0.6278
epoch:60, train_acc:0.8167, test_acc:0.6345
epoch:61, train_acc:0.8333, test_acc:0.6403
epoch:62, train_acc:0.8267, test_acc:0.6305
epoch:63, train_acc:0.8167, test_acc:0.6305
epoch:64, train_acc:0.8067, test_acc:0.64
epoch:65, train_acc:0.8167, test_acc:0.6473
epoch:66, train_acc:0.8433, test_acc:0.6434
epoch:67, train_acc:0.84, test_acc:0.6408
epoch:68, train_acc:0.8333, test_acc:0.6393
epoch:69, train_acc:0.8333, test_acc:0.6478
epoch:70, train_acc:0.8433, test_acc:0.6405
epoch:71, train_acc:0.84, test_acc:0.6541
epoch:72, train_acc:0.8433, test_acc:0.6414
epoch:73, train_acc:0.85, test_acc:0.6289
epoch:74, train_acc:0.8433, test_acc:0.6467
epoch:75, train_acc:0.85, test_acc:0.6302
epoch:76, train_acc:0.8433, test_acc:0.6449
epoch:77, train_acc:0.8433, test_acc:0.6508
epoch:78, train_acc:0.8567, test_acc:0.6455
epoch:79, train_acc:0.8433, test_acc:0.6471
epoch:80, train_acc:0.8667, test_acc:0.6641
epoch:81, train_acc:0.8633, test_acc:0.6516
epoch:82, train_acc:0.8567, test_acc:0.6501
epoch:83, train_acc:0.85, test_acc:0.6602
epoch:84, train_acc:0.8533, test_acc:0.6608
epoch:85, train_acc:0.85, test_acc:0.6535
epoch:86, train_acc:0.8533, test_acc:0.6701
epoch:87, train_acc:0.8433, test_acc:0.6629
epoch:88, train_acc:0.8633, test_acc:0.6659
epoch:89, train_acc:0.88, test_acc:0.6699
epoch:90, train_acc:0.86, test_acc:0.6624
epoch:91, train_acc:0.86, test_acc:0.6618
epoch:92, train_acc:0.85, test_acc:0.6644
epoch:93, train_acc:0.8567, test_acc:0.6593
epoch:94, train_acc:0.8633, test_acc:0.6718
epoch:95, train_acc:0.8667, test_acc:0.6734
epoch:96, train_acc:0.87, test_acc:0.6708
epoch:97, train_acc:0.87, test_acc:0.6643
epoch:98, train_acc:0.86, test_acc:0.6616
epoch:99, train_acc:0.8567, test_acc:0.6675
epoch:100, train_acc:0.8533, test_acc:0.6665
epoch:101, train_acc:0.8733, test_acc:0.6718
epoch:102, train_acc:0.8733, test_acc:0.6682
epoch:103, train_acc:0.8633, test_acc:0.6683
epoch:104, train_acc:0.8733, test_acc:0.6705
epoch:105, train_acc:0.8733, test_acc:0.6764
epoch:106, train_acc:0.8733, test_acc:0.6822
epoch:107, train_acc:0.8767, test_acc:0.674
epoch:108, train_acc:0.8633, test_acc:0.6744
epoch:109, train_acc:0.8767, test_acc:0.6698
epoch:110, train_acc:0.86, test_acc:0.6671
epoch:111, train_acc:0.8633, test_acc:0.6772
epoch:112, train_acc:0.8733, test_acc:0.6798
epoch:113, train_acc:0.87, test_acc:0.6808
epoch:114, train_acc:0.8767, test_acc:0.6723
epoch:115, train_acc:0.8933, test_acc:0.6751
epoch:116, train_acc:0.8733, test_acc:0.6779
epoch:117, train_acc:0.87, test_acc:0.6751
epoch:118, train_acc:0.8667, test_acc:0.6751
epoch:119, train_acc:0.8833, test_acc:0.6725
epoch:120, train_acc:0.8733, test_acc:0.6761
epoch:121, train_acc:0.88, test_acc:0.6768
epoch:122, train_acc:0.87, test_acc:0.6777
epoch:123, train_acc:0.8733, test_acc:0.6815
epoch:124, train_acc:0.87, test_acc:0.6835
epoch:125, train_acc:0.8833, test_acc:0.6808
epoch:126, train_acc:0.8633, test_acc:0.6693
epoch:127, train_acc:0.8733, test_acc:0.671
epoch:128, train_acc:0.87, test_acc:0.6723
epoch:129, train_acc:0.8767, test_acc:0.6769
epoch:130, train_acc:0.8767, test_acc:0.6815
epoch:131, train_acc:0.8933, test_acc:0.6834
epoch:132, train_acc:0.89, test_acc:0.6831
epoch:133, train_acc:0.8833, test_acc:0.6838
epoch:134, train_acc:0.87, test_acc:0.6846
epoch:135, train_acc:0.8967, test_acc:0.6797
epoch:136, train_acc:0.8833, test_acc:0.6835
epoch:137, train_acc:0.9033, test_acc:0.6854
epoch:138, train_acc:0.89, test_acc:0.6842
epoch:139, train_acc:0.8833, test_acc:0.6844
epoch:140, train_acc:0.8833, test_acc:0.6848
epoch:141, train_acc:0.88, test_acc:0.6799
epoch:142, train_acc:0.8867, test_acc:0.6839
epoch:143, train_acc:0.88, test_acc:0.6771
epoch:144, train_acc:0.88, test_acc:0.6788
epoch:145, train_acc:0.8867, test_acc:0.6898
epoch:146, train_acc:0.89, test_acc:0.6788
epoch:147, train_acc:0.8867, test_acc:0.685
epoch:148, train_acc:0.8833, test_acc:0.6782
epoch:149, train_acc:0.87, test_acc:0.6819
epoch:150, train_acc:0.89, test_acc:0.6852
epoch:151, train_acc:0.8933, test_acc:0.687
epoch:152, train_acc:0.8867, test_acc:0.6759
epoch:153, train_acc:0.8833, test_acc:0.6887
epoch:154, train_acc:0.8867, test_acc:0.6894
epoch:155, train_acc:0.89, test_acc:0.6785
epoch:156, train_acc:0.8833, test_acc:0.6815
epoch:157, train_acc:0.8833, test_acc:0.6843
epoch:158, train_acc:0.8867, test_acc:0.6871
epoch:159, train_acc:0.8867, test_acc:0.6851
epoch:160, train_acc:0.89, test_acc:0.6847
epoch:161, train_acc:0.89, test_acc:0.6752
epoch:162, train_acc:0.8867, test_acc:0.6885
epoch:163, train_acc:0.8933, test_acc:0.6884
epoch:164, train_acc:0.8767, test_acc:0.6875
epoch:165, train_acc:0.89, test_acc:0.6852
epoch:166, train_acc:0.88, test_acc:0.6882
epoch:167, train_acc:0.8867, test_acc:0.6886
epoch:168, train_acc:0.9033, test_acc:0.6815
epoch:169, train_acc:0.8867, test_acc:0.6813
epoch:170, train_acc:0.8733, test_acc:0.685
epoch:171, train_acc:0.88, test_acc:0.6916
epoch:172, train_acc:0.89, test_acc:0.6838
epoch:173, train_acc:0.8933, test_acc:0.674
epoch:174, train_acc:0.8867, test_acc:0.6918
epoch:175, train_acc:0.8967, test_acc:0.6863
epoch:176, train_acc:0.89, test_acc:0.6937
epoch:177, train_acc:0.8867, test_acc:0.6904
epoch:178, train_acc:0.8967, test_acc:0.6831
epoch:179, train_acc:0.8933, test_acc:0.6911
epoch:180, train_acc:0.8967, test_acc:0.6898
epoch:181, train_acc:0.8933, test_acc:0.684
epoch:182, train_acc:0.8933, test_acc:0.6833
epoch:183, train_acc:0.89, test_acc:0.6876
epoch:184, train_acc:0.8767, test_acc:0.6899
epoch:185, train_acc:0.8933, test_acc:0.6911
epoch:186, train_acc:0.8867, test_acc:0.6798
epoch:187, train_acc:0.89, test_acc:0.6849
epoch:188, train_acc:0.8933, test_acc:0.6907
epoch:189, train_acc:0.8933, test_acc:0.6935
epoch:190, train_acc:0.89, test_acc:0.6898
epoch:191, train_acc:0.89, test_acc:0.689
epoch:192, train_acc:0.8867, test_acc:0.688
epoch:193, train_acc:0.8933, test_acc:0.6847
epoch:194, train_acc:0.8933, test_acc:0.6858
epoch:195, train_acc:0.88, test_acc:0.6904
epoch:196, train_acc:0.8867, test_acc:0.6807
epoch:197, train_acc:0.8933, test_acc:0.677
epoch:198, train_acc:0.8867, test_acc:0.6831
epoch:199, train_acc:0.8933, test_acc:0.6905
epoch:200, train_acc:0.8867, test_acc:0.6953

 

相关文章
CSDN:2019.04.09起

 

 

  • 4
    点赞
  • 11
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
这个表达式是计算二分类(通常涉及机器学习中的预测模型,如DNN,即深度神经网络)中模型的准确率(Accuracy)。其中: - TP (True Positive): 真正例,预测为正且实际为正的样本数。 - TN (True Negative): 真负例,预测为负且实际为负的样本数。 - FP (False Positive): 假正例,预测为正但实际为负的样本数。 - FN (False Negative): 假负例,预测为负但实际为正的样本数。 公式 `(TP_DNN + TN_DNN) / (TP_DNN + TN_DNN + FP_DNN + FN_DNN)` 就是将正确预测(TP和TN)的总数除以所有预测总数(包括正确和错误预测),得到分类器在给定数据集上的准确率。 为了增加一个固定的阈值为0.5(通常用于二元分类问题中的概率阈值决策),我们通常会这样操作: 1. 对于模型输出的概率或预测概率进行比较。如果预测概率大于等于0.5,我们可以将其分类为正例(例如1),否则为负例(例如0)。 2. 计算新的TP、TN、FP和FN,基于新的分类标准。 这是一个示例代码片段: ```python # 假设model_output是DNN模型的输出概率列表 threshold = 0.5 new_predictions = [1 if prob >= threshold else 0 for prob in model_output] # 更新计数 TP_new = sum([1 for pred, actual in zip(new_predictions, labels) if pred == 1 and actual == 1]) TN_new = sum([1 for pred, actual in zip(new_predictions, labels) if pred == 0 and actual == 0]) FP_new = sum([1 for pred, actual in zip(new_predictions, labels) if pred == 1 and actual == 0]) FN_new = sum([1 for pred, actual in zip(new_predictions, labels) if pred == 0 and actual == 1]) # 新的精度计算 accuracy_DNN_thresholded = (TP_new + TN_new) / (TP_new + TN_new + FP_new + FN_new) print(f"Accuracy with 0.5 threshold: {accuracy_DNN_thresholded}") ``` 这里`labels`是对应的实际标签列表,`model_output`是模型预测的概率输出。这个新精度值就是基于0.5阈值后的模型性能。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

一个处女座的程序猿

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值