中文期刊分类结果记录

Model name: Intel(R) Xeon(R) CPU E5620 @ 2.40GHz
cpu cores : 4
Mem: 12005

baseline:贝叶斯

一层分类

标题|关键词|摘要
0 |0 |1

[ 0.74959006  0.75502539  0.75502539  0.75396741  0.75111088]
Mean score: 0.753 (+/-0.001)
74.9633948803
[ 0.73197567  0.73180279  0.72979264  0.73111511  0.72921075]
Mean score: 0.731 (+/-0.001)
112.939093828
[ 0.75212907  0.75534278  0.75528989  0.75259204  0.754708  ]
Mean score: 0.754 (+/-0.001)
75.9662179947
[ 0.75942872  0.76703343  0.76470588  0.76327761  0.76412399]
Mean score: 0.764 (+/-0.001)
75.9926490784
Accuracy on training set:
0.880809555549
Accuracy on testing set:
0.775131917197
Classification Report:
             precision    recall  f1-score   support

          1       0.65      0.88      0.75       650
          2       0.59      0.65      0.62       772
          3       0.55      0.58      0.57       877
          4       0.64      0.71      0.67       824
          5       0.85      0.79      0.82      1399
          6       0.77      0.81      0.79      2219
          7       0.54      0.72      0.62       851
          8       0.82      0.91      0.86       813
          9       0.92      0.91      0.91      2515
         10       0.89      0.81      0.85      1507
         11       0.84      0.82      0.83      1203
         12       0.44      0.59      0.50       496
         13       0.48      0.76      0.59      1056
         14       0.78      0.74      0.76      1931
         15       0.71      0.75      0.73      2412
         16       0.95      0.79      0.86      4112
         17       0.92      0.80      0.86      4240
         18       0.89      0.69      0.78      4374
         19       0.75      0.90      0.82      1080
         20       0.57      0.84      0.68      1145
         21       0.71      0.71      0.71      2479

avg / total       0.80      0.78      0.78     36955

Confusion Matrix:
[[ 573   37    4   14    5    1    5    1    3    2    3    2    0    0
     0    0    0    0    0    0    0]
 [ 105  502   43    6    2    2   13   11   24   11   25   17    1    1
     2    5    0    0    1    1    0]
 [  11   38  510   52   19   66   56   18   19   15   16   32    1    5
     0    3    0    1    3    1   11]
 [  66    9   62  581   11   46   17    1    1    1   16    3    1    0
     1    1    0    0    0    0    7]
 [  18   14   24   30 1108    0    8    4    7    4   29   38    3    0
     2    6    1   20   11   68    4]
 [  25    9   69   82    1 1800   81    5    0    0    4   21    0   19
     0    0    7    1   12    1   82]
 [  30   37   17   17    3   48  614   12   14   22   14   11    1    0
     0    7    0    0    0    0    4]
 [   0    9    2    1    0    2   17  737   34    2    6    3    0    0
     0    0    0    0    0    0    0]
 [  15   57    6    0    0    0    8   55 2283   57   32    2    0    0
     0    0    0    0    0    0    0]
 [   5   27    5    5   11    2   57    6   70 1223   19    2    7    0
     1    2    0   49    2   14    0]
 [   4   15   23   21    2   15   42    6   13   21  990    8    0   23
     0    0    0    0    8    0   12]
 [  14   57    6    2    9   14   21    4    8    3    2  295   11    2
     5    0    0    5   16   10   12]
 [   0    0    3    0    1    1   15    6    0    0    0   60  802    3
    24    8    7   62   19   36    9]
 [   0    2   10    8    6   24   31    1    1    1   20   31   34 1429
     8    0   19   19   28  117  142]
 [   1    9    2    4    2    1   26   14    0    1    1   18  121   68
  1814   76  124   10    5   10  105]
 [   0   23  100   34   23   49   31    7    0    1    0   13  101    3
   398 3254   31   19    2    3   20]
 [   0    0    1    1    0   28    5    0    0    2    2   13   52  120
   253   40 3412   54   19   37  201]
 [   1    0    6    1   61   13   56    9    0    4    1   53  442   38
    28    9   22 3037  126  373   94]
 [   0    0    5    1    2    5    3    0    0    0    0    5    6    3
     0    0    0   26  976   29   19]
 [   0    0    0    2   31    1    6    3    0    1    0   16   28    9
     1    0    2   60   19  957    9]
 [  16    2   28   40   14  207   26    0    0    0    4   33   48  104
    25    5   90   31   47   11 1748]]

0 | 1| 0

[ 0.74578154  0.74238256  0.74180068  0.74391663  0.73989632]
Mean score: 0.743 (+/-0.001)
10.5830168724
[ 0.72451732  0.72249259  0.72656581  0.7270419   0.72254549]
Mean score: 0.725 (+/-0.001)
21.9918198586
[ 0.74371859  0.73873254  0.74227677  0.74317605  0.73867964]
Mean score: 0.741 (+/-0.001)
10.532449007
[ 0.740968    0.73915573  0.73407744  0.73825645  0.73439484]
Mean score: 0.737 (+/-0.001)
10.6837120056
Accuracy on training set:
0.883063023032
Accuracy on testing set:
0.757061570258
Classification Report:
             precision    recall  f1-score   support

          1       0.62      0.82      0.71       752
          2       0.65      0.66      0.65      1183
          3       0.50      0.58      0.54       889
          4       0.56      0.65      0.61       819
          5       0.79      0.76      0.77      1446
          6       0.77      0.76      0.76      2217
          7       0.57      0.73      0.64       882
          8       0.81      0.89      0.85       868
          9       0.92      0.85      0.88      2966
         10       0.86      0.80      0.83      1780
         11       0.82      0.82      0.82      1531
         12       0.44      0.59      0.50       598
         13       0.47      0.73      0.57      1078
         14       0.74      0.75      0.75      1924
         15       0.71      0.71      0.71      2566
         16       0.93      0.79      0.85      4329
         17       0.90      0.79      0.84      4248
         18       0.87      0.70      0.77      4357
         19       0.71      0.86      0.78      1077
         20       0.59      0.80      0.68      1215
         21       0.69      0.71      0.70      2466

avg / total       0.78      0.76      0.76     39191

Confusion Matrix:
[[ 620   57    9   23    5    8    2    3    9    5    3    7    0    0
     0    0    0    0    0    0    1]
 [ 152  776   54   18    6    5   17   14   36   15   36   40    0    0
     2    9    0    0    1    0    2]
 [  13   39  513   62   20   70   48    6   16   16   23   27    1    2
     1    5    1    1    7    2   16]
 [  58   12   70  536   16   49   27    3    2    3   21    4    0    1
     1    7    0    0    2    0    7]
 [  28    7   21   32 1092    5   12    7    4    7   31   38   13    4
     1    4    1   35   21   79    4]
 [  22    8  110  112   10 1679   79    6    0    2    5   26    2   22
     0    1    5    2   10    4  112]
 [  15   23   21   25    9   32  642   13   12   30   21   24    2    1
     1    5    0    0    1    2    3]
 [   1    8    3    1    3    2   18  773   34    4   10    8    0    0
     0    0    0    1    1    1    0]
 [  27  103   24    6    9    1   27   87 2515   93   53   11    0    1
     2    2    2    0    1    2    0]
 [  10   33   14    8   15    2   53   11   85 1423   22    8   10    1
     0    3    1   59    1   20    1]
 [   6   27   36   26    6   15   43    8   15   26 1249    8    3   30
     1    0    2    1    3    0   26]
 [  21   43    8    7   14   12   25    2    3    3    5  353   26   10
     3    5    3    8   17   16   14]
 [   0    0    7    0    6    5    7    3    0    1    0   43  783    9
    25   15    8   76   36   43   11]
 [   3    3    5    6    6   24   12    0    1    4   22   23   33 1436
    14    0   35   29   40  104  124]
 [   4   16    1    4    2    6   12    7    0    3    2   22  112   89
  1822  122  161   34    8   13  126]
 [   0   42   90   30   32   43   42    1    3    3    1   19  112    3
   371 3421   52   23    3    7   31]
 [   0    0    2    2    2   39    5    1    0    0    1    9   67  117
   274   64 3340   65   20   26  214]
 [   1    2   13    2   72   16   34    9    0   13    2   69  404   52
    35   12   24 3044  141  331   81]
 [   1    0    3    2    5    8    3    0    0    0    1   13   17   14
     1    0    0   30  929   33   17]
 [   0    0    2    2   28    2    6    0    0    1    0   17   38   25
     2    1    1   69   32  977   12]
 [  17    1   20   46   17  169   21    2    1    1    8   35   53  114
    25    7   92   38   43    9 1747]]

1|0|0

[ 0.74012166  0.74238256  0.74714346  0.74513331  0.74560939]
Mean score: 0.744 (+/-0.001)
9.13238406181
[ 0.71827559  0.72132882  0.72587812  0.72148752  0.72185781]
Mean score: 0.722 (+/-0.001)
19.7019441128
[ 0.74419466  0.74640288  0.74920652  0.74761955  0.74777825]
Mean score: 0.747 (+/-0.001)
9.13390898705
[ 0.73837609  0.73725138  0.73931443  0.73894414  0.74026661]
Mean score: 0.739 (+/-0.001)
9.22805905342
Accuracy on training set:
0.886723585235
Accuracy on testing set:
0.756259567733
Classification Report:
             precision    recall  f1-score   support

          1       0.61      0.88      0.72       728
          2       0.67      0.64      0.65      1175
          3       0.46      0.58      0.51       790
          4       0.59      0.65      0.62       806
          5       0.76      0.74      0.75      1322
          6       0.76      0.77      0.77      2247
          7       0.53      0.68      0.59       863
          8       0.79      0.87      0.83       872
          9       0.90      0.86      0.88      2872
         10       0.85      0.79      0.82      1708
         11       0.82      0.81      0.82      1506
         12       0.34      0.56      0.42       425
         13       0.47      0.71      0.57      1075
         14       0.75      0.73      0.74      1924
         15       0.70      0.73      0.71      2580
         16       0.93      0.77      0.85      4309
         17       0.90      0.81      0.85      4234
         18       0.87      0.70      0.78      4355
         19       0.74      0.86      0.80      1071
         20       0.60      0.80      0.69      1193
         21       0.69      0.68      0.68      2486

avg / total       0.78      0.76      0.76     38541

Confusion Matrix:
[[ 637   42    3   13    8    1    6    0    7    1    0   10    0    0
     0    0    0    0    0    0    0]
 [ 145  747   62   13    9    5   19   17   35   17   37   43    3    0
     4   14    0    1    1    3    0]
 [  16   26  456   63   15   56   41    7   21   11   20   30    2    3
     0    4    1    2    3    3   10]
 [  54   10   68  522   21   55   31    3    4    4   17    6    0    0
     0    2    0    0    1    0    8]
 [  40   12   35   31  980    7   17    5   12   10   31   45    7    3
     1    6    0   18   10   45    7]
 [  20    9   94   92    9 1738   97    4    0    1   11   22    3   23
     1    2    7    6    7    2   99]
 [  22   37   27   16    7   40  583   18   12   30   26   25    0    2
     0   14    0    1    0    2    1]
 [   9    9    8    3    2    3   18  759   44    6    4    5    1    0
     0    0    0    0    0    1    0]
 [  36   71   10    5    3    2   23   89 2484   81   53    4    3    1
     1    1    0    1    3    1    0]
 [  11   30   18    3   15    7   56   12   88 1349   27   10   10    2
     3    3    0   50    2   11    1]
 [  10   24   27   21    9   18   43    7   26   26 1227    7    3   32
     3    0    2    1    5    0   15]
 [  13   47    9    2    7   14   17    9    3    2    1  237   16    8
     1    1    0    8    5   12   13]
 [   0    3    2    0    7    1   13    2    1    4    1   67  760    5
    31   19    9   76   26   39    9]
 [   2    2   13    6    6   27   16    2    3    5   20   22   27 1408
    23    1   30   32   35  104  140]
 [   1   10    3    3    3    2   13   13    1    2    3   19  110   75
  1885  106  165   26    8    8  124]
 [   0   28   80   44   34   43   39    5    3    4    1   15  119    3
   429 3336   43   31    6   11   35]
 [   0    0    3    4    1   25    6    0    0    0    4   10   68   99
   250   49 3431   53   15   31  185]
 [   2    2   21    3   86   14   39    6    0   34    3   43  384   51
    46   14   29 3040  146  306   86]
 [   1    0    2    1   10    4    3    1    0    0    1   13   15   12
     0    0    1   26  925   39   17]
 [   1    0    6    0   41    4    1    2    2    1    0   24   38   14
     1    3    1   74   18  956    6]
 [  16    1   36   42   10  222   21    0    3    0   13   38   44  134
    25    8  108   33   36    9 1687]]

1|1|0

[ 0.77651415  0.77729581  0.77094795  0.7779835   0.77290521]
Mean score: 0.775 (+/-0.001)
10.7098929882
[ 0.76064533  0.76052687  0.75597757  0.76211384  0.75724714]
Mean score: 0.759 (+/-0.001)
20.93759799
[ 0.77667284  0.77734871  0.77195303  0.7776661   0.77533855]
Mean score: 0.776 (+/-0.001)
10.896999836
[ 0.7704311   0.77174143  0.7687262   0.77264071  0.7732755 ]
Mean score: 0.771 (+/-0.001)
11.051887989
Accuracy on training set:
0.897483098994
Accuracy on testing set:
0.784648621172
Classification Report:
             precision    recall  f1-score   support

          1       0.66      0.89      0.76       729
          2       0.71      0.70      0.70      1187
          3       0.54      0.65      0.59       894
          4       0.64      0.68      0.66       815
          5       0.83      0.82      0.82      1456
          6       0.77      0.79      0.78      2257
          7       0.60      0.73      0.66       893
          8       0.84      0.91      0.87       895
          9       0.93      0.90      0.91      2983
         10       0.89      0.82      0.86      1781
         11       0.85      0.83      0.84      1528
         12       0.50      0.59      0.54       602
         13       0.49      0.74      0.59      1073
         14       0.77      0.76      0.76      1940
         15       0.74      0.75      0.75      2602
         16       0.94      0.80      0.86      4351
         17       0.90      0.82      0.86      4227
         18       0.88      0.72      0.79      4359
         19       0.75      0.91      0.82      1063
         20       0.64      0.84      0.73      1250
         21       0.69      0.70      0.70      2460

avg / total       0.80      0.78      0.79     39345

Confusion Matrix:
[[ 650   30    3   15    9    4    0    0    7    4    3    3    0    0
     0    0    0    0    0    0    1]
 [ 143  827   52    9    4    3   16   13   26   18   29   33    0    0
     2   11    0    0    0    0    1]
 [  13   32  583   45   15   58   32    5   11   12   23   32    2    5
     0    6    2    0    4    0   14]
 [  50   12   75  558   11   47   25    1    6    0   15    4    2    1
     2    0    0    1    1    0    4]
 [  26    5   17   24 1195    5    6    4    6    3   17   23    8    3
     1    6    0   27    9   67    4]
 [  19   14  102   93    8 1779   75    1    0    0    5   19    1   20
     1    1    4    0   10    2  103]
 [  27   20   25   15    5   40  654   15   14   29   20   15    1    1
     0    9    0    0    0    1    2]
 [   1    8    4    3    1    0   17  814   31    2   10    2    0    0
     1    0    0    0    0    0    1]
 [  14   65   15    0    6    2   17   75 2672   70   38    7    0    1
     0    0    0    0    1    0    0]
 [   1   33    7    4   20    8   45    5   68 1466   23    4   11    1
     1    2    0   60    0   21    1]
 [   3   26   36   24    8   31   32   12   14   17 1264    4    2   30
     3    0    1    0    5    0   16]
 [  19   52   14    4   17   11   23    1    6    6    5  358   17    2
     3    3    1    5   25   11   19]
 [   0    1    3    0    5    3   13    3    0    0    0   55  793    9
    19   13    8   68   39   32    9]
 [   0    2    8    5    8   35   21    4    0    1   18   26   27 1467
    10    1   27   19   30   87  144]
 [   1    5    2    5    1    4   13    6    0    0    0   20   93   68
  1953  100  164   21    2    9  135]
 [   0   32   95   28   31   51   28    1    1    2    2   14  103    3
   358 3479   57   22    4    9   31]
 [   0    0    0    0    1   26    3    1    0    0    1    2   58  104
   220   58 3467   59   13   16  198]
 [   0    2   16    3   54    7   36    4    0   15    1   54  396   43
    34   17   30 3132  121  317   77]
 [   0    0    1    1    2    3    2    0    0    0    2    3    8   10
     2    1    0   26  972   19   11]
 [   0    0    3    0   28    2    2    2    0    1    0   11   35   19
     1    1    1   63   19 1055    7]
 [  13    2   22   41   12  203   21    3    2    0    6   26   48  113
    26    5   87   40   42   14 1734]]

1|1|1

[ 0.77598519  0.77158273  0.7779835   0.77084215  0.77830089]
Mean score: 0.775 (+/-0.002)
49.5327510834
[ 0.75868818  0.75502539  0.75856961  0.75343843  0.76243123]
Mean score: 0.758 (+/-0.002)
68.0406198502
[ 0.77937054  0.77147694  0.78189801  0.7730639   0.78036394]
Mean score: 0.777 (+/-0.002)
50.9235990047
[ 0.78709336  0.78115743  0.78935675  0.77972916  0.78935675]
Mean score: 0.785 (+/-0.002)
51.271859169
Accuracy on training set:
0.887961405402
Accuracy on testing set:
0.791565656566
Classification Report:
             precision    recall  f1-score   support

          1       0.65      0.92      0.76       747
          2       0.70      0.68      0.69      1217
          3       0.54      0.58      0.56       902
          4       0.65      0.70      0.67       837
          5       0.87      0.82      0.84      1459
          6       0.80      0.81      0.81      2256
          7       0.57      0.74      0.64       892
          8       0.84      0.93      0.88       897
          9       0.94      0.92      0.93      2998
         10       0.91      0.87      0.89      1789
         11       0.87      0.82      0.84      1539
         12       0.50      0.64      0.56       607
         13       0.49      0.75      0.59      1090
         14       0.78      0.74      0.76      1943
         15       0.74      0.77      0.75      2608
         16       0.95      0.80      0.87      4335
         17       0.92      0.82      0.87      4254
         18       0.90      0.71      0.79      4386
         19       0.79      0.93      0.85      1089
         20       0.60      0.84      0.70      1248
         21       0.70      0.73      0.71      2507

avg / total       0.81      0.79      0.80     39600

Confusion Matrix:
[[ 684   37    2    5    0    0    3    1    6    1    6    2    0    0
     0    0    0    0    0    0    0]
 [ 166  827   60   14    3    6   14    8   24   16   44   25    1    0
     1    7    0    0    1    0    0]
 [  17   39  523   57   18   60   54   17   19   19   13   35    1    6
     1    3    0    2    2    0   16]
 [  59    8   64  587    6   46   27    2    2    0   18    3    1    0
     0    5    0    0    2    0    7]
 [  27   13   14   19 1192    6    9    4    8    4   27   22    5    0
     0    9    0   11   17   69    3]
 [  17    8   76   75    1 1832   74    4    1    2    3   19    0   20
     0    2    3    0   11    2  106]
 [  27   35   23   25    0   32  656   18   15   20   16   11    0    1
     0   10    0    0    1    0    2]
 [   1    6    2    1    0    0   12  831   36    2    4    2    0    0
     0    0    0    0    0    0    0]
 [  12   59    9    0    1    0    9   71 2757   55   22    2    0    0
     0    0    0    0    1    0    0]
 [   6    9    2    3    7    5   44    4   48 1558   12    3    4    1
     0    3    0   59    0   21    0]
 [   1   26   39   22    1   27   55    1   11   16 1268   10    4   30
     0    0    1    1    8    0   18]
 [  19   60    4    2    6    7   20    6    8    5    5  389   12    2
     1    1    1    2   15   20   22]
 [   0    3    2    0    4    6   18    5    0    0    0   81  822    3
    19   13    6   55   17   25   11]
 [   0    4   12    4    7   22   26    1    0    0   15   28   28 1447
     5    0   17   20   31  114  162]
 [   1    9    3    4    0    0   18   12    0    3    0   17  111   72
  2008   71  133   16    5    8  117]
 [   0   30   88   40   25   30   23    0    1    1    2   12   96    3
   410 3481   36   24    1    6   26]
 [   0    0    1    2    1   20    8    0    0    0    2    8   57  104
   227   44 3475   52   15   24  214]
 [   0    1   14    0   63    9   45    6    0    4    3   58  429   34
    25   11   23 3102   96  377   86]
 [   0    0    1    2    0    3    3    0    0    0    0    5    6    6
     1    0    0   25 1014   16    7]
 [   1    0    3    1   26    0    1    2    1    0    0   11   45   17
     1    0    1   63   15 1052    8]
 [  10    2   18   42   11  178   24    2    1    0    4   31   56  103
    24    5   84   26   39    6 1841]]

1|2|0

[ 0.76847395  0.77454507  0.76904359  0.76941388  0.77290521]
Mean score: 0.771 (+/-0.001)
13.7860519886
[ 0.74927268  0.75613627  0.75121667  0.75126957  0.7545493 ]
Mean score: 0.752 (+/-0.001)
26.2482299805
[ 0.76678127  0.77359289  0.76560516  0.76941388  0.76951968]
Mean score: 0.769 (+/-0.001)
13.9964339733
[ 0.76461254  0.77084215  0.76169065  0.76502328  0.76888489]
Mean score: 0.766 (+/-0.002)
14.1476941109
Accuracy on training set:
0.894319780789
Accuracy on testing set:
0.774996836644
Classification Report:
             precision    recall  f1-score   support

          1       0.68      0.87      0.76       748
          2       0.68      0.69      0.69      1202
          3       0.50      0.58      0.54       905
          4       0.60      0.67      0.64       826
          5       0.82      0.83      0.82      1465
          6       0.78      0.77      0.78      2248
          7       0.55      0.69      0.61       890
          8       0.83      0.90      0.86       892
          9       0.93      0.88      0.90      3003
         10       0.87      0.81      0.84      1786
         11       0.85      0.83      0.84      1531
         12       0.46      0.60      0.52       614
         13       0.48      0.74      0.58      1073
         14       0.76      0.75      0.75      1934
         15       0.74      0.75      0.74      2612
         16       0.94      0.80      0.86      4336
         17       0.90      0.81      0.86      4254
         18       0.88      0.70      0.78      4369
         19       0.74      0.89      0.81      1085
         20       0.61      0.83      0.70      1246
         21       0.70      0.71      0.71      2496

avg / total       0.79      0.77      0.78     39515

Confusion Matrix:
[[ 651   50    3   13    5    6    5    1    6    1    2    4    0    0
     0    0    0    0    0    0    1]
 [ 131  828   60   18    5    2   18    8   29   10   38   39    3    1
     4    7    0    0    0    0    1]
 [  16   35  522   66   36   55   56    4   14   15   24   33    2    4
     0    4    0    1    1    1   16]
 [  52   20   73  554   15   49   23    2    1    2   11    1    0    1
     2    4    0    0    1    0   15]
 [  12    6   26   23 1213    1   12    1    5    8   12   32    7    2
     1   10    1   15   15   60    3]
 [  19    8  100   84    8 1742   84    7    0    0    5   17    2   28
     1    1   10    3    7    4  118]
 [  16   30   29   27    6   43  618   23   17   25   16   21    0    0
     0   12    0    0    1    0    6]
 [   0    8    3    1    0    1   16  805   33    6    8   10    0    0
     0    0    1    0    0    0    0]
 [  16   84   18    4    7    1   19   74 2637   78   54    8    0    0
     0    2    0    0    1    0    0]
 [   7   32   10    4   19    5   72    6   73 1455   20    8    6    1
     0    0    0   50    1   16    1]
 [   5   28   36   27    6   16   34    7   22   26 1274    5    1   21
     4    0    0    1    2    0   16]
 [  20   38    8    4   12   11   21    8    6    5    2  371   32    8
     1    4    4    2   21   16   20]
 [   0    1    0    1    5    2   18    4    0    1    0   50  792    6
    15   12    9   81   28   33   15]
 [   0    3   10   10    8   26   26    2    3    2   21   32   32 1445
    16    1   29   19   35   98  116]
 [   1    7    2    2    1    4   12    8    0    3    0   21  119   78
  1954   96  148   26    7    6  117]
 [   0   32   87   30   20   55   35    1    1    7    2   18  105    7
   371 3455   47   25    3    8   27]
 [   0    0    4    2    0   30    4    0    1    0    1    2   63  116
   220   48 3466   42   25   24  206]
 [   1    1   13    4   72    6   30    6    0   16    1   74  412   43
    32    9   33 3068  125  351   72]
 [   0    0    3    0   10    5    1    0    0    0    3   10    9   11
     0    0    0   37  961   26    9]
 [   0    0    4    2   29    2    1    2    0    3    0   15   27   20
     0    0    2   77   15 1033   14]
 [  12    2   30   41   10  164   20    2    1    0    5   38   55  118
    33    6   85   41   44    9 1780]]

2|1|0

[ 0.77677863  0.77872408  0.7776132   0.77486246  0.78099873]
Mean score: 0.778 (+/-0.001)
13.3402330875
[ 0.75985189  0.75719424  0.75841092  0.75634786  0.76084427]
Mean score: 0.759 (+/-0.001)
26.3892529011
[ 0.77619677  0.77623783  0.77496826  0.77671392  0.7778777 ]
Mean score: 0.776 (+/-0.000)
13.3706729412
[ 0.77656705  0.77047186  0.77279941  0.77279941  0.77539145]
Mean score: 0.774 (+/-0.001)
13.6409230232
Accuracy on training set:
0.898202515843
Accuracy on testing set:
0.786110337494
Classification Report:
             precision    recall  f1-score   support

          1       0.68      0.88      0.77       755
          2       0.71      0.69      0.70      1200
          3       0.53      0.64      0.58       905
          4       0.63      0.69      0.66       823
          5       0.83      0.82      0.82      1456
          6       0.79      0.79      0.79      2241
          7       0.58      0.72      0.65       876
          8       0.84      0.90      0.87       900
          9       0.93      0.89      0.91      3004
         10       0.89      0.83      0.86      1783
         11       0.84      0.84      0.84      1526
         12       0.46      0.58      0.52       614
         13       0.49      0.74      0.59      1087
         14       0.78      0.77      0.77      1934
         15       0.73      0.76      0.75      2621
         16       0.94      0.80      0.87      4339
         17       0.90      0.82      0.86      4257
         18       0.89      0.73      0.80      4368
         19       0.76      0.89      0.82      1071
         20       0.65      0.83      0.73      1247
         21       0.70      0.72      0.71      2490

avg / total       0.80      0.79      0.79     39497

Confusion Matrix:
[[ 664   45    2   11   11    2    5    0    5    3    4    2    0    0
     0    0    0    0    0    0    1]
 [ 143  824   51   10    5    8   20   23   28   11   31   35    1    0
     2    7    0    0    1    0    0]
 [  12   29  578   54   23   69   38    6   16   10   22   21    4    3
     0    7    2    1    0    0   10]
 [  51    7   63  569   16   43   29    2    2    1   17    4    0    0
     5    2    0    1    2    0    9]
 [  21    9   27   22 1198    4   13    1   10    5   18   22    7    2
     1    8    0   22   11   53    2]
 [  12    5   92   96    2 1772   71    2    0    3    7   20    1   26
     1    0    5    2   10    4  110]
 [  14   22   33   14    2   44  635   16    8   23   24   18    1    0
     4   12    0    0    1    0    5]
 [   1    7    5    3    1    2   15  808   34   10    3    9    0    0
     0    0    0    1    0    0    1]
 [  22   64   11    0    5    2   18   71 2669   80   51   10    0    0
     0    0    0    0    0    1    0]
 [   7   25   15    1   21    5   44    5   60 1488   12    9    5    2
     0    2    0   65    2   15    0]
 [   2   25   24   29    3   19   48    6   18   15 1276   13    0   26
     0    0    0    0    5    0   17]
 [  13   50   15    3   11   12   19    6    5    6    7  356   29    6
     1    6    3    7   17   21   21]
 [   0    1    8    0    3    1   14    3    0    1    0   64  801    6
    27   11    7   72   29   26   13]
 [   0    5    9    6    7   15   22    2    3    2   26   29   23 1491
    12    0   31   23   31   85  112]
 [   2    4    5    3    3    1   12    6    1    2    1   19  101   76
  1991   89  145   20    6    7  127]
 [   0   33   93   30   25   54   23    2    1    3    1   12  109    4
   357 3478   51   23    2    6   32]
 [   0    0    4    4    1   28    4    0    0    0    2    6   57   86
   241   36 3501   49   14   19  205]
 [   0    0   21    4   61    5   34    4    0   15    3   61  388   36
    31   16   25 3168  116  293   87]
 [   0    0    8    1    6    7    1    0    0    0    2    8   14    9
     3    0    2   30  951   19   10]
 [   0    0    5    0   37    0    1    2    1    1    0   14   37   18
     0    2    2   59   23 1035   10]
 [  10    1   16   37   11  158   25    1    0    0   11   35   51  123
    33    5   99   34   30   14 1796]]

二层分类

0|0|1

[ 0.83840304  0.82604563  0.84205519  0.81446242  0.81731684]
Mean score: 0.828 (+/-0.006)
6.32944202423
[ 0.79752852  0.80228137  0.80589914  0.77830637  0.7925785 ]
Mean score: 0.795 (+/-0.005)
8.90074586868
[ 0.82604563  0.81939163  0.83539486  0.80209324  0.81921979]
Mean score: 0.820 (+/-0.005)
5.76560211182
[ 0.84220532  0.83079848  0.84681256  0.82112274  0.83158896]
Mean score: 0.835 (+/-0.005)
5.89748215675
Accuracy on training set:
0.956439033669
Accuracy on testing set:
0.826347305389
Classification Report:
             precision    recall  f1-score   support

          1       0.88      0.95      0.91       221
          2       0.77      0.72      0.74       260
          3       0.84      0.72      0.77       315
          4       0.79      0.88      0.83       261
          5       0.91      0.88      0.89       269
          6       0.85      0.85      0.85       257
          7       0.76      0.84      0.80       254

avg / total       0.83      0.83      0.83      1837

Confusion Matrix:
[[210   2   2   5   0   2   0]
 [ 11 186  12   6   0   4  41]
 [ 10  18 226  30  11   7  13]
 [  2   8  11 229   0   3   8]
 [  2   0  11   2 236  15   3]
 [  3   6   2  15  10 218   3]

0|1|0

[ 0.81653992  0.82509506  0.82017127  0.80970504  0.79543292]
Mean score: 0.813 (+/-0.005)
0.627457141876
[ 0.80893536  0.81368821  0.80304472  0.80589914  0.78782112]
Mean score: 0.804 (+/-0.004)
2.57228112221
[ 0.81749049  0.82319392  0.81541389  0.81351094  0.80304472]
Mean score: 0.815 (+/-0.003)
0.586635828018
[ 0.79942966  0.79942966  0.79923882  0.77545195  0.77925785]
Mean score: 0.791 (+/-0.005)
0.58501291275
Accuracy on training set:
0.951493247099
Accuracy on testing set:
0.811499703616
Classification Report:
             precision    recall  f1-score   support

          1       0.89      0.90      0.89       200
          2       0.75      0.74      0.74       245
          3       0.81      0.70      0.75       293
          4       0.79      0.84      0.81       222
          5       0.86      0.90      0.88       260
          6       0.86      0.82      0.84       257
          7       0.74      0.83      0.78       210

avg / total       0.81      0.81      0.81      1687

Confusion Matrix:
[[179  12   5   1   1   1   1]
 [ 10 181   9   5   3   4  33]
 [  8  17 204  19  18  10  17]
 [  2   8  19 187   0   3   3]
 [  1   0  10   1 234  12   2]
 [  0   9   1  21  10 210   6]
 [  2  15   5   4   5   5 174]]

1|0|0

[ 0.80893536  0.79657795  0.78972407  0.7925785   0.80589914]
Mean score: 0.799 (+/-0.004)
0.649480104446
[ 0.79657795  0.77946768  0.75261656  0.77640343  0.7811608 ]
Mean score: 0.777 (+/-0.007)
2.45231199265
[ 0.81273764  0.79942966  0.7773549   0.79923882  0.80780209]
Mean score: 0.799 (+/-0.006)
0.683506965637
[ 0.78992395  0.77376426  0.77450048  0.7773549   0.76879163]
Mean score: 0.777 (+/-0.004)
0.656605005264
Accuracy on training set:
0.951112801978
Accuracy on testing set:
0.800578034682
Classification Report:
             precision    recall  f1-score   support

          1       0.88      0.88      0.88       216
          2       0.74      0.62      0.67       255
          3       0.79      0.74      0.76       318
          4       0.76      0.85      0.80       228
          5       0.90      0.89      0.90       265
          6       0.83      0.80      0.82       237
          7       0.70      0.85      0.77       211

avg / total       0.80      0.80      0.80      1730

Confusion Matrix:
[[191  12   2   3   0   5   3]
 [ 16 158  25  12   0   3  41]
 [  5  14 236  20  15  15  13]
 [  2   5  15 194   0   3   9]
 [  1   2  12   2 237   7   4]
 [  1   9   4  20   6 190   7]
 [  0  14   5   3   4   6 179]]

1|1|1

[ 0.84410646  0.83555133  0.81255947  0.82778306  0.84205519]
Mean score: 0.832 (+/-0.006)
4.53355813026
[ 0.81653992  0.81368821  0.78782112  0.80209324  0.82017127]
Mean score: 0.808 (+/-0.006)
6.23939990997
[ 0.84315589  0.83269962  0.81636537  0.82017127  0.84395814]
Mean score: 0.831 (+/-0.006)
3.91611194611
[ 0.84885932  0.85456274  0.83349191  0.83158896  0.83729781]
Mean score: 0.841 (+/-0.004)
3.94948601723
Accuracy on training set:
0.955487920867
Accuracy on testing set:
0.839546191248
Classification Report:
             precision    recall  f1-score   support

          1       0.91      0.95      0.93       222
          2       0.80      0.74      0.77       263
          3       0.84      0.77      0.80       318
          4       0.83      0.89      0.86       260
          5       0.89      0.88      0.89       268
          6       0.86      0.83      0.84       258
          7       0.76      0.84      0.80       262

avg / total       0.84      0.84      0.84      1851

Confusion Matrix:
[[212   4   2   3   0   0   1]
 [  6 194  13   2   2   4  42]
 [  7  12 244  20  10  13  12]
 [  1   8  12 232   0   3   4]
 [  1   0  13   1 237   9   7]
 [  5   7   0  16  11 215   4]
 [  1  19   5   5   5   7 220]]

1|1|0

[ 0.84885932  0.84030418  0.82017127  0.82683159  0.83824929]
Mean score: 0.835 (+/-0.005)
0.653192043304
[ 0.82604563  0.83365019  0.81160799  0.81065652  0.83254044]
Mean score: 0.823 (+/-0.005)
2.51483106613
[ 0.8460076   0.84695817  0.82017127  0.81731684  0.84681256]
Mean score: 0.835 (+/-0.007)
0.614599943161
[ 0.8365019   0.83079848  0.81921979  0.81065652  0.83444339]
Mean score: 0.826 (+/-0.005)
0.611855983734
Accuracy on training set:
0.955107475747
Accuracy on testing set:
0.803468208092
Classification Report:
             precision    recall  f1-score   support

          1       0.92      0.88      0.90       196
          2       0.72      0.71      0.71       236
          3       0.79      0.67      0.73       307
          4       0.76      0.88      0.82       247
          5       0.85      0.87      0.86       251
          6       0.85      0.83      0.84       248
          7       0.76      0.82      0.79       245

avg / total       0.81      0.80      0.80      1730

Confusion Matrix:
[[173  14   3   3   2   0   1]
 [  7 168  15  10   2   5  29]
 [  4  16 205  32  21   8  21]
 [  2   5  19 218   0   2   1]
 [  1   0  10   2 219  14   5]
 [  1   9   1  15  10 205   7]
 [  1  22   5   5   4   6 202]]

1|2|0

[ 0.81178707  0.8269962   0.82873454  0.84966698  0.86108468]
Mean score: 0.836 (+/-0.009)
0.894863843918
[ 0.81653992  0.81463878  0.80780209  0.83444339  0.83824929]
Mean score: 0.822 (+/-0.006)
2.87404513359
[ 0.81939163  0.82509506  0.82112274  0.84966698  0.85442436]
Mean score: 0.834 (+/-0.007)
0.771161794662
[ 0.80228137  0.83745247  0.80209324  0.82873454  0.83920076]
Mean score: 0.822 (+/-0.008)
0.770865917206
Accuracy on training set:
0.953395472703
Accuracy on testing set:
0.820920977828
Classification Report:
             precision    recall  f1-score   support

          1       0.92      0.93      0.92       209
          2       0.74      0.73      0.73       251
          3       0.80      0.77      0.79       301
          4       0.78      0.82      0.80       261
          5       0.90      0.88      0.89       252
          6       0.88      0.82      0.85       259
          7       0.74      0.81      0.78       226

avg / total       0.82      0.82      0.82      1759

Confusion Matrix:
[[194   9   2   2   0   1   1]
 [  9 182  17   7   0   1  35]
 [  2  13 233  24  13   6  10]
 [  5  10  17 215   0   5   9]
 [  1   1  12   2 223   9   4]
 [  1  12   2  19   8 213   4]
 [  0  18   8   5   3   8 184]]

2|1|0

[ 0.8108365   0.85456274  0.82302569  0.81636537  0.81921979]
Mean score: 0.825 (+/-0.008)
0.980884075165
[ 0.80323194  0.84315589  0.81541389  0.80494767  0.80875357]
Mean score: 0.815 (+/-0.007)
2.84884095192
[ 0.81273764  0.85076046  0.82683159  0.81351094  0.81446242]
Mean score: 0.824 (+/-0.007)
0.8581199646
[ 0.80988593  0.8460076   0.82302569  0.80494767  0.80494767]
Mean score: 0.818 (+/-0.008)
0.853451013565
Accuracy on training set:
0.953205250143
Accuracy on testing set:
0.826704545455
Classification Report:
             precision    recall  f1-score   support

          1       0.87      0.94      0.90       203
          2       0.74      0.74      0.74       242
          3       0.80      0.74      0.77       308
          4       0.83      0.86      0.84       250
          5       0.91      0.91      0.91       262
          6       0.86      0.84      0.85       252
          7       0.78      0.80      0.79       243

avg / total       0.83      0.83      0.83      1760

Confusion Matrix:
[[190   7   4   0   0   2   0]
 [ 12 179   9   4   0   6  32]
 [  7  18 227  21  13   8  14]
 [  2   3  17 214   0   7   7]
 [  1   0  10   2 238   9   2]
 [  4  11   2  14   8 212   1]
 [  2  24  14   2   3   3 195]]

cnn+word_vec

二层测试
0|0|1

Total words: 30826
5257
Step #1, avg. loss: 2.60984
Step #11, avg. loss: 2.07501
Step #21, avg. loss: 2.03285
Step #31, avg. loss: 2.00902
Step #41, avg. loss: 1.97503
Step #51, avg. loss: 1.94583
Step #61, avg. loss: 1.90271
Step #71, avg. loss: 1.82990
Step #81, avg. loss: 1.73817
Step #91, avg. loss: 1.52507
Accuracy: 0.514970
Step #101, avg. loss: 1.14542
Step #111, avg. loss: 1.31164
Step #121, avg. loss: 1.18494
Step #131, avg. loss: 1.09571
Step #141, avg. loss: 0.99312
Step #151, avg. loss: 0.96255
Step #161, avg. loss: 0.80787
Step #171, avg. loss: 0.75417
Step #181, avg. loss: 0.70739
Step #191, avg. loss: 0.56529
Accuracy: 0.702232
Step #201, avg. loss: 0.32927
Step #211, avg. loss: 0.57769
Step #221, avg. loss: 0.46946
Step #231, avg. loss: 0.44901
Step #241, avg. loss: 0.37742
Step #251, avg. loss: 0.35916
Step #261, avg. loss: 0.28955
Step #271, avg. loss: 0.31245
Step #281, avg. loss: 0.29318
Step #291, avg. loss: 0.21372
Accuracy: 0.766467
Step #301, avg. loss: 0.14286
Step #311, avg. loss: 0.32342
Step #321, avg. loss: 0.20449
Step #331, avg. loss: 0.17505
Step #341, avg. loss: 0.11531
Step #351, avg. loss: 0.13806
Step #361, avg. loss: 0.11443
Step #371, avg. loss: 0.13264
Step #381, avg. loss: 0.16203
Step #391, avg. loss: 0.08565
Accuracy: 0.770822
Step #401, avg. loss: 0.04767
Step #411, avg. loss: 0.25601
Step #421, avg. loss: 0.10386
Step #431, avg. loss: 0.08169
Step #441, avg. loss: 0.04171
Step #451, avg. loss: 0.04919
Step #461, avg. loss: 0.04442
Step #471, avg. loss: 0.05746
Step #481, avg. loss: 0.08983
Step #491, avg. loss: 0.05419
Accuracy: 0.770278
Step #501, avg. loss: 0.02359
Step #511, avg. loss: 0.20114
Step #521, avg. loss: 0.07504
Step #531, avg. loss: 0.06430
Step #541, avg. loss: 0.02656
Step #551, avg. loss: 0.07178
Step #561, avg. loss: 0.03692
Step #571, avg. loss: 0.04512
Step #581, avg. loss: 0.08238
Step #591, avg. loss: 0.06619
Accuracy: 0.768645
Step #601, avg. loss: 0.01325
Step #611, avg. loss: 0.17883
Step #621, avg. loss: 0.04890
Step #631, avg. loss: 0.05085
Step #641, avg. loss: 0.01578
Step #651, avg. loss: 0.02053
Step #661, avg. loss: 0.03840
Step #671, avg. loss: 0.03358
Step #681, avg. loss: 0.06809
Step #691, avg. loss: 0.05815
Accuracy: 0.778443

0|1|0

Total words: 7628
5257
Step #1, avg. loss: 2.48088
Step #11, avg. loss: 2.18275
Step #21, avg. loss: 1.97011
Step #31, avg. loss: 1.75935
Step #41, avg. loss: 1.65008
Step #51, avg. loss: 1.51694
Step #61, avg. loss: 1.65903
Step #71, avg. loss: 1.26060
Step #81, avg. loss: 1.18436
Step #91, avg. loss: 1.21504
Accuracy: 0.601067
Step #101, avg. loss: 1.28028
Step #111, avg. loss: 1.58967
Step #121, avg. loss: 1.20759
Step #131, avg. loss: 1.05909
Step #141, avg. loss: 0.82816
Step #151, avg. loss: 0.60828
Step #161, avg. loss: 0.68530
Step #171, avg. loss: 0.52466
Step #181, avg. loss: 0.53533
Step #191, avg. loss: 0.62278
Accuracy: 0.704801
Step #201, avg. loss: 0.71187
Step #211, avg. loss: 0.70493
Step #221, avg. loss: 0.44868
Step #231, avg. loss: 0.49362
Step #241, avg. loss: 0.37702
Step #251, avg. loss: 0.41256
Step #261, avg. loss: 0.33509
Step #271, avg. loss: 0.28316
Step #281, avg. loss: 0.26350
Step #291, avg. loss: 0.84105
Accuracy: 0.640782
Step #301, avg. loss: 0.39877
Step #311, avg. loss: 0.84267
Step #321, avg. loss: 0.67489
Step #331, avg. loss: 0.27592
Step #341, avg. loss: 0.17912
Step #351, avg. loss: 0.40831
Step #361, avg. loss: 0.60725
Step #371, avg. loss: 0.44976
Step #381, avg. loss: 0.27388
Step #391, avg. loss: 1.01070
Accuracy: 0.670421
Step #401, avg. loss: 0.16547
Step #411, avg. loss: 1.40274
Step #421, avg. loss: 0.71987
Step #431, avg. loss: 0.50604
Step #441, avg. loss: 0.61115
Step #451, avg. loss: 0.34493
Step #461, avg. loss: 0.26330
Step #471, avg. loss: 0.37650
Step #481, avg. loss: 0.19608
Step #491, avg. loss: 0.58911
Accuracy: 0.697095
Step #501, avg. loss: 0.21833
Step #511, avg. loss: 0.92526
Step #521, avg. loss: 0.22602
Step #531, avg. loss: 0.31629
Step #541, avg. loss: 0.19652
Step #551, avg. loss: 0.32109
Step #561, avg. loss: 0.35139
Step #571, avg. loss: 0.31259
Step #581, avg. loss: 0.25307
Step #591, avg. loss: 0.26362
Accuracy: 0.659751
Step #601, avg. loss: 0.47940
Step #611, avg. loss: 0.43072
Step #621, avg. loss: 0.31848
Step #631, avg. loss: 0.18140
Step #641, avg. loss: 0.16179
Step #651, avg. loss: 0.19835
Step #661, avg. loss: 0.08420
Step #671, avg. loss: 0.10776
Step #681, avg. loss: 0.18294
Step #691, avg. loss: 0.15314
Accuracy: 0.676349
Step #701, avg. loss: 0.00230
Step #711, avg. loss: 0.25458
Step #721, avg. loss: 0.14851
Step #731, avg. loss: 0.14122
Step #741, avg. loss: 0.19839
Step #751, avg. loss: 0.21696
Step #761, avg. loss: 0.13286
Step #771, avg. loss: 0.11289
Step #781, avg. loss: 0.32290
Step #791, avg. loss: 0.42129
Accuracy: 0.692353
Step #801, avg. loss: 0.27962
Step #811, avg. loss: 0.18117
Step #821, avg. loss: 0.15873
Step #831, avg. loss: 0.17153
Step #841, avg. loss: 0.22355
Step #851, avg. loss: 0.18404
Step #861, avg. loss: 0.18554
Step #871, avg. loss: 0.25874
Step #881, avg. loss: 0.16779
Step #891, avg. loss: 0.47159
Accuracy: 0.659158

1|0|0

Total words: 6588
5257
Step #1, avg. loss: 2.48208
Step #11, avg. loss: 2.12532
Step #21, avg. loss: 1.95332
Step #31, avg. loss: 1.80085
Step #41, avg. loss: 1.72175
Step #51, avg. loss: 1.45604
Step #61, avg. loss: 1.42045
Step #71, avg. loss: 1.38859
Step #81, avg. loss: 1.25315
Step #91, avg. loss: 1.09426
Accuracy: 0.623121
Step #101, avg. loss: 1.28541
Step #111, avg. loss: 0.89145
Step #121, avg. loss: 0.83071
Step #131, avg. loss: 0.74202
Step #141, avg. loss: 0.96515
Step #151, avg. loss: 1.92484
Step #161, avg. loss: 0.79284
Step #171, avg. loss: 1.14903
Step #181, avg. loss: 0.70839
Step #191, avg. loss: 0.50760
Accuracy: 0.657225
Step #201, avg. loss: 0.55511
Step #211, avg. loss: 0.52444
Step #221, avg. loss: 0.45119
Step #231, avg. loss: 0.39653
Step #241, avg. loss: 0.54471
Step #251, avg. loss: 0.79104
Step #261, avg. loss: 0.52441
Step #271, avg. loss: 0.51329
Step #281, avg. loss: 0.38252
Step #291, avg. loss: 0.38485
Accuracy: 0.664740
Step #301, avg. loss: 0.28400
Step #311, avg. loss: 0.32971
Step #321, avg. loss: 0.31350
Step #331, avg. loss: 0.35479
Step #341, avg. loss: 0.31836
Step #351, avg. loss: 0.33444
Step #361, avg. loss: 0.27853
Step #371, avg. loss: 0.33334
Step #381, avg. loss: 0.24205
Step #391, avg. loss: 0.29117
Accuracy: 0.684971
Step #401, avg. loss: 0.48210
Step #411, avg. loss: 0.23184
Step #421, avg. loss: 0.18448
Step #431, avg. loss: 0.25891
Step #441, avg. loss: 0.17348
Step #451, avg. loss: 0.20772
Step #461, avg. loss: 0.22145
Step #471, avg. loss: 0.22147
Step #481, avg. loss: 0.19612
Step #491, avg. loss: 0.19054
Accuracy: 0.714451
Step #501, avg. loss: 0.11372
Step #511, avg. loss: 0.16056
Step #521, avg. loss: 0.14230
Step #531, avg. loss: 0.16356
Step #541, avg. loss: 0.16440
Step #551, avg. loss: 0.16510
Step #561, avg. loss: 0.18192
Step #571, avg. loss: 0.12127
Step #581, avg. loss: 0.16471
Step #591, avg. loss: 0.13285
Accuracy: 0.717919
Step #601, avg. loss: 0.09409
Step #611, avg. loss: 0.13332
Step #621, avg. loss: 0.09279
Step #631, avg. loss: 0.12646
Step #641, avg. loss: 0.11714
Step #651, avg. loss: 0.13965
Step #661, avg. loss: 0.11399
Step #671, avg. loss: 0.12594
Step #681, avg. loss: 0.09913
Step #691, avg. loss: 0.09211
Accuracy: 0.718497
Step #701, avg. loss: 0.01086
Step #711, avg. loss: 0.06280
Step #721, avg. loss: 0.05045
Step #731, avg. loss: 0.11921
Step #741, avg. loss: 0.08486
Step #751, avg. loss: 0.09542
Step #761, avg. loss: 0.06182
Step #771, avg. loss: 0.07520
Step #781, avg. loss: 0.08585
Step #791, avg. loss: 0.08004
Accuracy: 0.720809
Step #801, avg. loss: 0.01061
Step #811, avg. loss: 0.04389
Step #821, avg. loss: 0.04086
Step #831, avg. loss: 0.06485
Step #841, avg. loss: 0.05515
Step #851, avg. loss: 0.07109
Step #861, avg. loss: 0.06121
Step #871, avg. loss: 0.05771
Step #881, avg. loss: 0.09106
Step #891, avg. loss: 0.06207
Accuracy: 0.728902
Step #901, avg. loss: 0.00392
Step #911, avg. loss: 0.03961
Step #921, avg. loss: 0.05221
Step #931, avg. loss: 0.07133
Step #941, avg. loss: 0.05964
Step #951, avg. loss: 0.06250
Step #961, avg. loss: 0.07394
Step #971, avg. loss: 0.06803
Step #981, avg. loss: 0.06939
Step #991, avg. loss: 0.04112
Accuracy: 0.726012
Step #1001, avg. loss: 0.00457
Step #1011, avg. loss: 0.04980
Step #1021, avg. loss: 0.04075
Step #1031, avg. loss: 0.04502
Step #1041, avg. loss: 0.04697
Step #1051, avg. loss: 0.07540
Step #1061, avg. loss: 0.05483
Step #1071, avg. loss: 0.04097
Step #1081, avg. loss: 0.07129
Step #1091, avg. loss: 0.06268
Accuracy: 0.729480
Step #1101, avg. loss: 0.00607
Step #1111, avg. loss: 0.03477
Step #1121, avg. loss: 0.07943
Step #1131, avg. loss: 0.04567
Step #1141, avg. loss: 0.05176
Step #1151, avg. loss: 0.07055
Step #1161, avg. loss: 0.05182
Step #1171, avg. loss: 0.03886
Step #1181, avg. loss: 0.06006
Step #1191, avg. loss: 0.03582
Accuracy: 0.727746
Step #1201, avg. loss: 0.00582
Step #1211, avg. loss: 0.04950
Step #1221, avg. loss: 0.05012
Step #1231, avg. loss: 0.04747
Step #1241, avg. loss: 0.07981
Step #1251, avg. loss: 0.06798
Step #1261, avg. loss: 0.04589
Step #1271, avg. loss: 0.03573
Step #1281, avg. loss: 0.06568
Step #1291, avg. loss: 0.04664
Accuracy: 0.732370
Step #1301, avg. loss: 0.00652
Step #1311, avg. loss: 0.03294
Step #1321, avg. loss: 0.03977
Step #1331, avg. loss: 0.03853
Step #1341, avg. loss: 0.05304
Step #1351, avg. loss: 0.07061
Step #1361, avg. loss: 0.05539
Step #1371, avg. loss: 0.06041
Step #1381, avg. loss: 0.05625
Step #1391, avg. loss: 0.04891
Accuracy: 0.713295
Step #1401, avg. loss: 0.00742
Step #1411, avg. loss: 0.04046
Step #1421, avg. loss: 0.11138
Step #1431, avg. loss: 0.08290
Step #1441, avg. loss: 0.05174
Step #1451, avg. loss: 0.07043
Step #1461, avg. loss: 0.05472
Step #1471, avg. loss: 0.04844
Step #1481, avg. loss: 0.10241
Step #1491, avg. loss: 0.05096
Accuracy: 0.721965
Step #1501, avg. loss: 0.01629
Step #1511, avg. loss: 0.02956
Step #1521, avg. loss: 0.04187
Step #1531, avg. loss: 0.03365
Step #1541, avg. loss: 0.07898
Step #1551, avg. loss: 0.04595
Step #1561, avg. loss: 0.04875
Step #1571, avg. loss: 0.03881
Step #1581, avg. loss: 0.09497
Step #1591, avg. loss: 0.03399
Accuracy: 0.736994

1|1|1

Total words: 31506
5257
Step #1, avg. loss: 2.48363
Step #11, avg. loss: 2.14880
Step #21, avg. loss: 1.99761
Step #31, avg. loss: 1.64937
Step #41, avg. loss: 1.33280
Step #51, avg. loss: 1.02822
Step #61, avg. loss: 1.10988
Step #71, avg. loss: 1.11106
Step #81, avg. loss: 1.01520
Step #91, avg. loss: 0.82767
Accuracy: 0.705024
Step #101, avg. loss: 0.69097
Step #111, avg. loss: 0.74883
Step #121, avg. loss: 0.61172
Step #131, avg. loss: 0.48499
Step #141, avg. loss: 0.43764
Step #151, avg. loss: 0.34538
Step #161, avg. loss: 0.35794
Step #171, avg. loss: 0.36683
Step #181, avg. loss: 0.28310
Step #191, avg. loss: 0.45695
Accuracy: 0.766613
Step #201, avg. loss: 0.10511
Step #211, avg. loss: 0.34317
Step #221, avg. loss: 0.20481
Step #231, avg. loss: 0.24296
Step #241, avg. loss: 0.32518
Step #251, avg. loss: 0.25472
Step #261, avg. loss: 0.27910
Step #271, avg. loss: 0.32864
Step #281, avg. loss: 0.47292
Step #291, avg. loss: 0.34084
Accuracy: 0.729335
Step #301, avg. loss: 0.52786
Step #311, avg. loss: 0.24127
Step #321, avg. loss: 0.18615
Step #331, avg. loss: 0.13355
Step #341, avg. loss: 0.17512
Step #351, avg. loss: 0.13426
Step #361, avg. loss: 0.13900
Step #371, avg. loss: 0.13461
Step #381, avg. loss: 0.26764
Step #391, avg. loss: 0.17848
Accuracy: 0.704484
Step #401, avg. loss: 0.15521
Step #411, avg. loss: 0.18452
Step #421, avg. loss: 0.09647
Step #431, avg. loss: 0.06679
Step #441, avg. loss: 0.05524
Step #451, avg. loss: 0.15033
Step #461, avg. loss: 0.10063
Step #471, avg. loss: 0.13175
Step #481, avg. loss: 0.05669
Step #491, avg. loss: 0.14631
Accuracy: 0.776877
Step #501, avg. loss: 0.00445
Step #511, avg. loss: 0.07686
Step #521, avg. loss: 0.06099
Step #531, avg. loss: 0.05241
Step #541, avg. loss: 0.02917
Step #551, avg. loss: 0.07375
Step #561, avg. loss: 0.07085
Step #571, avg. loss: 0.08030
Step #581, avg. loss: 0.03038
Step #591, avg. loss: 0.11781
Accuracy: 0.791464
Step #601, avg. loss: 0.00110
Step #611, avg. loss: 0.10858
Step #621, avg. loss: 0.06294
Step #631, avg. loss: 0.04764
Step #641, avg. loss: 0.04617
Step #651, avg. loss: 0.04382
Step #661, avg. loss: 0.05335
Step #671, avg. loss: 0.04960
Step #681, avg. loss: 0.02800
Step #691, avg. loss: 0.09886
Accuracy: 0.784981
Step #701, avg. loss: 0.00124
Step #711, avg. loss: 0.10994
Step #721, avg. loss: 0.04431
Step #731, avg. loss: 0.03322
Step #741, avg. loss: 0.01779
Step #751, avg. loss: 0.05028
Step #761, avg. loss: 0.02901
Step #771, avg. loss: 0.04095
Step #781, avg. loss: 0.03639
Step #791, avg. loss: 0.09229
Accuracy: 0.792004
Step #801, avg. loss: 0.00071
Step #811, avg. loss: 0.04342
Step #821, avg. loss: 0.02817
Step #831, avg. loss: 0.06035
Step #841, avg. loss: 0.04365
Step #851, avg. loss: 0.05640
Step #861, avg. loss: 0.04630
Step #871, avg. loss: 0.04445
Step #881, avg. loss: 0.03093
Step #891, avg. loss: 0.06585
Accuracy: 0.787682
Step #901, avg. loss: 0.00114
Step #911, avg. loss: 0.06990
Step #921, avg. loss: 0.03271
Step #931, avg. loss: 0.06902
Step #941, avg. loss: 0.04701
Step #951, avg. loss: 0.08250
Step #961, avg. loss: 0.04099
Step #971, avg. loss: 0.03976
Step #981, avg. loss: 0.01596
Step #991, avg. loss: 0.06201
Accuracy: 0.782820
Step #1001, avg. loss: 0.00056
Step #1011, avg. loss: 0.03215
Step #1021, avg. loss: 0.03692
Step #1031, avg. loss: 0.04677
Step #1041, avg. loss: 0.06234
Step #1051, avg. loss: 0.09073
Step #1061, avg. loss: 0.03345
Step #1071, avg. loss: 0.06644
Step #1081, avg. loss: 0.05855
Step #1091, avg. loss: 0.06261
Accuracy: 0.782280
Step #1101, avg. loss: 0.02160
Step #1111, avg. loss: 0.09026
Step #1121, avg. loss: 0.09194
Step #1131, avg. loss: 0.06511
Step #1141, avg. loss: 0.11725
Step #1151, avg. loss: 0.10635
Step #1161, avg. loss: 0.11397
Step #1171, avg. loss: 0.12594
Step #1181, avg. loss: 0.05972
Step #1191, avg. loss: 0.13401
Accuracy: 0.763911
Step #1201, avg. loss: 0.00855
Step #1211, avg. loss: 0.17799
Step #1221, avg. loss: 0.26357
Step #1231, avg. loss: 0.28681
Step #1241, avg. loss: 0.19954
Step #1251, avg. loss: 0.27488
Step #1261, avg. loss: 0.31896
Step #1271, avg. loss: 0.42444
Step #1281, avg. loss: 0.27173
Step #1291, avg. loss: 0.50698
Accuracy: 0.706645
Step #1301, avg. loss: 0.75622
Step #1311, avg. loss: 0.40409
Step #1321, avg. loss: 0.27314
Step #1331, avg. loss: 0.22736
Step #1341, avg. loss: 0.32036
Step #1351, avg. loss: 0.48177
Step #1361, avg. loss: 0.23720
Step #1371, avg. loss: 0.31059
Step #1381, avg. loss: 0.27563
Step #1391, avg. loss: 0.23397
Accuracy: 0.745003
Step #1401, avg. loss: 0.31600
Step #1411, avg. loss: 0.43339
Step #1421, avg. loss: 0.29391
Step #1431, avg. loss: 0.32390
Step #1441, avg. loss: 0.37443
Step #1451, avg. loss: 0.30197
Step #1461, avg. loss: 0.31757
Step #1471, avg. loss: 0.10893
Step #1481, avg. loss: 0.08277
Step #1491, avg. loss: 0.27095
Accuracy: 0.760670
Step #1501, avg. loss: 0.50784
Step #1511, avg. loss: 0.31752
Step #1521, avg. loss: 0.11838
Step #1531, avg. loss: 0.13945
Step #1541, avg. loss: 0.24255
Step #1551, avg. loss: 0.23503
Step #1561, avg. loss: 0.25411
Step #1571, avg. loss: 0.22777
Step #1581, avg. loss: 0.05948
Step #1591, avg. loss: 0.22746
Accuracy: 0.773096
Step #1601, avg. loss: 0.00236
Step #1611, avg. loss: 0.09493
Step #1621, avg. loss: 0.12085
Step #1631, avg. loss: 0.23651
Step #1641, avg. loss: 0.17632
Step #1651, avg. loss: 0.19220
Step #1661, avg. loss: 0.17943
Step #1671, avg. loss: 0.17162
Step #1681, avg. loss: 0.09902
Step #1691, avg. loss: 0.21995
Accuracy: 0.743382
Step #1701, avg. loss: 0.01831
Step #1711, avg. loss: 0.18908
Step #1721, avg. loss: 0.09967
Step #1731, avg. loss: 0.12377
Step #1741, avg. loss: 0.27931
Step #1751, avg. loss: 0.18140
Step #1761, avg. loss: 0.16377
Step #1771, avg. loss: 0.11880
Step #1781, avg. loss: 0.04303
Step #1791, avg. loss: 0.15357
Accuracy: 0.771475
Step #1801, avg. loss: 0.00006
Step #1811, avg. loss: 0.07291
Step #1821, avg. loss: 0.19227
Step #1831, avg. loss: 0.07443
Step #1841, avg. loss: 0.23928
Step #1851, avg. loss: 0.14014
Step #1861, avg. loss: 0.06113
Step #1871, avg. loss: 0.05166
Step #1881, avg. loss: 0.03867
Step #1891, avg. loss: 0.15002
Accuracy: 0.763371
Step #1901, avg. loss: 0.00179
Step #1911, avg. loss: 0.15428
Step #1921, avg. loss: 0.10005
Step #1931, avg. loss: 0.03174
Step #1941, avg. loss: 0.02644
Step #1951, avg. loss: 0.05929
Step #1961, avg. loss: 0.09231
Step #1971, avg. loss: 0.04724
Step #1981, avg. loss: 0.07728
Step #1991, avg. loss: 0.14659
Accuracy: 0.777418
Step #2001, avg. loss: 0.00002
Step #2011, avg. loss: 0.04346
Step #2021, avg. loss: 0.03909
Step #2031, avg. loss: 0.04912
Step #2041, avg. loss: 0.02303
Step #2051, avg. loss: 0.06839
Step #2061, avg. loss: 0.06950
Step #2071, avg. loss: 0.12245
Step #2081, avg. loss: 0.09497
Step #2091, avg. loss: 0.13069
Accuracy: 0.757428
Step #2101, avg. loss: 0.13961
Step #2111, avg. loss: 0.15652
Step #2121, avg. loss: 0.06639
Step #2131, avg. loss: 0.14090
Step #2141, avg. loss: 0.11430
Step #2151, avg. loss: 0.12167
Step #2161, avg. loss: 0.08155
Step #2171, avg. loss: 0.06276
Step #2181, avg. loss: 0.07679
Step #2191, avg. loss: 0.15325
Accuracy: 0.777958
Step #2201, avg. loss: 0.00019
Step #2211, avg. loss: 0.07348
Step #2221, avg. loss: 0.04676
Step #2231, avg. loss: 0.04889
Step #2241, avg. loss: 0.07799
Step #2251, avg. loss: 0.05798
Step #2261, avg. loss: 0.08042
Step #2271, avg. loss: 0.08573
Step #2281, avg. loss: 0.03103
Step #2291, avg. loss: 0.12172
Accuracy: 0.777958
Step #2301, avg. loss: 0.00000
Step #2311, avg. loss: 0.06340
Step #2321, avg. loss: 0.03024
Step #2331, avg. loss: 0.05157
Step #2341, avg. loss: 0.05220
Step #2351, avg. loss: 0.04540
Step #2361, avg. loss: 0.04089
Step #2371, avg. loss: 0.04823
Step #2381, avg. loss: 0.02149
Step #2391, avg. loss: 0.03980
Accuracy: 0.788763

1|1|0

Total words: 9044
5257
Step #1, avg. loss: 2.37940
Step #11, avg. loss: 2.48520
Step #21, avg. loss: 1.91082
Step #31, avg. loss: 1.67222
Step #41, avg. loss: 1.61798
Step #51, avg. loss: 1.44891
Step #61, avg. loss: 1.49822
Step #71, avg. loss: 1.43217
Step #81, avg. loss: 1.35551
Step #91, avg. loss: 1.30462
Accuracy: 0.447977
Step #101, avg. loss: 1.13616
Step #111, avg. loss: 1.16450
Step #121, avg. loss: 1.06632
Step #131, avg. loss: 1.21245
Step #141, avg. loss: 1.06876
Step #151, avg. loss: 0.88421
Step #161, avg. loss: 0.93588
Step #171, avg. loss: 0.91741
Step #181, avg. loss: 0.76064
Step #191, avg. loss: 0.70266
Accuracy: 0.675723
Step #201, avg. loss: 0.63346
Step #211, avg. loss: 0.58900
Step #221, avg. loss: 0.60008
Step #231, avg. loss: 0.58066
Step #241, avg. loss: 0.60372
Step #251, avg. loss: 0.41562
Step #261, avg. loss: 0.60802
Step #271, avg. loss: 0.49910
Step #281, avg. loss: 0.34695
Step #291, avg. loss: 0.35529
Accuracy: 0.723699
Step #301, avg. loss: 0.37804
Step #311, avg. loss: 0.32915
Step #321, avg. loss: 0.33634
Step #331, avg. loss: 0.38049
Step #341, avg. loss: 0.42563
Step #351, avg. loss: 0.28169
Step #361, avg. loss: 0.27199
Step #371, avg. loss: 0.29999
Step #381, avg. loss: 0.25761
Step #391, avg. loss: 0.18673
Accuracy: 0.746821
Step #401, avg. loss: 0.17649
Step #411, avg. loss: 0.15379
Step #421, avg. loss: 0.20245
Step #431, avg. loss: 0.14784
Step #441, avg. loss: 0.24599
Step #451, avg. loss: 59.09514
Step #461, avg. loss: 12.99777
Step #471, avg. loss: 7.30942
Step #481, avg. loss: 17.93662
Step #491, avg. loss: 8.66729
Accuracy: 0.615029
Step #501, avg. loss: 0.60415
Step #511, avg. loss: 2.50959
Step #521, avg. loss: 1.49566
Step #531, avg. loss: 0.51926
Step #541, avg. loss: 0.71975
Step #551, avg. loss: 0.74097
Step #561, avg. loss: 0.50689
Step #571, avg. loss: 0.43720
Step #581, avg. loss: 0.28355
Step #591, avg. loss: 0.59256
Accuracy: 0.381503
Step #601, avg. loss: 5.72632
Step #611, avg. loss: 6.17220
Step #621, avg. loss: 0.76089
Step #631, avg. loss: 0.60663
Step #641, avg. loss: 0.54188
Step #651, avg. loss: 0.48290
Step #661, avg. loss: 0.48782
Step #671, avg. loss: 0.29085
Step #681, avg. loss: 0.27194
Step #691, avg. loss: 0.27975
Accuracy: 0.705780
Step #701, avg. loss: 0.47740
Step #711, avg. loss: 0.15470
Step #721, avg. loss: 0.24586
Step #731, avg. loss: 0.24210
Step #741, avg. loss: 0.23996
Step #751, avg. loss: 0.25412
Step #761, avg. loss: 0.25905
Step #771, avg. loss: 0.16011
Step #781, avg. loss: 0.10156
Step #791, avg. loss: 0.09358
Accuracy: 0.759538
Step #801, avg. loss: 0.20353
Step #811, avg. loss: 0.11804
Step #821, avg. loss: 0.17109
Step #831, avg. loss: 0.09999
Step #841, avg. loss: 0.13220
Step #851, avg. loss: 0.15160
Step #861, avg. loss: 0.27356
Step #871, avg. loss: 0.11834
Step #881, avg. loss: 0.07729
Step #891, avg. loss: 0.06629
Accuracy: 0.759538
Step #901, avg. loss: 0.18303
Step #911, avg. loss: 0.12365
Step #921, avg. loss: 0.16016
Step #931, avg. loss: 0.05987
Step #941, avg. loss: 0.13885
Step #951, avg. loss: 0.14011
Step #961, avg. loss: 0.12496
Step #971, avg. loss: 0.09294
Step #981, avg. loss: 0.05823
Step #991, avg. loss: 0.06065
Accuracy: 0.768208
Step #1001, avg. loss: 0.15444
Step #1011, avg. loss: 0.04161
Step #1021, avg. loss: 0.12683
Step #1031, avg. loss: 0.07769
Step #1041, avg. loss: 0.08872
Step #1051, avg. loss: 0.12831
Step #1061, avg. loss: 0.12930
Step #1071, avg. loss: 0.09508
Step #1081, avg. loss: 0.06123
Step #1091, avg. loss: 0.03862
Accuracy: 0.760116
Step #1101, avg. loss: 0.14554
Step #1111, avg. loss: 0.03876
Step #1121, avg. loss: 0.12442
Step #1131, avg. loss: 0.04241
Step #1141, avg. loss: 0.08784
Step #1151, avg. loss: 0.08544
Step #1161, avg. loss: 0.08115
Step #1171, avg. loss: 0.08688
Step #1181, avg. loss: 0.06636
Step #1191, avg. loss: 0.08656
Accuracy: 0.758382
Step #1201, avg. loss: 0.14044
Step #1211, avg. loss: 0.05633
Step #1221, avg. loss: 0.11252
Step #1231, avg. loss: 0.02935
Step #1241, avg. loss: 0.10790
Step #1251, avg. loss: 0.14540
Step #1261, avg. loss: 0.07953
Step #1271, avg. loss: 0.08069
Step #1281, avg. loss: 0.06228
Step #1291, avg. loss: 0.04207
Accuracy: 0.759538
Step #1301, avg. loss: 0.21682
Step #1311, avg. loss: 0.08282
Step #1321, avg. loss: 0.10700
Step #1331, avg. loss: 0.03180
Step #1341, avg. loss: 0.07877
Step #1351, avg. loss: 0.10498
Step #1361, avg. loss: 0.07125
Step #1371, avg. loss: 0.04438
Step #1381, avg. loss: 0.02999
Step #1391, avg. loss: 0.02933
Accuracy: 0.765318
Step #1401, avg. loss: 0.20227
Step #1411, avg. loss: 0.04319
Step #1421, avg. loss: 0.10715
Step #1431, avg. loss: 0.02221
Step #1441, avg. loss: 0.07563
Step #1451, avg. loss: 0.09802
Step #1461, avg. loss: 0.06242
Step #1471, avg. loss: 0.05954
Step #1481, avg. loss: 0.03300
Step #1491, avg. loss: 0.02295
Accuracy: 0.769942
Step #1501, avg. loss: 0.16413
Step #1511, avg. loss: 0.03177
Step #1521, avg. loss: 0.08398
Step #1531, avg. loss: 0.02373
Step #1541, avg. loss: 0.07112
Step #1551, avg. loss: 0.10221
Step #1561, avg. loss: 0.05227
Step #1571, avg. loss: 0.07840
Step #1581, avg. loss: 0.03149
Step #1591, avg. loss: 0.05919
Accuracy: 0.771676

1|2|0

Total words: 9132
5257
Step #1, avg. loss: 2.38115
Step #11, avg. loss: 2.25324
Step #21, avg. loss: 1.85541
Step #31, avg. loss: 1.53801
Step #41, avg. loss: 1.31748
Step #51, avg. loss: 1.40558
Step #61, avg. loss: 1.46173
Step #71, avg. loss: 1.47350
Step #81, avg. loss: 1.41357
Step #91, avg. loss: 1.40992
Accuracy: 0.554861
Step #101, avg. loss: 1.52205
Step #111, avg. loss: 0.93047
Step #121, avg. loss: 0.79347
Step #131, avg. loss: 0.67986
Step #141, avg. loss: 0.81088
Step #151, avg. loss: 0.70536
Step #161, avg. loss: 0.63104
Step #171, avg. loss: 0.60389
Step #181, avg. loss: 0.67646
Step #191, avg. loss: 0.65864
Accuracy: 0.717453
Step #201, avg. loss: 0.97313
Step #211, avg. loss: 0.51309
Step #221, avg. loss: 0.30900
Step #231, avg. loss: 0.25942
Step #241, avg. loss: 0.31040
Step #251, avg. loss: 0.25291
Step #261, avg. loss: 0.35289
Step #271, avg. loss: 0.36517
Step #281, avg. loss: 0.26135
Step #291, avg. loss: 0.29646
Accuracy: 0.737351
Step #301, avg. loss: 0.12710
Step #311, avg. loss: 0.38262
Step #321, avg. loss: 0.19509
Step #331, avg. loss: 0.24143
Step #341, avg. loss: 0.22133
Step #351, avg. loss: 0.17240
Step #361, avg. loss: 0.36916
Step #371, avg. loss: 0.24652
Step #381, avg. loss: 0.24759
Step #391, avg. loss: 0.21595
Accuracy: 0.739056
Step #401, avg. loss: 0.06048
Step #411, avg. loss: 0.17454
Step #421, avg. loss: 0.12367
Step #431, avg. loss: 0.20691
Step #441, avg. loss: 0.16205
Step #451, avg. loss: 0.12706
Step #461, avg. loss: 0.16436
Step #471, avg. loss: 0.12413
Step #481, avg. loss: 0.14805
Step #491, avg. loss: 0.14595
Accuracy: 0.757248
Step #501, avg. loss: 0.12736
Step #511, avg. loss: 0.12450
Step #521, avg. loss: 0.10273
Step #531, avg. loss: 0.18390
Step #541, avg. loss: 0.10305
Step #551, avg. loss: 0.08719
Step #561, avg. loss: 0.13535
Step #571, avg. loss: 0.06309
Step #581, avg. loss: 0.12031
Step #591, avg. loss: 0.10321
Accuracy: 0.754974
Step #601, avg. loss: 0.01492
Step #611, avg. loss: 0.11706
Step #621, avg. loss: 0.10047
Step #631, avg. loss: 0.14129
Step #641, avg. loss: 0.09785
Step #651, avg. loss: 0.03732
Step #661, avg. loss: 0.11060
Step #671, avg. loss: 0.05377
Step #681, avg. loss: 0.09368
Step #691, avg. loss: 0.08860
Accuracy: 0.761522

2|1|0

Total words: 9204
5257
Step #1, avg. loss: 2.48724
Step #11, avg. loss: 2.12572
Step #21, avg. loss: 1.95281
Step #31, avg. loss: 1.85331
Step #41, avg. loss: 1.58681
Step #51, avg. loss: 1.37555
Step #61, avg. loss: 1.29150
Step #71, avg. loss: 1.91069
Step #81, avg. loss: 1.44710
Step #91, avg. loss: 1.16064
Accuracy: 0.643750
Step #101, avg. loss: 1.09745
Step #111, avg. loss: 0.91309
Step #121, avg. loss: 1.00672
Step #131, avg. loss: 0.95821
Step #141, avg. loss: 0.78080
Step #151, avg. loss: 0.52617
Step #161, avg. loss: 0.69928
Step #171, avg. loss: 1.49659
Step #181, avg. loss: 1.64293
Step #191, avg. loss: 1.06832
Accuracy: 0.696591
Step #201, avg. loss: 1.05737
Step #211, avg. loss: 0.54399
Step #221, avg. loss: 0.67309
Step #231, avg. loss: 0.42267
Step #241, avg. loss: 0.38417
Step #251, avg. loss: 0.25370
Step #261, avg. loss: 0.38196
Step #271, avg. loss: 0.68776
Step #281, avg. loss: 0.64545
Step #291, avg. loss: 0.33590
Accuracy: 0.703977
Step #301, avg. loss: 0.26401
Step #311, avg. loss: 0.27628
Step #321, avg. loss: 0.27603
Step #331, avg. loss: 0.41377
Step #341, avg. loss: 0.25591
Step #351, avg. loss: 0.12166
Step #361, avg. loss: 0.13108
Step #371, avg. loss: 0.19064
Step #381, avg. loss: 0.42403
Step #391, avg. loss: 0.27741
Accuracy: 0.686932
Step #401, avg. loss: 0.45524
Step #411, avg. loss: 0.18493
Step #421, avg. loss: 0.24717
Step #431, avg. loss: 0.53812
Step #441, avg. loss: 0.49204
Step #451, avg. loss: 0.33116
Step #461, avg. loss: 0.43162
Step #471, avg. loss: 0.33759
Step #481, avg. loss: 0.61707
Step #491, avg. loss: 0.29440
Accuracy: 0.706250
Step #501, avg. loss: 0.46749
Step #511, avg. loss: 0.19618
Step #521, avg. loss: 0.30612
Step #531, avg. loss: 0.23623
Step #541, avg. loss: 0.16593
Step #551, avg. loss: 0.20524
Step #561, avg. loss: 0.16833
Step #571, avg. loss: 0.26346
Step #581, avg. loss: 0.29471
Step #591, avg. loss: 0.19858
Accuracy: 0.744886
Step #601, avg. loss: 0.33661
Step #611, avg. loss: 0.15075
Step #621, avg. loss: 0.21640
Step #631, avg. loss: 0.12732
Step #641, avg. loss: 0.19101
Step #651, avg. loss: 0.13084
Step #661, avg. loss: 0.08922
Step #671, avg. loss: 0.11181
Step #681, avg. loss: 0.11101
Step #691, avg. loss: 0.05266
Accuracy: 0.773295
Step #701, avg. loss: 0.23592
Step #711, avg. loss: 0.03922
Step #721, avg. loss: 0.14522
Step #731, avg. loss: 0.02798
Step #741, avg. loss: 0.05015
Step #751, avg. loss: 0.07405
Step #761, avg. loss: 0.08622
Step #771, avg. loss: 0.04989
Step #781, avg. loss: 0.06083
Step #791, avg. loss: 0.06015
Accuracy: 0.777273
Step #801, avg. loss: 0.05189
Step #811, avg. loss: 0.07796
Step #821, avg. loss: 0.08842
Step #831, avg. loss: 0.03814
Step #841, avg. loss: 0.02498
Step #851, avg. loss: 0.04924
Step #861, avg. loss: 0.03777
Step #871, avg. loss: 0.06801
Step #881, avg. loss: 0.04531
Step #891, avg. loss: 0.07067
Accuracy: 0.776136
Step #901, avg. loss: 0.04320
Step #911, avg. loss: 0.03710
Step #921, avg. loss: 0.04780
Step #931, avg. loss: 0.03119
Step #941, avg. loss: 0.02124
Step #951, avg. loss: 0.02722
Step #961, avg. loss: 0.03743
Step #971, avg. loss: 0.04466
Step #981, avg. loss: 0.03945
Step #991, avg. loss: 0.03632
Accuracy: 0.781818
learning_rate=0.009
EMBEDDING_SIZE = 100
WINDOW_SIZE =12

rnn+word_vec √

0|0|1

Total words: 30826
Step #1, avg. loss: 3.31535
Step #101, avg. loss: 1.89723
Step #201, epoch #1, avg. loss: 0.68571
Step #301, epoch #1, avg. loss: 0.35212
Step #401, epoch #2, avg. loss: 0.15009
Step #501, epoch #3, avg. loss: 0.14185
Step #601, epoch #3, avg. loss: 0.05242
Step #701, epoch #4, avg. loss: 0.05473
Step #801, epoch #4, avg. loss: 0.04090
Step #901, epoch #5, avg. loss: 0.04055
Accuracy: 0.836146
Step #1001, avg. loss: 0.00270
Step #1101, avg. loss: 0.02793
Step #1201, epoch #1, avg. loss: 0.02933
Step #1301, epoch #1, avg. loss: 0.03249
Step #1401, epoch #2, avg. loss: 0.01914
Step #1501, epoch #3, avg. loss: 0.03303
Step #1601, epoch #3, avg. loss: 0.01968
Step #1701, epoch #4, avg. loss: 0.02493
Step #1801, epoch #4, avg. loss: 0.02230
Step #1901, epoch #5, avg. loss: 0.02993
Accuracy: 0.831791

0|1|0

Total words: 7628
Step #1, avg. loss: 2.28448
Step #101, avg. loss: 1.97874
Step #201, epoch #1, avg. loss: 1.88785
Step #301, epoch #1, avg. loss: 1.10239
Step #401, epoch #2, avg. loss: 0.59162
Step #501, epoch #3, avg. loss: 0.42994
Step #601, epoch #3, avg. loss: 0.25851
Step #701, epoch #4, avg. loss: 0.19903
Step #801, epoch #4, avg. loss: 0.16137
Step #901, epoch #5, avg. loss: 0.11144
Accuracy: 0.802608

1|0|0
1|1|1
1|1|0
1|2|0
2|1|0

Total words: 9204
Step #1, avg. loss: 3.31588
Step #101, avg. loss: 1.77354
Step #201, epoch #1, avg. loss: 1.13782
Step #301, epoch #1, avg. loss: 0.53906
Step #401, epoch #2, avg. loss: 0.28272
Step #501, epoch #3, avg. loss: 0.24662
Step #601, epoch #3, avg. loss: 0.13302
Step #701, epoch #4, avg. loss: 0.13856
Step #801, epoch #4, avg. loss: 0.10473
Step #901, epoch #5, avg. loss: 0.07825
Accuracy: 0.790909
Step #1001, avg. loss: 0.05549
Step #1101, avg. loss: 0.05588
Step #1201, epoch #1, avg. loss: 0.05866
Step #1301, epoch #1, avg. loss: 0.05177
Step #1401, epoch #2, avg. loss: 0.03915
Step #1501, epoch #3, avg. loss: 0.05236
Step #1601, epoch #3, avg. loss: 0.03517
Step #1701, epoch #4, avg. loss: 0.04324
Step #1801, epoch #4, avg. loss: 0.03953
Step #1901, epoch #5, avg. loss: 0.04080
Accuracy: 0.810795
Step #2001, avg. loss: 0.03055
Step #2101, avg. loss: 0.04244
Step #2201, epoch #1, avg. loss: 0.03700
Step #2301, epoch #1, avg. loss: 0.03388
Step #2401, epoch #2, avg. loss: 0.03374
Step #2501, epoch #3, avg. loss: 0.05339
Step #2601, epoch #3, avg. loss: 0.03583
Step #2701, epoch #4, avg. loss: 0.04207
Step #2801, epoch #4, avg. loss: 0.04294
Step #2901, epoch #5, avg. loss: 0.03118
Accuracy: 0.804545
Step #3001, avg. loss: 0.02793
Step #3101, avg. loss: 0.02888
Step #3201, epoch #1, avg. loss: 0.03976
Step #3301, epoch #1, avg. loss: 0.03506
Step #3401, epoch #2, avg. loss: 0.03233
Step #3501, epoch #3, avg. loss: 0.03241
Step #3601, epoch #3, avg. loss: 0.02877
Step #3701, epoch #4, avg. loss: 0.04363
Step #3801, epoch #4, avg. loss: 0.03516
Step #3901, epoch #5, avg. loss: 0.02673
Accuracy: 0.811364
Step #4001, avg. loss: 0.03452
Step #4101, avg. loss: 0.02366
Step #4201, epoch #1, avg. loss: 0.04042
Step #4301, epoch #1, avg. loss: 0.04312
Step #4401, epoch #2, avg. loss: 0.02917
Step #4501, epoch #3, avg. loss: 0.03896
Step #4601, epoch #3, avg. loss: 0.03036
Step #4701, epoch #4, avg. loss: 0.03360
Step #4801, epoch #4, avg. loss: 0.03076
Step #4901, epoch #5, avg. loss: 0.02635
Accuracy: 0.808523
Step #5001, avg. loss: 0.04477
Step #5101, avg. loss: 0.02281
Step #5201, epoch #1, avg. loss: 0.02762
Step #5301, epoch #1, avg. loss: 0.02349
Step #5401, epoch #2, avg. loss: 0.02308
Step #5501, epoch #3, avg. loss: 0.02474
Step #5601, epoch #3, avg. loss: 0.02351
Step #5701, epoch #4, avg. loss: 0.02501
Step #5801, epoch #4, avg. loss: 0.02417
Step #5901, epoch #5, avg. loss: 0.01996
Accuracy: 0.816477

rnn_21

Step #1, avg. loss: 3.54451
Step #101, avg. loss: 3.05181
Step #201, avg. loss: 2.69855
Step #301, avg. loss: 2.31483
Step #401, avg. loss: 1.99817
Step #501, avg. loss: 1.71132
Step #601, avg. loss: 1.50262
Step #701, avg. loss: 1.35528
Step #801, avg. loss: 1.19330
Step #901, avg. loss: 1.07963
Accuracy: 0.718955
Step #1001, avg. loss: 0.88580
Step #1101, avg. loss: 0.85950
Step #1201, avg. loss: 0.84391
Step #1301, avg. loss: 0.75376
Step #1401, avg. loss: 0.69276
Step #1501, avg. loss: 0.61233
Step #1601, avg. loss: 0.58295
Step #1701, avg. loss: 0.59254
Step #1801, avg. loss: 0.52718
Step #1901, avg. loss: 0.45503
Accuracy: 0.762928
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值