《Web安全之机器学习入门》笔记:第十五章 16.2 循环神经网络RNN识别验证码

本小节通过tflearn库的RNN算法来识别验证码,对比了DNN与RNN两种算法的准确率。

1、数据集

使用MNIST图集,加载方法如下

X, Y, testX, testY = mnist.load_data(one_hot=True)

2、DNN训练数据集

def do_DNN(X, Y, testX, testY):
    # Building deep neural network
    input_layer = tflearn.input_data(shape=[None, 784])
    dense1 = tflearn.fully_connected(input_layer, 64, activation='tanh',
                                     regularizer='L2', weight_decay=0.001)
    dropout1 = tflearn.dropout(dense1, 0.8)
    dense2 = tflearn.fully_connected(dropout1, 64, activation='tanh',
                                     regularizer='L2', weight_decay=0.001)
    dropout2 = tflearn.dropout(dense2, 0.8)
    softmax = tflearn.fully_connected(dropout2, 10, activation='softmax')

    # Regression using SGD with learning rate decay and Top-3 accuracy
    sgd = tflearn.SGD(learning_rate=0.1, lr_decay=0.96, decay_step=1000)
    top_k = tflearn.metrics.Top_k(3)
    net = tflearn.regression(softmax, optimizer=sgd, metric=top_k,
                             loss='categorical_crossentropy')

    # Training
    model = tflearn.DNN(net, tensorboard_verbose=0)
    model.fit(X, Y, n_epoch=20, validation_set=(testX, testY),
              show_metric=True, run_id="dense_model")

do_DNN(X, Y, testX, testY)

3、DNN运行结果

Training Step: 17200  | total loss: 0.38923 | time: 3.821s
| SGD | epoch: 020 | loss: 0.38923 - top3: 0.9759 | val_loss: 0.11003 - val_top3: 0.9957 -- iter: 55000/55000

4、RNN训练数据集

def do_rnn(X, Y, testX, testY):
    X = np.reshape(X, (-1, 28, 28))
    testX = np.reshape(testX, (-1, 28, 28))

    net = tflearn.input_data(shape=[None, 28, 28])
    net = tflearn.lstm(net, 128, return_seq=True)
    net = tflearn.lstm(net, 128)
    net = tflearn.fully_connected(net, 10, activation='softmax')
    net = tflearn.regression(net, optimizer='adam',
                         loss='categorical_crossentropy', name="output1")
    model = tflearn.DNN(net, tensorboard_verbose=2)
    model.fit(X, Y, n_epoch=1, validation_set=(testX,testY), show_metric=True,
          snapshot_step=100)

do_rnn(X, Y, testX, testY)

5、RNN运行结果

---------------------------------
Run id: NGI5YO
Log directory: /tmp/tflearn_logs/
---------------------------------
Training samples: 55000
Validation samples: 10000
--
Training Step: 1  | time: 2.087s
| Adam | epoch: 001 | loss: 0.00000 - acc: 0.0000 -- iter: 00064/55000
Training Step: 2  | total loss: 2.07202 | time: 2.170s
| Adam | epoch: 001 | loss: 2.07202 - acc: 0.1828 -- iter: 00128/55000
Training Step: 3  | total loss: 2.25508 | time: 2.265s
| Adam | epoch: 001 | loss: 2.25508 - acc: 0.2250 -- iter: 00192/55000
Training Step: 4  | total loss: 2.28581 | time: 2.349s
| Adam | epoch: 001 | loss: 2.28581 - acc: 0.1500 -- iter: 00256/55000
Training Step: 5  | total loss: 2.28283 | time: 2.431s
| Adam | epoch: 001 | loss: 2.28283 - acc: 0.1868 -- iter: 00320/55000
Training Step: 6  | total loss: 2.27338 | time: 2.507s
| Adam | epoch: 001 | loss: 2.27338 - acc: 0.2375 -- iter: 00384/55000
Training Step: 7  | total loss: 2.27075 | time: 2.583s
| Adam | epoch: 001 | loss: 2.27075 - acc: 0.2450 -- iter: 00448/55000
Training Step: 8  | total loss: 2.27322 | time: 2.687s
| Adam | epoch: 001 | loss: 2.27322 - acc: 0.2302 -- iter: 00512/55000
Training Step: 9  | total loss: 2.27180 | time: 2.768s
| Adam | epoch: 001 | loss: 2.27180 - acc: 0.1745 -- iter: 00576/55000
Training Step: 10  | total loss: 2.24697 | time: 2.846s
| Adam | epoch: 001 | loss: 2.24697 - acc: 0.2748 -- iter: 00640/55000
Training Step: 11  | total loss: 2.21987 | time: 2.925s
| Adam | epoch: 001 | loss: 2.21987 - acc: 0.2334 -- iter: 00704/55000
Training Step: 12  | total loss: 2.22415 | time: 3.009s
| Adam | epoch: 001 | loss: 2.22415 - acc: 0.2198 -- iter: 00768/55000
Training Step: 13  | total loss: 2.21162 | time: 3.099s
| Adam | epoch: 001 | loss: 2.21162 - acc: 0.2126 -- iter: 00832/55000
Training Step: 14  | total loss: 2.20106 | time: 3.176s
| Adam | epoch: 001 | loss: 2.20106 - acc: 0.1896 -- iter: 00896/55000
Training Step: 15  | total loss: 2.17609 | time: 3.248s
| Adam | epoch: 001 | loss: 2.17609 - acc: 0.2438 -- iter: 00960/55000
Training Step: 16  | total loss: 2.19087 | time: 3.329s
| Adam | epoch: 001 | loss: 2.19087 - acc: 0.2168 -- iter: 01024/55000
Training Step: 17  | total loss: 2.17489 | time: 3.406s
| Adam | epoch: 001 | loss: 2.17489 - acc: 0.2231 -- iter: 01088/55000
Training Step: 18  | total loss: 2.13640 | time: 3.475s
| Adam | epoch: 001 | loss: 2.13640 - acc: 0.2649 -- iter: 01152/55000
Training Step: 19  | total loss: 2.12010 | time: 3.549s
| Adam | epoch: 001 | loss: 2.12010 - acc: 0.2547 -- iter: 01216/55000
Training Step: 20  | total loss: 2.09065 | time: 3.625s
| Adam | epoch: 001 | loss: 2.09065 - acc: 0.2632 -- iter: 01280/55000
Training Step: 21  | total loss: 2.05633 | time: 3.696s
| Adam | epoch: 001 | loss: 2.05633 - acc: 0.2979 -- iter: 01344/55000
Training Step: 22  | total loss: 2.02618 | time: 3.772s
| Adam | epoch: 001 | loss: 2.02618 - acc: 0.3164 -- iter: 01408/55000
Training Step: 23  | total loss: 2.01173 | time: 3.846s
| Adam | epoch: 001 | loss: 2.01173 - acc: 0.3289 -- iter: 01472/55000
Training Step: 24  | total loss: 2.01153 | time: 3.923s
| Adam | epoch: 001 | loss: 2.01153 - acc: 0.3155 -- iter: 01536/55000
Training Step: 25  | total loss: 1.99772 | time: 3.999s
| Adam | epoch: 001 | loss: 1.99772 - acc: 0.3317 -- iter: 01600/55000
Training Step: 26  | total loss: 1.94960 | time: 4.069s
| Adam | epoch: 001 | loss: 1.94960 - acc: 0.3597 -- iter: 01664/55000
Training Step: 27  | total loss: 1.94797 | time: 4.140s
| Adam | epoch: 001 | loss: 1.94797 - acc: 0.3516 -- iter: 01728/55000
Training Step: 28  | total loss: 1.90517 | time: 4.218s
| Adam | epoch: 001 | loss: 1.90517 - acc: 0.3887 -- iter: 01792/55000
Training Step: 29  | total loss: 1.88715 | time: 4.291s
| Adam | epoch: 001 | loss: 1.88715 - acc: 0.3892 -- iter: 01856/55000
Training Step: 30  | total loss: 1.87837 | time: 4.365s
| Adam | epoch: 001 | loss: 1.87837 - acc: 0.3895 -- iter: 01920/55000
Training Step: 31  | total loss: 1.86977 | time: 4.435s
| Adam | epoch: 001 | loss: 1.86977 - acc: 0.3898 -- iter: 01984/55000
Training Step: 32  | total loss: 1.84277 | time: 4.512s
| Adam | epoch: 001 | loss: 1.84277 - acc: 0.4005 -- iter: 02048/55000
Training Step: 33  | total loss: 1.80892 | time: 4.588s
| Adam | epoch: 001 | loss: 1.80892 - acc: 0.4086 -- iter: 02112/55000
Training Step: 34  | total loss: 1.74824 | time: 4.659s
| Adam | epoch: 001 | loss: 1.74824 - acc: 0.4516 -- iter: 02176/55000
Training Step: 35  | total loss: 1.76893 | time: 4.730s
| Adam | epoch: 001 | loss: 1.76893 - acc: 0.4225 -- iter: 02240/55000
Training Step: 36  | total loss: 1.73516 | time: 4.801s
| Adam | epoch: 001 | loss: 1.73516 - acc: 0.4288 -- iter: 02304/55000
Training Step: 37  | total loss: 1.71662 | time: 4.871s
| Adam | epoch: 001 | loss: 1.71662 - acc: 0.4336 -- iter: 02368/55000
Training Step: 38  | total loss: 1.71414 | time: 4.940s
| Adam | epoch: 001 | loss: 1.71414 - acc: 0.4161 -- iter: 02432/55000
Training Step: 39  | total loss: 1.69260 | time: 5.011s
| Adam | epoch: 001 | loss: 1.69260 - acc: 0.4261 -- iter: 02496/55000
Training Step: 40  | total loss: 1.67391 | time: 5.083s
| Adam | epoch: 001 | loss: 1.67391 - acc: 0.4341 -- iter: 02560/55000
Training Step: 41  | total loss: 1.66346 | time: 5.156s
| Adam | epoch: 001 | loss: 1.66346 - acc: 0.4434 -- iter: 02624/55000
Training Step: 42  | total loss: 1.66268 | time: 5.233s
| Adam | epoch: 001 | loss: 1.66268 - acc: 0.4451 -- iter: 02688/55000
Training Step: 43  | total loss: 1.66776 | time: 5.301s
| Adam | epoch: 001 | loss: 1.66776 - acc: 0.4410 -- iter: 02752/55000
Training Step: 44  | total loss: 1.61481 | time: 5.370s
| Adam | epoch: 001 | loss: 1.61481 - acc: 0.4756 -- iter: 02816/55000
Training Step: 45  | total loss: 1.59265 | time: 5.441s
| Adam | epoch: 001 | loss: 1.59265 - acc: 0.4850 -- iter: 02880/55000
Training Step: 46  | total loss: 1.59723 | time: 5.514s
| Adam | epoch: 001 | loss: 1.59723 - acc: 0.4771 -- iter: 02944/55000
Training Step: 47  | total loss: 1.57226 | time: 5.589s
| Adam | epoch: 001 | loss: 1.57226 - acc: 0.4655 -- iter: 03008/55000
Training Step: 48  | total loss: 1.58012 | time: 5.665s
| Adam | epoch: 001 | loss: 1.58012 - acc: 0.4585 -- iter: 03072/55000
Training Step: 49  | total loss: 1.55036 | time: 5.740s
| Adam | epoch: 001 | loss: 1.55036 - acc: 0.4749 -- iter: 03136/55000
Training Step: 50  | total loss: 1.51742 | time: 5.807s
| Adam | epoch: 001 | loss: 1.51742 - acc: 0.4812 -- iter: 03200/55000
Training Step: 51  | total loss: 1.49791 | time: 5.877s
| Adam | epoch: 001 | loss: 1.49791 - acc: 0.4889 -- iter: 03264/55000
Training Step: 52  | total loss: 1.48200 | time: 5.947s
| Adam | epoch: 001 | loss: 1.48200 - acc: 0.4976 -- iter: 03328/55000
Training Step: 53  | total loss: 1.46454 | time: 6.019s
| Adam | epoch: 001 | loss: 1.46454 - acc: 0.5025 -- iter: 03392/55000
Training Step: 54  | total loss: 1.46626 | time: 6.090s
| Adam | epoch: 001 | loss: 1.46626 - acc: 0.5090 -- iter: 03456/55000
Training Step: 55  | total loss: 1.45979 | time: 6.159s
| Adam | epoch: 001 | loss: 1.45979 - acc: 0.5211 -- iter: 03520/55000
Training Step: 56  | total loss: 1.47702 | time: 6.235s
| Adam | epoch: 001 | loss: 1.47702 - acc: 0.5269 -- iter: 03584/55000
Training Step: 57  | total loss: 1.45760 | time: 6.305s
| Adam | epoch: 001 | loss: 1.45760 - acc: 0.5340 -- iter: 03648/55000
Training Step: 58  | total loss: 1.42167 | time: 6.373s
| Adam | epoch: 001 | loss: 1.42167 - acc: 0.5507 -- iter: 03712/55000
Training Step: 59  | total loss: 1.42636 | time: 6.445s
| Adam | epoch: 001 | loss: 1.42636 - acc: 0.5523 -- iter: 03776/55000
Training Step: 60  | total loss: 1.39890 | time: 6.517s
| Adam | epoch: 001 | loss: 1.39890 - acc: 0.5640 -- iter: 03840/55000
Training Step: 61  | total loss: 1.40002 | time: 6.584s
| Adam | epoch: 001 | loss: 1.40002 - acc: 0.5638 -- iter: 03904/55000
Training Step: 62  | total loss: 1.40043 | time: 6.656s
| Adam | epoch: 001 | loss: 1.40043 - acc: 0.5455 -- iter: 03968/55000
Training Step: 63  | total loss: 1.38852 | time: 6.725s
| Adam | epoch: 001 | loss: 1.38852 - acc: 0.5437 -- iter: 04032/55000
Training Step: 64  | total loss: 1.37558 | time: 6.792s
| Adam | epoch: 001 | loss: 1.37558 - acc: 0.5402 -- iter: 04096/55000
Training Step: 65  | total loss: 1.37556 | time: 6.861s
| Adam | epoch: 001 | loss: 1.37556 - acc: 0.5295 -- iter: 04160/55000
Training Step: 66  | total loss: 1.35580 | time: 6.934s
| Adam | epoch: 001 | loss: 1.35580 - acc: 0.5354 -- iter: 04224/55000
Training Step: 67  | total loss: 1.35524 | time: 7.004s
| Adam | epoch: 001 | loss: 1.35524 - acc: 0.5311 -- iter: 04288/55000
Training Step: 68  | total loss: 1.32878 | time: 7.071s
| Adam | epoch: 001 | loss: 1.32878 - acc: 0.5423 -- iter: 04352/55000
Training Step: 69  | total loss: 1.31841 | time: 7.149s
| Adam | epoch: 001 | loss: 1.31841 - acc: 0.5446 -- iter: 04416/55000
Training Step: 70  | total loss: 1.31137 | time: 7.226s
| Adam | epoch: 001 | loss: 1.31137 - acc: 0.5503 -- iter: 04480/55000
Training Step: 71  | total loss: 1.29267 | time: 7.301s
| Adam | epoch: 001 | loss: 1.29267 - acc: 0.5606 -- iter: 04544/55000
Training Step: 72  | total loss: 1.27034 | time: 7.371s
| Adam | epoch: 001 | loss: 1.27034 - acc: 0.5678 -- iter: 04608/55000
Training Step: 73  | total loss: 1.26863 | time: 7.443s
| Adam | epoch: 001 | loss: 1.26863 - acc: 0.5724 -- iter: 04672/55000
Training Step: 74  | total loss: 1.28603 | time: 7.513s
| Adam | epoch: 001 | loss: 1.28603 - acc: 0.5645 -- iter: 04736/55000
Training Step: 75  | total loss: 1.28972 | time: 7.586s
| Adam | epoch: 001 | loss: 1.28972 - acc: 0.5558 -- iter: 04800/55000
Training Step: 76  | total loss: 1.26343 | time: 7.689s
| Adam | epoch: 001 | loss: 1.26343 - acc: 0.5632 -- iter: 04864/55000
Training Step: 77  | total loss: 1.27809 | time: 7.768s
| Adam | epoch: 001 | loss: 1.27809 - acc: 0.5565 -- iter: 04928/55000
Training Step: 78  | total loss: 1.26239 | time: 7.848s
| Adam | epoch: 001 | loss: 1.26239 - acc: 0.5621 -- iter: 04992/55000
Training Step: 79  | total loss: 1.26139 | time: 7.926s
| Adam | epoch: 001 | loss: 1.26139 - acc: 0.5621 -- iter: 05056/55000
Training Step: 80  | total loss: 1.23637 | time: 8.021s
| Adam | epoch: 001 | loss: 1.23637 - acc: 0.5781 -- iter: 05120/55000
Training Step: 81  | total loss: 1.23012 | time: 8.106s
| Adam | epoch: 001 | loss: 1.23012 - acc: 0.5829 -- iter: 05184/55000
Training Step: 82  | total loss: 1.22749 | time: 8.200s
| Adam | epoch: 001 | loss: 1.22749 - acc: 0.5871 -- iter: 05248/55000
Training Step: 83  | total loss: 1.21077 | time: 8.281s
| Adam | epoch: 001 | loss: 1.21077 - acc: 0.5909 -- iter: 05312/55000
Training Step: 84  | total loss: 1.20042 | time: 8.359s
| Adam | epoch: 001 | loss: 1.20042 - acc: 0.5958 -- iter: 05376/55000
Training Step: 85  | total loss: 1.19075 | time: 8.438s
| Adam | epoch: 001 | loss: 1.19075 - acc: 0.5925 -- iter: 05440/55000
Training Step: 86  | total loss: 1.17265 | time: 8.572s
| Adam | epoch: 001 | loss: 1.17265 - acc: 0.5973 -- iter: 05504/55000
Training Step: 87  | total loss: 1.15980 | time: 8.645s
| Adam | epoch: 001 | loss: 1.15980 - acc: 0.6001 -- iter: 05568/55000
Training Step: 88  | total loss: 1.16137 | time: 8.727s
| Adam | epoch: 001 | loss: 1.16137 - acc: 0.6010 -- iter: 05632/55000
Training Step: 89  | total loss: 1.13935 | time: 8.815s
| Adam | epoch: 001 | loss: 1.13935 - acc: 0.6097 -- iter: 05696/55000
Training Step: 90  | total loss: 1.12620 | time: 8.901s
| Adam | epoch: 001 | loss: 1.12620 - acc: 0.6143 -- iter: 05760/55000
Training Step: 91  | total loss: 1.12002 | time: 9.000s
| Adam | epoch: 001 | loss: 1.12002 - acc: 0.6138 -- iter: 05824/55000
Training Step: 92  | total loss: 1.10262 | time: 9.080s
| Adam | epoch: 001 | loss: 1.10262 - acc: 0.6196 -- iter: 05888/55000
Training Step: 93  | total loss: 1.09626 | time: 9.158s
| Adam | epoch: 001 | loss: 1.09626 - acc: 0.6217 -- iter: 05952/55000
Training Step: 94  | total loss: 1.07897 | time: 9.233s
| Adam | epoch: 001 | loss: 1.07897 - acc: 0.6330 -- iter: 06016/55000
Training Step: 95  | total loss: 1.07810 | time: 9.304s
| Adam | epoch: 001 | loss: 1.07810 - acc: 0.6322 -- iter: 06080/55000
Training Step: 96  | total loss: 1.06633 | time: 9.373s
| Adam | epoch: 001 | loss: 1.06633 - acc: 0.6440 -- iter: 06144/55000
Training Step: 97  | total loss: 1.06578 | time: 9.445s
| Adam | epoch: 001 | loss: 1.06578 - acc: 0.6499 -- iter: 06208/55000
Training Step: 98  | total loss: 1.05860 | time: 9.519s
| Adam | epoch: 001 | loss: 1.05860 - acc: 0.6583 -- iter: 06272/55000
Training Step: 99  | total loss: 1.05952 | time: 9.587s
| Adam | epoch: 001 | loss: 1.05952 - acc: 0.6566 -- iter: 06336/55000
Training Step: 100  | total loss: 1.06813 | time: 12.306s
| Adam | epoch: 001 | loss: 1.06813 - acc: 0.6518 | val_loss: 0.92908 - val_acc: 0.6861 -- iter: 06400/55000
--
Training Step: 101  | total loss: 1.05771 | time: 12.383s
| Adam | epoch: 001 | loss: 1.05771 - acc: 0.6507 -- iter: 06464/55000
Training Step: 102  | total loss: 1.04335 | time: 12.463s
| Adam | epoch: 001 | loss: 1.04335 - acc: 0.6528 -- iter: 06528/55000
Training Step: 103  | total loss: 1.02571 | time: 12.539s
| Adam | epoch: 001 | loss: 1.02571 - acc: 0.6547 -- iter: 06592/55000
Training Step: 104  | total loss: 0.99155 | time: 12.659s
| Adam | epoch: 001 | loss: 0.99155 - acc: 0.6643 -- iter: 06656/55000
Training Step: 105  | total loss: 0.98733 | time: 12.761s
| Adam | epoch: 001 | loss: 0.98733 - acc: 0.6682 -- iter: 06720/55000
Training Step: 106  | total loss: 0.97634 | time: 12.846s
| Adam | epoch: 001 | loss: 0.97634 - acc: 0.6732 -- iter: 06784/55000
Training Step: 107  | total loss: 0.97233 | time: 12.920s
| Adam | epoch: 001 | loss: 0.97233 - acc: 0.6778 -- iter: 06848/55000
Training Step: 108  | total loss: 0.96744 | time: 12.995s
| Adam | epoch: 001 | loss: 0.96744 - acc: 0.6787 -- iter: 06912/55000
Training Step: 109  | total loss: 0.96681 | time: 13.073s
| Adam | epoch: 001 | loss: 0.96681 - acc: 0.6702 -- iter: 06976/55000
Training Step: 110  | total loss: 0.95638 | time: 13.154s
| Adam | epoch: 001 | loss: 0.95638 - acc: 0.6735 -- iter: 07040/55000
Training Step: 111  | total loss: 0.96916 | time: 13.236s
| Adam | epoch: 001 | loss: 0.96916 - acc: 0.6718 -- iter: 07104/55000
Training Step: 112  | total loss: 0.96800 | time: 13.324s
| Adam | epoch: 001 | loss: 0.96800 - acc: 0.6781 -- iter: 07168/55000
Training Step: 113  | total loss: 0.94526 | time: 13.401s
| Adam | epoch: 001 | loss: 0.94526 - acc: 0.6884 -- iter: 07232/55000
Training Step: 114  | total loss: 0.91878 | time: 13.485s
| Adam | epoch: 001 | loss: 0.91878 - acc: 0.6992 -- iter: 07296/55000
Training Step: 115  | total loss: 0.90311 | time: 13.573s
| Adam | epoch: 001 | loss: 0.90311 - acc: 0.6996 -- iter: 07360/55000
Training Step: 116  | total loss: 0.89521 | time: 13.682s
| Adam | epoch: 001 | loss: 0.89521 - acc: 0.7031 -- iter: 07424/55000
Training Step: 117  | total loss: 0.86999 | time: 13.786s
| Adam | epoch: 001 | loss: 0.86999 - acc: 0.7125 -- iter: 07488/55000
Training Step: 118  | total loss: 0.87026 | time: 13.868s
| Adam | epoch: 001 | loss: 0.87026 - acc: 0.7100 -- iter: 07552/55000
Training Step: 119  | total loss: 0.87076 | time: 13.946s
| Adam | epoch: 001 | loss: 0.87076 - acc: 0.7171 -- iter: 07616/55000
Training Step: 120  | total loss: 0.86282 | time: 14.030s
| Adam | epoch: 001 | loss: 0.86282 - acc: 0.7188 -- iter: 07680/55000
Training Step: 121  | total loss: 0.87106 | time: 14.167s
| Adam | epoch: 001 | loss: 0.87106 - acc: 0.7204 -- iter: 07744/55000
Training Step: 122  | total loss: 0.85589 | time: 14.266s
| Adam | epoch: 001 | loss: 0.85589 - acc: 0.7187 -- iter: 07808/55000
Training Step: 123  | total loss: 0.85184 | time: 14.348s
| Adam | epoch: 001 | loss: 0.85184 - acc: 0.7202 -- iter: 07872/55000
Training Step: 124  | total loss: 0.83852 | time: 14.417s
| Adam | epoch: 001 | loss: 0.83852 - acc: 0.7248 -- iter: 07936/55000
Training Step: 125  | total loss: 0.87202 | time: 14.488s
| Adam | epoch: 001 | loss: 0.87202 - acc: 0.7117 -- iter: 08000/55000
Training Step: 126  | total loss: 0.85141 | time: 14.564s
| Adam | epoch: 001 | loss: 0.85141 - acc: 0.7186 -- iter: 08064/55000
Training Step: 127  | total loss: 0.84667 | time: 14.672s
| Adam | epoch: 001 | loss: 0.84667 - acc: 0.7171 -- iter: 08128/55000
Training Step: 128  | total loss: 0.84962 | time: 14.862s
| Adam | epoch: 001 | loss: 0.84962 - acc: 0.7172 -- iter: 08192/55000
Training Step: 129  | total loss: 0.84488 | time: 14.944s
| Adam | epoch: 001 | loss: 0.84488 - acc: 0.7190 -- iter: 08256/55000
Training Step: 130  | total loss: 0.83290 | time: 15.045s
| Adam | epoch: 001 | loss: 0.83290 - acc: 0.7252 -- iter: 08320/55000
Training Step: 131  | total loss: 0.82558 | time: 15.132s
| Adam | epoch: 001 | loss: 0.82558 - acc: 0.7277 -- iter: 08384/55000
Training Step: 132  | total loss: 0.82019 | time: 15.228s
| Adam | epoch: 001 | loss: 0.82019 - acc: 0.7315 -- iter: 08448/55000
Training Step: 133  | total loss: 0.80147 | time: 15.325s
| Adam | epoch: 001 | loss: 0.80147 - acc: 0.7396 -- iter: 08512/55000
Training Step: 134  | total loss: 0.78075 | time: 15.420s
| Adam | epoch: 001 | loss: 0.78075 - acc: 0.7437 -- iter: 08576/55000
Training Step: 135  | total loss: 0.77551 | time: 15.507s
| Adam | epoch: 001 | loss: 0.77551 - acc: 0.7490 -- iter: 08640/55000
Training Step: 136  | total loss: 0.78014 | time: 15.598s
| Adam | epoch: 001 | loss: 0.78014 - acc: 0.7523 -- iter: 08704/55000
Training Step: 137  | total loss: 0.76653 | time: 15.699s
| Adam | epoch: 001 | loss: 0.76653 - acc: 0.7583 -- iter: 08768/55000
Training Step: 138  | total loss: 0.75854 | time: 15.790s
| Adam | epoch: 001 | loss: 0.75854 - acc: 0.7559 -- iter: 08832/55000
Training Step: 139  | total loss: 0.75299 | time: 15.881s
| Adam | epoch: 001 | loss: 0.75299 - acc: 0.7600 -- iter: 08896/55000
Training Step: 140  | total loss: 0.75194 | time: 15.981s
| Adam | epoch: 001 | loss: 0.75194 - acc: 0.7606 -- iter: 08960/55000
Training Step: 141  | total loss: 0.73765 | time: 16.069s
| Adam | epoch: 001 | loss: 0.73765 - acc: 0.7704 -- iter: 09024/55000
Training Step: 142  | total loss: 0.73899 | time: 16.160s
| Adam | epoch: 001 | loss: 0.73899 - acc: 0.7700 -- iter: 09088/55000
Training Step: 143  | total loss: 0.73675 | time: 16.249s
| Adam | epoch: 001 | loss: 0.73675 - acc: 0.7602 -- iter: 09152/55000
Training Step: 144  | total loss: 0.73605 | time: 16.338s
| Adam | epoch: 001 | loss: 0.73605 - acc: 0.7654 -- iter: 09216/55000
Training Step: 145  | total loss: 0.72503 | time: 16.420s
| Adam | epoch: 001 | loss: 0.72503 - acc: 0.7701 -- iter: 09280/55000
Training Step: 146  | total loss: 0.71150 | time: 16.520s
| Adam | epoch: 001 | loss: 0.71150 - acc: 0.7759 -- iter: 09344/55000
Training Step: 147  | total loss: 0.70193 | time: 16.614s
| Adam | epoch: 001 | loss: 0.70193 - acc: 0.7811 -- iter: 09408/55000
Training Step: 148  | total loss: 0.69012 | time: 16.705s
| Adam | epoch: 001 | loss: 0.69012 - acc: 0.7858 -- iter: 09472/55000
Training Step: 149  | total loss: 0.68618 | time: 16.818s
| Adam | epoch: 001 | loss: 0.68618 - acc: 0.7822 -- iter: 09536/55000
Training Step: 150  | total loss: 0.65793 | time: 16.909s
| Adam | epoch: 001 | loss: 0.65793 - acc: 0.7931 -- iter: 09600/55000
Training Step: 151  | total loss: 0.64981 | time: 16.996s
| Adam | epoch: 001 | loss: 0.64981 - acc: 0.7950 -- iter: 09664/55000
Training Step: 152  | total loss: 0.65076 | time: 17.091s
| Adam | epoch: 001 | loss: 0.65076 - acc: 0.7890 -- iter: 09728/55000
Training Step: 153  | total loss: 0.65767 | time: 17.183s
| Adam | epoch: 001 | loss: 0.65767 - acc: 0.7882 -- iter: 09792/55000
Training Step: 154  | total loss: 0.64788 | time: 17.270s
| Adam | epoch: 001 | loss: 0.64788 - acc: 0.7922 -- iter: 09856/55000
Training Step: 155  | total loss: 0.65209 | time: 17.362s
| Adam | epoch: 001 | loss: 0.65209 - acc: 0.7895 -- iter: 09920/55000
Training Step: 156  | total loss: 0.65153 | time: 17.452s
| Adam | epoch: 001 | loss: 0.65153 - acc: 0.7934 -- iter: 09984/55000
Training Step: 157  | total loss: 0.69179 | time: 17.536s
| Adam | epoch: 001 | loss: 0.69179 - acc: 0.7828 -- iter: 10048/55000
Training Step: 158  | total loss: 0.70335 | time: 17.628s
| Adam | epoch: 001 | loss: 0.70335 - acc: 0.7795 -- iter: 10112/55000
Training Step: 159  | total loss: 0.68798 | time: 17.725s
| Adam | epoch: 001 | loss: 0.68798 - acc: 0.7828 -- iter: 10176/55000
Training Step: 160  | total loss: 0.66570 | time: 17.809s
| Adam | epoch: 001 | loss: 0.66570 - acc: 0.7889 -- iter: 10240/55000
Training Step: 161  | total loss: 0.66312 | time: 17.908s
| Adam | epoch: 001 | loss: 0.66312 - acc: 0.7850 -- iter: 10304/55000
Training Step: 162  | total loss: 0.66579 | time: 18.001s
| Adam | epoch: 001 | loss: 0.66579 - acc: 0.7846 -- iter: 10368/55000
Training Step: 163  | total loss: 0.65281 | time: 18.088s
| Adam | epoch: 001 | loss: 0.65281 - acc: 0.7874 -- iter: 10432/55000
Training Step: 164  | total loss: 0.64217 | time: 18.180s
| Adam | epoch: 001 | loss: 0.64217 - acc: 0.7946 -- iter: 10496/55000
Training Step: 165  | total loss: 0.64940 | time: 18.270s
| Adam | epoch: 001 | loss: 0.64940 - acc: 0.7964 -- iter: 10560/55000
Training Step: 166  | total loss: 0.65562 | time: 18.356s
| Adam | epoch: 001 | loss: 0.65562 - acc: 0.7918 -- iter: 10624/55000
Training Step: 167  | total loss: 0.65650 | time: 18.444s
| Adam | epoch: 001 | loss: 0.65650 - acc: 0.7923 -- iter: 10688/55000
Training Step: 168  | total loss: 0.63585 | time: 18.533s
| Adam | epoch: 001 | loss: 0.63585 - acc: 0.8037 -- iter: 10752/55000
Training Step: 169  | total loss: 0.62446 | time: 18.629s
| Adam | epoch: 001 | loss: 0.62446 - acc: 0.8124 -- iter: 10816/55000
Training Step: 170  | total loss: 0.62175 | time: 18.736s
| Adam | epoch: 001 | loss: 0.62175 - acc: 0.8139 -- iter: 10880/55000
Training Step: 171  | total loss: 0.62955 | time: 18.827s
| Adam | epoch: 001 | loss: 0.62955 - acc: 0.8044 -- iter: 10944/55000
Training Step: 172  | total loss: 0.61877 | time: 18.913s
| Adam | epoch: 001 | loss: 0.61877 - acc: 0.8084 -- iter: 11008/55000
Training Step: 173  | total loss: 0.60870 | time: 19.021s
| Adam | epoch: 001 | loss: 0.60870 - acc: 0.8135 -- iter: 11072/55000
Training Step: 174  | total loss: 0.62547 | time: 19.117s
| Adam | epoch: 001 | loss: 0.62547 - acc: 0.8056 -- iter: 11136/55000
Training Step: 175  | total loss: 0.61011 | time: 19.205s
| Adam | epoch: 001 | loss: 0.61011 - acc: 0.8109 -- iter: 11200/55000
Training Step: 176  | total loss: 0.63179 | time: 19.293s
| Adam | epoch: 001 | loss: 0.63179 - acc: 0.8017 -- iter: 11264/55000
Training Step: 177  | total loss: 0.65876 | time: 19.385s
| Adam | epoch: 001 | loss: 0.65876 - acc: 0.7965 -- iter: 11328/55000
Training Step: 178  | total loss: 0.66166 | time: 19.481s
| Adam | epoch: 001 | loss: 0.66166 - acc: 0.7966 -- iter: 11392/55000
Training Step: 179  | total loss: 0.68819 | time: 19.571s
| Adam | epoch: 001 | loss: 0.68819 - acc: 0.7841 -- iter: 11456/55000
Training Step: 180  | total loss: 0.69202 | time: 19.663s
| Adam | epoch: 001 | loss: 0.69202 - acc: 0.7776 -- iter: 11520/55000
Training Step: 181  | total loss: 0.68350 | time: 19.752s
| Adam | epoch: 001 | loss: 0.68350 - acc: 0.7811 -- iter: 11584/55000
Training Step: 182  | total loss: 0.67177 | time: 19.845s
| Adam | epoch: 001 | loss: 0.67177 - acc: 0.7826 -- iter: 11648/55000
Training Step: 183  | total loss: 0.68280 | time: 19.945s
| Adam | epoch: 001 | loss: 0.68280 - acc: 0.7747 -- iter: 11712/55000
Training Step: 184  | total loss: 0.67099 | time: 20.051s
| Adam | epoch: 001 | loss: 0.67099 - acc: 0.7785 -- iter: 11776/55000
Training Step: 185  | total loss: 0.66673 | time: 20.150s
| Adam | epoch: 001 | loss: 0.66673 - acc: 0.7772 -- iter: 11840/55000
Training Step: 186  | total loss: 0.65158 | time: 20.247s
| Adam | epoch: 001 | loss: 0.65158 - acc: 0.7885 -- iter: 11904/55000
Training Step: 187  | total loss: 0.64766 | time: 20.406s
| Adam | epoch: 001 | loss: 0.64766 - acc: 0.7909 -- iter: 11968/55000
Training Step: 188  | total loss: 0.63837 | time: 20.521s
| Adam | epoch: 001 | loss: 0.63837 - acc: 0.7915 -- iter: 12032/55000
Training Step: 189  | total loss: 0.64199 | time: 20.624s
| Adam | epoch: 001 | loss: 0.64199 - acc: 0.7921 -- iter: 12096/55000
Training Step: 190  | total loss: 0.64907 | time: 20.715s
| Adam | epoch: 001 | loss: 0.64907 - acc: 0.7879 -- iter: 12160/55000
Training Step: 191  | total loss: 0.64814 | time: 20.807s
| Adam | epoch: 001 | loss: 0.64814 - acc: 0.7856 -- iter: 12224/55000
Training Step: 192  | total loss: 0.63260 | time: 20.920s
| Adam | epoch: 001 | loss: 0.63260 - acc: 0.7930 -- iter: 12288/55000
Training Step: 193  | total loss: 0.62832 | time: 21.030s
| Adam | epoch: 001 | loss: 0.62832 - acc: 0.7965 -- iter: 12352/55000
Training Step: 194  | total loss: 0.62189 | time: 21.124s
| Adam | epoch: 001 | loss: 0.62189 - acc: 0.7997 -- iter: 12416/55000
Training Step: 195  | total loss: 0.62191 | time: 21.233s
| Adam | epoch: 001 | loss: 0.62191 - acc: 0.7994 -- iter: 12480/55000
Training Step: 196  | total loss: 0.61278 | time: 21.334s
| Adam | epoch: 001 | loss: 0.61278 - acc: 0.7991 -- iter: 12544/55000
Training Step: 197  | total loss: 0.60690 | time: 21.421s
| Adam | epoch: 001 | loss: 0.60690 - acc: 0.8036 -- iter: 12608/55000
Training Step: 198  | total loss: 0.60029 | time: 21.513s
| Adam | epoch: 001 | loss: 0.60029 - acc: 0.8045 -- iter: 12672/55000
Training Step: 199  | total loss: 0.61381 | time: 21.606s
| Adam | epoch: 001 | loss: 0.61381 - acc: 0.7881 -- iter: 12736/55000
Training Step: 200  | total loss: 0.60992 | time: 24.289s
| Adam | epoch: 001 | loss: 0.60992 - acc: 0.7874 | val_loss: 0.58093 - val_acc: 0.8198 -- iter: 12800/55000
--
Training Step: 201  | total loss: 0.61489 | time: 24.377s
| Adam | epoch: 001 | loss: 0.61489 - acc: 0.7852 -- iter: 12864/55000
Training Step: 202  | total loss: 0.60375 | time: 24.476s
| Adam | epoch: 001 | loss: 0.60375 - acc: 0.7895 -- iter: 12928/55000
Training Step: 203  | total loss: 0.60756 | time: 24.566s
| Adam | epoch: 001 | loss: 0.60756 - acc: 0.7903 -- iter: 12992/55000
Training Step: 204  | total loss: 0.62830 | time: 24.655s
| Adam | epoch: 001 | loss: 0.62830 - acc: 0.7847 -- iter: 13056/55000
Training Step: 205  | total loss: 0.62843 | time: 24.743s
| Adam | epoch: 001 | loss: 0.62843 - acc: 0.7859 -- iter: 13120/55000
Training Step: 206  | total loss: 0.61605 | time: 24.838s
| Adam | epoch: 001 | loss: 0.61605 - acc: 0.7948 -- iter: 13184/55000
Training Step: 207  | total loss: 0.58921 | time: 24.929s
| Adam | epoch: 001 | loss: 0.58921 - acc: 0.8044 -- iter: 13248/55000
Training Step: 208  | total loss: 0.60060 | time: 25.019s
| Adam | epoch: 001 | loss: 0.60060 - acc: 0.7974 -- iter: 13312/55000
Training Step: 209  | total loss: 0.58976 | time: 25.109s
| Adam | epoch: 001 | loss: 0.58976 - acc: 0.8051 -- iter: 13376/55000
Training Step: 210  | total loss: 0.59683 | time: 25.196s
| Adam | epoch: 001 | loss: 0.59683 - acc: 0.7949 -- iter: 13440/55000
Training Step: 211  | total loss: 0.57863 | time: 25.286s
| Adam | epoch: 001 | loss: 0.57863 - acc: 0.8030 -- iter: 13504/55000
Training Step: 212  | total loss: 0.57978 | time: 25.372s
| Adam | epoch: 001 | loss: 0.57978 - acc: 0.8055 -- iter: 13568/55000
Training Step: 213  | total loss: 0.56768 | time: 25.461s
| Adam | epoch: 001 | loss: 0.56768 - acc: 0.8077 -- iter: 13632/55000
Training Step: 214  | total loss: 0.56411 | time: 25.550s
| Adam | epoch: 001 | loss: 0.56411 - acc: 0.8113 -- iter: 13696/55000
Training Step: 215  | total loss: 0.56578 | time: 25.638s
| Adam | epoch: 001 | loss: 0.56578 - acc: 0.8130 -- iter: 13760/55000
Training Step: 216  | total loss: 0.59238 | time: 25.724s
| Adam | epoch: 001 | loss: 0.59238 - acc: 0.8098 -- iter: 13824/55000
Training Step: 217  | total loss: 0.59426 | time: 25.817s
| Adam | epoch: 001 | loss: 0.59426 - acc: 0.8054 -- iter: 13888/55000
Training Step: 218  | total loss: 0.59720 | time: 25.927s
| Adam | epoch: 001 | loss: 0.59720 - acc: 0.8030 -- iter: 13952/55000
Training Step: 219  | total loss: 0.57955 | time: 26.019s
| Adam | epoch: 001 | loss: 0.57955 - acc: 0.8086 -- iter: 14016/55000
Training Step: 220  | total loss: 0.57247 | time: 26.105s
| Adam | epoch: 001 | loss: 0.57247 - acc: 0.8137 -- iter: 14080/55000
Training Step: 221  | total loss: 0.55830 | time: 26.195s
| Adam | epoch: 001 | loss: 0.55830 - acc: 0.8198 -- iter: 14144/55000
Training Step: 222  | total loss: 0.55473 | time: 26.283s
| Adam | epoch: 001 | loss: 0.55473 - acc: 0.8144 -- iter: 14208/55000
Training Step: 223  | total loss: 0.55923 | time: 26.369s
| Adam | epoch: 001 | loss: 0.55923 - acc: 0.8080 -- iter: 14272/55000
Training Step: 224  | total loss: 0.56731 | time: 26.457s
| Adam | epoch: 001 | loss: 0.56731 - acc: 0.8006 -- iter: 14336/55000
Training Step: 225  | total loss: 0.57789 | time: 26.545s
| Adam | epoch: 001 | loss: 0.57789 - acc: 0.7987 -- iter: 14400/55000
Training Step: 226  | total loss: 0.58178 | time: 26.636s
| Adam | epoch: 001 | loss: 0.58178 - acc: 0.8016 -- iter: 14464/55000
Training Step: 227  | total loss: 0.57478 | time: 26.732s
| Adam | epoch: 001 | loss: 0.57478 - acc: 0.8058 -- iter: 14528/55000
Training Step: 228  | total loss: 0.54673 | time: 26.821s
| Adam | epoch: 001 | loss: 0.54673 - acc: 0.8190 -- iter: 14592/55000
Training Step: 229  | total loss: 0.55696 | time: 26.912s
| Adam | epoch: 001 | loss: 0.55696 - acc: 0.8168 -- iter: 14656/55000
Training Step: 230  | total loss: 0.55664 | time: 27.003s
| Adam | epoch: 001 | loss: 0.55664 - acc: 0.8132 -- iter: 14720/55000
Training Step: 231  | total loss: 0.58008 | time: 27.094s
| Adam | epoch: 001 | loss: 0.58008 - acc: 0.8069 -- iter: 14784/55000
Training Step: 232  | total loss: 0.59173 | time: 27.185s
| Adam | epoch: 001 | loss: 0.59173 - acc: 0.8090 -- iter: 14848/55000
Training Step: 233  | total loss: 0.57220 | time: 27.274s
| Adam | epoch: 001 | loss: 0.57220 - acc: 0.8203 -- iter: 14912/55000
Training Step: 234  | total loss: 0.57438 | time: 27.365s
| Adam | epoch: 001 | loss: 0.57438 - acc: 0.8242 -- iter: 14976/55000
Training Step: 235  | total loss: 0.56488 | time: 27.454s
| Adam | epoch: 001 | loss: 0.56488 - acc: 0.8309 -- iter: 15040/55000
Training Step: 236  | total loss: 0.56912 | time: 27.552s
| Adam | epoch: 001 | loss: 0.56912 - acc: 0.8306 -- iter: 15104/55000
Training Step: 237  | total loss: 0.57795 | time: 27.642s
| Adam | epoch: 001 | loss: 0.57795 - acc: 0.8335 -- iter: 15168/55000
Training Step: 238  | total loss: 0.57485 | time: 27.734s
| Adam | epoch: 001 | loss: 0.57485 - acc: 0.8314 -- iter: 15232/55000
Training Step: 239  | total loss: 0.55262 | time: 27.831s
| Adam | epoch: 001 | loss: 0.55262 - acc: 0.8357 -- iter: 15296/55000
Training Step: 240  | total loss: 0.52941 | time: 27.922s
| Adam | epoch: 001 | loss: 0.52941 - acc: 0.8443 -- iter: 15360/55000
Training Step: 241  | total loss: 0.51865 | time: 28.021s
| Adam | epoch: 001 | loss: 0.51865 - acc: 0.8459 -- iter: 15424/55000
Training Step: 242  | total loss: 0.53206 | time: 28.134s
| Adam | epoch: 001 | loss: 0.53206 - acc: 0.8394 -- iter: 15488/55000
Training Step: 243  | total loss: 0.51988 | time: 28.221s
| Adam | epoch: 001 | loss: 0.51988 - acc: 0.8445 -- iter: 15552/55000
Training Step: 244  | total loss: 0.49989 | time: 28.313s
| Adam | epoch: 001 | loss: 0.49989 - acc: 0.8554 -- iter: 15616/55000
Training Step: 245  | total loss: 0.48870 | time: 28.408s
| Adam | epoch: 001 | loss: 0.48870 - acc: 0.8605 -- iter: 15680/55000
Training Step: 246  | total loss: 0.50318 | time: 28.498s
| Adam | epoch: 001 | loss: 0.50318 - acc: 0.8557 -- iter: 15744/55000
Training Step: 247  | total loss: 0.51702 | time: 28.590s
| Adam | epoch: 001 | loss: 0.51702 - acc: 0.8498 -- iter: 15808/55000
Training Step: 248  | total loss: 0.51136 | time: 28.673s
| Adam | epoch: 001 | loss: 0.51136 - acc: 0.8476 -- iter: 15872/55000
Training Step: 249  | total loss: 0.50336 | time: 28.772s
| Adam | epoch: 001 | loss: 0.50336 - acc: 0.8441 -- iter: 15936/55000
Training Step: 250  | total loss: 0.50236 | time: 28.868s
| Adam | epoch: 001 | loss: 0.50236 - acc: 0.8425 -- iter: 16000/55000
Training Step: 251  | total loss: 0.50802 | time: 28.962s
| Adam | epoch: 001 | loss: 0.50802 - acc: 0.8458 -- iter: 16064/55000
Training Step: 252  | total loss: 0.50830 | time: 29.057s
| Adam | epoch: 001 | loss: 0.50830 - acc: 0.8487 -- iter: 16128/55000
Training Step: 253  | total loss: 0.50290 | time: 29.157s
| Adam | epoch: 001 | loss: 0.50290 - acc: 0.8513 -- iter: 16192/55000
Training Step: 254  | total loss: 0.48157 | time: 29.253s
| Adam | epoch: 001 | loss: 0.48157 - acc: 0.8599 -- iter: 16256/55000
Training Step: 255  | total loss: 0.50546 | time: 29.351s
| Adam | epoch: 001 | loss: 0.50546 - acc: 0.8521 -- iter: 16320/55000
Training Step: 256  | total loss: 0.51568 | time: 29.448s
| Adam | epoch: 001 | loss: 0.51568 - acc: 0.8512 -- iter: 16384/55000
Training Step: 257  | total loss: 0.49285 | time: 29.539s
| Adam | epoch: 001 | loss: 0.49285 - acc: 0.8583 -- iter: 16448/55000
Training Step: 258  | total loss: 0.49434 | time: 29.634s
| Adam | epoch: 001 | loss: 0.49434 - acc: 0.8615 -- iter: 16512/55000
Training Step: 259  | total loss: 0.48816 | time: 29.727s
| Adam | epoch: 001 | loss: 0.48816 - acc: 0.8644 -- iter: 16576/55000
Training Step: 260  | total loss: 0.50207 | time: 29.827s
| Adam | epoch: 001 | loss: 0.50207 - acc: 0.8624 -- iter: 16640/55000
Training Step: 261  | total loss: 0.50872 | time: 29.921s
| Adam | epoch: 001 | loss: 0.50872 - acc: 0.8574 -- iter: 16704/55000
Training Step: 262  | total loss: 0.51540 | time: 30.030s
| Adam | epoch: 001 | loss: 0.51540 - acc: 0.8560 -- iter: 16768/55000
Training Step: 263  | total loss: 0.50671 | time: 30.119s
| Adam | epoch: 001 | loss: 0.50671 - acc: 0.8610 -- iter: 16832/55000
Training Step: 264  | total loss: 0.49352 | time: 30.216s
| Adam | epoch: 001 | loss: 0.49352 - acc: 0.8656 -- iter: 16896/55000
Training Step: 265  | total loss: 0.48045 | time: 30.309s
| Adam | epoch: 001 | loss: 0.48045 - acc: 0.8665 -- iter: 16960/55000
Training Step: 266  | total loss: 0.46459 | time: 30.407s
| Adam | epoch: 001 | loss: 0.46459 - acc: 0.8674 -- iter: 17024/55000
Training Step: 267  | total loss: 0.45058 | time: 30.493s
| Adam | epoch: 001 | loss: 0.45058 - acc: 0.8681 -- iter: 17088/55000
Training Step: 268  | total loss: 0.44620 | time: 30.588s
| Adam | epoch: 001 | loss: 0.44620 - acc: 0.8688 -- iter: 17152/55000
Training Step: 269  | total loss: 0.43431 | time: 30.688s
| Adam | epoch: 001 | loss: 0.43431 - acc: 0.8710 -- iter: 17216/55000
Training Step: 270  | total loss: 0.44274 | time: 30.775s
| Adam | epoch: 001 | loss: 0.44274 - acc: 0.8636 -- iter: 17280/55000
Training Step: 271  | total loss: 0.43197 | time: 30.872s
| Adam | epoch: 001 | loss: 0.43197 - acc: 0.8678 -- iter: 17344/55000
Training Step: 272  | total loss: 0.42902 | time: 30.963s
| Adam | epoch: 001 | loss: 0.42902 - acc: 0.8701 -- iter: 17408/55000
Training Step: 273  | total loss: 0.41816 | time: 31.053s
| Adam | epoch: 001 | loss: 0.41816 - acc: 0.8706 -- iter: 17472/55000
Training Step: 274  | total loss: 0.41583 | time: 31.145s
| Adam | epoch: 001 | loss: 0.41583 - acc: 0.8742 -- iter: 17536/55000
Training Step: 275  | total loss: 0.42648 | time: 31.239s
| Adam | epoch: 001 | loss: 0.42648 - acc: 0.8696 -- iter: 17600/55000
Training Step: 276  | total loss: 0.43864 | time: 31.328s
| Adam | epoch: 001 | loss: 0.43864 - acc: 0.8670 -- iter: 17664/55000
Training Step: 277  | total loss: 0.42572 | time: 31.413s
| Adam | epoch: 001 | loss: 0.42572 - acc: 0.8709 -- iter: 17728/55000
Training Step: 278  | total loss: 0.42181 | time: 31.501s
| Adam | epoch: 001 | loss: 0.42181 - acc: 0.8666 -- iter: 17792/55000
Training Step: 279  | total loss: 0.41171 | time: 31.588s
| Adam | epoch: 001 | loss: 0.41171 - acc: 0.8706 -- iter: 17856/55000
Training Step: 280  | total loss: 0.40621 | time: 31.671s
| Adam | epoch: 001 | loss: 0.40621 - acc: 0.8742 -- iter: 17920/55000
Training Step: 281  | total loss: 0.40933 | time: 31.758s
| Adam | epoch: 001 | loss: 0.40933 - acc: 0.8727 -- iter: 17984/55000
Training Step: 282  | total loss: 0.40694 | time: 31.855s
| Adam | epoch: 001 | loss: 0.40694 - acc: 0.8745 -- iter: 18048/55000
Training Step: 283  | total loss: 0.39700 | time: 31.948s
| Adam | epoch: 001 | loss: 0.39700 - acc: 0.8745 -- iter: 18112/55000
Training Step: 284  | total loss: 0.39910 | time: 32.071s
| Adam | epoch: 001 | loss: 0.39910 - acc: 0.8746 -- iter: 18176/55000
Training Step: 285  | total loss: 0.38803 | time: 32.243s
| Adam | epoch: 001 | loss: 0.38803 - acc: 0.8762 -- iter: 18240/55000
Training Step: 286  | total loss: 0.38485 | time: 32.361s
| Adam | epoch: 001 | loss: 0.38485 - acc: 0.8792 -- iter: 18304/55000
Training Step: 287  | total loss: 0.36997 | time: 32.524s
| Adam | epoch: 001 | loss: 0.36997 - acc: 0.8835 -- iter: 18368/55000
Training Step: 288  | total loss: 0.35344 | time: 32.627s
| Adam | epoch: 001 | loss: 0.35344 - acc: 0.8920 -- iter: 18432/55000
Training Step: 289  | total loss: 0.36071 | time: 32.724s
| Adam | epoch: 001 | loss: 0.36071 - acc: 0.8887 -- iter: 18496/55000
Training Step: 290  | total loss: 0.37615 | time: 32.835s
| Adam | epoch: 001 | loss: 0.37615 - acc: 0.8842 -- iter: 18560/55000
Training Step: 291  | total loss: 0.36957 | time: 32.983s
| Adam | epoch: 001 | loss: 0.36957 - acc: 0.8896 -- iter: 18624/55000
Training Step: 292  | total loss: 0.38195 | time: 33.120s
| Adam | epoch: 001 | loss: 0.38195 - acc: 0.8850 -- iter: 18688/55000
Training Step: 293  | total loss: 0.37388 | time: 33.246s
| Adam | epoch: 001 | loss: 0.37388 - acc: 0.8871 -- iter: 18752/55000
Training Step: 294  | total loss: 0.39191 | time: 33.457s
| Adam | epoch: 001 | loss: 0.39191 - acc: 0.8859 -- iter: 18816/55000
Training Step: 295  | total loss: 0.39700 | time: 33.576s
| Adam | epoch: 001 | loss: 0.39700 - acc: 0.8817 -- iter: 18880/55000
Training Step: 296  | total loss: 0.40140 | time: 33.657s
| Adam | epoch: 001 | loss: 0.40140 - acc: 0.8841 -- iter: 18944/55000
Training Step: 297  | total loss: 0.39640 | time: 33.736s
| Adam | epoch: 001 | loss: 0.39640 - acc: 0.8848 -- iter: 19008/55000
Training Step: 298  | total loss: 0.39757 | time: 33.830s
| Adam | epoch: 001 | loss: 0.39757 - acc: 0.8807 -- iter: 19072/55000
Training Step: 299  | total loss: 0.37859 | time: 33.925s
| Adam | epoch: 001 | loss: 0.37859 - acc: 0.8848 -- iter: 19136/55000
Training Step: 300  | total loss: 0.37637 | time: 36.404s
| Adam | epoch: 001 | loss: 0.37637 - acc: 0.8885 | val_loss: 0.34490 - val_acc: 0.9018 -- iter: 19200/55000
--
Training Step: 301  | total loss: 0.37632 | time: 36.504s
| Adam | epoch: 001 | loss: 0.37632 - acc: 0.8903 -- iter: 19264/55000
Training Step: 302  | total loss: 0.38254 | time: 36.591s
| Adam | epoch: 001 | loss: 0.38254 - acc: 0.8872 -- iter: 19328/55000
Training Step: 303  | total loss: 0.38173 | time: 36.685s
| Adam | epoch: 001 | loss: 0.38173 - acc: 0.8907 -- iter: 19392/55000
Training Step: 304  | total loss: 0.38329 | time: 36.779s
| Adam | epoch: 001 | loss: 0.38329 - acc: 0.8907 -- iter: 19456/55000
Training Step: 305  | total loss: 0.40761 | time: 36.867s
| Adam | epoch: 001 | loss: 0.40761 - acc: 0.8813 -- iter: 19520/55000
Training Step: 306  | total loss: 0.41300 | time: 36.953s
| Adam | epoch: 001 | loss: 0.41300 - acc: 0.8807 -- iter: 19584/55000
Training Step: 307  | total loss: 0.41070 | time: 37.042s
| Adam | epoch: 001 | loss: 0.41070 - acc: 0.8816 -- iter: 19648/55000
Training Step: 308  | total loss: 0.40795 | time: 37.128s
| Adam | epoch: 001 | loss: 0.40795 - acc: 0.8825 -- iter: 19712/55000
Training Step: 309  | total loss: 0.40971 | time: 37.216s
| Adam | epoch: 001 | loss: 0.40971 - acc: 0.8849 -- iter: 19776/55000
Training Step: 310  | total loss: 0.39931 | time: 37.314s
| Adam | epoch: 001 | loss: 0.39931 - acc: 0.8870 -- iter: 19840/55000
Training Step: 311  | total loss: 0.38668 | time: 37.399s
| Adam | epoch: 001 | loss: 0.38668 - acc: 0.8905 -- iter: 19904/55000
Training Step: 312  | total loss: 0.39050 | time: 37.484s
| Adam | epoch: 001 | loss: 0.39050 - acc: 0.8874 -- iter: 19968/55000
Training Step: 313  | total loss: 0.38798 | time: 37.567s
| Adam | epoch: 001 | loss: 0.38798 - acc: 0.8877 -- iter: 20032/55000
Training Step: 314  | total loss: 0.38902 | time: 37.648s
| Adam | epoch: 001 | loss: 0.38902 - acc: 0.8818 -- iter: 20096/55000
Training Step: 315  | total loss: 0.37789 | time: 37.734s
| Adam | epoch: 001 | loss: 0.37789 - acc: 0.8889 -- iter: 20160/55000
Training Step: 316  | total loss: 0.37405 | time: 37.824s
| Adam | epoch: 001 | loss: 0.37405 - acc: 0.8844 -- iter: 20224/55000
Training Step: 317  | total loss: 0.39121 | time: 37.907s
| Adam | epoch: 001 | loss: 0.39121 - acc: 0.8756 -- iter: 20288/55000
Training Step: 318  | total loss: 0.38995 | time: 38.000s
| Adam | epoch: 001 | loss: 0.38995 - acc: 0.8725 -- iter: 20352/55000
Training Step: 319  | total loss: 0.38934 | time: 38.099s
| Adam | epoch: 001 | loss: 0.38934 - acc: 0.8790 -- iter: 20416/55000
Training Step: 320  | total loss: 0.38129 | time: 38.184s
| Adam | epoch: 001 | loss: 0.38129 - acc: 0.8832 -- iter: 20480/55000
Training Step: 321  | total loss: 0.38735 | time: 38.270s
| Adam | epoch: 001 | loss: 0.38735 - acc: 0.8824 -- iter: 20544/55000
Training Step: 322  | total loss: 0.38933 | time: 38.362s
| Adam | epoch: 001 | loss: 0.38933 - acc: 0.8832 -- iter: 20608/55000
Training Step: 323  | total loss: 0.38899 | time: 38.451s
| Adam | epoch: 001 | loss: 0.38899 - acc: 0.8855 -- iter: 20672/55000
Training Step: 324  | total loss: 0.38883 | time: 38.535s
| Adam | epoch: 001 | loss: 0.38883 - acc: 0.8861 -- iter: 20736/55000
Training Step: 325  | total loss: 0.38510 | time: 38.623s
| Adam | epoch: 001 | loss: 0.38510 - acc: 0.8865 -- iter: 20800/55000
Training Step: 326  | total loss: 0.37045 | time: 38.705s
| Adam | epoch: 001 | loss: 0.37045 - acc: 0.8916 -- iter: 20864/55000
Training Step: 327  | total loss: 0.36675 | time: 38.793s
| Adam | epoch: 001 | loss: 0.36675 - acc: 0.8915 -- iter: 20928/55000
Training Step: 328  | total loss: 0.35701 | time: 38.879s
| Adam | epoch: 001 | loss: 0.35701 - acc: 0.8945 -- iter: 20992/55000
Training Step: 329  | total loss: 0.34784 | time: 38.978s
| Adam | epoch: 001 | loss: 0.34784 - acc: 0.8957 -- iter: 21056/55000
Training Step: 330  | total loss: 0.35356 | time: 39.073s
| Adam | epoch: 001 | loss: 0.35356 - acc: 0.8983 -- iter: 21120/55000
Training Step: 331  | total loss: 0.37451 | time: 39.163s
| Adam | epoch: 001 | loss: 0.37451 - acc: 0.8898 -- iter: 21184/55000
Training Step: 332  | total loss: 0.35935 | time: 39.250s
| Adam | epoch: 001 | loss: 0.35935 - acc: 0.8961 -- iter: 21248/55000
Training Step: 333  | total loss: 0.36283 | time: 39.346s
| Adam | epoch: 001 | loss: 0.36283 - acc: 0.8924 -- iter: 21312/55000
Training Step: 334  | total loss: 0.36516 | time: 39.438s
| Adam | epoch: 001 | loss: 0.36516 - acc: 0.8907 -- iter: 21376/55000
Training Step: 335  | total loss: 0.38816 | time: 39.525s
| Adam | epoch: 001 | loss: 0.38816 - acc: 0.8875 -- iter: 21440/55000
Training Step: 336  | total loss: 0.37840 | time: 39.613s
| Adam | epoch: 001 | loss: 0.37840 - acc: 0.8925 -- iter: 21504/55000
Training Step: 337  | total loss: 0.38874 | time: 39.706s
| Adam | epoch: 001 | loss: 0.38874 - acc: 0.8892 -- iter: 21568/55000
Training Step: 338  | total loss: 0.37756 | time: 39.802s
| Adam | epoch: 001 | loss: 0.37756 - acc: 0.8941 -- iter: 21632/55000
Training Step: 339  | total loss: 0.37569 | time: 39.893s
| Adam | epoch: 001 | loss: 0.37569 - acc: 0.8968 -- iter: 21696/55000
Training Step: 340  | total loss: 0.38372 | time: 39.984s
| Adam | epoch: 001 | loss: 0.38372 - acc: 0.8931 -- iter: 21760/55000
Training Step: 341  | total loss: 0.37554 | time: 40.069s
| Adam | epoch: 001 | loss: 0.37554 - acc: 0.8944 -- iter: 21824/55000
Training Step: 342  | total loss: 0.37197 | time: 40.159s
| Adam | epoch: 001 | loss: 0.37197 - acc: 0.8925 -- iter: 21888/55000
Training Step: 343  | total loss: 0.36774 | time: 40.250s
| Adam | epoch: 001 | loss: 0.36774 - acc: 0.8938 -- iter: 21952/55000
Training Step: 344  | total loss: 0.37025 | time: 40.339s
| Adam | epoch: 001 | loss: 0.37025 - acc: 0.8920 -- iter: 22016/55000
Training Step: 345  | total loss: 0.38004 | time: 40.428s
| Adam | epoch: 001 | loss: 0.38004 - acc: 0.8871 -- iter: 22080/55000
Training Step: 346  | total loss: 0.36049 | time: 40.516s
| Adam | epoch: 001 | loss: 0.36049 - acc: 0.8906 -- iter: 22144/55000
Training Step: 347  | total loss: 0.38193 | time: 40.615s
| Adam | epoch: 001 | loss: 0.38193 - acc: 0.8812 -- iter: 22208/55000
Training Step: 348  | total loss: 0.37212 | time: 40.703s
| Adam | epoch: 001 | loss: 0.37212 - acc: 0.8837 -- iter: 22272/55000
Training Step: 349  | total loss: 0.36912 | time: 40.798s
| Adam | epoch: 001 | loss: 0.36912 - acc: 0.8844 -- iter: 22336/55000
Training Step: 350  | total loss: 0.36219 | time: 40.886s
| Adam | epoch: 001 | loss: 0.36219 - acc: 0.8897 -- iter: 22400/55000
Training Step: 351  | total loss: 0.35955 | time: 40.983s
| Adam | epoch: 001 | loss: 0.35955 - acc: 0.8898 -- iter: 22464/55000
Training Step: 352  | total loss: 0.35965 | time: 41.079s
| Adam | epoch: 001 | loss: 0.35965 - acc: 0.8883 -- iter: 22528/55000
Training Step: 353  | total loss: 0.35203 | time: 41.173s
| Adam | epoch: 001 | loss: 0.35203 - acc: 0.8870 -- iter: 22592/55000
Training Step: 354  | total loss: 0.34208 | time: 41.269s
| Adam | epoch: 001 | loss: 0.34208 - acc: 0.8921 -- iter: 22656/55000
Training Step: 355  | total loss: 0.33428 | time: 41.369s
| Adam | epoch: 001 | loss: 0.33428 - acc: 0.8919 -- iter: 22720/55000
Training Step: 356  | total loss: 0.33393 | time: 41.469s
| Adam | epoch: 001 | loss: 0.33393 - acc: 0.8918 -- iter: 22784/55000
Training Step: 357  | total loss: 0.32117 | time: 41.560s
| Adam | epoch: 001 | loss: 0.32117 - acc: 0.8964 -- iter: 22848/55000
Training Step: 358  | total loss: 0.32490 | time: 41.664s
| Adam | epoch: 001 | loss: 0.32490 - acc: 0.8958 -- iter: 22912/55000
Training Step: 359  | total loss: 0.33063 | time: 41.771s
| Adam | epoch: 001 | loss: 0.33063 - acc: 0.8937 -- iter: 22976/55000
Training Step: 360  | total loss: 0.31911 | time: 41.859s
| Adam | epoch: 001 | loss: 0.31911 - acc: 0.8996 -- iter: 23040/55000
Training Step: 361  | total loss: 0.31413 | time: 41.954s
| Adam | epoch: 001 | loss: 0.31413 - acc: 0.9019 -- iter: 23104/55000
Training Step: 362  | total loss: 0.32081 | time: 42.048s
| Adam | epoch: 001 | loss: 0.32081 - acc: 0.8961 -- iter: 23168/55000
Training Step: 363  | total loss: 0.32152 | time: 42.137s
| Adam | epoch: 001 | loss: 0.32152 - acc: 0.8955 -- iter: 23232/55000
Training Step: 364  | total loss: 0.32897 | time: 42.228s
| Adam | epoch: 001 | loss: 0.32897 - acc: 0.8950 -- iter: 23296/55000
Training Step: 365  | total loss: 0.34269 | time: 42.320s
| Adam | epoch: 001 | loss: 0.34269 - acc: 0.8899 -- iter: 23360/55000
Training Step: 366  | total loss: 0.35117 | time: 42.414s
| Adam | epoch: 001 | loss: 0.35117 - acc: 0.8900 -- iter: 23424/55000
Training Step: 367  | total loss: 0.35084 | time: 42.512s
| Adam | epoch: 001 | loss: 0.35084 - acc: 0.8885 -- iter: 23488/55000
Training Step: 368  | total loss: 0.34076 | time: 42.605s
| Adam | epoch: 001 | loss: 0.34076 - acc: 0.8934 -- iter: 23552/55000
Training Step: 369  | total loss: 0.34877 | time: 42.698s
| Adam | epoch: 001 | loss: 0.34877 - acc: 0.8931 -- iter: 23616/55000
Training Step: 370  | total loss: 0.33392 | time: 42.788s
| Adam | epoch: 001 | loss: 0.33392 - acc: 0.9007 -- iter: 23680/55000
Training Step: 371  | total loss: 0.32597 | time: 42.881s
| Adam | epoch: 001 | loss: 0.32597 - acc: 0.9028 -- iter: 23744/55000
Training Step: 372  | total loss: 0.32353 | time: 42.974s
| Adam | epoch: 001 | loss: 0.32353 - acc: 0.9016 -- iter: 23808/55000
Training Step: 373  | total loss: 0.31970 | time: 43.070s
| Adam | epoch: 001 | loss: 0.31970 - acc: 0.9036 -- iter: 23872/55000
Training Step: 374  | total loss: 0.31284 | time: 43.170s
| Adam | epoch: 001 | loss: 0.31284 - acc: 0.9054 -- iter: 23936/55000
Training Step: 375  | total loss: 0.30259 | time: 43.260s
| Adam | epoch: 001 | loss: 0.30259 - acc: 0.9071 -- iter: 24000/55000
Training Step: 376  | total loss: 0.30535 | time: 43.351s
| Adam | epoch: 001 | loss: 0.30535 - acc: 0.9039 -- iter: 24064/55000
Training Step: 377  | total loss: 0.29525 | time: 43.450s
| Adam | epoch: 001 | loss: 0.29525 - acc: 0.9057 -- iter: 24128/55000
Training Step: 378  | total loss: 0.30400 | time: 43.538s
| Adam | epoch: 001 | loss: 0.30400 - acc: 0.9042 -- iter: 24192/55000
Training Step: 379  | total loss: 0.29770 | time: 43.623s
| Adam | epoch: 001 | loss: 0.29770 - acc: 0.9059 -- iter: 24256/55000
Training Step: 380  | total loss: 0.29628 | time: 43.709s
| Adam | epoch: 001 | loss: 0.29628 - acc: 0.9060 -- iter: 24320/55000
Training Step: 381  | total loss: 0.28486 | time: 43.799s
| Adam | epoch: 001 | loss: 0.28486 - acc: 0.9107 -- iter: 24384/55000
Training Step: 382  | total loss: 0.27606 | time: 43.892s
| Adam | epoch: 001 | loss: 0.27606 - acc: 0.9149 -- iter: 24448/55000
Training Step: 383  | total loss: 0.27226 | time: 43.970s
| Adam | epoch: 001 | loss: 0.27226 - acc: 0.9172 -- iter: 24512/55000
Training Step: 384  | total loss: 0.27467 | time: 44.059s
| Adam | epoch: 001 | loss: 0.27467 - acc: 0.9145 -- iter: 24576/55000
Training Step: 385  | total loss: 0.28205 | time: 44.152s
| Adam | epoch: 001 | loss: 0.28205 - acc: 0.9153 -- iter: 24640/55000
Training Step: 386  | total loss: 0.27151 | time: 44.238s
| Adam | epoch: 001 | loss: 0.27151 - acc: 0.9190 -- iter: 24704/55000
Training Step: 387  | total loss: 0.30284 | time: 44.326s
| Adam | epoch: 001 | loss: 0.30284 - acc: 0.9115 -- iter: 24768/55000
Training Step: 388  | total loss: 0.30912 | time: 44.417s
| Adam | epoch: 001 | loss: 0.30912 - acc: 0.9110 -- iter: 24832/55000
Training Step: 389  | total loss: 0.31314 | time: 44.510s
| Adam | epoch: 001 | loss: 0.31314 - acc: 0.9121 -- iter: 24896/55000
Training Step: 390  | total loss: 0.29608 | time: 44.603s
| Adam | epoch: 001 | loss: 0.29608 - acc: 0.9177 -- iter: 24960/55000
Training Step: 391  | total loss: 0.29237 | time: 44.699s
| Adam | epoch: 001 | loss: 0.29237 - acc: 0.9182 -- iter: 25024/55000
Training Step: 392  | total loss: 0.29324 | time: 44.787s
| Adam | epoch: 001 | loss: 0.29324 - acc: 0.9154 -- iter: 25088/55000
Training Step: 393  | total loss: 0.28436 | time: 44.880s
| Adam | epoch: 001 | loss: 0.28436 - acc: 0.9176 -- iter: 25152/55000
Training Step: 394  | total loss: 0.30271 | time: 44.976s
| Adam | epoch: 001 | loss: 0.30271 - acc: 0.9134 -- iter: 25216/55000
Training Step: 395  | total loss: 0.29094 | time: 45.067s
| Adam | epoch: 001 | loss: 0.29094 - acc: 0.9173 -- iter: 25280/55000
Training Step: 396  | total loss: 0.29069 | time: 45.157s
| Adam | epoch: 001 | loss: 0.29069 - acc: 0.9178 -- iter: 25344/55000
Training Step: 397  | total loss: 0.29224 | time: 45.252s
| Adam | epoch: 001 | loss: 0.29224 - acc: 0.9182 -- iter: 25408/55000
Training Step: 398  | total loss: 0.28607 | time: 45.343s
| Adam | epoch: 001 | loss: 0.28607 - acc: 0.9217 -- iter: 25472/55000
Training Step: 399  | total loss: 0.27686 | time: 45.436s
| Adam | epoch: 001 | loss: 0.27686 - acc: 0.9248 -- iter: 25536/55000
Training Step: 400  | total loss: 0.26747 | time: 47.924s
| Adam | epoch: 001 | loss: 0.26747 - acc: 0.9261 | val_loss: 0.27111 - val_acc: 0.9200 -- iter: 25600/55000
--
Training Step: 401  | total loss: 0.28078 | time: 48.066s
| Adam | epoch: 001 | loss: 0.28078 - acc: 0.9210 -- iter: 25664/55000
Training Step: 402  | total loss: 0.28125 | time: 48.155s
| Adam | epoch: 001 | loss: 0.28125 - acc: 0.9211 -- iter: 25728/55000
Training Step: 403  | total loss: 0.27041 | time: 48.236s
| Adam | epoch: 001 | loss: 0.27041 - acc: 0.9243 -- iter: 25792/55000
Training Step: 404  | total loss: 0.29024 | time: 48.319s
| Adam | epoch: 001 | loss: 0.29024 - acc: 0.9194 -- iter: 25856/55000
Training Step: 405  | total loss: 0.31285 | time: 48.398s
| Adam | epoch: 001 | loss: 0.31285 - acc: 0.9118 -- iter: 25920/55000
Training Step: 406  | total loss: 0.30578 | time: 48.480s
| Adam | epoch: 001 | loss: 0.30578 - acc: 0.9159 -- iter: 25984/55000
Training Step: 407  | total loss: 0.28847 | time: 48.564s
| Adam | epoch: 001 | loss: 0.28847 - acc: 0.9212 -- iter: 26048/55000
Training Step: 408  | total loss: 0.28124 | time: 48.649s
| Adam | epoch: 001 | loss: 0.28124 - acc: 0.9228 -- iter: 26112/55000
Training Step: 409  | total loss: 0.28338 | time: 48.739s
| Adam | epoch: 001 | loss: 0.28338 - acc: 0.9181 -- iter: 26176/55000
Training Step: 410  | total loss: 0.28091 | time: 48.829s
| Adam | epoch: 001 | loss: 0.28091 - acc: 0.9200 -- iter: 26240/55000
Training Step: 411  | total loss: 0.28786 | time: 48.918s
| Adam | epoch: 001 | loss: 0.28786 - acc: 0.9171 -- iter: 26304/55000
Training Step: 412  | total loss: 0.30154 | time: 49.006s
| Adam | epoch: 001 | loss: 0.30154 - acc: 0.9113 -- iter: 26368/55000
Training Step: 413  | total loss: 0.30173 | time: 49.095s
| Adam | epoch: 001 | loss: 0.30173 - acc: 0.9124 -- iter: 26432/55000
Training Step: 414  | total loss: 0.29888 | time: 49.178s
| Adam | epoch: 001 | loss: 0.29888 - acc: 0.9086 -- iter: 26496/55000
Training Step: 415  | total loss: 0.29349 | time: 49.263s
| Adam | epoch: 001 | loss: 0.29349 - acc: 0.9099 -- iter: 26560/55000
Training Step: 416  | total loss: 0.28413 | time: 49.348s
| Adam | epoch: 001 | loss: 0.28413 - acc: 0.9143 -- iter: 26624/55000
Training Step: 417  | total loss: 0.28710 | time: 49.434s
| Adam | epoch: 001 | loss: 0.28710 - acc: 0.9119 -- iter: 26688/55000
Training Step: 418  | total loss: 0.28235 | time: 49.521s
| Adam | epoch: 001 | loss: 0.28235 - acc: 0.9129 -- iter: 26752/55000
Training Step: 419  | total loss: 0.27088 | time: 49.603s
| Adam | epoch: 001 | loss: 0.27088 - acc: 0.9169 -- iter: 26816/55000
Training Step: 420  | total loss: 0.26766 | time: 49.697s
| Adam | epoch: 001 | loss: 0.26766 - acc: 0.9221 -- iter: 26880/55000
Training Step: 421  | total loss: 0.26507 | time: 49.785s
| Adam | epoch: 001 | loss: 0.26507 - acc: 0.9221 -- iter: 26944/55000
Training Step: 422  | total loss: 0.27545 | time: 49.873s
| Adam | epoch: 001 | loss: 0.27545 - acc: 0.9158 -- iter: 27008/55000
Training Step: 423  | total loss: 0.28538 | time: 49.959s
| Adam | epoch: 001 | loss: 0.28538 - acc: 0.9133 -- iter: 27072/55000
Training Step: 424  | total loss: 0.28023 | time: 50.050s
| Adam | epoch: 001 | loss: 0.28023 - acc: 0.9157 -- iter: 27136/55000
Training Step: 425  | total loss: 0.27403 | time: 50.136s
| Adam | epoch: 001 | loss: 0.27403 - acc: 0.9179 -- iter: 27200/55000
Training Step: 426  | total loss: 0.27857 | time: 50.225s
| Adam | epoch: 001 | loss: 0.27857 - acc: 0.9136 -- iter: 27264/55000
Training Step: 427  | total loss: 0.28135 | time: 50.311s
| Adam | epoch: 001 | loss: 0.28135 - acc: 0.9144 -- iter: 27328/55000
Training Step: 428  | total loss: 0.27431 | time: 50.393s
| Adam | epoch: 001 | loss: 0.27431 - acc: 0.9136 -- iter: 27392/55000
Training Step: 429  | total loss: 0.28024 | time: 50.477s
| Adam | epoch: 001 | loss: 0.28024 - acc: 0.9129 -- iter: 27456/55000
Training Step: 430  | total loss: 0.27908 | time: 50.567s
| Adam | epoch: 001 | loss: 0.27908 - acc: 0.9138 -- iter: 27520/55000
Training Step: 431  | total loss: 0.29298 | time: 50.654s
| Adam | epoch: 001 | loss: 0.29298 - acc: 0.9083 -- iter: 27584/55000
Training Step: 432  | total loss: 0.29865 | time: 50.743s
| Adam | epoch: 001 | loss: 0.29865 - acc: 0.9034 -- iter: 27648/55000
Training Step: 433  | total loss: 0.28957 | time: 50.829s
| Adam | epoch: 001 | loss: 0.28957 - acc: 0.9068 -- iter: 27712/55000
Training Step: 434  | total loss: 0.28535 | time: 50.913s
| Adam | epoch: 001 | loss: 0.28535 - acc: 0.9083 -- iter: 27776/55000
Training Step: 435  | total loss: 0.27743 | time: 51.002s
| Adam | epoch: 001 | loss: 0.27743 - acc: 0.9128 -- iter: 27840/55000
Training Step: 436  | total loss: 0.27568 | time: 51.087s
| Adam | epoch: 001 | loss: 0.27568 - acc: 0.9153 -- iter: 27904/55000
Training Step: 437  | total loss: 0.30309 | time: 51.171s
| Adam | epoch: 001 | loss: 0.30309 - acc: 0.9081 -- iter: 27968/55000
Training Step: 438  | total loss: 0.31027 | time: 51.257s
| Adam | epoch: 001 | loss: 0.31027 - acc: 0.9048 -- iter: 28032/55000
Training Step: 439  | total loss: 0.29842 | time: 51.342s
| Adam | epoch: 001 | loss: 0.29842 - acc: 0.9081 -- iter: 28096/55000
Training Step: 440  | total loss: 0.28792 | time: 51.430s
| Adam | epoch: 001 | loss: 0.28792 - acc: 0.9110 -- iter: 28160/55000
Training Step: 441  | total loss: 0.28661 | time: 51.520s
| Adam | epoch: 001 | loss: 0.28661 - acc: 0.9121 -- iter: 28224/55000
Training Step: 442  | total loss: 0.29104 | time: 51.613s
| Adam | epoch: 001 | loss: 0.29104 - acc: 0.9100 -- iter: 28288/55000
Training Step: 443  | total loss: 0.28199 | time: 51.698s
| Adam | epoch: 001 | loss: 0.28199 - acc: 0.9127 -- iter: 28352/55000
Training Step: 444  | total loss: 0.29588 | time: 51.781s
| Adam | epoch: 001 | loss: 0.29588 - acc: 0.9089 -- iter: 28416/55000
Training Step: 445  | total loss: 0.30631 | time: 51.866s
| Adam | epoch: 001 | loss: 0.30631 - acc: 0.9024 -- iter: 28480/55000
Training Step: 446  | total loss: 0.31217 | time: 51.960s
| Adam | epoch: 001 | loss: 0.31217 - acc: 0.8997 -- iter: 28544/55000
Training Step: 447  | total loss: 0.31907 | time: 52.047s
| Adam | epoch: 001 | loss: 0.31907 - acc: 0.9003 -- iter: 28608/55000
Training Step: 448  | total loss: 0.32689 | time: 52.134s
| Adam | epoch: 001 | loss: 0.32689 - acc: 0.8978 -- iter: 28672/55000
Training Step: 449  | total loss: 0.32722 | time: 52.223s
| Adam | epoch: 001 | loss: 0.32722 - acc: 0.8955 -- iter: 28736/55000
Training Step: 450  | total loss: 0.31954 | time: 52.313s
| Adam | epoch: 001 | loss: 0.31954 - acc: 0.8997 -- iter: 28800/55000
Training Step: 451  | total loss: 0.32053 | time: 52.402s
| Adam | epoch: 001 | loss: 0.32053 - acc: 0.9004 -- iter: 28864/55000
Training Step: 452  | total loss: 0.33080 | time: 52.497s
| Adam | epoch: 001 | loss: 0.33080 - acc: 0.9010 -- iter: 28928/55000
Training Step: 453  | total loss: 0.34598 | time: 52.586s
| Adam | epoch: 001 | loss: 0.34598 - acc: 0.8968 -- iter: 28992/55000
Training Step: 454  | total loss: 0.34480 | time: 52.672s
| Adam | epoch: 001 | loss: 0.34480 - acc: 0.8946 -- iter: 29056/55000
Training Step: 455  | total loss: 0.34515 | time: 52.760s
| Adam | epoch: 001 | loss: 0.34515 - acc: 0.8958 -- iter: 29120/55000
Training Step: 456  | total loss: 0.35199 | time: 52.851s
| Adam | epoch: 001 | loss: 0.35199 - acc: 0.8937 -- iter: 29184/55000
Training Step: 457  | total loss: 0.34351 | time: 52.935s
| Adam | epoch: 001 | loss: 0.34351 - acc: 0.8918 -- iter: 29248/55000
Training Step: 458  | total loss: 0.34607 | time: 53.022s
| Adam | epoch: 001 | loss: 0.34607 - acc: 0.8870 -- iter: 29312/55000
Training Step: 459  | total loss: 0.33570 | time: 53.114s
| Adam | epoch: 001 | loss: 0.33570 - acc: 0.8905 -- iter: 29376/55000
Training Step: 460  | total loss: 0.31562 | time: 53.204s
| Adam | epoch: 001 | loss: 0.31562 - acc: 0.8983 -- iter: 29440/55000
Training Step: 461  | total loss: 0.30676 | time: 53.284s
| Adam | epoch: 001 | loss: 0.30676 - acc: 0.9023 -- iter: 29504/55000
Training Step: 462  | total loss: 0.31818 | time: 53.368s
| Adam | epoch: 001 | loss: 0.31818 - acc: 0.8995 -- iter: 29568/55000
Training Step: 463  | total loss: 0.30283 | time: 53.458s
| Adam | epoch: 001 | loss: 0.30283 - acc: 0.9033 -- iter: 29632/55000
Training Step: 464  | total loss: 0.31961 | time: 53.544s
| Adam | epoch: 001 | loss: 0.31961 - acc: 0.9005 -- iter: 29696/55000
Training Step: 465  | total loss: 0.30738 | time: 53.636s
| Adam | epoch: 001 | loss: 0.30738 - acc: 0.9042 -- iter: 29760/55000
Training Step: 466  | total loss: 0.30887 | time: 53.726s
| Adam | epoch: 001 | loss: 0.30887 - acc: 0.9044 -- iter: 29824/55000
Training Step: 467  | total loss: 0.33084 | time: 53.819s
| Adam | epoch: 001 | loss: 0.33084 - acc: 0.9015 -- iter: 29888/55000
Training Step: 468  | total loss: 0.32862 | time: 53.904s
| Adam | epoch: 001 | loss: 0.32862 - acc: 0.9004 -- iter: 29952/55000
Training Step: 469  | total loss: 0.31339 | time: 53.991s
| Adam | epoch: 001 | loss: 0.31339 - acc: 0.9057 -- iter: 30016/55000
Training Step: 470  | total loss: 0.32416 | time: 54.080s
| Adam | epoch: 001 | loss: 0.32416 - acc: 0.9057 -- iter: 30080/55000
Training Step: 471  | total loss: 0.32066 | time: 54.172s
| Adam | epoch: 001 | loss: 0.32066 - acc: 0.9073 -- iter: 30144/55000
Training Step: 472  | total loss: 0.32010 | time: 54.258s
| Adam | epoch: 001 | loss: 0.32010 - acc: 0.9072 -- iter: 30208/55000
Training Step: 473  | total loss: 0.31439 | time: 54.348s
| Adam | epoch: 001 | loss: 0.31439 - acc: 0.9071 -- iter: 30272/55000
Training Step: 474  | total loss: 0.31456 | time: 54.440s
| Adam | epoch: 001 | loss: 0.31456 - acc: 0.9039 -- iter: 30336/55000
Training Step: 475  | total loss: 0.33389 | time: 54.527s
| Adam | epoch: 001 | loss: 0.33389 - acc: 0.8979 -- iter: 30400/55000
Training Step: 476  | total loss: 0.34073 | time: 54.619s
| Adam | epoch: 001 | loss: 0.34073 - acc: 0.8987 -- iter: 30464/55000
Training Step: 477  | total loss: 0.34172 | time: 54.707s
| Adam | epoch: 001 | loss: 0.34172 - acc: 0.8995 -- iter: 30528/55000
Training Step: 478  | total loss: 0.33233 | time: 54.793s
| Adam | epoch: 001 | loss: 0.33233 - acc: 0.9002 -- iter: 30592/55000
Training Step: 479  | total loss: 0.32118 | time: 54.879s
| Adam | epoch: 001 | loss: 0.32118 - acc: 0.9039 -- iter: 30656/55000
Training Step: 480  | total loss: 0.30659 | time: 54.969s
| Adam | epoch: 001 | loss: 0.30659 - acc: 0.9104 -- iter: 30720/55000
Training Step: 481  | total loss: 0.29586 | time: 55.068s
| Adam | epoch: 001 | loss: 0.29586 - acc: 0.9115 -- iter: 30784/55000
Training Step: 482  | total loss: 0.29588 | time: 55.159s
| Adam | epoch: 001 | loss: 0.29588 - acc: 0.9063 -- iter: 30848/55000
Training Step: 483  | total loss: 0.31209 | time: 55.249s
| Adam | epoch: 001 | loss: 0.31209 - acc: 0.9032 -- iter: 30912/55000
Training Step: 484  | total loss: 0.29978 | time: 55.335s
| Adam | epoch: 001 | loss: 0.29978 - acc: 0.9051 -- iter: 30976/55000
Training Step: 485  | total loss: 0.29877 | time: 55.426s
| Adam | epoch: 001 | loss: 0.29877 - acc: 0.9052 -- iter: 31040/55000
Training Step: 486  | total loss: 0.29265 | time: 55.515s
| Adam | epoch: 001 | loss: 0.29265 - acc: 0.9100 -- iter: 31104/55000
Training Step: 487  | total loss: 0.28352 | time: 55.603s
| Adam | epoch: 001 | loss: 0.28352 - acc: 0.9127 -- iter: 31168/55000
Training Step: 488  | total loss: 0.26301 | time: 55.697s
| Adam | epoch: 001 | loss: 0.26301 - acc: 0.9199 -- iter: 31232/55000
Training Step: 489  | total loss: 0.26432 | time: 55.789s
| Adam | epoch: 001 | loss: 0.26432 - acc: 0.9185 -- iter: 31296/55000
Training Step: 490  | total loss: 0.28646 | time: 55.882s
| Adam | epoch: 001 | loss: 0.28646 - acc: 0.9110 -- iter: 31360/55000
Training Step: 491  | total loss: 0.31440 | time: 55.971s
| Adam | epoch: 001 | loss: 0.31440 - acc: 0.9059 -- iter: 31424/55000
Training Step: 492  | total loss: 0.30653 | time: 56.064s
| Adam | epoch: 001 | loss: 0.30653 - acc: 0.9059 -- iter: 31488/55000
Training Step: 493  | total loss: 0.30200 | time: 56.158s
| Adam | epoch: 001 | loss: 0.30200 - acc: 0.9091 -- iter: 31552/55000
Training Step: 494  | total loss: 0.28299 | time: 56.251s
| Adam | epoch: 001 | loss: 0.28299 - acc: 0.9150 -- iter: 31616/55000
Training Step: 495  | total loss: 0.28083 | time: 56.338s
| Adam | epoch: 001 | loss: 0.28083 - acc: 0.9157 -- iter: 31680/55000
Training Step: 496  | total loss: 0.29551 | time: 56.432s
| Adam | epoch: 001 | loss: 0.29551 - acc: 0.9132 -- iter: 31744/55000
Training Step: 497  | total loss: 0.30499 | time: 56.524s
| Adam | epoch: 001 | loss: 0.30499 - acc: 0.9078 -- iter: 31808/55000
Training Step: 498  | total loss: 0.30232 | time: 56.612s
| Adam | epoch: 001 | loss: 0.30232 - acc: 0.9092 -- iter: 31872/55000
Training Step: 499  | total loss: 0.30225 | time: 56.701s
| Adam | epoch: 001 | loss: 0.30225 - acc: 0.9105 -- iter: 31936/55000
Training Step: 500  | total loss: 0.29700 | time: 59.190s
| Adam | epoch: 001 | loss: 0.29700 - acc: 0.9132 | val_loss: 0.24866 - val_acc: 0.9278 -- iter: 32000/55000
--
Training Step: 501  | total loss: 0.28568 | time: 59.321s
| Adam | epoch: 001 | loss: 0.28568 - acc: 0.9156 -- iter: 32064/55000
Training Step: 502  | total loss: 0.29099 | time: 59.419s
| Adam | epoch: 001 | loss: 0.29099 - acc: 0.9131 -- iter: 32128/55000
Training Step: 503  | total loss: 0.30158 | time: 59.506s
| Adam | epoch: 001 | loss: 0.30158 - acc: 0.9109 -- iter: 32192/55000
Training Step: 504  | total loss: 0.30012 | time: 59.594s
| Adam | epoch: 001 | loss: 0.30012 - acc: 0.9120 -- iter: 32256/55000
Training Step: 505  | total loss: 0.29854 | time: 59.685s
| Adam | epoch: 001 | loss: 0.29854 - acc: 0.9130 -- iter: 32320/55000
Training Step: 506  | total loss: 0.31658 | time: 59.778s
| Adam | epoch: 001 | loss: 0.31658 - acc: 0.9060 -- iter: 32384/55000
Training Step: 507  | total loss: 0.31409 | time: 59.864s
| Adam | epoch: 001 | loss: 0.31409 - acc: 0.9045 -- iter: 32448/55000
Training Step: 508  | total loss: 0.29829 | time: 59.954s
| Adam | epoch: 001 | loss: 0.29829 - acc: 0.9109 -- iter: 32512/55000
Training Step: 509  | total loss: 0.29282 | time: 60.040s
| Adam | epoch: 001 | loss: 0.29282 - acc: 0.9105 -- iter: 32576/55000
Training Step: 510  | total loss: 0.29436 | time: 60.137s
| Adam | epoch: 001 | loss: 0.29436 - acc: 0.9100 -- iter: 32640/55000
Training Step: 511  | total loss: 0.28438 | time: 60.226s
| Adam | epoch: 001 | loss: 0.28438 - acc: 0.9143 -- iter: 32704/55000
Training Step: 512  | total loss: 0.28007 | time: 60.313s
| Adam | epoch: 001 | loss: 0.28007 - acc: 0.9182 -- iter: 32768/55000
Training Step: 513  | total loss: 0.26336 | time: 60.396s
| Adam | epoch: 001 | loss: 0.26336 - acc: 0.9233 -- iter: 32832/55000
Training Step: 514  | total loss: 0.28877 | time: 60.484s
| Adam | epoch: 001 | loss: 0.28877 - acc: 0.9185 -- iter: 32896/55000
Training Step: 515  | total loss: 0.28699 | time: 60.572s
| Adam | epoch: 001 | loss: 0.28699 - acc: 0.9204 -- iter: 32960/55000
Training Step: 516  | total loss: 0.28152 | time: 60.661s
| Adam | epoch: 001 | loss: 0.28152 - acc: 0.9221 -- iter: 33024/55000
Training Step: 517  | total loss: 0.28105 | time: 60.750s
| Adam | epoch: 001 | loss: 0.28105 - acc: 0.9221 -- iter: 33088/55000
Training Step: 518  | total loss: 0.28409 | time: 60.843s
| Adam | epoch: 001 | loss: 0.28409 - acc: 0.9205 -- iter: 33152/55000
Training Step: 519  | total loss: 0.30265 | time: 60.928s
| Adam | epoch: 001 | loss: 0.30265 - acc: 0.9175 -- iter: 33216/55000
Training Step: 520  | total loss: 0.29714 | time: 61.014s
| Adam | epoch: 001 | loss: 0.29714 - acc: 0.9164 -- iter: 33280/55000
Training Step: 521  | total loss: 0.29342 | time: 61.102s
| Adam | epoch: 001 | loss: 0.29342 - acc: 0.9169 -- iter: 33344/55000
Training Step: 522  | total loss: 0.28754 | time: 61.196s
| Adam | epoch: 001 | loss: 0.28754 - acc: 0.9205 -- iter: 33408/55000
Training Step: 523  | total loss: 0.29113 | time: 61.287s
| Adam | epoch: 001 | loss: 0.29113 - acc: 0.9207 -- iter: 33472/55000
Training Step: 524  | total loss: 0.28827 | time: 61.376s
| Adam | epoch: 001 | loss: 0.28827 - acc: 0.9192 -- iter: 33536/55000
Training Step: 525  | total loss: 0.28815 | time: 61.468s
| Adam | epoch: 001 | loss: 0.28815 - acc: 0.9211 -- iter: 33600/55000
Training Step: 526  | total loss: 0.28172 | time: 61.569s
| Adam | epoch: 001 | loss: 0.28172 - acc: 0.9196 -- iter: 33664/55000
Training Step: 527  | total loss: 0.26638 | time: 61.658s
| Adam | epoch: 001 | loss: 0.26638 - acc: 0.9245 -- iter: 33728/55000
Training Step: 528  | total loss: 0.25235 | time: 61.737s
| Adam | epoch: 001 | loss: 0.25235 - acc: 0.9289 -- iter: 33792/55000
Training Step: 529  | total loss: 0.24651 | time: 61.822s
| Adam | epoch: 001 | loss: 0.24651 - acc: 0.9313 -- iter: 33856/55000
Training Step: 530  | total loss: 0.23680 | time: 61.907s
| Adam | epoch: 001 | loss: 0.23680 - acc: 0.9351 -- iter: 33920/55000
Training Step: 531  | total loss: 0.24028 | time: 61.988s
| Adam | epoch: 001 | loss: 0.24028 - acc: 0.9353 -- iter: 33984/55000
Training Step: 532  | total loss: 0.24881 | time: 62.066s
| Adam | epoch: 001 | loss: 0.24881 - acc: 0.9309 -- iter: 34048/55000
Training Step: 533  | total loss: 0.24725 | time: 62.148s
| Adam | epoch: 001 | loss: 0.24725 - acc: 0.9331 -- iter: 34112/55000
Training Step: 534  | total loss: 0.25004 | time: 62.226s
| Adam | epoch: 001 | loss: 0.25004 - acc: 0.9320 -- iter: 34176/55000
Training Step: 535  | total loss: 0.24137 | time: 62.304s
| Adam | epoch: 001 | loss: 0.24137 - acc: 0.9341 -- iter: 34240/55000
Training Step: 536  | total loss: 0.23600 | time: 62.386s
| Adam | epoch: 001 | loss: 0.23600 - acc: 0.9344 -- iter: 34304/55000
Training Step: 537  | total loss: 0.24007 | time: 62.468s
| Adam | epoch: 001 | loss: 0.24007 - acc: 0.9347 -- iter: 34368/55000
Training Step: 538  | total loss: 0.24654 | time: 62.548s
| Adam | epoch: 001 | loss: 0.24654 - acc: 0.9256 -- iter: 34432/55000
Training Step: 539  | total loss: 0.25501 | time: 62.626s
| Adam | epoch: 001 | loss: 0.25501 - acc: 0.9237 -- iter: 34496/55000
Training Step: 540  | total loss: 0.25410 | time: 62.702s
| Adam | epoch: 001 | loss: 0.25410 - acc: 0.9204 -- iter: 34560/55000
Training Step: 541  | total loss: 0.24991 | time: 62.778s
| Adam | epoch: 001 | loss: 0.24991 - acc: 0.9205 -- iter: 34624/55000
Training Step: 542  | total loss: 0.24288 | time: 62.854s
| Adam | epoch: 001 | loss: 0.24288 - acc: 0.9238 -- iter: 34688/55000
Training Step: 543  | total loss: 0.24488 | time: 62.934s
| Adam | epoch: 001 | loss: 0.24488 - acc: 0.9252 -- iter: 34752/55000
Training Step: 544  | total loss: 0.25425 | time: 63.009s
| Adam | epoch: 001 | loss: 0.25425 - acc: 0.9201 -- iter: 34816/55000
Training Step: 545  | total loss: 0.23768 | time: 63.085s
| Adam | epoch: 001 | loss: 0.23768 - acc: 0.9250 -- iter: 34880/55000
Training Step: 546  | total loss: 0.23205 | time: 63.162s
| Adam | epoch: 001 | loss: 0.23205 - acc: 0.9278 -- iter: 34944/55000
Training Step: 547  | total loss: 0.24876 | time: 63.241s
| Adam | epoch: 001 | loss: 0.24876 - acc: 0.9210 -- iter: 35008/55000
Training Step: 548  | total loss: 0.26028 | time: 63.318s
| Adam | epoch: 001 | loss: 0.26028 - acc: 0.9179 -- iter: 35072/55000
Training Step: 549  | total loss: 0.27005 | time: 63.396s
| Adam | epoch: 001 | loss: 0.27005 - acc: 0.9152 -- iter: 35136/55000
Training Step: 550  | total loss: 0.26602 | time: 63.507s
| Adam | epoch: 001 | loss: 0.26602 - acc: 0.9174 -- iter: 35200/55000
Training Step: 551  | total loss: 0.25246 | time: 63.613s
| Adam | epoch: 001 | loss: 0.25246 - acc: 0.9210 -- iter: 35264/55000
Training Step: 552  | total loss: 0.24366 | time: 63.697s
| Adam | epoch: 001 | loss: 0.24366 - acc: 0.9242 -- iter: 35328/55000
Training Step: 553  | total loss: 0.24973 | time: 63.780s
| Adam | epoch: 001 | loss: 0.24973 - acc: 0.9224 -- iter: 35392/55000
Training Step: 554  | total loss: 0.24203 | time: 63.865s
| Adam | epoch: 001 | loss: 0.24203 - acc: 0.9255 -- iter: 35456/55000
Training Step: 555  | total loss: 0.23496 | time: 63.935s
| Adam | epoch: 001 | loss: 0.23496 - acc: 0.9283 -- iter: 35520/55000
Training Step: 556  | total loss: 0.22628 | time: 64.005s
| Adam | epoch: 001 | loss: 0.22628 - acc: 0.9292 -- iter: 35584/55000
Training Step: 557  | total loss: 0.21782 | time: 64.085s
| Adam | epoch: 001 | loss: 0.21782 - acc: 0.9347 -- iter: 35648/55000
Training Step: 558  | total loss: 0.21604 | time: 64.165s
| Adam | epoch: 001 | loss: 0.21604 - acc: 0.9365 -- iter: 35712/55000
Training Step: 559  | total loss: 0.21248 | time: 64.241s
| Adam | epoch: 001 | loss: 0.21248 - acc: 0.9335 -- iter: 35776/55000
Training Step: 560  | total loss: 0.21164 | time: 64.350s
| Adam | epoch: 001 | loss: 0.21164 - acc: 0.9339 -- iter: 35840/55000
Training Step: 561  | total loss: 0.23725 | time: 64.466s
| Adam | epoch: 001 | loss: 0.23725 - acc: 0.9280 -- iter: 35904/55000
Training Step: 562  | total loss: 0.23145 | time: 64.548s
| Adam | epoch: 001 | loss: 0.23145 - acc: 0.9290 -- iter: 35968/55000
Training Step: 563  | total loss: 0.24169 | time: 64.635s
| Adam | epoch: 001 | loss: 0.24169 - acc: 0.9283 -- iter: 36032/55000
Training Step: 564  | total loss: 0.22631 | time: 64.703s
| Adam | epoch: 001 | loss: 0.22631 - acc: 0.9339 -- iter: 36096/55000
Training Step: 565  | total loss: 0.24111 | time: 64.856s
| Adam | epoch: 001 | loss: 0.24111 - acc: 0.9327 -- iter: 36160/55000
Training Step: 566  | total loss: 0.23574 | time: 65.014s
| Adam | epoch: 001 | loss: 0.23574 - acc: 0.9316 -- iter: 36224/55000
Training Step: 567  | total loss: 0.21551 | time: 65.132s
| Adam | epoch: 001 | loss: 0.21551 - acc: 0.9384 -- iter: 36288/55000
Training Step: 568  | total loss: 0.22348 | time: 65.241s
| Adam | epoch: 001 | loss: 0.22348 - acc: 0.9368 -- iter: 36352/55000
Training Step: 569  | total loss: 0.23533 | time: 65.432s
| Adam | epoch: 001 | loss: 0.23533 - acc: 0.9322 -- iter: 36416/55000
Training Step: 570  | total loss: 0.23648 | time: 65.567s
| Adam | epoch: 001 | loss: 0.23648 - acc: 0.9311 -- iter: 36480/55000
Training Step: 571  | total loss: 0.24915 | time: 65.652s
| Adam | epoch: 001 | loss: 0.24915 - acc: 0.9286 -- iter: 36544/55000
Training Step: 572  | total loss: 0.23388 | time: 65.733s
| Adam | epoch: 001 | loss: 0.23388 - acc: 0.9342 -- iter: 36608/55000
Training Step: 573  | total loss: 0.22653 | time: 65.835s
| Adam | epoch: 001 | loss: 0.22653 - acc: 0.9361 -- iter: 36672/55000
Training Step: 574  | total loss: 0.22392 | time: 65.940s
| Adam | epoch: 001 | loss: 0.22392 - acc: 0.9347 -- iter: 36736/55000
Training Step: 575  | total loss: 0.21992 | time: 66.052s
| Adam | epoch: 001 | loss: 0.21992 - acc: 0.9350 -- iter: 36800/55000
Training Step: 576  | total loss: 0.21441 | time: 66.199s
| Adam | epoch: 001 | loss: 0.21441 - acc: 0.9368 -- iter: 36864/55000
Training Step: 577  | total loss: 0.22363 | time: 66.395s
| Adam | epoch: 001 | loss: 0.22363 - acc: 0.9337 -- iter: 36928/55000
Training Step: 578  | total loss: 0.21897 | time: 66.507s
| Adam | epoch: 001 | loss: 0.21897 - acc: 0.9372 -- iter: 36992/55000
Training Step: 579  | total loss: 0.22993 | time: 66.605s
| Adam | epoch: 001 | loss: 0.22993 - acc: 0.9326 -- iter: 37056/55000
Training Step: 580  | total loss: 0.22088 | time: 66.713s
| Adam | epoch: 001 | loss: 0.22088 - acc: 0.9362 -- iter: 37120/55000
Training Step: 581  | total loss: 0.20886 | time: 66.814s
| Adam | epoch: 001 | loss: 0.20886 - acc: 0.9410 -- iter: 37184/55000
Training Step: 582  | total loss: 0.19934 | time: 66.911s
| Adam | epoch: 001 | loss: 0.19934 - acc: 0.9438 -- iter: 37248/55000
Training Step: 583  | total loss: 0.20741 | time: 67.008s
| Adam | epoch: 001 | loss: 0.20741 - acc: 0.9416 -- iter: 37312/55000
Training Step: 584  | total loss: 0.22598 | time: 67.107s
| Adam | epoch: 001 | loss: 0.22598 - acc: 0.9412 -- iter: 37376/55000
Training Step: 585  | total loss: 0.22152 | time: 67.203s
| Adam | epoch: 001 | loss: 0.22152 - acc: 0.9408 -- iter: 37440/55000
Training Step: 586  | total loss: 0.22584 | time: 67.306s
| Adam | epoch: 001 | loss: 0.22584 - acc: 0.9420 -- iter: 37504/55000
Training Step: 587  | total loss: 0.23293 | time: 67.403s
| Adam | epoch: 001 | loss: 0.23293 - acc: 0.9400 -- iter: 37568/55000
Training Step: 588  | total loss: 0.22698 | time: 67.496s
| Adam | epoch: 001 | loss: 0.22698 - acc: 0.9429 -- iter: 37632/55000
Training Step: 589  | total loss: 0.24690 | time: 67.590s
| Adam | epoch: 001 | loss: 0.24690 - acc: 0.9345 -- iter: 37696/55000
Training Step: 590  | total loss: 0.24501 | time: 67.673s
| Adam | epoch: 001 | loss: 0.24501 - acc: 0.9380 -- iter: 37760/55000
Training Step: 591  | total loss: 0.24878 | time: 67.767s
| Adam | epoch: 001 | loss: 0.24878 - acc: 0.9348 -- iter: 37824/55000
Training Step: 592  | total loss: 0.23813 | time: 67.908s
| Adam | epoch: 001 | loss: 0.23813 - acc: 0.9366 -- iter: 37888/55000
Training Step: 593  | total loss: 0.24568 | time: 68.033s
| Adam | epoch: 001 | loss: 0.24568 - acc: 0.9352 -- iter: 37952/55000
Training Step: 594  | total loss: 0.24623 | time: 68.144s
| Adam | epoch: 001 | loss: 0.24623 - acc: 0.9354 -- iter: 38016/55000
Training Step: 595  | total loss: 0.24976 | time: 68.247s
| Adam | epoch: 001 | loss: 0.24976 - acc: 0.9325 -- iter: 38080/55000
Training Step: 596  | total loss: 0.26408 | time: 68.350s
| Adam | epoch: 001 | loss: 0.26408 - acc: 0.9283 -- iter: 38144/55000
Training Step: 597  | total loss: 0.24699 | time: 68.498s
| Adam | epoch: 001 | loss: 0.24699 - acc: 0.9323 -- iter: 38208/55000
Training Step: 598  | total loss: 0.24701 | time: 68.588s
| Adam | epoch: 001 | loss: 0.24701 - acc: 0.9329 -- iter: 38272/55000
Training Step: 599  | total loss: 0.22962 | time: 68.702s
| Adam | epoch: 001 | loss: 0.22962 - acc: 0.9380 -- iter: 38336/55000
Training Step: 600  | total loss: 0.23668 | time: 71.199s
| Adam | epoch: 001 | loss: 0.23668 - acc: 0.9364 | val_loss: 0.23255 - val_acc: 0.9318 -- iter: 38400/55000
--
Training Step: 601  | total loss: 0.23596 | time: 71.300s
| Adam | epoch: 001 | loss: 0.23596 - acc: 0.9365 -- iter: 38464/55000
Training Step: 602  | total loss: 0.23543 | time: 71.457s
| Adam | epoch: 001 | loss: 0.23543 - acc: 0.9350 -- iter: 38528/55000
Training Step: 603  | total loss: 0.22422 | time: 71.576s
| Adam | epoch: 001 | loss: 0.22422 - acc: 0.9384 -- iter: 38592/55000
Training Step: 604  | total loss: 0.23751 | time: 71.670s
| Adam | epoch: 001 | loss: 0.23751 - acc: 0.9368 -- iter: 38656/55000
Training Step: 605  | total loss: 0.25200 | time: 71.769s
| Adam | epoch: 001 | loss: 0.25200 - acc: 0.9337 -- iter: 38720/55000
Training Step: 606  | total loss: 0.23960 | time: 71.879s
| Adam | epoch: 001 | loss: 0.23960 - acc: 0.9372 -- iter: 38784/55000
Training Step: 607  | total loss: 0.23419 | time: 71.979s
| Adam | epoch: 001 | loss: 0.23419 - acc: 0.9372 -- iter: 38848/55000
Training Step: 608  | total loss: 0.24519 | time: 72.091s
| Adam | epoch: 001 | loss: 0.24519 - acc: 0.9373 -- iter: 38912/55000
Training Step: 609  | total loss: 0.24061 | time: 72.196s
| Adam | epoch: 001 | loss: 0.24061 - acc: 0.9389 -- iter: 38976/55000
Training Step: 610  | total loss: 0.25068 | time: 72.298s
| Adam | epoch: 001 | loss: 0.25068 - acc: 0.9340 -- iter: 39040/55000
Training Step: 611  | total loss: 0.25216 | time: 72.391s
| Adam | epoch: 001 | loss: 0.25216 - acc: 0.9313 -- iter: 39104/55000
Training Step: 612  | total loss: 0.27199 | time: 72.488s
| Adam | epoch: 001 | loss: 0.27199 - acc: 0.9272 -- iter: 39168/55000
Training Step: 613  | total loss: 0.26090 | time: 72.592s
| Adam | epoch: 001 | loss: 0.26090 - acc: 0.9298 -- iter: 39232/55000
Training Step: 614  | total loss: 0.26080 | time: 72.720s
| Adam | epoch: 001 | loss: 0.26080 - acc: 0.9290 -- iter: 39296/55000
Training Step: 615  | total loss: 0.26143 | time: 72.816s
| Adam | epoch: 001 | loss: 0.26143 - acc: 0.9267 -- iter: 39360/55000
Training Step: 616  | total loss: 0.26089 | time: 72.933s
| Adam | epoch: 001 | loss: 0.26089 - acc: 0.9294 -- iter: 39424/55000
Training Step: 617  | total loss: 0.24929 | time: 73.022s
| Adam | epoch: 001 | loss: 0.24929 - acc: 0.9333 -- iter: 39488/55000
Training Step: 618  | total loss: 0.24424 | time: 73.114s
| Adam | epoch: 001 | loss: 0.24424 - acc: 0.9322 -- iter: 39552/55000
Training Step: 619  | total loss: 0.23298 | time: 73.220s
| Adam | epoch: 001 | loss: 0.23298 - acc: 0.9358 -- iter: 39616/55000
Training Step: 620  | total loss: 0.24959 | time: 73.334s
| Adam | epoch: 001 | loss: 0.24959 - acc: 0.9282 -- iter: 39680/55000
Training Step: 621  | total loss: 0.23388 | time: 73.452s
| Adam | epoch: 001 | loss: 0.23388 - acc: 0.9322 -- iter: 39744/55000
Training Step: 622  | total loss: 0.22262 | time: 73.602s
| Adam | epoch: 001 | loss: 0.22262 - acc: 0.9374 -- iter: 39808/55000
Training Step: 623  | total loss: 0.22276 | time: 73.745s
| Adam | epoch: 001 | loss: 0.22276 - acc: 0.9374 -- iter: 39872/55000
Training Step: 624  | total loss: 0.21706 | time: 73.893s
| Adam | epoch: 001 | loss: 0.21706 - acc: 0.9390 -- iter: 39936/55000
Training Step: 625  | total loss: 0.21431 | time: 74.023s
| Adam | epoch: 001 | loss: 0.21431 - acc: 0.9404 -- iter: 40000/55000
Training Step: 626  | total loss: 0.20422 | time: 74.126s
| Adam | epoch: 001 | loss: 0.20422 - acc: 0.9433 -- iter: 40064/55000
Training Step: 627  | total loss: 0.20479 | time: 74.228s
| Adam | epoch: 001 | loss: 0.20479 - acc: 0.9411 -- iter: 40128/55000
Training Step: 628  | total loss: 0.19911 | time: 74.356s
| Adam | epoch: 001 | loss: 0.19911 - acc: 0.9439 -- iter: 40192/55000
Training Step: 629  | total loss: 0.20333 | time: 74.498s
| Adam | epoch: 001 | loss: 0.20333 - acc: 0.9417 -- iter: 40256/55000
Training Step: 630  | total loss: 0.19633 | time: 74.633s
| Adam | epoch: 001 | loss: 0.19633 - acc: 0.9428 -- iter: 40320/55000
Training Step: 631  | total loss: 0.20619 | time: 74.752s
| Adam | epoch: 001 | loss: 0.20619 - acc: 0.9407 -- iter: 40384/55000
Training Step: 632  | total loss: 0.21089 | time: 74.873s
| Adam | epoch: 001 | loss: 0.21089 - acc: 0.9420 -- iter: 40448/55000
Training Step: 633  | total loss: 0.22156 | time: 74.993s
| Adam | epoch: 001 | loss: 0.22156 - acc: 0.9384 -- iter: 40512/55000
Training Step: 634  | total loss: 0.20921 | time: 75.083s
| Adam | epoch: 001 | loss: 0.20921 - acc: 0.9430 -- iter: 40576/55000
Training Step: 635  | total loss: 0.21889 | time: 75.178s
| Adam | epoch: 001 | loss: 0.21889 - acc: 0.9393 -- iter: 40640/55000
Training Step: 636  | total loss: 0.20943 | time: 75.266s
| Adam | epoch: 001 | loss: 0.20943 - acc: 0.9423 -- iter: 40704/55000
Training Step: 637  | total loss: 0.22321 | time: 75.357s
| Adam | epoch: 001 | loss: 0.22321 - acc: 0.9355 -- iter: 40768/55000
Training Step: 638  | total loss: 0.22019 | time: 75.451s
| Adam | epoch: 001 | loss: 0.22019 - acc: 0.9389 -- iter: 40832/55000
Training Step: 639  | total loss: 0.21508 | time: 75.536s
| Adam | epoch: 001 | loss: 0.21508 - acc: 0.9403 -- iter: 40896/55000
Training Step: 640  | total loss: 0.20235 | time: 75.614s
| Adam | epoch: 001 | loss: 0.20235 - acc: 0.9447 -- iter: 40960/55000
Training Step: 641  | total loss: 0.20028 | time: 75.698s
| Adam | epoch: 001 | loss: 0.20028 - acc: 0.9455 -- iter: 41024/55000
Training Step: 642  | total loss: 0.19041 | time: 75.782s
| Adam | epoch: 001 | loss: 0.19041 - acc: 0.9494 -- iter: 41088/55000
Training Step: 643  | total loss: 0.18488 | time: 75.871s
| Adam | epoch: 001 | loss: 0.18488 - acc: 0.9498 -- iter: 41152/55000
Training Step: 644  | total loss: 0.17858 | time: 75.965s
| Adam | epoch: 001 | loss: 0.17858 - acc: 0.9517 -- iter: 41216/55000
Training Step: 645  | total loss: 0.16940 | time: 76.054s
| Adam | epoch: 001 | loss: 0.16940 - acc: 0.9534 -- iter: 41280/55000
Training Step: 646  | total loss: 0.16515 | time: 76.139s
| Adam | epoch: 001 | loss: 0.16515 - acc: 0.9549 -- iter: 41344/55000
Training Step: 647  | total loss: 0.15640 | time: 76.222s
| Adam | epoch: 001 | loss: 0.15640 - acc: 0.9579 -- iter: 41408/55000
Training Step: 648  | total loss: 0.15456 | time: 76.316s
| Adam | epoch: 001 | loss: 0.15456 - acc: 0.9590 -- iter: 41472/55000
Training Step: 649  | total loss: 0.16066 | time: 76.397s
| Adam | epoch: 001 | loss: 0.16066 - acc: 0.9537 -- iter: 41536/55000
Training Step: 650  | total loss: 0.14792 | time: 76.483s
| Adam | epoch: 001 | loss: 0.14792 - acc: 0.9583 -- iter: 41600/55000
Training Step: 651  | total loss: 0.14808 | time: 76.566s
| Adam | epoch: 001 | loss: 0.14808 - acc: 0.9594 -- iter: 41664/55000
Training Step: 652  | total loss: 0.15204 | time: 76.648s
| Adam | epoch: 001 | loss: 0.15204 - acc: 0.9587 -- iter: 41728/55000
Training Step: 653  | total loss: 0.14911 | time: 76.734s
| Adam | epoch: 001 | loss: 0.14911 - acc: 0.9613 -- iter: 41792/55000
Training Step: 654  | total loss: 0.15109 | time: 76.838s
| Adam | epoch: 001 | loss: 0.15109 - acc: 0.9620 -- iter: 41856/55000
Training Step: 655  | total loss: 0.15493 | time: 76.932s
| Adam | epoch: 001 | loss: 0.15493 - acc: 0.9596 -- iter: 41920/55000
Training Step: 656  | total loss: 0.15791 | time: 77.024s
| Adam | epoch: 001 | loss: 0.15791 - acc: 0.9574 -- iter: 41984/55000
Training Step: 657  | total loss: 0.15482 | time: 77.118s
| Adam | epoch: 001 | loss: 0.15482 - acc: 0.9570 -- iter: 42048/55000
Training Step: 658  | total loss: 0.16163 | time: 77.209s
| Adam | epoch: 001 | loss: 0.16163 - acc: 0.9550 -- iter: 42112/55000
Training Step: 659  | total loss: 0.17125 | time: 77.296s
| Adam | epoch: 001 | loss: 0.17125 - acc: 0.9501 -- iter: 42176/55000
Training Step: 660  | total loss: 0.18001 | time: 77.374s
| Adam | epoch: 001 | loss: 0.18001 - acc: 0.9489 -- iter: 42240/55000
Training Step: 661  | total loss: 0.18144 | time: 77.453s
| Adam | epoch: 001 | loss: 0.18144 - acc: 0.9493 -- iter: 42304/55000
Training Step: 662  | total loss: 0.18463 | time: 77.538s
| Adam | epoch: 001 | loss: 0.18463 - acc: 0.9512 -- iter: 42368/55000
Training Step: 663  | total loss: 0.19783 | time: 77.617s
| Adam | epoch: 001 | loss: 0.19783 - acc: 0.9452 -- iter: 42432/55000
Training Step: 664  | total loss: 0.19367 | time: 77.705s
| Adam | epoch: 001 | loss: 0.19367 - acc: 0.9460 -- iter: 42496/55000
Training Step: 665  | total loss: 0.19605 | time: 77.792s
| Adam | epoch: 001 | loss: 0.19605 - acc: 0.9451 -- iter: 42560/55000
Training Step: 666  | total loss: 0.18238 | time: 77.888s
| Adam | epoch: 001 | loss: 0.18238 - acc: 0.9491 -- iter: 42624/55000
Training Step: 667  | total loss: 0.18797 | time: 77.981s
| Adam | epoch: 001 | loss: 0.18797 - acc: 0.9448 -- iter: 42688/55000
Training Step: 668  | total loss: 0.19436 | time: 78.084s
| Adam | epoch: 001 | loss: 0.19436 - acc: 0.9425 -- iter: 42752/55000
Training Step: 669  | total loss: 0.19229 | time: 78.184s
| Adam | epoch: 001 | loss: 0.19229 - acc: 0.9420 -- iter: 42816/55000
Training Step: 670  | total loss: 0.19675 | time: 78.270s
| Adam | epoch: 001 | loss: 0.19675 - acc: 0.9400 -- iter: 42880/55000
Training Step: 671  | total loss: 0.20122 | time: 78.360s
| Adam | epoch: 001 | loss: 0.20122 - acc: 0.9397 -- iter: 42944/55000
Training Step: 672  | total loss: 0.19270 | time: 78.448s
| Adam | epoch: 001 | loss: 0.19270 - acc: 0.9426 -- iter: 43008/55000
Training Step: 673  | total loss: 0.20383 | time: 78.537s
| Adam | epoch: 001 | loss: 0.20383 - acc: 0.9374 -- iter: 43072/55000
Training Step: 674  | total loss: 0.19775 | time: 78.626s
| Adam | epoch: 001 | loss: 0.19775 - acc: 0.9390 -- iter: 43136/55000
Training Step: 675  | total loss: 0.19504 | time: 78.722s
| Adam | epoch: 001 | loss: 0.19504 - acc: 0.9388 -- iter: 43200/55000
Training Step: 676  | total loss: 0.19570 | time: 78.822s
| Adam | epoch: 001 | loss: 0.19570 - acc: 0.9403 -- iter: 43264/55000
Training Step: 677  | total loss: 0.19459 | time: 78.921s
| Adam | epoch: 001 | loss: 0.19459 - acc: 0.9384 -- iter: 43328/55000
Training Step: 678  | total loss: 0.18652 | time: 79.009s
| Adam | epoch: 001 | loss: 0.18652 - acc: 0.9399 -- iter: 43392/55000
Training Step: 679  | total loss: 0.18781 | time: 79.093s
| Adam | epoch: 001 | loss: 0.18781 - acc: 0.9381 -- iter: 43456/55000
Training Step: 680  | total loss: 0.19059 | time: 79.182s
| Adam | epoch: 001 | loss: 0.19059 - acc: 0.9380 -- iter: 43520/55000
Training Step: 681  | total loss: 0.18485 | time: 79.271s
| Adam | epoch: 001 | loss: 0.18485 - acc: 0.9395 -- iter: 43584/55000
Training Step: 682  | total loss: 0.17385 | time: 79.351s
| Adam | epoch: 001 | loss: 0.17385 - acc: 0.9440 -- iter: 43648/55000
Training Step: 683  | total loss: 0.18284 | time: 79.434s
| Adam | epoch: 001 | loss: 0.18284 - acc: 0.9434 -- iter: 43712/55000
Training Step: 684  | total loss: 0.18184 | time: 79.527s
| Adam | epoch: 001 | loss: 0.18184 - acc: 0.9428 -- iter: 43776/55000
Training Step: 685  | total loss: 0.19304 | time: 79.614s
| Adam | epoch: 001 | loss: 0.19304 - acc: 0.9376 -- iter: 43840/55000
Training Step: 686  | total loss: 0.19241 | time: 79.708s
| Adam | epoch: 001 | loss: 0.19241 - acc: 0.9376 -- iter: 43904/55000
Training Step: 687  | total loss: 0.19833 | time: 79.803s
| Adam | epoch: 001 | loss: 0.19833 - acc: 0.9376 -- iter: 43968/55000
Training Step: 688  | total loss: 0.19178 | time: 79.892s
| Adam | epoch: 001 | loss: 0.19178 - acc: 0.9407 -- iter: 44032/55000
Training Step: 689  | total loss: 0.19150 | time: 79.979s
| Adam | epoch: 001 | loss: 0.19150 - acc: 0.9404 -- iter: 44096/55000
Training Step: 690  | total loss: 0.18635 | time: 80.064s
| Adam | epoch: 001 | loss: 0.18635 - acc: 0.9416 -- iter: 44160/55000
Training Step: 691  | total loss: 0.18171 | time: 80.152s
| Adam | epoch: 001 | loss: 0.18171 - acc: 0.9428 -- iter: 44224/55000
Training Step: 692  | total loss: 0.18844 | time: 80.253s
| Adam | epoch: 001 | loss: 0.18844 - acc: 0.9407 -- iter: 44288/55000
Training Step: 693  | total loss: 0.19801 | time: 80.365s
| Adam | epoch: 001 | loss: 0.19801 - acc: 0.9388 -- iter: 44352/55000
Training Step: 694  | total loss: 0.19993 | time: 80.479s
| Adam | epoch: 001 | loss: 0.19993 - acc: 0.9387 -- iter: 44416/55000
Training Step: 695  | total loss: 0.19732 | time: 80.594s
| Adam | epoch: 001 | loss: 0.19732 - acc: 0.9417 -- iter: 44480/55000
Training Step: 696  | total loss: 0.20579 | time: 80.696s
| Adam | epoch: 001 | loss: 0.20579 - acc: 0.9381 -- iter: 44544/55000
Training Step: 697  | total loss: 0.20734 | time: 80.800s
| Adam | epoch: 001 | loss: 0.20734 - acc: 0.9396 -- iter: 44608/55000
Training Step: 698  | total loss: 0.21038 | time: 80.901s
| Adam | epoch: 001 | loss: 0.21038 - acc: 0.9347 -- iter: 44672/55000
Training Step: 699  | total loss: 0.20553 | time: 81.004s
| Adam | epoch: 001 | loss: 0.20553 - acc: 0.9366 -- iter: 44736/55000
Training Step: 700  | total loss: 0.20164 | time: 83.875s
| Adam | epoch: 001 | loss: 0.20164 - acc: 0.9382 | val_loss: 0.19836 - val_acc: 0.9394 -- iter: 44800/55000
--
Training Step: 701  | total loss: 0.21442 | time: 83.954s
| Adam | epoch: 001 | loss: 0.21442 - acc: 0.9366 -- iter: 44864/55000
Training Step: 702  | total loss: 0.24252 | time: 84.022s
| Adam | epoch: 001 | loss: 0.24252 - acc: 0.9320 -- iter: 44928/55000
Training Step: 703  | total loss: 0.23944 | time: 84.096s
| Adam | epoch: 001 | loss: 0.23944 - acc: 0.9326 -- iter: 44992/55000
Training Step: 704  | total loss: 0.23148 | time: 84.167s
| Adam | epoch: 001 | loss: 0.23148 - acc: 0.9362 -- iter: 45056/55000
Training Step: 705  | total loss: 0.25111 | time: 84.237s
| Adam | epoch: 001 | loss: 0.25111 - acc: 0.9285 -- iter: 45120/55000
Training Step: 706  | total loss: 0.24614 | time: 84.309s
| Adam | epoch: 001 | loss: 0.24614 - acc: 0.9294 -- iter: 45184/55000
Training Step: 707  | total loss: 0.23580 | time: 84.381s
| Adam | epoch: 001 | loss: 0.23580 - acc: 0.9318 -- iter: 45248/55000
Training Step: 708  | total loss: 0.21967 | time: 84.453s
| Adam | epoch: 001 | loss: 0.21967 - acc: 0.9355 -- iter: 45312/55000
Training Step: 709  | total loss: 0.21561 | time: 84.526s
| Adam | epoch: 001 | loss: 0.21561 - acc: 0.9372 -- iter: 45376/55000
Training Step: 710  | total loss: 0.20914 | time: 84.599s
| Adam | epoch: 001 | loss: 0.20914 - acc: 0.9388 -- iter: 45440/55000
Training Step: 711  | total loss: 0.20746 | time: 84.671s
| Adam | epoch: 001 | loss: 0.20746 - acc: 0.9402 -- iter: 45504/55000
Training Step: 712  | total loss: 0.20138 | time: 84.746s
| Adam | epoch: 001 | loss: 0.20138 - acc: 0.9415 -- iter: 45568/55000
Training Step: 713  | total loss: 0.19805 | time: 84.823s
| Adam | epoch: 001 | loss: 0.19805 - acc: 0.9411 -- iter: 45632/55000
Training Step: 714  | total loss: 0.18835 | time: 84.895s
| Adam | epoch: 001 | loss: 0.18835 - acc: 0.9455 -- iter: 45696/55000
Training Step: 715  | total loss: 0.18234 | time: 84.966s
| Adam | epoch: 001 | loss: 0.18234 - acc: 0.9493 -- iter: 45760/55000
Training Step: 716  | total loss: 0.20404 | time: 85.038s
| Adam | epoch: 001 | loss: 0.20404 - acc: 0.9450 -- iter: 45824/55000
Training Step: 717  | total loss: 0.19471 | time: 85.112s
| Adam | epoch: 001 | loss: 0.19471 - acc: 0.9474 -- iter: 45888/55000
Training Step: 718  | total loss: 0.20176 | time: 85.184s
| Adam | epoch: 001 | loss: 0.20176 - acc: 0.9433 -- iter: 45952/55000
Training Step: 719  | total loss: 0.19644 | time: 85.251s
| Adam | epoch: 001 | loss: 0.19644 - acc: 0.9458 -- iter: 46016/55000
Training Step: 720  | total loss: 0.19887 | time: 85.324s
| Adam | epoch: 001 | loss: 0.19887 - acc: 0.9403 -- iter: 46080/55000
Training Step: 721  | total loss: 0.19395 | time: 85.394s
| Adam | epoch: 001 | loss: 0.19395 - acc: 0.9432 -- iter: 46144/55000
Training Step: 722  | total loss: 0.19941 | time: 85.469s
| Adam | epoch: 001 | loss: 0.19941 - acc: 0.9426 -- iter: 46208/55000
Training Step: 723  | total loss: 0.19479 | time: 85.543s
| Adam | epoch: 001 | loss: 0.19479 - acc: 0.9452 -- iter: 46272/55000
Training Step: 724  | total loss: 0.19931 | time: 85.620s
| Adam | epoch: 001 | loss: 0.19931 - acc: 0.9413 -- iter: 46336/55000
Training Step: 725  | total loss: 0.20847 | time: 85.691s
| Adam | epoch: 001 | loss: 0.20847 - acc: 0.9378 -- iter: 46400/55000
Training Step: 726  | total loss: 0.20846 | time: 85.768s
| Adam | epoch: 001 | loss: 0.20846 - acc: 0.9393 -- iter: 46464/55000
Training Step: 727  | total loss: 0.21471 | time: 85.835s
| Adam | epoch: 001 | loss: 0.21471 - acc: 0.9360 -- iter: 46528/55000
Training Step: 728  | total loss: 0.20916 | time: 85.906s
| Adam | epoch: 001 | loss: 0.20916 - acc: 0.9346 -- iter: 46592/55000
Training Step: 729  | total loss: 0.19998 | time: 85.976s
| Adam | epoch: 001 | loss: 0.19998 - acc: 0.9396 -- iter: 46656/55000
Training Step: 730  | total loss: 0.20846 | time: 86.046s
| Adam | epoch: 001 | loss: 0.20846 - acc: 0.9363 -- iter: 46720/55000
Training Step: 731  | total loss: 0.20262 | time: 86.117s
| Adam | epoch: 001 | loss: 0.20262 - acc: 0.9395 -- iter: 46784/55000
Training Step: 732  | total loss: 0.20421 | time: 86.194s
| Adam | epoch: 001 | loss: 0.20421 - acc: 0.9393 -- iter: 46848/55000
Training Step: 733  | total loss: 0.19478 | time: 86.263s
| Adam | epoch: 001 | loss: 0.19478 - acc: 0.9423 -- iter: 46912/55000
Training Step: 734  | total loss: 0.20060 | time: 86.334s
| Adam | epoch: 001 | loss: 0.20060 - acc: 0.9402 -- iter: 46976/55000
Training Step: 735  | total loss: 0.20799 | time: 86.406s
| Adam | epoch: 001 | loss: 0.20799 - acc: 0.9337 -- iter: 47040/55000
Training Step: 736  | total loss: 0.21140 | time: 86.476s
| Adam | epoch: 001 | loss: 0.21140 - acc: 0.9325 -- iter: 47104/55000
Training Step: 737  | total loss: 0.21436 | time: 86.546s
| Adam | epoch: 001 | loss: 0.21436 - acc: 0.9299 -- iter: 47168/55000
Training Step: 738  | total loss: 0.21176 | time: 86.614s
| Adam | epoch: 001 | loss: 0.21176 - acc: 0.9306 -- iter: 47232/55000
Training Step: 739  | total loss: 0.21300 | time: 86.686s
| Adam | epoch: 001 | loss: 0.21300 - acc: 0.9282 -- iter: 47296/55000
Training Step: 740  | total loss: 0.20916 | time: 86.755s
| Adam | epoch: 001 | loss: 0.20916 - acc: 0.9307 -- iter: 47360/55000
Training Step: 741  | total loss: 0.21887 | time: 86.826s
| Adam | epoch: 001 | loss: 0.21887 - acc: 0.9267 -- iter: 47424/55000
Training Step: 742  | total loss: 0.22054 | time: 86.901s
| Adam | epoch: 001 | loss: 0.22054 - acc: 0.9278 -- iter: 47488/55000
Training Step: 743  | total loss: 0.22670 | time: 86.971s
| Adam | epoch: 001 | loss: 0.22670 - acc: 0.9272 -- iter: 47552/55000
Training Step: 744  | total loss: 0.21105 | time: 87.042s
| Adam | epoch: 001 | loss: 0.21105 - acc: 0.9329 -- iter: 47616/55000
Training Step: 745  | total loss: 0.19581 | time: 87.120s
| Adam | epoch: 001 | loss: 0.19581 - acc: 0.9380 -- iter: 47680/55000
Training Step: 746  | total loss: 0.18522 | time: 87.191s
| Adam | epoch: 001 | loss: 0.18522 - acc: 0.9427 -- iter: 47744/55000
Training Step: 747  | total loss: 0.18683 | time: 87.265s
| Adam | epoch: 001 | loss: 0.18683 - acc: 0.9406 -- iter: 47808/55000
Training Step: 748  | total loss: 0.19719 | time: 87.335s
| Adam | epoch: 001 | loss: 0.19719 - acc: 0.9372 -- iter: 47872/55000
Training Step: 749  | total loss: 0.21564 | time: 87.406s
| Adam | epoch: 001 | loss: 0.21564 - acc: 0.9294 -- iter: 47936/55000
Training Step: 750  | total loss: 0.21400 | time: 87.483s
| Adam | epoch: 001 | loss: 0.21400 - acc: 0.9271 -- iter: 48000/55000
Training Step: 751  | total loss: 0.19969 | time: 87.550s
| Adam | epoch: 001 | loss: 0.19969 - acc: 0.9328 -- iter: 48064/55000
Training Step: 752  | total loss: 0.19807 | time: 87.623s
| Adam | epoch: 001 | loss: 0.19807 - acc: 0.9301 -- iter: 48128/55000
Training Step: 753  | total loss: 0.19585 | time: 87.694s
| Adam | epoch: 001 | loss: 0.19585 - acc: 0.9340 -- iter: 48192/55000
Training Step: 754  | total loss: 0.18653 | time: 87.762s
| Adam | epoch: 001 | loss: 0.18653 - acc: 0.9344 -- iter: 48256/55000
Training Step: 755  | total loss: 0.18122 | time: 87.839s
| Adam | epoch: 001 | loss: 0.18122 - acc: 0.9362 -- iter: 48320/55000
Training Step: 756  | total loss: 0.17422 | time: 87.918s
| Adam | epoch: 001 | loss: 0.17422 - acc: 0.9395 -- iter: 48384/55000
Training Step: 757  | total loss: 0.16488 | time: 87.993s
| Adam | epoch: 001 | loss: 0.16488 - acc: 0.9440 -- iter: 48448/55000
Training Step: 758  | total loss: 0.16894 | time: 88.064s
| Adam | epoch: 001 | loss: 0.16894 - acc: 0.9433 -- iter: 48512/55000
Training Step: 759  | total loss: 0.16144 | time: 88.132s
| Adam | epoch: 001 | loss: 0.16144 - acc: 0.9443 -- iter: 48576/55000
Training Step: 760  | total loss: 0.16045 | time: 88.201s
| Adam | epoch: 001 | loss: 0.16045 - acc: 0.9468 -- iter: 48640/55000
Training Step: 761  | total loss: 0.15813 | time: 88.275s
| Adam | epoch: 001 | loss: 0.15813 - acc: 0.9458 -- iter: 48704/55000
Training Step: 762  | total loss: 0.15714 | time: 88.348s
| Adam | epoch: 001 | loss: 0.15714 - acc: 0.9466 -- iter: 48768/55000
Training Step: 763  | total loss: 0.16029 | time: 88.424s
| Adam | epoch: 001 | loss: 0.16029 - acc: 0.9441 -- iter: 48832/55000
Training Step: 764  | total loss: 0.16730 | time: 88.496s
| Adam | epoch: 001 | loss: 0.16730 - acc: 0.9434 -- iter: 48896/55000
Training Step: 765  | total loss: 0.16261 | time: 88.570s
| Adam | epoch: 001 | loss: 0.16261 - acc: 0.9475 -- iter: 48960/55000
Training Step: 766  | total loss: 0.16430 | time: 88.648s
| Adam | epoch: 001 | loss: 0.16430 - acc: 0.9465 -- iter: 49024/55000
Training Step: 767  | total loss: 0.17839 | time: 88.721s
| Adam | epoch: 001 | loss: 0.17839 - acc: 0.9456 -- iter: 49088/55000
Training Step: 768  | total loss: 0.17249 | time: 88.792s
| Adam | epoch: 001 | loss: 0.17249 - acc: 0.9479 -- iter: 49152/55000
Training Step: 769  | total loss: 0.16527 | time: 88.863s
| Adam | epoch: 001 | loss: 0.16527 - acc: 0.9516 -- iter: 49216/55000
Training Step: 770  | total loss: 0.16534 | time: 88.931s
| Adam | epoch: 001 | loss: 0.16534 - acc: 0.9502 -- iter: 49280/55000
Training Step: 771  | total loss: 0.15372 | time: 89.001s
| Adam | epoch: 001 | loss: 0.15372 - acc: 0.9552 -- iter: 49344/55000
Training Step: 772  | total loss: 0.15823 | time: 89.069s
| Adam | epoch: 001 | loss: 0.15823 - acc: 0.9534 -- iter: 49408/55000
Training Step: 773  | total loss: 0.16452 | time: 89.140s
| Adam | epoch: 001 | loss: 0.16452 - acc: 0.9534 -- iter: 49472/55000
Training Step: 774  | total loss: 0.17468 | time: 89.214s
| Adam | epoch: 001 | loss: 0.17468 - acc: 0.9486 -- iter: 49536/55000
Training Step: 775  | total loss: 0.17736 | time: 89.283s
| Adam | epoch: 001 | loss: 0.17736 - acc: 0.9491 -- iter: 49600/55000
Training Step: 776  | total loss: 0.17341 | time: 89.359s
| Adam | epoch: 001 | loss: 0.17341 - acc: 0.9495 -- iter: 49664/55000
Training Step: 777  | total loss: 0.16625 | time: 89.440s
| Adam | epoch: 001 | loss: 0.16625 - acc: 0.9530 -- iter: 49728/55000
Training Step: 778  | total loss: 0.16662 | time: 89.521s
| Adam | epoch: 001 | loss: 0.16662 - acc: 0.9530 -- iter: 49792/55000
Training Step: 779  | total loss: 0.16557 | time: 89.597s
| Adam | epoch: 001 | loss: 0.16557 - acc: 0.9515 -- iter: 49856/55000
Training Step: 780  | total loss: 0.17553 | time: 89.682s
| Adam | epoch: 001 | loss: 0.17553 - acc: 0.9469 -- iter: 49920/55000
Training Step: 781  | total loss: 0.16541 | time: 89.772s
| Adam | epoch: 001 | loss: 0.16541 - acc: 0.9507 -- iter: 49984/55000
Training Step: 782  | total loss: 0.16057 | time: 89.880s
| Adam | epoch: 001 | loss: 0.16057 - acc: 0.9540 -- iter: 50048/55000
Training Step: 783  | total loss: 0.15984 | time: 89.983s
| Adam | epoch: 001 | loss: 0.15984 - acc: 0.9555 -- iter: 50112/55000
Training Step: 784  | total loss: 0.15548 | time: 90.053s
| Adam | epoch: 001 | loss: 0.15548 - acc: 0.9553 -- iter: 50176/55000
Training Step: 785  | total loss: 0.15253 | time: 90.122s
| Adam | epoch: 001 | loss: 0.15253 - acc: 0.9582 -- iter: 50240/55000
Training Step: 786  | total loss: 0.14975 | time: 90.193s
| Adam | epoch: 001 | loss: 0.14975 - acc: 0.9608 -- iter: 50304/55000
Training Step: 787  | total loss: 0.13937 | time: 90.264s
| Adam | epoch: 001 | loss: 0.13937 - acc: 0.9632 -- iter: 50368/55000
Training Step: 788  | total loss: 0.13485 | time: 90.340s
| Adam | epoch: 001 | loss: 0.13485 - acc: 0.9637 -- iter: 50432/55000
Training Step: 789  | total loss: 0.14574 | time: 90.411s
| Adam | epoch: 001 | loss: 0.14574 - acc: 0.9595 -- iter: 50496/55000
Training Step: 790  | total loss: 0.14575 | time: 90.482s
| Adam | epoch: 001 | loss: 0.14575 - acc: 0.9589 -- iter: 50560/55000
Training Step: 791  | total loss: 0.14412 | time: 90.553s
| Adam | epoch: 001 | loss: 0.14412 - acc: 0.9583 -- iter: 50624/55000
Training Step: 792  | total loss: 0.13804 | time: 90.622s
| Adam | epoch: 001 | loss: 0.13804 - acc: 0.9609 -- iter: 50688/55000
Training Step: 793  | total loss: 0.13593 | time: 90.691s
| Adam | epoch: 001 | loss: 0.13593 - acc: 0.9617 -- iter: 50752/55000
Training Step: 794  | total loss: 0.13382 | time: 90.757s
| Adam | epoch: 001 | loss: 0.13382 - acc: 0.9608 -- iter: 50816/55000
Training Step: 795  | total loss: 0.12559 | time: 90.825s
| Adam | epoch: 001 | loss: 0.12559 - acc: 0.9632 -- iter: 50880/55000
Training Step: 796  | total loss: 0.12814 | time: 90.899s
| Adam | epoch: 001 | loss: 0.12814 - acc: 0.9638 -- iter: 50944/55000
Training Step: 797  | total loss: 0.13070 | time: 90.971s
| Adam | epoch: 001 | loss: 0.13070 - acc: 0.9611 -- iter: 51008/55000
Training Step: 798  | total loss: 0.12740 | time: 91.048s
| Adam | epoch: 001 | loss: 0.12740 - acc: 0.9619 -- iter: 51072/55000
Training Step: 799  | total loss: 0.12608 | time: 91.121s
| Adam | epoch: 001 | loss: 0.12608 - acc: 0.9626 -- iter: 51136/55000
Training Step: 800  | total loss: 0.12542 | time: 93.332s
| Adam | epoch: 001 | loss: 0.12542 - acc: 0.9632 | val_loss: 0.14589 - val_acc: 0.9562 -- iter: 51200/55000
--
Training Step: 801  | total loss: 0.11808 | time: 93.401s
| Adam | epoch: 001 | loss: 0.11808 - acc: 0.9669 -- iter: 51264/55000
Training Step: 802  | total loss: 0.12174 | time: 93.474s
| Adam | epoch: 001 | loss: 0.12174 - acc: 0.9639 -- iter: 51328/55000
Training Step: 803  | total loss: 0.11769 | time: 93.550s
| Adam | epoch: 001 | loss: 0.11769 - acc: 0.9660 -- iter: 51392/55000
Training Step: 804  | total loss: 0.13054 | time: 93.623s
| Adam | epoch: 001 | loss: 0.13054 - acc: 0.9616 -- iter: 51456/55000
Training Step: 805  | total loss: 0.12359 | time: 93.692s
| Adam | epoch: 001 | loss: 0.12359 - acc: 0.9639 -- iter: 51520/55000
Training Step: 806  | total loss: 0.11626 | time: 93.766s
| Adam | epoch: 001 | loss: 0.11626 - acc: 0.9659 -- iter: 51584/55000
Training Step: 807  | total loss: 0.12184 | time: 93.845s
| Adam | epoch: 001 | loss: 0.12184 - acc: 0.9646 -- iter: 51648/55000
Training Step: 808  | total loss: 0.12606 | time: 93.919s
| Adam | epoch: 001 | loss: 0.12606 - acc: 0.9650 -- iter: 51712/55000
Training Step: 809  | total loss: 0.12741 | time: 93.992s
| Adam | epoch: 001 | loss: 0.12741 - acc: 0.9623 -- iter: 51776/55000
Training Step: 810  | total loss: 0.13205 | time: 94.064s
| Adam | epoch: 001 | loss: 0.13205 - acc: 0.9598 -- iter: 51840/55000
Training Step: 811  | total loss: 0.13328 | time: 94.139s
| Adam | epoch: 001 | loss: 0.13328 - acc: 0.9607 -- iter: 51904/55000
Training Step: 812  | total loss: 0.12705 | time: 94.209s
| Adam | epoch: 001 | loss: 0.12705 - acc: 0.9615 -- iter: 51968/55000
Training Step: 813  | total loss: 0.11683 | time: 94.286s
| Adam | epoch: 001 | loss: 0.11683 - acc: 0.9654 -- iter: 52032/55000
Training Step: 814  | total loss: 0.11281 | time: 94.363s
| Adam | epoch: 001 | loss: 0.11281 - acc: 0.9673 -- iter: 52096/55000
Training Step: 815  | total loss: 0.11937 | time: 94.439s
| Adam | epoch: 001 | loss: 0.11937 - acc: 0.9627 -- iter: 52160/55000
Training Step: 816  | total loss: 0.12048 | time: 94.521s
| Adam | epoch: 001 | loss: 0.12048 - acc: 0.9618 -- iter: 52224/55000
Training Step: 817  | total loss: 0.12611 | time: 94.609s
| Adam | epoch: 001 | loss: 0.12611 - acc: 0.9609 -- iter: 52288/55000
Training Step: 818  | total loss: 0.11939 | time: 94.692s
| Adam | epoch: 001 | loss: 0.11939 - acc: 0.9632 -- iter: 52352/55000
Training Step: 819  | total loss: 0.11085 | time: 94.804s
| Adam | epoch: 001 | loss: 0.11085 - acc: 0.9669 -- iter: 52416/55000
Training Step: 820  | total loss: 0.10814 | time: 94.928s
| Adam | epoch: 001 | loss: 0.10814 - acc: 0.9687 -- iter: 52480/55000
Training Step: 821  | total loss: 0.11846 | time: 95.035s
| Adam | epoch: 001 | loss: 0.11846 - acc: 0.9655 -- iter: 52544/55000
Training Step: 822  | total loss: 0.12206 | time: 95.200s
| Adam | epoch: 001 | loss: 0.12206 - acc: 0.9643 -- iter: 52608/55000
Training Step: 823  | total loss: 0.12090 | time: 95.303s
| Adam | epoch: 001 | loss: 0.12090 - acc: 0.9632 -- iter: 52672/55000
Training Step: 824  | total loss: 0.11694 | time: 95.403s
| Adam | epoch: 001 | loss: 0.11694 - acc: 0.9653 -- iter: 52736/55000
Training Step: 825  | total loss: 0.13020 | time: 95.498s
| Adam | epoch: 001 | loss: 0.13020 - acc: 0.9594 -- iter: 52800/55000
Training Step: 826  | total loss: 0.13500 | time: 95.589s
| Adam | epoch: 001 | loss: 0.13500 - acc: 0.9572 -- iter: 52864/55000
Training Step: 827  | total loss: 0.13624 | time: 95.679s
| Adam | epoch: 001 | loss: 0.13624 - acc: 0.9584 -- iter: 52928/55000
Training Step: 828  | total loss: 0.13759 | time: 95.764s
| Adam | epoch: 001 | loss: 0.13759 - acc: 0.9610 -- iter: 52992/55000
Training Step: 829  | total loss: 0.80818 | time: 95.852s
| Adam | epoch: 001 | loss: 0.80818 - acc: 0.8727 -- iter: 53056/55000
Training Step: 830  | total loss: 0.76036 | time: 95.949s
| Adam | epoch: 001 | loss: 0.76036 - acc: 0.8760 -- iter: 53120/55000
Training Step: 831  | total loss: 0.70348 | time: 96.038s
| Adam | epoch: 001 | loss: 0.70348 - acc: 0.8837 -- iter: 53184/55000
Training Step: 832  | total loss: 0.65525 | time: 96.132s
| Adam | epoch: 001 | loss: 0.65525 - acc: 0.8891 -- iter: 53248/55000
Training Step: 833  | total loss: 0.61280 | time: 96.222s
| Adam | epoch: 001 | loss: 0.61280 - acc: 0.8924 -- iter: 53312/55000
Training Step: 834  | total loss: 0.58883 | time: 96.310s
| Adam | epoch: 001 | loss: 0.58883 - acc: 0.8907 -- iter: 53376/55000
Training Step: 835  | total loss: 0.55146 | time: 96.401s
| Adam | epoch: 001 | loss: 0.55146 - acc: 0.8969 -- iter: 53440/55000
Training Step: 836  | total loss: 0.50885 | time: 96.490s
| Adam | epoch: 001 | loss: 0.50885 - acc: 0.9057 -- iter: 53504/55000
Training Step: 837  | total loss: 0.47707 | time: 96.583s
| Adam | epoch: 001 | loss: 0.47707 - acc: 0.9104 -- iter: 53568/55000
Training Step: 838  | total loss: 0.44479 | time: 96.674s
| Adam | epoch: 001 | loss: 0.44479 - acc: 0.9147 -- iter: 53632/55000
Training Step: 839  | total loss: 0.41088 | time: 96.766s
| Adam | epoch: 001 | loss: 0.41088 - acc: 0.9216 -- iter: 53696/55000
Training Step: 840  | total loss: 0.40279 | time: 96.854s
| Adam | epoch: 001 | loss: 0.40279 - acc: 0.9185 -- iter: 53760/55000
Training Step: 841  | total loss: 0.37355 | time: 96.935s
| Adam | epoch: 001 | loss: 0.37355 - acc: 0.9251 -- iter: 53824/55000
Training Step: 842  | total loss: 0.37156 | time: 97.017s
| Adam | epoch: 001 | loss: 0.37156 - acc: 0.9264 -- iter: 53888/55000
Training Step: 843  | total loss: 0.36377 | time: 97.098s
| Adam | epoch: 001 | loss: 0.36377 - acc: 0.9228 -- iter: 53952/55000
Training Step: 844  | total loss: 0.33810 | time: 97.181s
| Adam | epoch: 001 | loss: 0.33810 - acc: 0.9289 -- iter: 54016/55000
Training Step: 845  | total loss: 0.32184 | time: 97.264s
| Adam | epoch: 001 | loss: 0.32184 - acc: 0.9329 -- iter: 54080/55000
Training Step: 846  | total loss: 0.31845 | time: 97.351s
| Adam | epoch: 001 | loss: 0.31845 - acc: 0.9334 -- iter: 54144/55000
Training Step: 847  | total loss: 0.31341 | time: 97.432s
| Adam | epoch: 001 | loss: 0.31341 - acc: 0.9260 -- iter: 54208/55000
Training Step: 848  | total loss: 0.29649 | time: 97.516s
| Adam | epoch: 001 | loss: 0.29649 - acc: 0.9287 -- iter: 54272/55000
Training Step: 849  | total loss: 0.28754 | time: 97.598s
| Adam | epoch: 001 | loss: 0.28754 - acc: 0.9265 -- iter: 54336/55000
Training Step: 850  | total loss: 0.26791 | time: 97.677s
| Adam | epoch: 001 | loss: 0.26791 - acc: 0.9322 -- iter: 54400/55000
Training Step: 851  | total loss: 0.25911 | time: 97.760s
| Adam | epoch: 001 | loss: 0.25911 - acc: 0.9328 -- iter: 54464/55000
Training Step: 852  | total loss: 0.24474 | time: 97.839s
| Adam | epoch: 001 | loss: 0.24474 - acc: 0.9348 -- iter: 54528/55000
Training Step: 853  | total loss: 0.23969 | time: 97.921s
| Adam | epoch: 001 | loss: 0.23969 - acc: 0.9366 -- iter: 54592/55000
Training Step: 854  | total loss: 0.22799 | time: 98.007s
| Adam | epoch: 001 | loss: 0.22799 - acc: 0.9367 -- iter: 54656/55000
Training Step: 855  | total loss: 0.21135 | time: 98.090s
| Adam | epoch: 001 | loss: 0.21135 - acc: 0.9415 -- iter: 54720/55000
Training Step: 856  | total loss: 0.22039 | time: 98.178s
| Adam | epoch: 001 | loss: 0.22039 - acc: 0.9364 -- iter: 54784/55000
Training Step: 857  | total loss: 0.20773 | time: 98.260s
| Adam | epoch: 001 | loss: 0.20773 - acc: 0.9381 -- iter: 54848/55000
Training Step: 858  | total loss: 0.20429 | time: 98.338s
| Adam | epoch: 001 | loss: 0.20429 - acc: 0.9380 -- iter: 54912/55000
Training Step: 859  | total loss: 0.21189 | time: 98.418s
| Adam | epoch: 001 | loss: 0.21189 - acc: 0.9348 -- iter: 54976/55000
Training Step: 860  | total loss: 0.20062 | time: 100.993s
| Adam | epoch: 001 | loss: 0.20062 - acc: 0.9382 | val_loss: 0.15825 - val_acc: 0.9538 -- iter: 55000/55000
--

Process finished with exit code 0

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

mooyuan天天

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值