【TFLearn和TensorFlow应用】——泰坦尼克号预测

使用TFLearn和TensorFlow根据泰坦尼克号乘客的个人信息(如性别,年龄等)来估计他们的生存可能性。为了解决这个经典的机器学习任务,我们将构建一个深度神经网络分类器。
参考来源

介绍
1912年4月15日,泰坦尼克号在与冰山相撞后沉没,2224名乘客和机组人员中有1502人死亡。虽然在沉没中幸存下来有一些运气因素,但有些人群比其他群体更有可能生存下来,例如妇女,儿童和上层阶级。在本教程中,我们将进行分析以发现这些人是谁。

在这里插入图片描述

代码如下:

from __future__ import print_function
import numpy as np
import tflearn
# Download the Titanic dataset
from tflearn.datasets import titanic
titanic.download_dataset('titanic_dataset.csv')

# Load CSV file, indicate that the first column represents labels
from tflearn.data_utils import load_csv
data, labels = load_csv('titanic_dataset.csv', target_column=0,
                        categorical_labels=True, n_classes=2)


# Preprocessing function
def preprocess(passengers, columns_to_delete):
    # Sort by descending id and delete columns
    for column_to_delete in sorted(columns_to_delete, reverse=True):
        [passenger.pop(column_to_delete) for passenger in passengers]
    for i in range(len(passengers)):
        # Converting 'sex' field to float (id is 1 after removing labels column)
        passengers[i][1] = 1. if data[i][1] == 'female' else 0.
    return np.array(passengers, dtype=np.float32)

# Ignore 'name' and 'ticket' columns (id 1 & 6 of data array)
to_ignore=[1, 6]

# Preprocess data
data = preprocess(data, to_ignore)

# Build neural network
net = tflearn.input_data(shape=[None, 6])
net = tflearn.fully_connected(net, 32)
net = tflearn.fully_connected(net, 32)
net = tflearn.fully_connected(net, 2, activation='softmax')
net = tflearn.regression(net)

# Define model
model = tflearn.DNN(net)
# Start training (apply gradient descent algorithm)
model.fit(data, labels, n_epoch=10, batch_size=16, show_metric=True)

# Let's create some data for DiCaprio and Winslet
dicaprio = [3, 'Jack Dawson', 'male', 19, 0, 0, 'N/A', 5.0000]
winslet = [1, 'Rose DeWitt Bukater', 'female', 17, 1, 2, 'N/A', 100.0000]
# Preprocess data
dicaprio, winslet = preprocess([dicaprio, winslet], to_ignore)
# Predict surviving chances (class 1 results)
pred = model.predict([dicaprio, winslet])
print("DiCaprio Surviving Rate:", pred[0][1])
print("Winslet Surviving Rate:", pred[1][1])

输出结果如下:

---------------------------------
Run id: BVF5U1
Log directory: /tmp/tflearn_logs/
---------------------------------
Training samples: 1309
Validation samples: 0
--
Training Step: 1  | time: 0.103s
| Adam | epoch: 001 | loss: 0.00000 - acc: 0.0000 -- iter: 0016/1309
Training Step: 2  | total loss: 0.62505 | time: 0.105s
| Adam | epoch: 001 | loss: 0.62505 - acc: 0.5062 -- iter: 0032/1309
Training Step: 3  | total loss: 0.68028 | time: 0.107s
| Adam | epoch: 001 | loss: 0.68028 - acc: 0.6034 -- iter: 0048/1309
Training Step: 4  | total loss: 0.68955 | time: 0.109s
| Adam | epoch: 001 | loss: 0.68955 - acc: 0.5727 -- iter: 0064/1309
Training Step: 5  | total loss: 0.68870 | time: 0.110s
| Adam | epoch: 001 | loss: 0.68870 - acc: 0.6089 -- iter: 0080/1309
Training Step: 6  | total loss: 0.68943 | time: 0.111s
| Adam | epoch: 001 | loss: 0.68943 - acc: 0.5791 -- iter: 0096/1309
Training Step: 7  | total loss: 0.68917 | time: 0.113s
| Adam | epoch: 001 | loss: 0.68917 - acc: 0.6441 -- iter: 0112/1309
Training Step: 8  | total loss: 0.68216 | time: 0.114s
| Adam | epoch: 001 | loss: 0.68216 - acc: 0.7740 -- iter: 0128/1309
Training Step: 9  | total loss: 0.68934 | time: 0.116s
| Adam | epoch: 001 | loss: 0.68934 - acc: 0.7282 -- iter: 0144/1309
Training Step: 10  | total loss: 0.67822 | time: 0.117s
| Adam | epoch: 001 | loss: 0.67822 - acc: 0.7391 -- iter: 0160/1309
Training Step: 11  | total loss: 0.67647 | time: 0.119s
| Adam | epoch: 001 | loss: 0.67647 - acc: 0.6554 -- iter: 0176/1309
Training Step: 12  | total loss: 0.67894 | time: 0.120s
| Adam | epoch: 001 | loss: 0.67894 - acc: 0.6136 -- iter: 0192/1309
Training Step: 13  | total loss: 0.67066 | time: 0.122s
| Adam | epoch: 001 | loss: 0.67066 - acc: 0.7524 -- iter: 0208/1309
Training Step: 14  | total loss: 0.67009 | time: 0.123s
| Adam | epoch: 001 | loss: 0.67009 - acc: 0.7770 -- iter: 0224/1309
Training Step: 15  | total loss: 0.67956 | time: 0.125s
| Adam | epoch: 001 | loss: 0.67956 - acc: 0.7664 -- iter: 0240/1309
Training Step: 16  | total loss: 0.69134 | time: 0.126s
| Adam | epoch: 001 | loss: 0.69134 - acc: 0.6900 -- iter: 0256/1309
Training Step: 17  | total loss: 0.67629 | time: 0.128s
| Adam | epoch: 001 | loss: 0.67629 - acc: 0.7566 -- iter: 0272/1309
Training Step: 18  | total loss: 0.66671 | time: 0.129s
| Adam | epoch: 001 | loss: 0.66671 - acc: 0.7110 -- iter: 0288/1309
Training Step: 19  | total loss: 0.67119 | time: 0.131s
| Adam | epoch: 001 | loss: 0.67119 - acc: 0.6824 -- iter: 0304/1309
Training Step: 20  | total loss: 0.66469 | time: 0.132s
| Adam | epoch: 001 | loss: 0.66469 - acc: 0.6639 -- iter: 0320/1309
Training Step: 21  | total loss: 0.66643 | time: 0.134s
| Adam | epoch: 001 | loss: 0.66643 - acc: 0.6712 -- iter: 0336/1309
Training Step: 22  | total loss: 0.66034 | time: 0.136s
| Adam | epoch: 001 | loss: 0.66034 - acc: 0.6761 -- iter: 0352/1309
Training Step: 23  | total loss: 0.66459 | time: 0.138s
| Adam | epoch: 001 | loss: 0.66459 - acc: 0.6613 -- iter: 0368/1309
Training Step: 24  | total loss: 0.65420 | time: 0.139s
| Adam | epoch: 001 | loss: 0.65420 - acc: 0.6862 -- iter: 0384/1309
Training Step: 25  | total loss: 0.64804 | time: 0.141s
| Adam | epoch: 001 | loss: 0.64804 - acc: 0.6866 -- iter: 0400/1309
Training Step: 26  | total loss: 0.64707 | time: 0.142s
| Adam | epoch: 001 | loss: 0.64707 - acc: 0.6868 -- iter: 0416/1309
Training Step: 27  | total loss: 0.67533 | time: 0.144s
| Adam | epoch: 001 | loss: 0.67533 - acc: 0.6066 -- iter: 0432/1309
Training Step: 28  | total loss: 0.66490 | time: 0.146s
| Adam | epoch: 001 | loss: 0.66490 - acc: 0.6269 -- iter: 0448/1309
Training Step: 29  | total loss: 0.69553 | time: 0.148s
| Adam | epoch: 001 | loss: 0.69553 - acc: 0.5504 -- iter: 0464/1309
Training Step: 30  | total loss: 0.69576 | time: 0.150s
| Adam | epoch: 001 | loss: 0.69576 - acc: 0.5533 -- iter: 0480/1309
Training Step: 31  | total loss: 0.68775 | time: 0.152s
| Adam | epoch: 001 | loss: 0.68775 - acc: 0.5842 -- iter: 0496/1309
Training Step: 32  | total loss: 0.68708 | time: 0.153s
| Adam | epoch: 001 | loss: 0.68708 - acc: 0.5793 -- iter: 0512/1309
Training Step: 33  | total loss: 0.68358 | time: 0.154s
| Adam | epoch: 001 | loss: 0.68358 - acc: 0.5894 -- iter: 0528/1309
Training Step: 34  | total loss: 0.68702 | time: 0.155s
| Adam | epoch: 001 | loss: 0.68702 - acc: 0.5836 -- iter: 0544/1309
Training Step: 35  | total loss: 0.69973 | time: 0.157s
| Adam | epoch: 001 | loss: 0.69973 - acc: 0.5923 -- iter: 0560/1309
Training Step: 36  | total loss: 0.69614 | time: 0.158s
| Adam | epoch: 001 | loss: 0.69614 - acc: 0.5862 -- iter: 0576/1309
Training Step: 37  | total loss: 0.69371 | time: 0.160s
| Adam | epoch: 001 | loss: 0.69371 - acc: 0.5689 -- iter: 0592/1309
Training Step: 38  | total loss: 0.67873 | time: 0.162s
| Adam | epoch: 001 | loss: 0.67873 - acc: 0.5921 -- iter: 0608/1309
Training Step: 39  | total loss: 0.66726 | time: 0.163s
| Adam | epoch: 001 | loss: 0.66726 - acc: 0.5984 -- iter: 0624/1309
Training Step: 40  | total loss: 0.67361 | time: 0.165s
| Adam | epoch: 001 | loss: 0.67361 - acc: 0.5800 -- iter: 0640/1309
Training Step: 41  | total loss: 0.69866 | time: 0.167s
| Adam | epoch: 001 | loss: 0.69866 - acc: 0.5653 -- iter: 0656/1309
Training Step: 42  | total loss: 0.68080 | time: 0.169s
| Adam | epoch: 001 | loss: 0.68080 - acc: 0.5985 -- iter: 0672/1309
Training Step: 43  | total loss: 0.67781 | time: 0.170s
| Adam | epoch: 001 | loss: 0.67781 - acc: 0.5922 -- iter: 0688/1309
Training Step: 44  | total loss: 0.67591 | time: 0.172s
| Adam | epoch: 001 | loss: 0.67591 - acc: 0.5979 -- iter: 0704/1309
Training Step: 45  | total loss: 0.67090 | time: 0.174s
| Adam | epoch: 001 | loss: 0.67090 - acc: 0.6025 -- iter: 0720/1309
Training Step: 46  | total loss: 0.65725 | time: 0.175s
| Adam | epoch: 001 | loss: 0.65725 - acc: 0.6479 -- iter: 0736/1309
Training Step: 47  | total loss: 0.65750 | time: 0.176s
| Adam | epoch: 001 | loss: 0.65750 - acc: 0.6339 -- iter: 0752/1309
Training Step: 48  | total loss: 0.65677 | time: 0.178s
| Adam | epoch: 001 | loss: 0.65677 - acc: 0.6325 -- iter: 0768/1309
Training Step: 49  | total loss: 0.65059 | time: 0.179s
| Adam | epoch: 001 | loss: 0.65059 - acc: 0.6609 -- iter: 0784/1309
Training Step: 50  | total loss: 0.64831 | time: 0.181s
| Adam | epoch: 001 | loss: 0.64831 - acc: 0.6650 -- iter: 0800/1309
Training Step: 51  | total loss: 0.65447 | time: 0.183s
| Adam | epoch: 001 | loss: 0.65447 - acc: 0.6589 -- iter: 0816/1309
Training Step: 52  | total loss: 0.65161 | time: 0.184s
| Adam | epoch: 001 | loss: 0.65161 - acc: 0.6820 -- iter: 0832/1309
Training Step: 53  | total loss: 0.65264 | time: 0.186s
| Adam | epoch: 001 | loss: 0.65264 - acc: 0.6736 -- iter: 0848/1309
Training Step: 54  | total loss: 0.65330 | time: 0.188s
| Adam | epoch: 001 | loss: 0.65330 - acc: 0.6665 -- iter: 0864/1309
Training Step: 55  | total loss: 0.64427 | time: 0.189s
| Adam | epoch: 001 | loss: 0.64427 - acc: 0.6784 -- iter: 0880/1309
Training Step: 56  | total loss: 0.64634 | time: 0.191s
| Adam | epoch: 001 | loss: 0.64634 - acc: 0.6709 -- iter: 0896/1309
Training Step: 57  | total loss: 0.63318 | time: 0.192s
| Adam | epoch: 001 | loss: 0.63318 - acc: 0.6992 -- iter: 0912/1309
Training Step: 58  | total loss: 0.62477 | time: 0.193s
| Adam | epoch: 001 | loss: 0.62477 - acc: 0.6976 -- iter: 0928/1309
Training Step: 59  | total loss: 0.64438 | time: 0.194s
| Adam | epoch: 001 | loss: 0.64438 - acc: 0.6710 -- iter: 0944/1309
Training Step: 60  | total loss: 0.64151 | time: 0.195s
| Adam | epoch: 001 | loss: 0.64151 - acc: 0.6650 -- iter: 0960/1309
Training Step: 61  | total loss: 0.63766 | time: 0.197s
| Adam | epoch: 001 | loss: 0.63766 - acc: 0.6760 -- iter: 0976/1309
Training Step: 62  | total loss: 0.63111 | time: 0.198s
| Adam | epoch: 001 | loss: 0.63111 - acc: 0.6856 -- iter: 0992/1309
Training Step: 63  | total loss: 0.63500 | time: 0.200s
| Adam | epoch: 001 | loss: 0.63500 - acc: 0.6700 -- iter: 1008/1309
Training Step: 64  | total loss: 0.62890 | time: 0.202s
| Adam | epoch: 001 | loss: 0.62890 - acc: 0.6800 -- iter: 1024/1309
Training Step: 65  | total loss: 0.63960 | time: 0.203s
| Adam | epoch: 001 | loss: 0.63960 - acc: 0.6578 -- iter: 1040/1309
Training Step: 66  | total loss: 0.64561 | time: 0.205s
| Adam | epoch: 001 | loss: 0.64561 - acc: 0.6538 -- iter: 1056/1309
Training Step: 67  | total loss: 0.63828 | time: 0.207s
| Adam | epoch: 001 | loss: 0.63828 - acc: 0.6653 -- iter: 1072/1309
Training Step: 68  | total loss: 0.63530 | time: 0.209s
| Adam | epoch: 001 | loss: 0.63530 - acc: 0.6680 -- iter: 1088/1309
Training Step: 69  | total loss: 0.63574 | time: 0.211s
| Adam | epoch: 001 | loss: 0.63574 - acc: 0.6483 -- iter: 1104/1309
Training Step: 70  | total loss: 0.64278 | time: 0.213s
| Adam | epoch: 001 | loss: 0.64278 - acc: 0.6456 -- iter: 1120/1309
Training Step: 71  | total loss: 0.63293 | time: 0.215s
| Adam | epoch: 001 | loss: 0.63293 - acc: 0.6504 -- iter: 1136/1309
Training Step: 72  | total loss: 0.63840 | time: 0.217s
| Adam | epoch: 001 | loss: 0.63840 - acc: 0.6546 -- iter: 1152/1309
Training Step: 73  | total loss: 0.65520 | time: 0.219s
| Adam | epoch: 001 | loss: 0.65520 - acc: 0.6305 -- iter: 1168/1309
Training Step: 74  | total loss: 0.66006 | time: 0.221s
| Adam | epoch: 001 | loss: 0.66006 - acc: 0.6299 -- iter: 1184/1309
Training Step: 75  | total loss: 0.66179 | time: 0.223s
| Adam | epoch: 001 | loss: 0.66179 - acc: 0.6293 -- iter: 1200/1309
Training Step: 76  | total loss: 0.65827 | time: 0.225s
| Adam | epoch: 001 | loss: 0.65827 - acc: 0.6222 -- iter: 1216/1309
Training Step: 77  | total loss: 0.64312 | time: 0.226s
| Adam | epoch: 001 | loss: 0.64312 - acc: 0.6357 -- iter: 1232/1309
Training Step: 78  | total loss: 0.64724 | time: 0.228s
| Adam | epoch: 001 | loss: 0.64724 - acc: 0.6215 -- iter: 1248/1309
Training Step: 79  | total loss: 0.64339 | time: 0.230s
| Adam | epoch: 001 | loss: 0.64339 - acc: 0.6219 -- iter: 1264/1309
Training Step: 80  | total loss: 0.63914 | time: 0.231s
| Adam | epoch: 001 | loss: 0.63914 - acc: 0.6286 -- iter: 1280/1309
Training Step: 81  | total loss: 0.63412 | time: 0.233s
| Adam | epoch: 001 | loss: 0.63412 - acc: 0.6409 -- iter: 1296/1309
Training Step: 82  | total loss: 0.63176 | time: 0.234s
| Adam | epoch: 001 | loss: 0.63176 - acc: 0.6518 -- iter: 1309/1309
--
Training Step: 83  | total loss: 0.64750 | time: 0.002s
| Adam | epoch: 002 | loss: 0.64750 - acc: 0.6404 -- iter: 0016/1309
Training Step: 84  | total loss: 0.66104 | time: 0.003s
| Adam | epoch: 002 | loss: 0.66104 - acc: 0.6302 -- iter: 0032/1309
Training Step: 85  | total loss: 0.65776 | time: 0.004s
| Adam | epoch: 002 | loss: 0.65776 - acc: 0.6422 -- iter: 0048/1309
Training Step: 86  | total loss: 0.64961 | time: 0.006s
| Adam | epoch: 002 | loss: 0.64961 - acc: 0.6655 -- iter: 0064/1309
Training Step: 87  | total loss: 0.64338 | time: 0.007s
| Adam | epoch: 002 | loss: 0.64338 - acc: 0.6677 -- iter: 0080/1309
Training Step: 88  | total loss: 0.64256 | time: 0.009s
| Adam | epoch: 002 | loss: 0.64256 - acc: 0.6572 -- iter: 0096/1309
Training Step: 89  | total loss: 0.63812 | time: 0.011s
| Adam | epoch: 002 | loss: 0.63812 - acc: 0.6727 -- iter: 0112/1309
Training Step: 90  | total loss: 0.63770 | time: 0.013s
| Adam | epoch: 002 | loss: 0.63770 - acc: 0.6679 -- iter: 0128/1309
Training Step: 91  | total loss: 0.64383 | time: 0.014s
| Adam | epoch: 002 | loss: 0.64383 - acc: 0.6511 -- iter: 0144/1309
Training Step: 92  | total loss: 0.63346 | time: 0.016s
| Adam | epoch: 002 | loss: 0.63346 - acc: 0.6673 -- iter: 0160/1309
Training Step: 93  | total loss: 0.63451 | time: 0.018s
| Adam | epoch: 002 | loss: 0.63451 - acc: 0.6693 -- iter: 0176/1309
Training Step: 94  | total loss: 0.62807 | time: 0.019s
| Adam | epoch: 002 | loss: 0.62807 - acc: 0.6774 -- iter: 0192/1309
Training Step: 95  | total loss: 0.62151 | time: 0.021s
| Adam | epoch: 002 | loss: 0.62151 - acc: 0.6909 -- iter: 0208/1309
Training Step: 96  | total loss: 0.63237 | time: 0.023s
| Adam | epoch: 002 | loss: 0.63237 - acc: 0.6968 -- iter: 0224/1309
Training Step: 97  | total loss: 0.63492 | time: 0.025s
| Adam | epoch: 002 | loss: 0.63492 - acc: 0.6771 -- iter: 0240/1309
Training Step: 98  | total loss: 0.64831 | time: 0.027s
| Adam | epoch: 002 | loss: 0.64831 - acc: 0.6844 -- iter: 0256/1309
Training Step: 99  | total loss: 0.64895 | time: 0.028s
| Adam | epoch: 002 | loss: 0.64895 - acc: 0.6785 -- iter: 0272/1309
Training Step: 100  | total loss: 0.66699 | time: 0.031s
| Adam | epoch: 002 | loss: 0.66699 - acc: 0.6481 -- iter: 0288/1309
Training Step: 101  | total loss: 0.65770 | time: 0.032s
| Adam | epoch: 002 | loss: 0.65770 - acc: 0.6583 -- iter: 0304/1309
Training Step: 102  | total loss: 0.64407 | time: 0.034s
| Adam | epoch: 002 | loss: 0.64407 - acc: 0.6675 -- iter: 0320/1309
Training Step: 103  | total loss: 0.65467 | time: 0.035s
| Adam | epoch: 002 | loss: 0.65467 - acc: 0.6632 -- iter: 0336/1309
Training Step: 104  | total loss: 0.63631 | time: 0.037s
| Adam | epoch: 002 | loss: 0.63631 - acc: 0.6844 -- iter: 0352/1309
Training Step: 105  | total loss: 0.63591 | time: 0.038s
| Adam | epoch: 002 | loss: 0.63591 - acc: 0.6722 -- iter: 0368/1309
Training Step: 106  | total loss: 0.64182 | time: 0.040s
| Adam | epoch: 002 | loss: 0.64182 - acc: 0.6612 -- iter: 0384/1309
Training Step: 107  | total loss: 0.64140 | time: 0.042s
| Adam | epoch: 002 | loss: 0.64140 - acc: 0.6514 -- iter: 0400/1309
Training Step: 108  | total loss: 0.64593 | time: 0.044s
| Adam | epoch: 002 | loss: 0.64593 - acc: 0.6362 -- iter: 0416/1309
Training Step: 109  | total loss: 0.63850 | time: 0.045s
| Adam | epoch: 002 | loss: 0.63850 - acc: 0.6476 -- iter: 0432/1309
Training Step: 110  | total loss: 0.63681 | time: 0.047s
| Adam | epoch: 002 | loss: 0.63681 - acc: 0.6453 -- iter: 0448/1309
Training Step: 111  | total loss: 0.63942 | time: 0.048s
| Adam | epoch: 002 | loss: 0.63942 - acc: 0.6433 -- iter: 0464/1309
Training Step: 112  | total loss: 0.63881 | time: 0.050s
| Adam | epoch: 002 | loss: 0.63881 - acc: 0.6352 -- iter: 0480/1309
Training Step: 113  | total loss: 0.65281 | time: 0.051s
| Adam | epoch: 002 | loss: 0.65281 - acc: 0.6280 -- iter: 0496/1309
Training Step: 114  | total loss: 0.65333 | time: 0.053s
| Adam | epoch: 002 | loss: 0.65333 - acc: 0.6277 -- iter: 0512/1309
Training Step: 115  | total loss: 0.64511 | time: 0.055s
| Adam | epoch: 002 | loss: 0.64511 - acc: 0.6399 -- iter: 0528/1309
Training Step: 116  | total loss: 0.63956 | time: 0.056s
| Adam | epoch: 002 | loss: 0.63956 - acc: 0.6509 -- iter: 0544/1309
Training Step: 117  | total loss: 0.64494 | time: 0.058s
| Adam | epoch: 002 | loss: 0.64494 - acc: 0.6421 -- iter: 0560/1309
Training Step: 118  | total loss: 0.63848 | time: 0.060s
| Adam | epoch: 002 | loss: 0.63848 - acc: 0.6529 -- iter: 0576/1309
Training Step: 119  | total loss: 0.62648 | time: 0.061s
| Adam | epoch: 002 | loss: 0.62648 - acc: 0.6751 -- iter: 0592/1309
Training Step: 120  | total loss: 0.61642 | time: 0.063s
| Adam | epoch: 002 | loss: 0.61642 - acc: 0.6951 -- iter: 0608/1309
Training Step: 121  | total loss: 0.61798 | time: 0.064s
| Adam | epoch: 002 | loss: 0.61798 - acc: 0.6818 -- iter: 0624/1309
Training Step: 122  | total loss: 0.62225 | time: 0.066s
| Adam | epoch: 002 | loss: 0.62225 - acc: 0.6824 -- iter: 0640/1309
Training Step: 123  | total loss: 0.62013 | time: 0.068s
| Adam | epoch: 002 | loss: 0.62013 - acc: 0.6829 -- iter: 0656/1309
Training Step: 124  | total loss: 0.61597 | time: 0.069s
| Adam | epoch: 002 | loss: 0.61597 - acc: 0.6896 -- iter: 0672/1309
Training Step: 125  | total loss: 0.62580 | time: 0.070s
| Adam | epoch: 002 | loss: 0.62580 - acc: 0.6831 -- iter: 0688/1309
Training Step: 126  | total loss: 0.62124 | time: 0.071s
| Adam | epoch: 002 | loss: 0.62124 - acc: 0.6836 -- iter: 0704/1309
Training Step: 127  | total loss: 0.63349 | time: 0.073s
| Adam | epoch: 002 | loss: 0.63349 - acc: 0.6715 -- iter: 0720/1309
Training Step: 128  | total loss: 0.64016 | time: 0.075s
| Adam | epoch: 002 | loss: 0.64016 - acc: 0.6668 -- iter: 0736/1309
Training Step: 129  | total loss: 0.64215 | time: 0.076s
| Adam | epoch: 002 | loss: 0.64215 - acc: 0.6814 -- iter: 0752/1309
Training Step: 130  | total loss: 0.63795 | time: 0.078s
| Adam | epoch: 002 | loss: 0.63795 - acc: 0.6820 -- iter: 0768/1309
Training Step: 131  | total loss: 0.63109 | time: 0.079s
| Adam | epoch: 002 | loss: 0.63109 - acc: 0.6826 -- iter: 0784/1309
Training Step: 132  | total loss: 0.63945 | time: 0.080s
| Adam | epoch: 002 | loss: 0.63945 - acc: 0.6705 -- iter: 0800/1309
Training Step: 133  | total loss: 0.62991 | time: 0.081s
| Adam | epoch: 002 | loss: 0.62991 - acc: 0.6847 -- iter: 0816/1309
Training Step: 134  | total loss: 0.64040 | time: 0.082s
| Adam | epoch: 002 | loss: 0.64040 - acc: 0.6725 -- iter: 0832/1309
Training Step: 135  | total loss: 0.64073 | time: 0.083s
| Adam | epoch: 002 | loss: 0.64073 - acc: 0.6615 -- iter: 0848/1309
Training Step: 136  | total loss: 0.64059 | time: 0.085s
| Adam | epoch: 002 | loss: 0.64059 - acc: 0.6516 -- iter: 0864/1309
Training Step: 137  | total loss: 0.62418 | time: 0.086s
| Adam | epoch: 002 | loss: 0.62418 - acc: 0.6802 -- iter: 0880/1309
Training Step: 138  | total loss: 0.62528 | time: 0.087s
| Adam | epoch: 002 | loss: 0.62528 - acc: 0.6747 -- iter: 0896/1309
Training Step: 139  | total loss: 0.63075 | time: 0.088s
| Adam | epoch: 002 | loss: 0.63075 - acc: 0.6697 -- iter: 0912/1309
Training Step: 140  | total loss: 0.61742 | time: 0.089s
| Adam | epoch: 002 | loss: 0.61742 - acc: 0.6715 -- iter: 0928/1309
Training Step: 141  | total loss: 0.62083 | time: 0.091s
| Adam | epoch: 002 | loss: 0.62083 - acc: 0.6731 -- iter: 0944/1309
Training Step: 142  | total loss: 0.62810 | time: 0.092s
| Adam | epoch: 002 | loss: 0.62810 - acc: 0.6620 -- iter: 0960/1309
Training Step: 143  | total loss: 0.62760 | time: 0.093s
| Adam | epoch: 002 | loss: 0.62760 - acc: 0.6583 -- iter: 0976/1309
Training Step: 144  | total loss: 0.62705 | time: 0.094s
| Adam | epoch: 002 | loss: 0.62705 - acc: 0.6612 -- iter: 0992/1309
Training Step: 145  | total loss: 0.62032 | time: 0.095s
| Adam | epoch: 002 | loss: 0.62032 - acc: 0.6639 -- iter: 1008/1309
Training Step: 146  | total loss: 0.61408 | time: 0.096s
| Adam | epoch: 002 | loss: 0.61408 - acc: 0.6725 -- iter: 1024/1309
Training Step: 147  | total loss: 0.60320 | time: 0.097s
| Adam | epoch: 002 | loss: 0.60320 - acc: 0.6802 -- iter: 1040/1309
Training Step: 148  | total loss: 0.60316 | time: 0.098s
| Adam | epoch: 002 | loss: 0.60316 - acc: 0.6747 -- iter: 1056/1309
Training Step: 149  | total loss: 0.60590 | time: 0.100s
| Adam | epoch: 002 | loss: 0.60590 - acc: 0.6635 -- iter: 1072/1309
Training Step: 150  | total loss: 0.61264 | time: 0.101s
| Adam | epoch: 002 | loss: 0.61264 - acc: 0.6471 -- iter: 1088/1309
Training Step: 151  | total loss: 0.61075 | time: 0.102s
| Adam | epoch: 002 | loss: 0.61075 - acc: 0.6512 -- iter: 1104/1309
Training Step: 152  | total loss: 0.61433 | time: 0.104s
| Adam | epoch: 002 | loss: 0.61433 - acc: 0.6486 -- iter: 1120/1309
Training Step: 153  | total loss: 0.61847 | time: 0.106s
| Adam | epoch: 002 | loss: 0.61847 - acc: 0.6525 -- iter: 1136/1309
Training Step: 154  | total loss: 0.61163 | time: 0.107s
| Adam | epoch: 002 | loss: 0.61163 - acc: 0.6622 -- iter: 1152/1309
Training Step: 155  | total loss: 0.61696 | time: 0.109s
| Adam | epoch: 002 | loss: 0.61696 - acc: 0.6585 -- iter: 1168/1309
Training Step: 156  | total loss: 0.61351 | time: 0.110s
| Adam | epoch: 002 | loss: 0.61351 - acc: 0.6614 -- iter: 1184/1309
Training Step: 157  | total loss: 0.61283 | time: 0.112s
| Adam | epoch: 002 | loss: 0.61283 - acc: 0.6578 -- iter: 1200/1309
Training Step: 158  | total loss: 0.62221 | time: 0.114s
| Adam | epoch: 002 | loss: 0.62221 - acc: 0.6482 -- iter: 1216/1309
Training Step: 159  | total loss: 0.59832 | time: 0.116s
| Adam | epoch: 002 | loss: 0.59832 - acc: 0.6709 -- iter: 1232/1309
Training Step: 160  | total loss: 0.59410 | time: 0.117s
| Adam | epoch: 002 | loss: 0.59410 - acc: 0.6788 -- iter: 1248/1309
Training Step: 161  | total loss: 0.58662 | time: 0.119s
| Adam | epoch: 002 | loss: 0.58662 - acc: 0.6922 -- iter: 1264/1309
Training Step: 162  | total loss: 0.58073 | time: 0.120s
| Adam | epoch: 002 | loss: 0.58073 - acc: 0.7042 -- iter: 1280/1309
Training Step: 163  | total loss: 0.57245 | time: 0.122s
| Adam | epoch: 002 | loss: 0.57245 - acc: 0.7150 -- iter: 1296/1309
Training Step: 164  | total loss: 0.58122 | time: 0.123s
| Adam | epoch: 002 | loss: 0.58122 - acc: 0.6998 -- iter: 1309/1309
--
Training Step: 165  | total loss: 0.58138 | time: 0.002s
| Adam | epoch: 003 | loss: 0.58138 - acc: 0.7048 -- iter: 0016/1309
Training Step: 166  | total loss: 0.58905 | time: 0.004s
| Adam | epoch: 003 | loss: 0.58905 - acc: 0.6959 -- iter: 0032/1309
Training Step: 167  | total loss: 0.59563 | time: 0.005s
| Adam | epoch: 003 | loss: 0.59563 - acc: 0.6878 -- iter: 0048/1309
Training Step: 168  | total loss: 0.61505 | time: 0.007s
| Adam | epoch: 003 | loss: 0.61505 - acc: 0.7003 -- iter: 0064/1309
Training Step: 169  | total loss: 0.60093 | time: 0.008s
| Adam | epoch: 003 | loss: 0.60093 - acc: 0.7115 -- iter: 0080/1309
Training Step: 170  | total loss: 0.66314 | time: 0.009s
| Adam | epoch: 003 | loss: 0.66314 - acc: 0.6904 -- iter: 0096/1309
Training Step: 171  | total loss: 0.67144 | time: 0.010s
| Adam | epoch: 003 | loss: 0.67144 - acc: 0.6963 -- iter: 0112/1309
Training Step: 172  | total loss: 0.67363 | time: 0.012s
| Adam | epoch: 003 | loss: 0.67363 - acc: 0.6892 -- iter: 0128/1309
Training Step: 173  | total loss: 0.68805 | time: 0.014s
| Adam | epoch: 003 | loss: 0.68805 - acc: 0.6828 -- iter: 0144/1309
Training Step: 174  | total loss: 0.68759 | time: 0.016s
| Adam | epoch: 003 | loss: 0.68759 - acc: 0.6707 -- iter: 0160/1309
Training Step: 175  | total loss: 0.66004 | time: 0.017s
| Adam | epoch: 003 | loss: 0.66004 - acc: 0.6849 -- iter: 0176/1309
Training Step: 176  | total loss: 0.65719 | time: 0.019s
| Adam | epoch: 003 | loss: 0.65719 - acc: 0.6789 -- iter: 0192/1309
Training Step: 177  | total loss: 0.65010 | time: 0.020s
| Adam | epoch: 003 | loss: 0.65010 - acc: 0.6798 -- iter: 0208/1309
Training Step: 178  | total loss: 0.62729 | time: 0.022s
| Adam | epoch: 003 | loss: 0.62729 - acc: 0.6993 -- iter: 0224/1309
Training Step: 179  | total loss: 0.62452 | time: 0.023s
| Adam | epoch: 003 | loss: 0.62452 - acc: 0.7044 -- iter: 0240/1309
Training Step: 180  | total loss: 0.62067 | time: 0.025s
| Adam | epoch: 003 | loss: 0.62067 - acc: 0.7027 -- iter: 0256/1309
Training Step: 181  | total loss: 0.62960 | time: 0.026s
| Adam | epoch: 003 | loss: 0.62960 - acc: 0.6824 -- iter: 0272/1309
Training Step: 182  | total loss: 0.62563 | time: 0.029s
| Adam | epoch: 003 | loss: 0.62563 - acc: 0.6829 -- iter: 0288/1309
Training Step: 183  | total loss: 0.62037 | time: 0.031s
| Adam | epoch: 003 | loss: 0.62037 - acc: 0.6771 -- iter: 0304/1309
Training Step: 184  | total loss: 0.62184 | time: 0.033s
| Adam | epoch: 003 | loss: 0.62184 - acc: 0.6719 -- iter: 0320/1309
Training Step: 185  | total loss: 0.62372 | time: 0.034s
| Adam | epoch: 003 | loss: 0.62372 - acc: 0.6610 -- iter: 0336/1309
Training Step: 186  | total loss: 0.61630 | time: 0.036s
| Adam | epoch: 003 | loss: 0.61630 - acc: 0.6636 -- iter: 0352/1309
Training Step: 187  | total loss: 0.60972 | time: 0.039s
| Adam | epoch: 003 | loss: 0.60972 - acc: 0.6660 -- iter: 0368/1309
Training Step: 188  | total loss: 0.61024 | time: 0.041s
| Adam | epoch: 003 | loss: 0.61024 - acc: 0.6619 -- iter: 0384/1309
Training Step: 189  | total loss: 0.60551 | time: 0.042s
| Adam | epoch: 003 | loss: 0.60551 - acc: 0.6582 -- iter: 0400/1309
Training Step: 190  | total loss: 0.61467 | time: 0.044s
| Adam | epoch: 003 | loss: 0.61467 - acc: 0.6424 -- iter: 0416/1309
Training Step: 191  | total loss: 0.61616 | time: 0.046s
| Adam | epoch: 003 | loss: 0.61616 - acc: 0.6344 -- iter: 0432/1309
Training Step: 192  | total loss: 0.61279 | time: 0.047s
| Adam | epoch: 003 | loss: 0.61279 - acc: 0.6397 -- iter: 0448/1309
Training Step: 193  | total loss: 0.59790 | time: 0.049s
| Adam | epoch: 003 | loss: 0.59790 - acc: 0.6632 -- iter: 0464/1309
Training Step: 194  | total loss: 0.59158 | time: 0.051s
| Adam | epoch: 003 | loss: 0.59158 - acc: 0.6657 -- iter: 0480/1309
Training Step: 195  | total loss: 0.59117 | time: 0.053s
| Adam | epoch: 003 | loss: 0.59117 - acc: 0.6679 -- iter: 0496/1309
Training Step: 196  | total loss: 0.58774 | time: 0.054s
| Adam | epoch: 003 | loss: 0.58774 - acc: 0.6761 -- iter: 0512/1309
Training Step: 197  | total loss: 0.58126 | time: 0.056s
| Adam | epoch: 003 | loss: 0.58126 - acc: 0.6960 -- iter: 0528/1309
Training Step: 198  | total loss: 0.58115 | time: 0.057s
| Adam | epoch: 003 | loss: 0.58115 - acc: 0.6951 -- iter: 0544/1309
Training Step: 199  | total loss: 0.60138 | time: 0.059s
| Adam | epoch: 003 | loss: 0.60138 - acc: 0.6819 -- iter: 0560/1309
Training Step: 200  | total loss: 0.59962 | time: 0.061s
| Adam | epoch: 003 | loss: 0.59962 - acc: 0.6824 -- iter: 0576/1309
Training Step: 201  | total loss: 0.59632 | time: 0.063s
| Adam | epoch: 003 | loss: 0.59632 - acc: 0.6892 -- iter: 0592/1309
Training Step: 202  | total loss: 0.59559 | time: 0.065s
| Adam | epoch: 003 | loss: 0.59559 - acc: 0.6953 -- iter: 0608/1309
Training Step: 203  | total loss: 0.59260 | time: 0.066s
| Adam | epoch: 003 | loss: 0.59260 - acc: 0.7007 -- iter: 0624/1309
Training Step: 204  | total loss: 0.58478 | time: 0.068s
| Adam | epoch: 003 | loss: 0.58478 - acc: 0.7057 -- iter: 0640/1309
Training Step: 205  | total loss: 0.58057 | time: 0.069s
| Adam | epoch: 003 | loss: 0.58057 - acc: 0.7038 -- iter: 0656/1309
Training Step: 206  | total loss: 0.58670 | time: 0.071s
| Adam | epoch: 003 | loss: 0.58670 - acc: 0.6960 -- iter: 0672/1309
Training Step: 207  | total loss: 0.59021 | time: 0.073s
| Adam | epoch: 003 | loss: 0.59021 - acc: 0.6951 -- iter: 0688/1309
Training Step: 208  | total loss: 0.60967 | time: 0.074s
| Adam | epoch: 003 | loss: 0.60967 - acc: 0.6631 -- iter: 0704/1309
Training Step: 209  | total loss: 0.61406 | time: 0.075s
| Adam | epoch: 003 | loss: 0.61406 - acc: 0.6593 -- iter: 0720/1309
Training Step: 210  | total loss: 0.60745 | time: 0.078s
| Adam | epoch: 003 | loss: 0.60745 - acc: 0.6684 -- iter: 0736/1309
Training Step: 211  | total loss: 0.59907 | time: 0.079s
| Adam | epoch: 003 | loss: 0.59907 - acc: 0.6765 -- iter: 0752/1309
Training Step: 212  | total loss: 0.59024 | time: 0.081s
| Adam | epoch: 003 | loss: 0.59024 - acc: 0.6839 -- iter: 0768/1309
Training Step: 213  | total loss: 0.59324 | time: 0.082s
| Adam | epoch: 003 | loss: 0.59324 - acc: 0.6905 -- iter: 0784/1309
Training Step: 214  | total loss: 0.59096 | time: 0.084s
| Adam | epoch: 003 | loss: 0.59096 - acc: 0.6902 -- iter: 0800/1309
Training Step: 215  | total loss: 0.59325 | time: 0.086s
| Adam | epoch: 003 | loss: 0.59325 - acc: 0.6899 -- iter: 0816/1309
Training Step: 216  | total loss: 0.58979 | time: 0.088s
| Adam | epoch: 003 | loss: 0.58979 - acc: 0.6897 -- iter: 0832/1309
Training Step: 217  | total loss: 0.58594 | time: 0.089s
| Adam | epoch: 003 | loss: 0.58594 - acc: 0.6895 -- iter: 0848/1309
Training Step: 218  | total loss: 0.57156 | time: 0.091s
| Adam | epoch: 003 | loss: 0.57156 - acc: 0.7080 -- iter: 0864/1309
Training Step: 219  | total loss: 0.56645 | time: 0.092s
| Adam | epoch: 003 | loss: 0.56645 - acc: 0.7122 -- iter: 0880/1309
Training Step: 220  | total loss: 0.55358 | time: 0.094s
| Adam | epoch: 003 | loss: 0.55358 - acc: 0.7222 -- iter: 0896/1309
Training Step: 221  | total loss: 0.53906 | time: 0.096s
| Adam | epoch: 003 | loss: 0.53906 - acc: 0.7375 -- iter: 0912/1309
Training Step: 222  | total loss: 0.53488 | time: 0.097s
| Adam | epoch: 003 | loss: 0.53488 - acc: 0.7388 -- iter: 0928/1309
Training Step: 223  | total loss: 0.54742 | time: 0.099s
| Adam | epoch: 003 | loss: 0.54742 - acc: 0.7211 -- iter: 0944/1309
Training Step: 224  | total loss: 0.56636 | time: 0.100s
| Adam | epoch: 003 | loss: 0.56636 - acc: 0.7053 -- iter: 0960/1309
Training Step: 225  | total loss: 0.56764 | time: 0.102s
| Adam | epoch: 003 | loss: 0.56764 - acc: 0.7035 -- iter: 0976/1309
Training Step: 226  | total loss: 0.58226 | time: 0.103s
| Adam | epoch: 003 | loss: 0.58226 - acc: 0.6956 -- iter: 0992/1309
Training Step: 227  | total loss: 0.57347 | time: 0.104s
| Adam | epoch: 003 | loss: 0.57347 - acc: 0.7073 -- iter: 1008/1309
Training Step: 228  | total loss: 0.56050 | time: 0.106s
| Adam | epoch: 003 | loss: 0.56050 - acc: 0.7178 -- iter: 1024/1309
Training Step: 229  | total loss: 0.53722 | time: 0.107s
| Adam | epoch: 003 | loss: 0.53722 - acc: 0.7336 -- iter: 1040/1309
Training Step: 230  | total loss: 0.54622 | time: 0.109s
| Adam | epoch: 003 | loss: 0.54622 - acc: 0.7290 -- iter: 1056/1309
Training Step: 231  | total loss: 0.53598 | time: 0.111s
| Adam | epoch: 003 | loss: 0.53598 - acc: 0.7373 -- iter: 1072/1309
Training Step: 232  | total loss: 0.52045 | time: 0.112s
| Adam | epoch: 003 | loss: 0.52045 - acc: 0.7448 -- iter: 1088/1309
Training Step: 233  | total loss: 0.54088 | time: 0.114s
| Adam | epoch: 003 | loss: 0.54088 - acc: 0.7266 -- iter: 1104/1309
Training Step: 234  | total loss: 0.53595 | time: 0.116s
| Adam | epoch: 003 | loss: 0.53595 - acc: 0.7289 -- iter: 1120/1309
Training Step: 235  | total loss: 0.51461 | time: 0.117s
| Adam | epoch: 003 | loss: 0.51461 - acc: 0.7435 -- iter: 1136/1309
Training Step: 236  | total loss: 0.50866 | time: 0.119s
| Adam | epoch: 003 | loss: 0.50866 - acc: 0.7504 -- iter: 1152/1309
Training Step: 237  | total loss: 0.49737 | time: 0.120s
| Adam | epoch: 003 | loss: 0.49737 - acc: 0.7566 -- iter: 1168/1309
Training Step: 238  | total loss: 0.50068 | time: 0.122s
| Adam | epoch: 003 | loss: 0.50068 - acc: 0.7560 -- iter: 1184/1309
Training Step: 239  | total loss: 0.51478 | time: 0.124s
| Adam | epoch: 003 | loss: 0.51478 - acc: 0.7429 -- iter: 1200/1309
Training Step: 240  | total loss: 0.51345 | time: 0.125s
| Adam | epoch: 003 | loss: 0.51345 - acc: 0.7436 -- iter: 1216/1309
Training Step: 241  | total loss: 0.51327 | time: 0.127s
| Adam | epoch: 003 | loss: 0.51327 - acc: 0.7505 -- iter: 1232/1309
Training Step: 242  | total loss: 0.53972 | time: 0.128s
| Adam | epoch: 003 | loss: 0.53972 - acc: 0.7317 -- iter: 1248/1309
Training Step: 243  | total loss: 0.54765 | time: 0.129s
| Adam | epoch: 003 | loss: 0.54765 - acc: 0.7273 -- iter: 1264/1309
Training Step: 244  | total loss: 0.54191 | time: 0.130s
| Adam | epoch: 003 | loss: 0.54191 - acc: 0.7295 -- iter: 1280/1309
Training Step: 245  | total loss: 0.53788 | time: 0.132s
| Adam | epoch: 003 | loss: 0.53788 - acc: 0.7316 -- iter: 1296/1309
Training Step: 246  | total loss: 0.55301 | time: 0.133s
| Adam | epoch: 003 | loss: 0.55301 - acc: 0.7147 -- iter: 1309/1309
--
Training Step: 247  | total loss: 0.56349 | time: 0.002s
| Adam | epoch: 004 | loss: 0.56349 - acc: 0.7057 -- iter: 0016/1309
Training Step: 248  | total loss: 0.56997 | time: 0.003s
| Adam | epoch: 004 | loss: 0.56997 - acc: 0.6976 -- iter: 0032/1309
Training Step: 249  | total loss: 0.56204 | time: 0.004s
| Adam | epoch: 004 | loss: 0.56204 - acc: 0.7125 -- iter: 0048/1309
Training Step: 250  | total loss: 0.55415 | time: 0.006s
| Adam | epoch: 004 | loss: 0.55415 - acc: 0.7259 -- iter: 0064/1309
Training Step: 251  | total loss: 0.59572 | time: 0.007s
| Adam | epoch: 004 | loss: 0.59572 - acc: 0.7220 -- iter: 0080/1309
Training Step: 252  | total loss: 0.59493 | time: 0.009s
| Adam | epoch: 004 | loss: 0.59493 - acc: 0.7123 -- iter: 0096/1309
Training Step: 253  | total loss: 0.60012 | time: 0.010s
| Adam | epoch: 004 | loss: 0.60012 - acc: 0.7161 -- iter: 0112/1309
Training Step: 254  | total loss: 0.58000 | time: 0.011s
| Adam | epoch: 004 | loss: 0.58000 - acc: 0.7320 -- iter: 0128/1309
Training Step: 255  | total loss: 0.58143 | time: 0.013s
| Adam | epoch: 004 | loss: 0.58143 - acc: 0.7275 -- iter: 0144/1309
Training Step: 256  | total loss: 0.57373 | time: 0.015s
| Adam | epoch: 004 | loss: 0.57373 - acc: 0.7360 -- iter: 0160/1309
Training Step: 257  | total loss: 0.56793 | time: 0.016s
| Adam | epoch: 004 | loss: 0.56793 - acc: 0.7437 -- iter: 0176/1309
Training Step: 258  | total loss: 0.57736 | time: 0.018s
| Adam | epoch: 004 | loss: 0.57736 - acc: 0.7381 -- iter: 0192/1309
Training Step: 259  | total loss: 0.56309 | time: 0.019s
| Adam | epoch: 004 | loss: 0.56309 - acc: 0.7455 -- iter: 0208/1309
Training Step: 260  | total loss: 0.55206 | time: 0.021s
| Adam | epoch: 004 | loss: 0.55206 - acc: 0.7585 -- iter: 0224/1309
Training Step: 261  | total loss: 0.54975 | time: 0.022s
| Adam | epoch: 004 | loss: 0.54975 - acc: 0.7451 -- iter: 0240/1309
Training Step: 262  | total loss: 0.55377 | time: 0.023s
| Adam | epoch: 004 | loss: 0.55377 - acc: 0.7393 -- iter: 0256/1309
Training Step: 263  | total loss: 0.55145 | time: 0.024s
| Adam | epoch: 004 | loss: 0.55145 - acc: 0.7404 -- iter: 0272/1309
Training Step: 264  | total loss: 0.56921 | time: 0.025s
| Adam | epoch: 004 | loss: 0.56921 - acc: 0.7101 -- iter: 0288/1309
Training Step: 265  | total loss: 0.55842 | time: 0.026s
| Adam | epoch: 004 | loss: 0.55842 - acc: 0.7266 -- iter: 0304/1309
Training Step: 266  | total loss: 0.55030 | time: 0.027s
| Adam | epoch: 004 | loss: 0.55030 - acc: 0.7352 -- iter: 0320/1309
Training Step: 267  | total loss: 0.54914 | time: 0.028s
| Adam | epoch: 004 | loss: 0.54914 - acc: 0.7367 -- iter: 0336/1309
Training Step: 268  | total loss: 0.57724 | time: 0.029s
| Adam | epoch: 004 | loss: 0.57724 - acc: 0.7005 -- iter: 0352/1309
Training Step: 269  | total loss: 0.56834 | time: 0.030s
| Adam | epoch: 004 | loss: 0.56834 - acc: 0.7180 -- iter: 0368/1309
Training Step: 270  | total loss: 0.55497 | time: 0.032s
| Adam | epoch: 004 | loss: 0.55497 - acc: 0.7274 -- iter: 0384/1309
Training Step: 271  | total loss: 0.55462 | time: 0.034s
| Adam | epoch: 004 | loss: 0.55462 - acc: 0.7172 -- iter: 0400/1309
Training Step: 272  | total loss: 0.56213 | time: 0.035s
| Adam | epoch: 004 | loss: 0.56213 - acc: 0.7080 -- iter: 0416/1309
Training Step: 273  | total loss: 0.58396 | time: 0.038s
| Adam | epoch: 004 | loss: 0.58396 - acc: 0.6997 -- iter: 0432/1309
Training Step: 274  | total loss: 0.56078 | time: 0.040s
| Adam | epoch: 004 | loss: 0.56078 - acc: 0.7172 -- iter: 0448/1309
Training Step: 275  | total loss: 0.55978 | time: 0.041s
| Adam | epoch: 004 | loss: 0.55978 - acc: 0.7205 -- iter: 0464/1309
Training Step: 276  | total loss: 0.54234 | time: 0.044s
| Adam | epoch: 004 | loss: 0.54234 - acc: 0.7297 -- iter: 0480/1309
Training Step: 277  | total loss: 0.55555 | time: 0.045s
| Adam | epoch: 004 | loss: 0.55555 - acc: 0.7192 -- iter: 0496/1309
Training Step: 278  | total loss: 0.54643 | time: 0.047s
| Adam | epoch: 004 | loss: 0.54643 - acc: 0.7410 -- iter: 0512/1309
Training Step: 279  | total loss: 0.53519 | time: 0.049s
| Adam | epoch: 004 | loss: 0.53519 - acc: 0.7482 -- iter: 0528/1309
Training Step: 280  | total loss: 0.52114 | time: 0.052s
| Adam | epoch: 004 | loss: 0.52114 - acc: 0.7734 -- iter: 0544/1309
Training Step: 281  | total loss: 0.51895 | time: 0.054s
| Adam | epoch: 004 | loss: 0.51895 - acc: 0.7773 -- iter: 0560/1309
Training Step: 282  | total loss: 0.53095 | time: 0.057s
| Adam | epoch: 004 | loss: 0.53095 - acc: 0.7558 -- iter: 0576/1309
Training Step: 283  | total loss: 0.52453 | time: 0.058s
| Adam | epoch: 004 | loss: 0.52453 - acc: 0.7552 -- iter: 0592/1309
Training Step: 284  | total loss: 0.52775 | time: 0.060s
| Adam | epoch: 004 | loss: 0.52775 - acc: 0.7609 -- iter: 0608/1309
Training Step: 285  | total loss: 0.52976 | time: 0.062s
| Adam | epoch: 004 | loss: 0.52976 - acc: 0.7536 -- iter: 0624/1309
Training Step: 286  | total loss: 0.54573 | time: 0.063s
| Adam | epoch: 004 | loss: 0.54573 - acc: 0.7470 -- iter: 0640/1309
Training Step: 287  | total loss: 0.55149 | time: 0.064s
| Adam | epoch: 004 | loss: 0.55149 - acc: 0.7348 -- iter: 0656/1309
Training Step: 288  | total loss: 0.53728 | time: 0.066s
| Adam | epoch: 004 | loss: 0.53728 - acc: 0.7426 -- iter: 0672/1309
Training Step: 289  | total loss: 0.53421 | time: 0.067s
| Adam | epoch: 004 | loss: 0.53421 - acc: 0.7371 -- iter: 0688/1309
Training Step: 290  | total loss: 0.54384 | time: 0.069s
| Adam | epoch: 004 | loss: 0.54384 - acc: 0.7196 -- iter: 0704/1309
Training Step: 291  | total loss: 0.52883 | time: 0.073s
| Adam | epoch: 004 | loss: 0.52883 - acc: 0.7289 -- iter: 0720/1309
Training Step: 292  | total loss: 0.53308 | time: 0.074s
| Adam | epoch: 004 | loss: 0.53308 - acc: 0.7310 -- iter: 0736/1309
Training Step: 293  | total loss: 0.53037 | time: 0.076s
| Adam | epoch: 004 | loss: 0.53037 - acc: 0.7392 -- iter: 0752/1309
Training Step: 294  | total loss: 0.52370 | time: 0.077s
| Adam | epoch: 004 | loss: 0.52370 - acc: 0.7527 -- iter: 0768/1309
Training Step: 295  | total loss: 0.52209 | time: 0.079s
| Adam | epoch: 004 | loss: 0.52209 - acc: 0.7587 -- iter: 0784/1309
Training Step: 296  | total loss: 0.52219 | time: 0.080s
| Adam | epoch: 004 | loss: 0.52219 - acc: 0.7578 -- iter: 0800/1309
Training Step: 297  | total loss: 0.53567 | time: 0.082s
| Adam | epoch: 004 | loss: 0.53567 - acc: 0.7571 -- iter: 0816/1309
Training Step: 298  | total loss: 0.53694 | time: 0.085s
| Adam | epoch: 004 | loss: 0.53694 - acc: 0.7564 -- iter: 0832/1309
Training Step: 299  | total loss: 0.53415 | time: 0.088s
| Adam | epoch: 004 | loss: 0.53415 - acc: 0.7432 -- iter: 0848/1309
Training Step: 300  | total loss: 0.56154 | time: 0.089s
| Adam | epoch: 004 | loss: 0.56154 - acc: 0.7376 -- iter: 0864/1309
Training Step: 301  | total loss: 0.54753 | time: 0.091s
| Adam | epoch: 004 | loss: 0.54753 - acc: 0.7389 -- iter: 0880/1309
Training Step: 302  | total loss: 0.55064 | time: 0.093s
| Adam | epoch: 004 | loss: 0.55064 - acc: 0.7462 -- iter: 0896/1309
Training Step: 303  | total loss: 0.56850 | time: 0.095s
| Adam | epoch: 004 | loss: 0.56850 - acc: 0.7279 -- iter: 0912/1309
Training Step: 304  | total loss: 0.57927 | time: 0.097s
| Adam | epoch: 004 | loss: 0.57927 - acc: 0.7363 -- iter: 0928/1309
Training Step: 305  | total loss: 0.56372 | time: 0.099s
| Adam | epoch: 004 | loss: 0.56372 - acc: 0.7502 -- iter: 0944/1309
Training Step: 306  | total loss: 0.54962 | time: 0.101s
| Adam | epoch: 004 | loss: 0.54962 - acc: 0.7564 -- iter: 0960/1309
Training Step: 307  | total loss: 0.55351 | time: 0.104s
| Adam | epoch: 004 | loss: 0.55351 - acc: 0.7495 -- iter: 0976/1309
Training Step: 308  | total loss: 0.54173 | time: 0.106s
| Adam | epoch: 004 | loss: 0.54173 - acc: 0.7621 -- iter: 0992/1309
Training Step: 309  | total loss: 0.56125 | time: 0.108s
| Adam | epoch: 004 | loss: 0.56125 - acc: 0.7546 -- iter: 1008/1309
Training Step: 310  | total loss: 0.56151 | time: 0.109s
| Adam | epoch: 004 | loss: 0.56151 - acc: 0.7604 -- iter: 1024/1309
Training Step: 311  | total loss: 0.52752 | time: 0.111s
| Adam | epoch: 004 | loss: 0.52752 - acc: 0.7781 -- iter: 1040/1309
Training Step: 312  | total loss: 0.53762 | time: 0.113s
| Adam | epoch: 004 | loss: 0.53762 - acc: 0.7691 -- iter: 1056/1309
Training Step: 313  | total loss: 0.53330 | time: 0.114s
| Adam | epoch: 004 | loss: 0.53330 - acc: 0.7734 -- iter: 1072/1309
Training Step: 314  | total loss: 0.51583 | time: 0.116s
| Adam | epoch: 004 | loss: 0.51583 - acc: 0.7773 -- iter: 1088/1309
Training Step: 315  | total loss: 0.51085 | time: 0.118s
| Adam | epoch: 004 | loss: 0.51085 - acc: 0.7808 -- iter: 1104/1309
Training Step: 316  | total loss: 0.51902 | time: 0.121s
| Adam | epoch: 004 | loss: 0.51902 - acc: 0.7652 -- iter: 1120/1309
Training Step: 317  | total loss: 0.50823 | time: 0.122s
| Adam | epoch: 004 | loss: 0.50823 - acc: 0.7637 -- iter: 1136/1309
Training Step: 318  | total loss: 0.50601 | time: 0.124s
| Adam | epoch: 004 | loss: 0.50601 - acc: 0.7686 -- iter: 1152/1309
Training Step: 319  | total loss: 0.52988 | time: 0.126s
| Adam | epoch: 004 | loss: 0.52988 - acc: 0.7605 -- iter: 1168/1309
Training Step: 320  | total loss: 0.51950 | time: 0.127s
| Adam | epoch: 004 | loss: 0.51950 - acc: 0.7594 -- iter: 1184/1309
Training Step: 321  | total loss: 0.52032 | time: 0.129s
| Adam | epoch: 004 | loss: 0.52032 - acc: 0.7710 -- iter: 1200/1309
Training Step: 322  | total loss: 0.51946 | time: 0.130s
| Adam | epoch: 004 | loss: 0.51946 - acc: 0.7751 -- iter: 1216/1309
Training Step: 323  | total loss: 0.50842 | time: 0.132s
| Adam | epoch: 004 | loss: 0.50842 - acc: 0.7789 -- iter: 1232/1309
Training Step: 324  | total loss: 0.53115 | time: 0.133s
| Adam | epoch: 004 | loss: 0.53115 - acc: 0.7760 -- iter: 1248/1309
Training Step: 325  | total loss: 0.55320 | time: 0.135s
| Adam | epoch: 004 | loss: 0.55320 - acc: 0.7546 -- iter: 1264/1309
Training Step: 326  | total loss: 0.55635 | time: 0.137s
| Adam | epoch: 004 | loss: 0.55635 - acc: 0.7542 -- iter: 1280/1309
Training Step: 327  | total loss: 0.54095 | time: 0.138s
| Adam | epoch: 004 | loss: 0.54095 - acc: 0.7600 -- iter: 1296/1309
Training Step: 328  | total loss: 0.54777 | time: 0.140s
| Adam | epoch: 004 | loss: 0.54777 - acc: 0.7528 -- iter: 1309/1309
--
Training Step: 329  | total loss: 0.54949 | time: 0.002s
| Adam | epoch: 005 | loss: 0.54949 - acc: 0.7587 -- iter: 0016/1309
Training Step: 330  | total loss: 0.53238 | time: 0.003s
| Adam | epoch: 005 | loss: 0.53238 - acc: 0.7704 -- iter: 0032/1309
Training Step: 331  | total loss: 0.53416 | time: 0.005s
| Adam | epoch: 005 | loss: 0.53416 - acc: 0.7683 -- iter: 0048/1309
Training Step: 332  | total loss: 0.53010 | time: 0.007s
| Adam | epoch: 005 | loss: 0.53010 - acc: 0.7684 -- iter: 0064/1309
Training Step: 333  | total loss: 0.52670 | time: 0.008s
| Adam | epoch: 005 | loss: 0.52670 - acc: 0.7685 -- iter: 0080/1309
Training Step: 334  | total loss: 0.52223 | time: 0.011s
| Adam | epoch: 005 | loss: 0.52223 - acc: 0.7729 -- iter: 0096/1309
Training Step: 335  | total loss: 0.52186 | time: 0.014s
| Adam | epoch: 005 | loss: 0.52186 - acc: 0.7644 -- iter: 0112/1309
Training Step: 336  | total loss: 0.53421 | time: 0.016s
| Adam | epoch: 005 | loss: 0.53421 - acc: 0.7379 -- iter: 0128/1309
Training Step: 337  | total loss: 0.50310 | time: 0.018s
| Adam | epoch: 005 | loss: 0.50310 - acc: 0.7641 -- iter: 0144/1309
Training Step: 338  | total loss: 0.50509 | time: 0.020s
| Adam | epoch: 005 | loss: 0.50509 - acc: 0.7565 -- iter: 0160/1309
Training Step: 339  | total loss: 0.49133 | time: 0.021s
| Adam | epoch: 005 | loss: 0.49133 - acc: 0.7621 -- iter: 0176/1309
Training Step: 340  | total loss: 0.48390 | time: 0.022s
| Adam | epoch: 005 | loss: 0.48390 - acc: 0.7734 -- iter: 0192/1309
Training Step: 341  | total loss: 0.47962 | time: 0.024s
| Adam | epoch: 005 | loss: 0.47962 - acc: 0.7835 -- iter: 0208/1309
Training Step: 342  | total loss: 0.48030 | time: 0.025s
| Adam | epoch: 005 | loss: 0.48030 - acc: 0.7864 -- iter: 0224/1309
Training Step: 343  | total loss: 0.49126 | time: 0.027s
| Adam | epoch: 005 | loss: 0.49126 - acc: 0.7765 -- iter: 0240/1309
Training Step: 344  | total loss: 0.51888 | time: 0.029s
| Adam | epoch: 005 | loss: 0.51888 - acc: 0.7676 -- iter: 0256/1309
Training Step: 345  | total loss: 0.54361 | time: 0.030s
| Adam | epoch: 005 | loss: 0.54361 - acc: 0.7534 -- iter: 0272/1309
Training Step: 346  | total loss: 0.52721 | time: 0.032s
| Adam | epoch: 005 | loss: 0.52721 - acc: 0.7655 -- iter: 0288/1309
Training Step: 347  | total loss: 0.53036 | time: 0.033s
| Adam | epoch: 005 | loss: 0.53036 - acc: 0.7577 -- iter: 0304/1309
Training Step: 348  | total loss: 0.54787 | time: 0.035s
| Adam | epoch: 005 | loss: 0.54787 - acc: 0.7507 -- iter: 0320/1309
Training Step: 349  | total loss: 0.54750 | time: 0.037s
| Adam | epoch: 005 | loss: 0.54750 - acc: 0.7444 -- iter: 0336/1309
Training Step: 350  | total loss: 0.55223 | time: 0.038s
| Adam | epoch: 005 | loss: 0.55223 - acc: 0.7387 -- iter: 0352/1309
Training Step: 351  | total loss: 0.54157 | time: 0.040s
| Adam | epoch: 005 | loss: 0.54157 - acc: 0.7461 -- iter: 0368/1309
Training Step: 352  | total loss: 0.53975 | time: 0.042s
| Adam | epoch: 005 | loss: 0.53975 - acc: 0.7402 -- iter: 0384/1309
Training Step: 353  | total loss: 0.54134 | time: 0.044s
| Adam | epoch: 005 | loss: 0.54134 - acc: 0.7412 -- iter: 0400/1309
Training Step: 354  | total loss: 0.53696 | time: 0.046s
| Adam | epoch: 005 | loss: 0.53696 - acc: 0.7546 -- iter: 0416/1309
Training Step: 355  | total loss: 0.52423 | time: 0.047s
| Adam | epoch: 005 | loss: 0.52423 - acc: 0.7541 -- iter: 0432/1309
Training Step: 356  | total loss: 0.50040 | time: 0.048s
| Adam | epoch: 005 | loss: 0.50040 - acc: 0.7725 -- iter: 0448/1309
Training Step: 357  | total loss: 0.50479 | time: 0.050s
| Adam | epoch: 005 | loss: 0.50479 - acc: 0.7765 -- iter: 0464/1309
Training Step: 358  | total loss: 0.49529 | time: 0.052s
| Adam | epoch: 005 | loss: 0.49529 - acc: 0.7738 -- iter: 0480/1309
Training Step: 359  | total loss: 0.49800 | time: 0.053s
| Adam | epoch: 005 | loss: 0.49800 - acc: 0.7714 -- iter: 0496/1309
Training Step: 360  | total loss: 0.50051 | time: 0.055s
| Adam | epoch: 005 | loss: 0.50051 - acc: 0.7755 -- iter: 0512/1309
Training Step: 361  | total loss: 0.48228 | time: 0.056s
| Adam | epoch: 005 | loss: 0.48228 - acc: 0.7917 -- iter: 0528/1309
Training Step: 362  | total loss: 0.51876 | time: 0.058s
| Adam | epoch: 005 | loss: 0.51876 - acc: 0.7626 -- iter: 0544/1309
Training Step: 363  | total loss: 0.51000 | time: 0.059s
| Adam | epoch: 005 | loss: 0.51000 - acc: 0.7676 -- iter: 0560/1309
Training Step: 364  | total loss: 0.51085 | time: 0.061s
| Adam | epoch: 005 | loss: 0.51085 - acc: 0.7721 -- iter: 0576/1309
Training Step: 365  | total loss: 0.48485 | time: 0.063s
| Adam | epoch: 005 | loss: 0.48485 - acc: 0.7948 -- iter: 0592/1309
Training Step: 366  | total loss: 0.50069 | time: 0.064s
| Adam | epoch: 005 | loss: 0.50069 - acc: 0.7841 -- iter: 0608/1309
Training Step: 367  | total loss: 0.50824 | time: 0.066s
| Adam | epoch: 005 | loss: 0.50824 - acc: 0.7745 -- iter: 0624/1309
Training Step: 368  | total loss: 0.49701 | time: 0.068s
| Adam | epoch: 005 | loss: 0.49701 - acc: 0.7845 -- iter: 0640/1309
Training Step: 369  | total loss: 0.50324 | time: 0.069s
| Adam | epoch: 005 | loss: 0.50324 - acc: 0.7686 -- iter: 0656/1309
Training Step: 370  | total loss: 0.49627 | time: 0.070s
| Adam | epoch: 005 | loss: 0.49627 - acc: 0.7667 -- iter: 0672/1309
Training Step: 371  | total loss: 0.49453 | time: 0.071s
| Adam | epoch: 005 | loss: 0.49453 - acc: 0.7713 -- iter: 0688/1309
Training Step: 372  | total loss: 0.48672 | time: 0.072s
| Adam | epoch: 005 | loss: 0.48672 - acc: 0.7692 -- iter: 0704/1309
Training Step: 373  | total loss: 0.47055 | time: 0.074s
| Adam | epoch: 005 | loss: 0.47055 - acc: 0.7797 -- iter: 0720/1309
Training Step: 374  | total loss: 0.49580 | time: 0.075s
| Adam | epoch: 005 | loss: 0.49580 - acc: 0.7643 -- iter: 0736/1309
Training Step: 375  | total loss: 0.51111 | time: 0.076s
| Adam | epoch: 005 | loss: 0.51111 - acc: 0.7691 -- iter: 0752/1309
Training Step: 376  | total loss: 0.53444 | time: 0.078s
| Adam | epoch: 005 | loss: 0.53444 - acc: 0.7484 -- iter: 0768/1309
Training Step: 377  | total loss: 0.52942 | time: 0.079s
| Adam | epoch: 005 | loss: 0.52942 - acc: 0.7486 -- iter: 0784/1309
Training Step: 378  | total loss: 0.54405 | time: 0.080s
| Adam | epoch: 005 | loss: 0.54405 - acc: 0.7487 -- iter: 0800/1309
Training Step: 379  | total loss: 0.55685 | time: 0.082s
| Adam | epoch: 005 | loss: 0.55685 - acc: 0.7426 -- iter: 0816/1309
Training Step: 380  | total loss: 0.54589 | time: 0.083s
| Adam | epoch: 005 | loss: 0.54589 - acc: 0.7433 -- iter: 0832/1309
Training Step: 381  | total loss: 0.52626 | time: 0.085s
| Adam | epoch: 005 | loss: 0.52626 - acc: 0.7565 -- iter: 0848/1309
Training Step: 382  | total loss: 0.52702 | time: 0.086s
| Adam | epoch: 005 | loss: 0.52702 - acc: 0.7559 -- iter: 0864/1309
Training Step: 383  | total loss: 0.51290 | time: 0.087s
| Adam | epoch: 005 | loss: 0.51290 - acc: 0.7678 -- iter: 0880/1309
Training Step: 384  | total loss: 0.52135 | time: 0.088s
| Adam | epoch: 005 | loss: 0.52135 - acc: 0.7597 -- iter: 0896/1309
Training Step: 385  | total loss: 0.54153 | time: 0.089s
| Adam | epoch: 005 | loss: 0.54153 - acc: 0.7525 -- iter: 0912/1309
Training Step: 386  | total loss: 0.54986 | time: 0.091s
| Adam | epoch: 005 | loss: 0.54986 - acc: 0.7523 -- iter: 0928/1309
Training Step: 387  | total loss: 0.56531 | time: 0.092s
| Adam | epoch: 005 | loss: 0.56531 - acc: 0.7395 -- iter: 0944/1309
Training Step: 388  | total loss: 0.55272 | time: 0.093s
| Adam | epoch: 005 | loss: 0.55272 - acc: 0.7468 -- iter: 0960/1309
Training Step: 389  | total loss: 0.53915 | time: 0.094s
| Adam | epoch: 005 | loss: 0.53915 - acc: 0.7472 -- iter: 0976/1309
Training Step: 390  | total loss: 0.55304 | time: 0.096s
| Adam | epoch: 005 | loss: 0.55304 - acc: 0.7412 -- iter: 0992/1309
Training Step: 391  | total loss: 0.52142 | time: 0.098s
| Adam | epoch: 005 | loss: 0.52142 - acc: 0.7608 -- iter: 1008/1309
Training Step: 392  | total loss: 0.51816 | time: 0.099s
| Adam | epoch: 005 | loss: 0.51816 - acc: 0.7660 -- iter: 1024/1309
Training Step: 393  | total loss: 0.50170 | time: 0.101s
| Adam | epoch: 005 | loss: 0.50170 - acc: 0.7706 -- iter: 1040/1309
Training Step: 394  | total loss: 0.49129 | time: 0.103s
| Adam | epoch: 005 | loss: 0.49129 - acc: 0.7686 -- iter: 1056/1309
Training Step: 395  | total loss: 0.48710 | time: 0.105s
| Adam | epoch: 005 | loss: 0.48710 - acc: 0.7730 -- iter: 1072/1309
Training Step: 396  | total loss: 0.46004 | time: 0.106s
| Adam | epoch: 005 | loss: 0.46004 - acc: 0.7894 -- iter: 1088/1309
Training Step: 397  | total loss: 0.46438 | time: 0.108s
| Adam | epoch: 005 | loss: 0.46438 - acc: 0.7855 -- iter: 1104/1309
Training Step: 398  | total loss: 0.47024 | time: 0.109s
| Adam | epoch: 005 | loss: 0.47024 - acc: 0.7882 -- iter: 1120/1309
Training Step: 399  | total loss: 0.46562 | time: 0.111s
| Adam | epoch: 005 | loss: 0.46562 - acc: 0.7906 -- iter: 1136/1309
Training Step: 400  | total loss: 0.47168 | time: 0.112s
| Adam | epoch: 005 | loss: 0.47168 - acc: 0.7803 -- iter: 1152/1309
Training Step: 401  | total loss: 0.48434 | time: 0.114s
| Adam | epoch: 005 | loss: 0.48434 - acc: 0.7648 -- iter: 1168/1309
Training Step: 402  | total loss: 0.45843 | time: 0.115s
| Adam | epoch: 005 | loss: 0.45843 - acc: 0.7820 -- iter: 1184/1309
Training Step: 403  | total loss: 0.45349 | time: 0.116s
| Adam | epoch: 005 | loss: 0.45349 - acc: 0.7851 -- iter: 1200/1309
Training Step: 404  | total loss: 0.46038 | time: 0.117s
| Adam | epoch: 005 | loss: 0.46038 - acc: 0.7816 -- iter: 1216/1309
Training Step: 405  | total loss: 0.46347 | time: 0.119s
| Adam | epoch: 005 | loss: 0.46347 - acc: 0.7847 -- iter: 1232/1309
Training Step: 406  | total loss: 0.46979 | time: 0.121s
| Adam | epoch: 005 | loss: 0.46979 - acc: 0.7937 -- iter: 1248/1309
Training Step: 407  | total loss: 0.49247 | time: 0.123s
| Adam | epoch: 005 | loss: 0.49247 - acc: 0.7768 -- iter: 1264/1309
Training Step: 408  | total loss: 0.50619 | time: 0.125s
| Adam | epoch: 005 | loss: 0.50619 - acc: 0.7679 -- iter: 1280/1309
Training Step: 409  | total loss: 0.51453 | time: 0.127s
| Adam | epoch: 005 | loss: 0.51453 - acc: 0.7599 -- iter: 1296/1309
Training Step: 410  | total loss: 0.52396 | time: 0.128s
| Adam | epoch: 005 | loss: 0.52396 - acc: 0.7589 -- iter: 1309/1309
--
Training Step: 411  | total loss: 0.50339 | time: 0.002s
| Adam | epoch: 006 | loss: 0.50339 - acc: 0.7705 -- iter: 0016/1309
Training Step: 412  | total loss: 0.49488 | time: 0.003s
| Adam | epoch: 006 | loss: 0.49488 - acc: 0.7747 -- iter: 0032/1309
Training Step: 413  | total loss: 0.50428 | time: 0.005s
| Adam | epoch: 006 | loss: 0.50428 - acc: 0.7722 -- iter: 0048/1309
Training Step: 414  | total loss: 0.48771 | time: 0.006s
| Adam | epoch: 006 | loss: 0.48771 - acc: 0.7825 -- iter: 0064/1309
Training Step: 415  | total loss: 0.48304 | time: 0.008s
| Adam | epoch: 006 | loss: 0.48304 - acc: 0.7812 -- iter: 0080/1309
Training Step: 416  | total loss: 0.47799 | time: 0.009s
| Adam | epoch: 006 | loss: 0.47799 - acc: 0.7800 -- iter: 0096/1309
Training Step: 417  | total loss: 0.46261 | time: 0.010s
| Adam | epoch: 006 | loss: 0.46261 - acc: 0.7895 -- iter: 0112/1309
Training Step: 418  | total loss: 0.45258 | time: 0.012s
| Adam | epoch: 006 | loss: 0.45258 - acc: 0.7918 -- iter: 0128/1309
Training Step: 419  | total loss: 0.44354 | time: 0.013s
| Adam | epoch: 006 | loss: 0.44354 - acc: 0.8001 -- iter: 0144/1309
Training Step: 420  | total loss: 0.45311 | time: 0.015s
| Adam | epoch: 006 | loss: 0.45311 - acc: 0.7951 -- iter: 0160/1309
Training Step: 421  | total loss: 0.44687 | time: 0.017s
| Adam | epoch: 006 | loss: 0.44687 - acc: 0.7968 -- iter: 0176/1309
Training Step: 422  | total loss: 0.43933 | time: 0.018s
| Adam | epoch: 006 | loss: 0.43933 - acc: 0.8047 -- iter: 0192/1309
Training Step: 423  | total loss: 0.43908 | time: 0.020s
| Adam | epoch: 006 | loss: 0.43908 - acc: 0.8117 -- iter: 0208/1309
Training Step: 424  | total loss: 0.45850 | time: 0.021s
| Adam | epoch: 006 | loss: 0.45850 - acc: 0.8118 -- iter: 0224/1309
Training Step: 425  | total loss: 0.48769 | time: 0.022s
| Adam | epoch: 006 | loss: 0.48769 - acc: 0.7993 -- iter: 0240/1309
Training Step: 426  | total loss: 0.48407 | time: 0.023s
| Adam | epoch: 006 | loss: 0.48407 - acc: 0.8007 -- iter: 0256/1309
Training Step: 427  | total loss: 0.48158 | time: 0.024s
| Adam | epoch: 006 | loss: 0.48158 - acc: 0.8018 -- iter: 0272/1309
Training Step: 428  | total loss: 0.48505 | time: 0.026s
| Adam | epoch: 006 | loss: 0.48505 - acc: 0.7967 -- iter: 0288/1309
Training Step: 429  | total loss: 0.48614 | time: 0.028s
| Adam | epoch: 006 | loss: 0.48614 - acc: 0.7920 -- iter: 0304/1309
Training Step: 430  | total loss: 0.48737 | time: 0.030s
| Adam | epoch: 006 | loss: 0.48737 - acc: 0.7878 -- iter: 0320/1309
Training Step: 431  | total loss: 0.48532 | time: 0.031s
| Adam | epoch: 006 | loss: 0.48532 - acc: 0.7903 -- iter: 0336/1309
Training Step: 432  | total loss: 0.45940 | time: 0.033s
| Adam | epoch: 006 | loss: 0.45940 - acc: 0.8112 -- iter: 0352/1309
Training Step: 433  | total loss: 0.46573 | time: 0.034s
| Adam | epoch: 006 | loss: 0.46573 - acc: 0.8114 -- iter: 0368/1309
Training Step: 434  | total loss: 0.46646 | time: 0.036s
| Adam | epoch: 006 | loss: 0.46646 - acc: 0.8115 -- iter: 0384/1309
Training Step: 435  | total loss: 0.47629 | time: 0.037s
| Adam | epoch: 006 | loss: 0.47629 - acc: 0.8116 -- iter: 0400/1309
Training Step: 436  | total loss: 0.45817 | time: 0.038s
| Adam | epoch: 006 | loss: 0.45817 - acc: 0.8179 -- iter: 0416/1309
Training Step: 437  | total loss: 0.48719 | time: 0.039s
| Adam | epoch: 006 | loss: 0.48719 - acc: 0.8049 -- iter: 0432/1309
Training Step: 438  | total loss: 0.50249 | time: 0.040s
| Adam | epoch: 006 | loss: 0.50249 - acc: 0.7994 -- iter: 0448/1309
Training Step: 439  | total loss: 0.47474 | time: 0.042s
| Adam | epoch: 006 | loss: 0.47474 - acc: 0.8195 -- iter: 0464/1309
Training Step: 440  | total loss: 0.48476 | time: 0.043s
| Adam | epoch: 006 | loss: 0.48476 - acc: 0.8000 -- iter: 0480/1309
Training Step: 441  | total loss: 0.49858 | time: 0.044s
| Adam | epoch: 006 | loss: 0.49858 - acc: 0.7888 -- iter: 0496/1309
Training Step: 442  | total loss: 0.50429 | time: 0.045s
| Adam | epoch: 006 | loss: 0.50429 - acc: 0.7786 -- iter: 0512/1309
Training Step: 443  | total loss: 0.50262 | time: 0.047s
| Adam | epoch: 006 | loss: 0.50262 - acc: 0.7758 -- iter: 0528/1309
Training Step: 444  | total loss: 0.49257 | time: 0.049s
| Adam | epoch: 006 | loss: 0.49257 - acc: 0.7857 -- iter: 0544/1309
Training Step: 445  | total loss: 0.47943 | time: 0.050s
| Adam | epoch: 006 | loss: 0.47943 - acc: 0.7946 -- iter: 0560/1309
Training Step: 446  | total loss: 0.48753 | time: 0.052s
| Adam | epoch: 006 | loss: 0.48753 - acc: 0.7902 -- iter: 0576/1309
Training Step: 447  | total loss: 0.51736 | time: 0.054s
| Adam | epoch: 006 | loss: 0.51736 - acc: 0.7674 -- iter: 0592/1309
Training Step: 448  | total loss: 0.49839 | time: 0.056s
| Adam | epoch: 006 | loss: 0.49839 - acc: 0.7719 -- iter: 0608/1309
Training Step: 449  | total loss: 0.51358 | time: 0.058s
| Adam | epoch: 006 | loss: 0.51358 - acc: 0.7635 -- iter: 0624/1309
Training Step: 450  | total loss: 0.50364 | time: 0.060s
| Adam | epoch: 006 | loss: 0.50364 - acc: 0.7684 -- iter: 0640/1309
Training Step: 451  | total loss: 0.47982 | time: 0.062s
| Adam | epoch: 006 | loss: 0.47982 - acc: 0.7853 -- iter: 0656/1309
Training Step: 452  | total loss: 0.47625 | time: 0.064s
| Adam | epoch: 006 | loss: 0.47625 - acc: 0.7818 -- iter: 0672/1309
Training Step: 453  | total loss: 0.47493 | time: 0.065s
| Adam | epoch: 006 | loss: 0.47493 - acc: 0.7848 -- iter: 0688/1309
Training Step: 454  | total loss: 0.49822 | time: 0.066s
| Adam | epoch: 006 | loss: 0.49822 - acc: 0.7751 -- iter: 0704/1309
Training Step: 455  | total loss: 0.49960 | time: 0.067s
| Adam | epoch: 006 | loss: 0.49960 - acc: 0.7788 -- iter: 0720/1309
Training Step: 456  | total loss: 0.51593 | time: 0.068s
| Adam | epoch: 006 | loss: 0.51593 - acc: 0.7697 -- iter: 0736/1309
Training Step: 457  | total loss: 0.50340 | time: 0.069s
| Adam | epoch: 006 | loss: 0.50340 - acc: 0.7802 -- iter: 0752/1309
Training Step: 458  | total loss: 0.51341 | time: 0.072s
| Adam | epoch: 006 | loss: 0.51341 - acc: 0.7772 -- iter: 0768/1309
Training Step: 459  | total loss: 0.53094 | time: 0.073s
| Adam | epoch: 006 | loss: 0.53094 - acc: 0.7682 -- iter: 0784/1309
Training Step: 460  | total loss: 0.53511 | time: 0.075s
| Adam | epoch: 006 | loss: 0.53511 - acc: 0.7664 -- iter: 0800/1309
Training Step: 461  | total loss: 0.51466 | time: 0.076s
| Adam | epoch: 006 | loss: 0.51466 - acc: 0.7773 -- iter: 0816/1309
Training Step: 462  | total loss: 0.52264 | time: 0.077s
| Adam | epoch: 006 | loss: 0.52264 - acc: 0.7745 -- iter: 0832/1309
Training Step: 463  | total loss: 0.51144 | time: 0.079s
| Adam | epoch: 006 | loss: 0.51144 - acc: 0.7783 -- iter: 0848/1309
Training Step: 464  | total loss: 0.50842 | time: 0.080s
| Adam | epoch: 006 | loss: 0.50842 - acc: 0.7818 -- iter: 0864/1309
Training Step: 465  | total loss: 0.51167 | time: 0.082s
| Adam | epoch: 006 | loss: 0.51167 - acc: 0.7723 -- iter: 0880/1309
Training Step: 466  | total loss: 0.51425 | time: 0.083s
| Adam | epoch: 006 | loss: 0.51425 - acc: 0.7701 -- iter: 0896/1309
Training Step: 467  | total loss: 0.50161 | time: 0.084s
| Adam | epoch: 006 | loss: 0.50161 - acc: 0.7743 -- iter: 0912/1309
Training Step: 468  | total loss: 0.50723 | time: 0.086s
| Adam | epoch: 006 | loss: 0.50723 - acc: 0.7657 -- iter: 0928/1309
Training Step: 469  | total loss: 0.51288 | time: 0.088s
| Adam | epoch: 006 | loss: 0.51288 - acc: 0.7578 -- iter: 0944/1309
Training Step: 470  | total loss: 0.51530 | time: 0.089s
| Adam | epoch: 006 | loss: 0.51530 - acc: 0.7571 -- iter: 0960/1309
Training Step: 471  | total loss: 0.50018 | time: 0.091s
| Adam | epoch: 006 | loss: 0.50018 - acc: 0.7626 -- iter: 0976/1309
Training Step: 472  | total loss: 0.51853 | time: 0.092s
| Adam | epoch: 006 | loss: 0.51853 - acc: 0.7488 -- iter: 0992/1309
Training Step: 473  | total loss: 0.51144 | time: 0.094s
| Adam | epoch: 006 | loss: 0.51144 - acc: 0.7677 -- iter: 1008/1309
Training Step: 474  | total loss: 0.51300 | time: 0.095s
| Adam | epoch: 006 | loss: 0.51300 - acc: 0.7659 -- iter: 1024/1309
Training Step: 475  | total loss: 0.51680 | time: 0.096s
| Adam | epoch: 006 | loss: 0.51680 - acc: 0.7643 -- iter: 1040/1309
Training Step: 476  | total loss: 0.48886 | time: 0.098s
| Adam | epoch: 006 | loss: 0.48886 - acc: 0.7879 -- iter: 1056/1309
Training Step: 477  | total loss: 0.48413 | time: 0.099s
| Adam | epoch: 006 | loss: 0.48413 - acc: 0.7904 -- iter: 1072/1309
Training Step: 478  | total loss: 0.48633 | time: 0.101s
| Adam | epoch: 006 | loss: 0.48633 - acc: 0.7926 -- iter: 1088/1309
Training Step: 479  | total loss: 0.49695 | time: 0.102s
| Adam | epoch: 006 | loss: 0.49695 - acc: 0.7883 -- iter: 1104/1309
Training Step: 480  | total loss: 0.51989 | time: 0.104s
| Adam | epoch: 006 | loss: 0.51989 - acc: 0.7782 -- iter: 1120/1309
Training Step: 481  | total loss: 0.52562 | time: 0.105s
| Adam | epoch: 006 | loss: 0.52562 - acc: 0.7754 -- iter: 1136/1309
Training Step: 482  | total loss: 0.51939 | time: 0.106s
| Adam | epoch: 006 | loss: 0.51939 - acc: 0.7854 -- iter: 1152/1309
Training Step: 483  | total loss: 0.53763 | time: 0.107s
| Adam | epoch: 006 | loss: 0.53763 - acc: 0.7693 -- iter: 1168/1309
Training Step: 484  | total loss: 0.51145 | time: 0.109s
| Adam | epoch: 006 | loss: 0.51145 - acc: 0.7862 -- iter: 1184/1309
Training Step: 485  | total loss: 0.49701 | time: 0.110s
| Adam | epoch: 006 | loss: 0.49701 - acc: 0.7950 -- iter: 1200/1309
Training Step: 486  | total loss: 0.48943 | time: 0.111s
| Adam | epoch: 006 | loss: 0.48943 - acc: 0.7968 -- iter: 1216/1309
Training Step: 487  | total loss: 0.48867 | time: 0.112s
| Adam | epoch: 006 | loss: 0.48867 - acc: 0.7859 -- iter: 1232/1309
Training Step: 488  | total loss: 0.49774 | time: 0.114s
| Adam | epoch: 006 | loss: 0.49774 - acc: 0.7698 -- iter: 1248/1309
Training Step: 489  | total loss: 0.54209 | time: 0.116s
| Adam | epoch: 006 | loss: 0.54209 - acc: 0.7490 -- iter: 1264/1309
Training Step: 490  | total loss: 0.56753 | time: 0.117s
| Adam | epoch: 006 | loss: 0.56753 - acc: 0.7554 -- iter: 1280/1309
Training Step: 491  | total loss: 0.57112 | time: 0.119s
| Adam | epoch: 006 | loss: 0.57112 - acc: 0.7486 -- iter: 1296/1309
Training Step: 492  | total loss: 0.54774 | time: 0.120s
| Adam | epoch: 006 | loss: 0.54774 - acc: 0.7675 -- iter: 1309/1309
--
Training Step: 493  | total loss: 0.53911 | time: 0.002s
| Adam | epoch: 007 | loss: 0.53911 - acc: 0.7782 -- iter: 0016/1309
Training Step: 494  | total loss: 0.51421 | time: 0.003s
| Adam | epoch: 007 | loss: 0.51421 - acc: 0.7942 -- iter: 0032/1309
Training Step: 495  | total loss: 0.50745 | time: 0.004s
| Adam | epoch: 007 | loss: 0.50745 - acc: 0.8023 -- iter: 0048/1309
Training Step: 496  | total loss: 0.48491 | time: 0.006s
| Adam | epoch: 007 | loss: 0.48491 - acc: 0.8095 -- iter: 0064/1309
Training Step: 497  | total loss: 0.46207 | time: 0.007s
| Adam | epoch: 007 | loss: 0.46207 - acc: 0.8223 -- iter: 0080/1309
Training Step: 498  | total loss: 0.45052 | time: 0.008s
| Adam | epoch: 007 | loss: 0.45052 - acc: 0.8324 -- iter: 0096/1309
Training Step: 499  | total loss: 0.43991 | time: 0.010s
| Adam | epoch: 007 | loss: 0.43991 - acc: 0.8415 -- iter: 0112/1309
Training Step: 500  | total loss: 0.46321 | time: 0.011s
| Adam | epoch: 007 | loss: 0.46321 - acc: 0.8261 -- iter: 0128/1309
Training Step: 501  | total loss: 0.46650 | time: 0.013s
| Adam | epoch: 007 | loss: 0.46650 - acc: 0.8185 -- iter: 0144/1309
Training Step: 502  | total loss: 0.46409 | time: 0.015s
| Adam | epoch: 007 | loss: 0.46409 - acc: 0.8116 -- iter: 0160/1309
Training Step: 503  | total loss: 0.46461 | time: 0.017s
| Adam | epoch: 007 | loss: 0.46461 - acc: 0.8117 -- iter: 0176/1309
Training Step: 504  | total loss: 0.44827 | time: 0.018s
| Adam | epoch: 007 | loss: 0.44827 - acc: 0.8118 -- iter: 0192/1309
Training Step: 505  | total loss: 0.43769 | time: 0.020s
| Adam | epoch: 007 | loss: 0.43769 - acc: 0.8181 -- iter: 0208/1309
Training Step: 506  | total loss: 0.42574 | time: 0.021s
| Adam | epoch: 007 | loss: 0.42574 - acc: 0.8175 -- iter: 0224/1309
Training Step: 507  | total loss: 0.40589 | time: 0.023s
| Adam | epoch: 007 | loss: 0.40589 - acc: 0.8295 -- iter: 0240/1309
Training Step: 508  | total loss: 0.39651 | time: 0.024s
| Adam | epoch: 007 | loss: 0.39651 - acc: 0.8341 -- iter: 0256/1309
Training Step: 509  | total loss: 0.40378 | time: 0.025s
| Adam | epoch: 007 | loss: 0.40378 - acc: 0.8194 -- iter: 0272/1309
Training Step: 510  | total loss: 0.42713 | time: 0.027s
| Adam | epoch: 007 | loss: 0.42713 - acc: 0.8000 -- iter: 0288/1309
Training Step: 511  | total loss: 0.42404 | time: 0.028s
| Adam | epoch: 007 | loss: 0.42404 - acc: 0.8137 -- iter: 0304/1309
Training Step: 512  | total loss: 0.42088 | time: 0.030s
| Adam | epoch: 007 | loss: 0.42088 - acc: 0.8199 -- iter: 0320/1309
Training Step: 513  | total loss: 0.43821 | time: 0.032s
| Adam | epoch: 007 | loss: 0.43821 - acc: 0.8191 -- iter: 0336/1309
Training Step: 514  | total loss: 0.43162 | time: 0.033s
| Adam | epoch: 007 | loss: 0.43162 - acc: 0.8247 -- iter: 0352/1309
Training Step: 515  | total loss: 0.43630 | time: 0.035s
| Adam | epoch: 007 | loss: 0.43630 - acc: 0.8297 -- iter: 0368/1309
Training Step: 516  | total loss: 0.45988 | time: 0.036s
| Adam | epoch: 007 | loss: 0.45988 - acc: 0.8155 -- iter: 0384/1309
Training Step: 517  | total loss: 0.48445 | time: 0.037s
| Adam | epoch: 007 | loss: 0.48445 - acc: 0.7965 -- iter: 0400/1309
Training Step: 518  | total loss: 0.48887 | time: 0.038s
| Adam | epoch: 007 | loss: 0.48887 - acc: 0.7918 -- iter: 0416/1309
Training Step: 519  | total loss: 0.50452 | time: 0.040s
| Adam | epoch: 007 | loss: 0.50452 - acc: 0.7876 -- iter: 0432/1309
Training Step: 520  | total loss: 0.51199 | time: 0.041s
| Adam | epoch: 007 | loss: 0.51199 - acc: 0.7901 -- iter: 0448/1309
Training Step: 521  | total loss: 0.49892 | time: 0.043s
| Adam | epoch: 007 | loss: 0.49892 - acc: 0.7986 -- iter: 0464/1309
Training Step: 522  | total loss: 0.49451 | time: 0.044s
| Adam | epoch: 007 | loss: 0.49451 - acc: 0.7938 -- iter: 0480/1309
Training Step: 523  | total loss: 0.53477 | time: 0.046s
| Adam | epoch: 007 | loss: 0.53477 - acc: 0.7644 -- iter: 0496/1309
Training Step: 524  | total loss: 0.51047 | time: 0.048s
| Adam | epoch: 007 | loss: 0.51047 - acc: 0.7817 -- iter: 0512/1309
Training Step: 525  | total loss: 0.51911 | time: 0.050s
| Adam | epoch: 007 | loss: 0.51911 - acc: 0.7848 -- iter: 0528/1309
Training Step: 526  | total loss: 0.50304 | time: 0.052s
| Adam | epoch: 007 | loss: 0.50304 - acc: 0.7875 -- iter: 0544/1309
Training Step: 527  | total loss: 0.53327 | time: 0.054s
| Adam | epoch: 007 | loss: 0.53327 - acc: 0.7713 -- iter: 0560/1309
Training Step: 528  | total loss: 0.52133 | time: 0.056s
| Adam | epoch: 007 | loss: 0.52133 - acc: 0.7817 -- iter: 0576/1309
Training Step: 529  | total loss: 0.52077 | time: 0.058s
| Adam | epoch: 007 | loss: 0.52077 - acc: 0.7785 -- iter: 0592/1309
Training Step: 530  | total loss: 0.51134 | time: 0.060s
| Adam | epoch: 007 | loss: 0.51134 - acc: 0.7756 -- iter: 0608/1309
Training Step: 531  | total loss: 0.52793 | time: 0.061s
| Adam | epoch: 007 | loss: 0.52793 - acc: 0.7668 -- iter: 0624/1309
Training Step: 532  | total loss: 0.53633 | time: 0.063s
| Adam | epoch: 007 | loss: 0.53633 - acc: 0.7526 -- iter: 0640/1309
Training Step: 533  | total loss: 0.56544 | time: 0.064s
| Adam | epoch: 007 | loss: 0.56544 - acc: 0.7399 -- iter: 0656/1309
Training Step: 534  | total loss: 0.57008 | time: 0.067s
| Adam | epoch: 007 | loss: 0.57008 - acc: 0.7346 -- iter: 0672/1309
Training Step: 535  | total loss: 0.55225 | time: 0.069s
| Adam | epoch: 007 | loss: 0.55225 - acc: 0.7487 -- iter: 0688/1309
Training Step: 536  | total loss: 0.54141 | time: 0.071s
| Adam | epoch: 007 | loss: 0.54141 - acc: 0.7613 -- iter: 0704/1309
Training Step: 537  | total loss: 0.55901 | time: 0.073s
| Adam | epoch: 007 | loss: 0.55901 - acc: 0.7477 -- iter: 0720/1309
Training Step: 538  | total loss: 0.53952 | time: 0.074s
| Adam | epoch: 007 | loss: 0.53952 - acc: 0.7542 -- iter: 0736/1309
Training Step: 539  | total loss: 0.54292 | time: 0.076s
| Adam | epoch: 007 | loss: 0.54292 - acc: 0.7537 -- iter: 0752/1309
Training Step: 540  | total loss: 0.52790 | time: 0.078s
| Adam | epoch: 007 | loss: 0.52790 - acc: 0.7596 -- iter: 0768/1309
Training Step: 541  | total loss: 0.54670 | time: 0.079s
| Adam | epoch: 007 | loss: 0.54670 - acc: 0.7462 -- iter: 0784/1309
Training Step: 542  | total loss: 0.55452 | time: 0.081s
| Adam | epoch: 007 | loss: 0.55452 - acc: 0.7278 -- iter: 0800/1309
Training Step: 543  | total loss: 0.55327 | time: 0.082s
| Adam | epoch: 007 | loss: 0.55327 - acc: 0.7238 -- iter: 0816/1309
Training Step: 544  | total loss: 0.54683 | time: 0.083s
| Adam | epoch: 007 | loss: 0.54683 - acc: 0.7201 -- iter: 0832/1309
Training Step: 545  | total loss: 0.56435 | time: 0.085s
| Adam | epoch: 007 | loss: 0.56435 - acc: 0.7231 -- iter: 0848/1309
Training Step: 546  | total loss: 0.55500 | time: 0.087s
| Adam | epoch: 007 | loss: 0.55500 - acc: 0.7321 -- iter: 0864/1309
Training Step: 547  | total loss: 0.54074 | time: 0.089s
| Adam | epoch: 007 | loss: 0.54074 - acc: 0.7401 -- iter: 0880/1309
Training Step: 548  | total loss: 0.52447 | time: 0.091s
| Adam | epoch: 007 | loss: 0.52447 - acc: 0.7536 -- iter: 0896/1309
Training Step: 549  | total loss: 0.50511 | time: 0.093s
| Adam | epoch: 007 | loss: 0.50511 - acc: 0.7657 -- iter: 0912/1309
Training Step: 550  | total loss: 0.54437 | time: 0.094s
| Adam | epoch: 007 | loss: 0.54437 - acc: 0.7454 -- iter: 0928/1309
Training Step: 551  | total loss: 0.52763 | time: 0.097s
| Adam | epoch: 007 | loss: 0.52763 - acc: 0.7521 -- iter: 0944/1309
Training Step: 552  | total loss: 0.51454 | time: 0.099s
| Adam | epoch: 007 | loss: 0.51454 - acc: 0.7519 -- iter: 0960/1309
Training Step: 553  | total loss: 0.50807 | time: 0.101s
| Adam | epoch: 007 | loss: 0.50807 - acc: 0.7642 -- iter: 0976/1309
Training Step: 554  | total loss: 0.50723 | time: 0.103s
| Adam | epoch: 007 | loss: 0.50723 - acc: 0.7690 -- iter: 0992/1309
Training Step: 555  | total loss: 0.49708 | time: 0.104s
| Adam | epoch: 007 | loss: 0.49708 - acc: 0.7734 -- iter: 1008/1309
Training Step: 556  | total loss: 0.50963 | time: 0.106s
| Adam | epoch: 007 | loss: 0.50963 - acc: 0.7711 -- iter: 1024/1309
Training Step: 557  | total loss: 0.50226 | time: 0.108s
| Adam | epoch: 007 | loss: 0.50226 - acc: 0.7814 -- iter: 1040/1309
Training Step: 558  | total loss: 0.49675 | time: 0.109s
| Adam | epoch: 007 | loss: 0.49675 - acc: 0.7783 -- iter: 1056/1309
Training Step: 559  | total loss: 0.53106 | time: 0.111s
| Adam | epoch: 007 | loss: 0.53106 - acc: 0.7567 -- iter: 1072/1309
Training Step: 560  | total loss: 0.54078 | time: 0.114s
| Adam | epoch: 007 | loss: 0.54078 - acc: 0.7560 -- iter: 1088/1309
Training Step: 561  | total loss: 0.52641 | time: 0.115s
| Adam | epoch: 007 | loss: 0.52641 - acc: 0.7679 -- iter: 1104/1309
Training Step: 562  | total loss: 0.51551 | time: 0.116s
| Adam | epoch: 007 | loss: 0.51551 - acc: 0.7849 -- iter: 1120/1309
Training Step: 563  | total loss: 0.50809 | time: 0.118s
| Adam | epoch: 007 | loss: 0.50809 - acc: 0.7752 -- iter: 1136/1309
Training Step: 564  | total loss: 0.50317 | time: 0.119s
| Adam | epoch: 007 | loss: 0.50317 - acc: 0.7851 -- iter: 1152/1309
Training Step: 565  | total loss: 0.49234 | time: 0.121s
| Adam | epoch: 007 | loss: 0.49234 - acc: 0.7941 -- iter: 1168/1309
Training Step: 566  | total loss: 0.49292 | time: 0.123s
| Adam | epoch: 007 | loss: 0.49292 - acc: 0.7960 -- iter: 1184/1309
Training Step: 567  | total loss: 0.49524 | time: 0.125s
| Adam | epoch: 007 | loss: 0.49524 - acc: 0.7851 -- iter: 1200/1309
Training Step: 568  | total loss: 0.48280 | time: 0.126s
| Adam | epoch: 007 | loss: 0.48280 - acc: 0.8004 -- iter: 1216/1309
Training Step: 569  | total loss: 0.48665 | time: 0.128s
| Adam | epoch: 007 | loss: 0.48665 - acc: 0.7953 -- iter: 1232/1309
Training Step: 570  | total loss: 0.50536 | time: 0.130s
| Adam | epoch: 007 | loss: 0.50536 - acc: 0.7720 -- iter: 1248/1309
Training Step: 571  | total loss: 0.48447 | time: 0.132s
| Adam | epoch: 007 | loss: 0.48447 - acc: 0.7886 -- iter: 1264/1309
Training Step: 572  | total loss: 0.47503 | time: 0.133s
| Adam | epoch: 007 | loss: 0.47503 - acc: 0.7910 -- iter: 1280/1309
Training Step: 573  | total loss: 0.47593 | time: 0.135s
| Adam | epoch: 007 | loss: 0.47593 - acc: 0.7931 -- iter: 1296/1309
Training Step: 574  | total loss: 0.46507 | time: 0.137s
| Adam | epoch: 007 | loss: 0.46507 - acc: 0.8013 -- iter: 1309/1309
--
Training Step: 575  | total loss: 0.45163 | time: 0.001s
| Adam | epoch: 008 | loss: 0.45163 - acc: 0.8087 -- iter: 0016/1309
Training Step: 576  | total loss: 0.43425 | time: 0.003s
| Adam | epoch: 008 | loss: 0.43425 - acc: 0.8216 -- iter: 0032/1309
Training Step: 577  | total loss: 0.42927 | time: 0.004s
| Adam | epoch: 008 | loss: 0.42927 - acc: 0.8269 -- iter: 0048/1309
Training Step: 578  | total loss: 0.42977 | time: 0.006s
| Adam | epoch: 008 | loss: 0.42977 - acc: 0.8255 -- iter: 0064/1309
Training Step: 579  | total loss: 0.43133 | time: 0.007s
| Adam | epoch: 008 | loss: 0.43133 - acc: 0.8242 -- iter: 0080/1309
Training Step: 580  | total loss: 0.44549 | time: 0.009s
| Adam | epoch: 008 | loss: 0.44549 - acc: 0.8168 -- iter: 0096/1309
Training Step: 581  | total loss: 0.46184 | time: 0.011s
| Adam | epoch: 008 | loss: 0.46184 - acc: 0.8043 -- iter: 0112/1309
Training Step: 582  | total loss: 0.47488 | time: 0.013s
| Adam | epoch: 008 | loss: 0.47488 - acc: 0.7931 -- iter: 0128/1309
Training Step: 583  | total loss: 0.49583 | time: 0.014s
| Adam | epoch: 008 | loss: 0.49583 - acc: 0.7825 -- iter: 0144/1309
Training Step: 584  | total loss: 0.48896 | time: 0.016s
| Adam | epoch: 008 | loss: 0.48896 - acc: 0.7793 -- iter: 0160/1309
Training Step: 585  | total loss: 0.49286 | time: 0.017s
| Adam | epoch: 008 | loss: 0.49286 - acc: 0.7764 -- iter: 0176/1309
Training Step: 586  | total loss: 0.51516 | time: 0.018s
| Adam | epoch: 008 | loss: 0.51516 - acc: 0.7550 -- iter: 0192/1309
Training Step: 587  | total loss: 0.49245 | time: 0.020s
| Adam | epoch: 008 | loss: 0.49245 - acc: 0.7732 -- iter: 0208/1309
Training Step: 588  | total loss: 0.48159 | time: 0.021s
| Adam | epoch: 008 | loss: 0.48159 - acc: 0.7709 -- iter: 0224/1309
Training Step: 589  | total loss: 0.47609 | time: 0.022s
| Adam | epoch: 008 | loss: 0.47609 - acc: 0.7751 -- iter: 0240/1309
Training Step: 590  | total loss: 0.49694 | time: 0.023s
| Adam | epoch: 008 | loss: 0.49694 - acc: 0.7663 -- iter: 0256/1309
Training Step: 591  | total loss: 0.47654 | time: 0.024s
| Adam | epoch: 008 | loss: 0.47654 - acc: 0.7772 -- iter: 0272/1309
Training Step: 592  | total loss: 0.48972 | time: 0.026s
| Adam | epoch: 008 | loss: 0.48972 - acc: 0.7682 -- iter: 0288/1309
Training Step: 593  | total loss: 0.47927 | time: 0.027s
| Adam | epoch: 008 | loss: 0.47927 - acc: 0.7789 -- iter: 0304/1309
Training Step: 594  | total loss: 0.47988 | time: 0.029s
| Adam | epoch: 008 | loss: 0.47988 - acc: 0.7760 -- iter: 0320/1309
Training Step: 595  | total loss: 0.46967 | time: 0.030s
| Adam | epoch: 008 | loss: 0.46967 - acc: 0.7859 -- iter: 0336/1309
Training Step: 596  | total loss: 0.46078 | time: 0.031s
| Adam | epoch: 008 | loss: 0.46078 - acc: 0.7886 -- iter: 0352/1309
Training Step: 597  | total loss: 0.46151 | time: 0.033s
| Adam | epoch: 008 | loss: 0.46151 - acc: 0.7910 -- iter: 0368/1309
Training Step: 598  | total loss: 0.51036 | time: 0.034s
| Adam | epoch: 008 | loss: 0.51036 - acc: 0.7681 -- iter: 0384/1309
Training Step: 599  | total loss: 0.51625 | time: 0.035s
| Adam | epoch: 008 | loss: 0.51625 - acc: 0.7538 -- iter: 0400/1309
Training Step: 600  | total loss: 0.51751 | time: 0.037s
| Adam | epoch: 008 | loss: 0.51751 - acc: 0.7534 -- iter: 0416/1309
Training Step: 601  | total loss: 0.54959 | time: 0.038s
| Adam | epoch: 008 | loss: 0.54959 - acc: 0.7343 -- iter: 0432/1309
Training Step: 602  | total loss: 0.55091 | time: 0.039s
| Adam | epoch: 008 | loss: 0.55091 - acc: 0.7296 -- iter: 0448/1309
Training Step: 603  | total loss: 0.53518 | time: 0.040s
| Adam | epoch: 008 | loss: 0.53518 - acc: 0.7442 -- iter: 0464/1309
Training Step: 604  | total loss: 0.50611 | time: 0.041s
| Adam | epoch: 008 | loss: 0.50611 - acc: 0.7635 -- iter: 0480/1309
Training Step: 605  | total loss: 0.51859 | time: 0.043s
| Adam | epoch: 008 | loss: 0.51859 - acc: 0.7684 -- iter: 0496/1309
Training Step: 606  | total loss: 0.52411 | time: 0.044s
| Adam | epoch: 008 | loss: 0.52411 - acc: 0.7666 -- iter: 0512/1309
Training Step: 607  | total loss: 0.50681 | time: 0.046s
| Adam | epoch: 008 | loss: 0.50681 - acc: 0.7774 -- iter: 0528/1309
Training Step: 608  | total loss: 0.50746 | time: 0.047s
| Adam | epoch: 008 | loss: 0.50746 - acc: 0.7747 -- iter: 0544/1309
Training Step: 609  | total loss: 0.49022 | time: 0.048s
| Adam | epoch: 008 | loss: 0.49022 - acc: 0.7847 -- iter: 0560/1309
Training Step: 610  | total loss: 0.50546 | time: 0.049s
| Adam | epoch: 008 | loss: 0.50546 - acc: 0.7750 -- iter: 0576/1309
Training Step: 611  | total loss: 0.50127 | time: 0.051s
| Adam | epoch: 008 | loss: 0.50127 - acc: 0.7787 -- iter: 0592/1309
Training Step: 612  | total loss: 0.49836 | time: 0.053s
| Adam | epoch: 008 | loss: 0.49836 - acc: 0.7821 -- iter: 0608/1309
Training Step: 613  | total loss: 0.51253 | time: 0.054s
| Adam | epoch: 008 | loss: 0.51253 - acc: 0.7727 -- iter: 0624/1309
Training Step: 614  | total loss: 0.51118 | time: 0.056s
| Adam | epoch: 008 | loss: 0.51118 - acc: 0.7766 -- iter: 0640/1309
Training Step: 615  | total loss: 0.52477 | time: 0.058s
| Adam | epoch: 008 | loss: 0.52477 - acc: 0.7740 -- iter: 0656/1309
Training Step: 616  | total loss: 0.52575 | time: 0.060s
| Adam | epoch: 008 | loss: 0.52575 - acc: 0.7716 -- iter: 0672/1309
Training Step: 617  | total loss: 0.50953 | time: 0.062s
| Adam | epoch: 008 | loss: 0.50953 - acc: 0.7819 -- iter: 0688/1309
Training Step: 618  | total loss: 0.51470 | time: 0.063s
| Adam | epoch: 008 | loss: 0.51470 - acc: 0.7725 -- iter: 0704/1309
Training Step: 619  | total loss: 0.51273 | time: 0.065s
| Adam | epoch: 008 | loss: 0.51273 - acc: 0.7702 -- iter: 0720/1309
Training Step: 620  | total loss: 0.50168 | time: 0.066s
| Adam | epoch: 008 | loss: 0.50168 - acc: 0.7745 -- iter: 0736/1309
Training Step: 621  | total loss: 0.49579 | time: 0.069s
| Adam | epoch: 008 | loss: 0.49579 - acc: 0.7720 -- iter: 0752/1309
Training Step: 622  | total loss: 0.48117 | time: 0.070s
| Adam | epoch: 008 | loss: 0.48117 - acc: 0.7823 -- iter: 0768/1309
Training Step: 623  | total loss: 0.48462 | time: 0.072s
| Adam | epoch: 008 | loss: 0.48462 - acc: 0.7916 -- iter: 0784/1309
Training Step: 624  | total loss: 0.48916 | time: 0.073s
| Adam | epoch: 008 | loss: 0.48916 - acc: 0.7937 -- iter: 0800/1309
Training Step: 625  | total loss: 0.47881 | time: 0.075s
| Adam | epoch: 008 | loss: 0.47881 - acc: 0.7956 -- iter: 0816/1309
Training Step: 626  | total loss: 0.46795 | time: 0.077s
| Adam | epoch: 008 | loss: 0.46795 - acc: 0.7972 -- iter: 0832/1309
Training Step: 627  | total loss: 0.48190 | time: 0.079s
| Adam | epoch: 008 | loss: 0.48190 - acc: 0.7738 -- iter: 0848/1309
Training Step: 628  | total loss: 0.46538 | time: 0.080s
| Adam | epoch: 008 | loss: 0.46538 - acc: 0.7839 -- iter: 0864/1309
Training Step: 629  | total loss: 0.45033 | time: 0.082s
| Adam | epoch: 008 | loss: 0.45033 - acc: 0.7930 -- iter: 0880/1309
Training Step: 630  | total loss: 0.47304 | time: 0.084s
| Adam | epoch: 008 | loss: 0.47304 - acc: 0.7825 -- iter: 0896/1309
Training Step: 631  | total loss: 0.49517 | time: 0.085s
| Adam | epoch: 008 | loss: 0.49517 - acc: 0.7730 -- iter: 0912/1309
Training Step: 632  | total loss: 0.51404 | time: 0.086s
| Adam | epoch: 008 | loss: 0.51404 - acc: 0.7707 -- iter: 0928/1309
Training Step: 633  | total loss: 0.48802 | time: 0.088s
| Adam | epoch: 008 | loss: 0.48802 - acc: 0.7936 -- iter: 0944/1309
Training Step: 634  | total loss: 0.49819 | time: 0.089s
| Adam | epoch: 008 | loss: 0.49819 - acc: 0.7892 -- iter: 0960/1309
Training Step: 635  | total loss: 0.50742 | time: 0.091s
| Adam | epoch: 008 | loss: 0.50742 - acc: 0.7916 -- iter: 0976/1309
Training Step: 636  | total loss: 0.48459 | time: 0.092s
| Adam | epoch: 008 | loss: 0.48459 - acc: 0.8124 -- iter: 0992/1309
Training Step: 637  | total loss: 0.50694 | time: 0.094s
| Adam | epoch: 008 | loss: 0.50694 - acc: 0.7999 -- iter: 1008/1309
Training Step: 638  | total loss: 0.51037 | time: 0.095s
| Adam | epoch: 008 | loss: 0.51037 - acc: 0.8012 -- iter: 1024/1309
Training Step: 639  | total loss: 0.49790 | time: 0.097s
| Adam | epoch: 008 | loss: 0.49790 - acc: 0.8086 -- iter: 1040/1309
Training Step: 640  | total loss: 0.49168 | time: 0.098s
| Adam | epoch: 008 | loss: 0.49168 - acc: 0.8090 -- iter: 1056/1309
Training Step: 641  | total loss: 0.49478 | time: 0.100s
| Adam | epoch: 008 | loss: 0.49478 - acc: 0.8031 -- iter: 1072/1309
Training Step: 642  | total loss: 0.49194 | time: 0.101s
| Adam | epoch: 008 | loss: 0.49194 - acc: 0.8103 -- iter: 1088/1309
Training Step: 643  | total loss: 0.48483 | time: 0.102s
| Adam | epoch: 008 | loss: 0.48483 - acc: 0.8167 -- iter: 1104/1309
Training Step: 644  | total loss: 0.47439 | time: 0.104s
| Adam | epoch: 008 | loss: 0.47439 - acc: 0.8226 -- iter: 1120/1309
Training Step: 645  | total loss: 0.48475 | time: 0.106s
| Adam | epoch: 008 | loss: 0.48475 - acc: 0.8153 -- iter: 1136/1309
Training Step: 646  | total loss: 0.48995 | time: 0.108s
| Adam | epoch: 008 | loss: 0.48995 - acc: 0.8088 -- iter: 1152/1309
Training Step: 647  | total loss: 0.49606 | time: 0.109s
| Adam | epoch: 008 | loss: 0.49606 - acc: 0.8029 -- iter: 1168/1309
Training Step: 648  | total loss: 0.49851 | time: 0.111s
| Adam | epoch: 008 | loss: 0.49851 - acc: 0.7914 -- iter: 1184/1309
Training Step: 649  | total loss: 0.50606 | time: 0.113s
| Adam | epoch: 008 | loss: 0.50606 - acc: 0.7872 -- iter: 1200/1309
Training Step: 650  | total loss: 0.49225 | time: 0.116s
| Adam | epoch: 008 | loss: 0.49225 - acc: 0.7897 -- iter: 1216/1309
Training Step: 651  | total loss: 0.49987 | time: 0.118s
| Adam | epoch: 008 | loss: 0.49987 - acc: 0.7858 -- iter: 1232/1309
Training Step: 652  | total loss: 0.47790 | time: 0.119s
| Adam | epoch: 008 | loss: 0.47790 - acc: 0.8009 -- iter: 1248/1309
Training Step: 653  | total loss: 0.47338 | time: 0.121s
| Adam | epoch: 008 | loss: 0.47338 - acc: 0.8021 -- iter: 1264/1309
Training Step: 654  | total loss: 0.47125 | time: 0.123s
| Adam | epoch: 008 | loss: 0.47125 - acc: 0.7969 -- iter: 1280/1309
Training Step: 655  | total loss: 0.45533 | time: 0.125s
| Adam | epoch: 008 | loss: 0.45533 - acc: 0.8110 -- iter: 1296/1309
Training Step: 656  | total loss: 0.45371 | time: 0.127s
| Adam | epoch: 008 | loss: 0.45371 - acc: 0.8174 -- iter: 1309/1309
--
Training Step: 657  | total loss: 0.44542 | time: 0.002s
| Adam | epoch: 009 | loss: 0.44542 - acc: 0.8231 -- iter: 0016/1309
Training Step: 658  | total loss: 0.46861 | time: 0.003s
| Adam | epoch: 009 | loss: 0.46861 - acc: 0.8033 -- iter: 0032/1309
Training Step: 659  | total loss: 0.45246 | time: 0.005s
| Adam | epoch: 009 | loss: 0.45246 - acc: 0.8105 -- iter: 0048/1309
Training Step: 660  | total loss: 0.45042 | time: 0.006s
| Adam | epoch: 009 | loss: 0.45042 - acc: 0.7982 -- iter: 0064/1309
Training Step: 661  | total loss: 0.45656 | time: 0.007s
| Adam | epoch: 009 | loss: 0.45656 - acc: 0.7934 -- iter: 0080/1309
Training Step: 662  | total loss: 0.47842 | time: 0.009s
| Adam | epoch: 009 | loss: 0.47842 - acc: 0.7828 -- iter: 0096/1309
Training Step: 663  | total loss: 0.46723 | time: 0.011s
| Adam | epoch: 009 | loss: 0.46723 - acc: 0.7982 -- iter: 0112/1309
Training Step: 664  | total loss: 0.48156 | time: 0.013s
| Adam | epoch: 009 | loss: 0.48156 - acc: 0.7953 -- iter: 0128/1309
Training Step: 665  | total loss: 0.49456 | time: 0.014s
| Adam | epoch: 009 | loss: 0.49456 - acc: 0.7927 -- iter: 0144/1309
Training Step: 666  | total loss: 0.48248 | time: 0.015s
| Adam | epoch: 009 | loss: 0.48248 - acc: 0.8010 -- iter: 0160/1309
Training Step: 667  | total loss: 0.49316 | time: 0.016s
| Adam | epoch: 009 | loss: 0.49316 - acc: 0.7959 -- iter: 0176/1309
Training Step: 668  | total loss: 0.49566 | time: 0.018s
| Adam | epoch: 009 | loss: 0.49566 - acc: 0.7928 -- iter: 0192/1309
Training Step: 669  | total loss: 0.49530 | time: 0.019s
| Adam | epoch: 009 | loss: 0.49530 - acc: 0.7928 -- iter: 0208/1309
Training Step: 670  | total loss: 0.49899 | time: 0.021s
| Adam | epoch: 009 | loss: 0.49899 - acc: 0.7947 -- iter: 0224/1309
Training Step: 671  | total loss: 0.49260 | time: 0.022s
| Adam | epoch: 009 | loss: 0.49260 - acc: 0.7965 -- iter: 0240/1309
Training Step: 672  | total loss: 0.47663 | time: 0.023s
| Adam | epoch: 009 | loss: 0.47663 - acc: 0.8044 -- iter: 0256/1309
Training Step: 673  | total loss: 0.49912 | time: 0.024s
| Adam | epoch: 009 | loss: 0.49912 - acc: 0.7739 -- iter: 0272/1309
Training Step: 674  | total loss: 0.51140 | time: 0.026s
| Adam | epoch: 009 | loss: 0.51140 - acc: 0.7590 -- iter: 0288/1309
Training Step: 675  | total loss: 0.50669 | time: 0.027s
| Adam | epoch: 009 | loss: 0.50669 - acc: 0.7644 -- iter: 0304/1309
Training Step: 676  | total loss: 0.51706 | time: 0.029s
| Adam | epoch: 009 | loss: 0.51706 - acc: 0.7629 -- iter: 0320/1309
Training Step: 677  | total loss: 0.50147 | time: 0.031s
| Adam | epoch: 009 | loss: 0.50147 - acc: 0.7742 -- iter: 0336/1309
Training Step: 678  | total loss: 0.54610 | time: 0.034s
| Adam | epoch: 009 | loss: 0.54610 - acc: 0.7530 -- iter: 0352/1309
Training Step: 679  | total loss: 0.55562 | time: 0.037s
| Adam | epoch: 009 | loss: 0.55562 - acc: 0.7339 -- iter: 0368/1309
Training Step: 680  | total loss: 0.52855 | time: 0.040s
| Adam | epoch: 009 | loss: 0.52855 - acc: 0.7480 -- iter: 0384/1309
Training Step: 681  | total loss: 0.53305 | time: 0.043s
| Adam | epoch: 009 | loss: 0.53305 - acc: 0.7420 -- iter: 0400/1309
Training Step: 682  | total loss: 0.54275 | time: 0.045s
| Adam | epoch: 009 | loss: 0.54275 - acc: 0.7365 -- iter: 0416/1309
Training Step: 683  | total loss: 0.52390 | time: 0.047s
| Adam | epoch: 009 | loss: 0.52390 - acc: 0.7504 -- iter: 0432/1309
Training Step: 684  | total loss: 0.51765 | time: 0.049s
| Adam | epoch: 009 | loss: 0.51765 - acc: 0.7566 -- iter: 0448/1309
Training Step: 685  | total loss: 0.53161 | time: 0.051s
| Adam | epoch: 009 | loss: 0.53161 - acc: 0.7434 -- iter: 0464/1309
Training Step: 686  | total loss: 0.55209 | time: 0.053s
| Adam | epoch: 009 | loss: 0.55209 - acc: 0.7378 -- iter: 0480/1309
Training Step: 687  | total loss: 0.52572 | time: 0.054s
| Adam | epoch: 009 | loss: 0.52572 - acc: 0.7578 -- iter: 0496/1309
Training Step: 688  | total loss: 0.52180 | time: 0.055s
| Adam | epoch: 009 | loss: 0.52180 - acc: 0.7570 -- iter: 0512/1309
Training Step: 689  | total loss: 0.51203 | time: 0.057s
| Adam | epoch: 009 | loss: 0.51203 - acc: 0.7626 -- iter: 0528/1309
Training Step: 690  | total loss: 0.51826 | time: 0.060s
| Adam | epoch: 009 | loss: 0.51826 - acc: 0.7551 -- iter: 0544/1309
Training Step: 691  | total loss: 0.51455 | time: 0.062s
| Adam | epoch: 009 | loss: 0.51455 - acc: 0.7483 -- iter: 0560/1309
Training Step: 692  | total loss: 0.51711 | time: 0.063s
| Adam | epoch: 009 | loss: 0.51711 - acc: 0.7422 -- iter: 0576/1309
Training Step: 693  | total loss: 0.52172 | time: 0.064s
| Adam | epoch: 009 | loss: 0.52172 - acc: 0.7243 -- iter: 0592/1309
Training Step: 694  | total loss: 0.51393 | time: 0.066s
| Adam | epoch: 009 | loss: 0.51393 - acc: 0.7331 -- iter: 0608/1309
Training Step: 695  | total loss: 0.50801 | time: 0.067s
| Adam | epoch: 009 | loss: 0.50801 - acc: 0.7348 -- iter: 0624/1309
Training Step: 696  | total loss: 0.49051 | time: 0.068s
| Adam | epoch: 009 | loss: 0.49051 - acc: 0.7550 -- iter: 0640/1309
Training Step: 697  | total loss: 0.49148 | time: 0.070s
| Adam | epoch: 009 | loss: 0.49148 - acc: 0.7608 -- iter: 0656/1309
Training Step: 698  | total loss: 0.47770 | time: 0.072s
| Adam | epoch: 009 | loss: 0.47770 - acc: 0.7722 -- iter: 0672/1309
Training Step: 699  | total loss: 0.46191 | time: 0.073s
| Adam | epoch: 009 | loss: 0.46191 - acc: 0.7887 -- iter: 0688/1309
Training Step: 700  | total loss: 0.45959 | time: 0.075s
| Adam | epoch: 009 | loss: 0.45959 - acc: 0.7849 -- iter: 0704/1309
Training Step: 701  | total loss: 0.46706 | time: 0.076s
| Adam | epoch: 009 | loss: 0.46706 - acc: 0.7814 -- iter: 0720/1309
Training Step: 702  | total loss: 0.49511 | time: 0.077s
| Adam | epoch: 009 | loss: 0.49511 - acc: 0.7720 -- iter: 0736/1309
Training Step: 703  | total loss: 0.49728 | time: 0.078s
| Adam | epoch: 009 | loss: 0.49728 - acc: 0.7760 -- iter: 0752/1309
Training Step: 704  | total loss: 0.46655 | time: 0.080s
| Adam | epoch: 009 | loss: 0.46655 - acc: 0.7984 -- iter: 0768/1309
Training Step: 705  | total loss: 0.45369 | time: 0.082s
| Adam | epoch: 009 | loss: 0.45369 - acc: 0.8061 -- iter: 0784/1309
Training Step: 706  | total loss: 0.45914 | time: 0.083s
| Adam | epoch: 009 | loss: 0.45914 - acc: 0.8005 -- iter: 0800/1309
Training Step: 707  | total loss: 0.43811 | time: 0.084s
| Adam | epoch: 009 | loss: 0.43811 - acc: 0.8079 -- iter: 0816/1309
Training Step: 708  | total loss: 0.43696 | time: 0.086s
| Adam | epoch: 009 | loss: 0.43696 - acc: 0.8084 -- iter: 0832/1309
Training Step: 709  | total loss: 0.45599 | time: 0.087s
| Adam | epoch: 009 | loss: 0.45599 - acc: 0.7963 -- iter: 0848/1309
Training Step: 710  | total loss: 0.48200 | time: 0.089s
| Adam | epoch: 009 | loss: 0.48200 - acc: 0.7917 -- iter: 0864/1309
Training Step: 711  | total loss: 0.48467 | time: 0.091s
| Adam | epoch: 009 | loss: 0.48467 - acc: 0.7813 -- iter: 0880/1309
Training Step: 712  | total loss: 0.48030 | time: 0.092s
| Adam | epoch: 009 | loss: 0.48030 - acc: 0.7844 -- iter: 0896/1309
Training Step: 713  | total loss: 0.46179 | time: 0.094s
| Adam | epoch: 009 | loss: 0.46179 - acc: 0.7934 -- iter: 0912/1309
Training Step: 714  | total loss: 0.46415 | time: 0.096s
| Adam | epoch: 009 | loss: 0.46415 - acc: 0.7953 -- iter: 0928/1309
Training Step: 715  | total loss: 0.47558 | time: 0.099s
| Adam | epoch: 009 | loss: 0.47558 - acc: 0.7783 -- iter: 0944/1309
Training Step: 716  | total loss: 0.48572 | time: 0.101s
| Adam | epoch: 009 | loss: 0.48572 - acc: 0.7817 -- iter: 0960/1309
Training Step: 717  | total loss: 0.49045 | time: 0.102s
| Adam | epoch: 009 | loss: 0.49045 - acc: 0.7723 -- iter: 0976/1309
Training Step: 718  | total loss: 0.52536 | time: 0.103s
| Adam | epoch: 009 | loss: 0.52536 - acc: 0.7576 -- iter: 0992/1309
Training Step: 719  | total loss: 0.53046 | time: 0.105s
| Adam | epoch: 009 | loss: 0.53046 - acc: 0.7631 -- iter: 1008/1309
Training Step: 720  | total loss: 0.50053 | time: 0.106s
| Adam | epoch: 009 | loss: 0.50053 - acc: 0.7805 -- iter: 1024/1309
Training Step: 721  | total loss: 0.50537 | time: 0.108s
| Adam | epoch: 009 | loss: 0.50537 - acc: 0.7837 -- iter: 1040/1309
Training Step: 722  | total loss: 0.49361 | time: 0.110s
| Adam | epoch: 009 | loss: 0.49361 - acc: 0.7866 -- iter: 1056/1309
Training Step: 723  | total loss: 0.47120 | time: 0.112s
| Adam | epoch: 009 | loss: 0.47120 - acc: 0.8079 -- iter: 1072/1309
Training Step: 724  | total loss: 0.44758 | time: 0.114s
| Adam | epoch: 009 | loss: 0.44758 - acc: 0.8209 -- iter: 1088/1309
Training Step: 725  | total loss: 0.47775 | time: 0.115s
| Adam | epoch: 009 | loss: 0.47775 - acc: 0.8075 -- iter: 1104/1309
Training Step: 726  | total loss: 0.47967 | time: 0.117s
| Adam | epoch: 009 | loss: 0.47967 - acc: 0.8080 -- iter: 1120/1309
Training Step: 727  | total loss: 0.48628 | time: 0.118s
| Adam | epoch: 009 | loss: 0.48628 - acc: 0.7960 -- iter: 1136/1309
Training Step: 728  | total loss: 0.47911 | time: 0.119s
| Adam | epoch: 009 | loss: 0.47911 - acc: 0.8039 -- iter: 1152/1309
Training Step: 729  | total loss: 0.48971 | time: 0.120s
| Adam | epoch: 009 | loss: 0.48971 - acc: 0.8048 -- iter: 1168/1309
Training Step: 730  | total loss: 0.49841 | time: 0.122s
| Adam | epoch: 009 | loss: 0.49841 - acc: 0.8055 -- iter: 1184/1309
Training Step: 731  | total loss: 0.47710 | time: 0.123s
| Adam | epoch: 009 | loss: 0.47710 - acc: 0.8187 -- iter: 1200/1309
Training Step: 732  | total loss: 0.47382 | time: 0.125s
| Adam | epoch: 009 | loss: 0.47382 - acc: 0.8056 -- iter: 1216/1309
Training Step: 733  | total loss: 0.48273 | time: 0.127s
| Adam | epoch: 009 | loss: 0.48273 - acc: 0.8000 -- iter: 1232/1309
Training Step: 734  | total loss: 0.51367 | time: 0.128s
| Adam | epoch: 009 | loss: 0.51367 - acc: 0.7700 -- iter: 1248/1309
Training Step: 735  | total loss: 0.52516 | time: 0.130s
| Adam | epoch: 009 | loss: 0.52516 - acc: 0.7555 -- iter: 1264/1309
Training Step: 736  | total loss: 0.52920 | time: 0.131s
| Adam | epoch: 009 | loss: 0.52920 - acc: 0.7612 -- iter: 1280/1309
Training Step: 737  | total loss: 0.51087 | time: 0.133s
| Adam | epoch: 009 | loss: 0.51087 - acc: 0.7726 -- iter: 1296/1309
Training Step: 738  | total loss: 0.54048 | time: 0.135s
| Adam | epoch: 009 | loss: 0.54048 - acc: 0.7453 -- iter: 1309/1309
--
Training Step: 739  | total loss: 0.54681 | time: 0.001s
| Adam | epoch: 010 | loss: 0.54681 - acc: 0.7333 -- iter: 0016/1309
Training Step: 740  | total loss: 0.54126 | time: 0.003s
| Adam | epoch: 010 | loss: 0.54126 - acc: 0.7350 -- iter: 0032/1309
Training Step: 741  | total loss: 0.54057 | time: 0.004s
| Adam | epoch: 010 | loss: 0.54057 - acc: 0.7302 -- iter: 0048/1309
Training Step: 742  | total loss: 0.53359 | time: 0.005s
| Adam | epoch: 010 | loss: 0.53359 - acc: 0.7322 -- iter: 0064/1309
Training Step: 743  | total loss: 0.55155 | time: 0.007s
| Adam | epoch: 010 | loss: 0.55155 - acc: 0.7215 -- iter: 0080/1309
Training Step: 744  | total loss: 0.55301 | time: 0.010s
| Adam | epoch: 010 | loss: 0.55301 - acc: 0.7243 -- iter: 0096/1309
Training Step: 745  | total loss: 0.54511 | time: 0.013s
| Adam | epoch: 010 | loss: 0.54511 - acc: 0.7332 -- iter: 0112/1309
Training Step: 746  | total loss: 0.52560 | time: 0.016s
| Adam | epoch: 010 | loss: 0.52560 - acc: 0.7411 -- iter: 0128/1309
Training Step: 747  | total loss: 0.54638 | time: 0.019s
| Adam | epoch: 010 | loss: 0.54638 - acc: 0.7285 -- iter: 0144/1309
Training Step: 748  | total loss: 0.56554 | time: 0.021s
| Adam | epoch: 010 | loss: 0.56554 - acc: 0.7172 -- iter: 0160/1309
Training Step: 749  | total loss: 0.56046 | time: 0.023s
| Adam | epoch: 010 | loss: 0.56046 - acc: 0.7142 -- iter: 0176/1309
Training Step: 750  | total loss: 0.54042 | time: 0.024s
| Adam | epoch: 010 | loss: 0.54042 - acc: 0.7366 -- iter: 0192/1309
Training Step: 751  | total loss: 0.53300 | time: 0.026s
| Adam | epoch: 010 | loss: 0.53300 - acc: 0.7442 -- iter: 0208/1309
Training Step: 752  | total loss: 0.54192 | time: 0.027s
| Adam | epoch: 010 | loss: 0.54192 - acc: 0.7385 -- iter: 0224/1309
Training Step: 753  | total loss: 0.52149 | time: 0.029s
| Adam | epoch: 010 | loss: 0.52149 - acc: 0.7584 -- iter: 0240/1309
Training Step: 754  | total loss: 0.50906 | time: 0.031s
| Adam | epoch: 010 | loss: 0.50906 - acc: 0.7638 -- iter: 0256/1309
Training Step: 755  | total loss: 0.52013 | time: 0.033s
| Adam | epoch: 010 | loss: 0.52013 - acc: 0.7562 -- iter: 0272/1309
Training Step: 756  | total loss: 0.50683 | time: 0.035s
| Adam | epoch: 010 | loss: 0.50683 - acc: 0.7681 -- iter: 0288/1309
Training Step: 757  | total loss: 0.48909 | time: 0.038s
| Adam | epoch: 010 | loss: 0.48909 - acc: 0.7850 -- iter: 0304/1309
Training Step: 758  | total loss: 0.48668 | time: 0.042s
| Adam | epoch: 010 | loss: 0.48668 - acc: 0.7940 -- iter: 0320/1309
Training Step: 759  | total loss: 0.50074 | time: 0.045s
| Adam | epoch: 010 | loss: 0.50074 - acc: 0.7833 -- iter: 0336/1309
Training Step: 760  | total loss: 0.49426 | time: 0.049s
| Adam | epoch: 010 | loss: 0.49426 - acc: 0.7738 -- iter: 0352/1309
Training Step: 761  | total loss: 0.46947 | time: 0.054s
| Adam | epoch: 010 | loss: 0.46947 - acc: 0.7901 -- iter: 0368/1309
Training Step: 762  | total loss: 0.46870 | time: 0.056s
| Adam | epoch: 010 | loss: 0.46870 - acc: 0.7986 -- iter: 0384/1309
Training Step: 763  | total loss: 0.49042 | time: 0.057s
| Adam | epoch: 010 | loss: 0.49042 - acc: 0.7938 -- iter: 0400/1309
Training Step: 764  | total loss: 0.49040 | time: 0.059s
| Adam | epoch: 010 | loss: 0.49040 - acc: 0.7831 -- iter: 0416/1309
Training Step: 765  | total loss: 0.51674 | time: 0.062s
| Adam | epoch: 010 | loss: 0.51674 - acc: 0.7673 -- iter: 0432/1309
Training Step: 766  | total loss: 0.49129 | time: 0.065s
| Adam | epoch: 010 | loss: 0.49129 - acc: 0.7843 -- iter: 0448/1309
Training Step: 767  | total loss: 0.49232 | time: 0.067s
| Adam | epoch: 010 | loss: 0.49232 - acc: 0.7872 -- iter: 0464/1309
Training Step: 768  | total loss: 0.48539 | time: 0.071s
| Adam | epoch: 010 | loss: 0.48539 - acc: 0.7897 -- iter: 0480/1309
Training Step: 769  | total loss: 0.49210 | time: 0.073s
| Adam | epoch: 010 | loss: 0.49210 - acc: 0.7857 -- iter: 0496/1309
Training Step: 770  | total loss: 0.48109 | time: 0.074s
| Adam | epoch: 010 | loss: 0.48109 - acc: 0.8009 -- iter: 0512/1309
Training Step: 771  | total loss: 0.49136 | time: 0.076s
| Adam | epoch: 010 | loss: 0.49136 - acc: 0.8021 -- iter: 0528/1309
Training Step: 772  | total loss: 0.50172 | time: 0.083s
| Adam | epoch: 010 | loss: 0.50172 - acc: 0.7969 -- iter: 0544/1309
Training Step: 773  | total loss: 0.52280 | time: 0.084s
| Adam | epoch: 010 | loss: 0.52280 - acc: 0.7672 -- iter: 0560/1309
Training Step: 774  | total loss: 0.51633 | time: 0.086s
| Adam | epoch: 010 | loss: 0.51633 - acc: 0.7717 -- iter: 0576/1309
Training Step: 775  | total loss: 0.51257 | time: 0.092s
| Adam | epoch: 010 | loss: 0.51257 - acc: 0.7695 -- iter: 0592/1309
Training Step: 776  | total loss: 0.54708 | time: 0.095s
| Adam | epoch: 010 | loss: 0.54708 - acc: 0.7488 -- iter: 0608/1309
Training Step: 777  | total loss: 0.51498 | time: 0.097s
| Adam | epoch: 010 | loss: 0.51498 - acc: 0.7677 -- iter: 0624/1309
Training Step: 778  | total loss: 0.52630 | time: 0.100s
| Adam | epoch: 010 | loss: 0.52630 - acc: 0.7659 -- iter: 0640/1309
Training Step: 779  | total loss: 0.52499 | time: 0.102s
| Adam | epoch: 010 | loss: 0.52499 - acc: 0.7706 -- iter: 0656/1309
Training Step: 780  | total loss: 0.53666 | time: 0.104s
| Adam | epoch: 010 | loss: 0.53666 - acc: 0.7560 -- iter: 0672/1309
Training Step: 781  | total loss: 0.55634 | time: 0.106s
| Adam | epoch: 010 | loss: 0.55634 - acc: 0.7429 -- iter: 0688/1309
Training Step: 782  | total loss: 0.53857 | time: 0.108s
| Adam | epoch: 010 | loss: 0.53857 - acc: 0.7561 -- iter: 0704/1309
Training Step: 783  | total loss: 0.52455 | time: 0.110s
| Adam | epoch: 010 | loss: 0.52455 - acc: 0.7618 -- iter: 0720/1309
Training Step: 784  | total loss: 0.50554 | time: 0.119s
| Adam | epoch: 010 | loss: 0.50554 - acc: 0.7793 -- iter: 0736/1309
Training Step: 785  | total loss: 0.50627 | time: 0.125s
| Adam | epoch: 010 | loss: 0.50627 - acc: 0.7764 -- iter: 0752/1309
Training Step: 786  | total loss: 0.50865 | time: 0.128s
| Adam | epoch: 010 | loss: 0.50865 - acc: 0.7800 -- iter: 0768/1309
Training Step: 787  | total loss: 0.53149 | time: 0.130s
| Adam | epoch: 010 | loss: 0.53149 - acc: 0.7708 -- iter: 0784/1309
Training Step: 788  | total loss: 0.51452 | time: 0.132s
| Adam | epoch: 010 | loss: 0.51452 - acc: 0.7812 -- iter: 0800/1309
Training Step: 789  | total loss: 0.48958 | time: 0.133s
| Adam | epoch: 010 | loss: 0.48958 - acc: 0.7968 -- iter: 0816/1309
Training Step: 790  | total loss: 0.48564 | time: 0.136s
| Adam | epoch: 010 | loss: 0.48564 - acc: 0.7921 -- iter: 0832/1309
Training Step: 791  | total loss: 0.47719 | time: 0.138s
| Adam | epoch: 010 | loss: 0.47719 - acc: 0.8004 -- iter: 0848/1309
Training Step: 792  | total loss: 0.46914 | time: 0.140s
| Adam | epoch: 010 | loss: 0.46914 - acc: 0.8079 -- iter: 0864/1309
Training Step: 793  | total loss: 0.45354 | time: 0.141s
| Adam | epoch: 010 | loss: 0.45354 - acc: 0.8146 -- iter: 0880/1309
Training Step: 794  | total loss: 0.48024 | time: 0.142s
| Adam | epoch: 010 | loss: 0.48024 - acc: 0.8019 -- iter: 0896/1309
Training Step: 795  | total loss: 0.47180 | time: 0.144s
| Adam | epoch: 010 | loss: 0.47180 - acc: 0.8092 -- iter: 0912/1309
Training Step: 796  | total loss: 0.45517 | time: 0.146s
| Adam | epoch: 010 | loss: 0.45517 - acc: 0.8220 -- iter: 0928/1309
Training Step: 797  | total loss: 0.44984 | time: 0.148s
| Adam | epoch: 010 | loss: 0.44984 - acc: 0.8211 -- iter: 0944/1309
Training Step: 798  | total loss: 0.44778 | time: 0.149s
| Adam | epoch: 010 | loss: 0.44778 - acc: 0.8202 -- iter: 0960/1309
Training Step: 799  | total loss: 0.43565 | time: 0.151s
| Adam | epoch: 010 | loss: 0.43565 - acc: 0.8194 -- iter: 0976/1309
Training Step: 800  | total loss: 0.44694 | time: 0.152s
| Adam | epoch: 010 | loss: 0.44694 - acc: 0.8125 -- iter: 0992/1309
Training Step: 801  | total loss: 0.45607 | time: 0.154s
| Adam | epoch: 010 | loss: 0.45607 - acc: 0.8062 -- iter: 1008/1309
Training Step: 802  | total loss: 0.46129 | time: 0.155s
| Adam | epoch: 010 | loss: 0.46129 - acc: 0.8006 -- iter: 1024/1309
Training Step: 803  | total loss: 0.46656 | time: 0.156s
| Adam | epoch: 010 | loss: 0.46656 - acc: 0.8018 -- iter: 1040/1309
Training Step: 804  | total loss: 0.47377 | time: 0.158s
| Adam | epoch: 010 | loss: 0.47377 - acc: 0.7966 -- iter: 1056/1309
Training Step: 805  | total loss: 0.49502 | time: 0.160s
| Adam | epoch: 010 | loss: 0.49502 - acc: 0.7795 -- iter: 1072/1309
Training Step: 806  | total loss: 0.48004 | time: 0.161s
| Adam | epoch: 010 | loss: 0.48004 - acc: 0.7828 -- iter: 1088/1309
Training Step: 807  | total loss: 0.48547 | time: 0.163s
| Adam | epoch: 010 | loss: 0.48547 - acc: 0.7670 -- iter: 1104/1309
Training Step: 808  | total loss: 0.50200 | time: 0.165s
| Adam | epoch: 010 | loss: 0.50200 - acc: 0.7528 -- iter: 1120/1309
Training Step: 809  | total loss: 0.52022 | time: 0.166s
| Adam | epoch: 010 | loss: 0.52022 - acc: 0.7650 -- iter: 1136/1309
Training Step: 810  | total loss: 0.54812 | time: 0.167s
| Adam | epoch: 010 | loss: 0.54812 - acc: 0.7635 -- iter: 1152/1309
Training Step: 811  | total loss: 0.56484 | time: 0.169s
| Adam | epoch: 010 | loss: 0.56484 - acc: 0.7497 -- iter: 1168/1309
Training Step: 812  | total loss: 0.65704 | time: 0.170s
| Adam | epoch: 010 | loss: 0.65704 - acc: 0.7309 -- iter: 1184/1309
Training Step: 813  | total loss: 0.66676 | time: 0.172s
| Adam | epoch: 010 | loss: 0.66676 - acc: 0.7204 -- iter: 1200/1309
Training Step: 814  | total loss: 0.64706 | time: 0.174s
| Adam | epoch: 010 | loss: 0.64706 - acc: 0.7358 -- iter: 1216/1309
Training Step: 815  | total loss: 0.62997 | time: 0.175s
| Adam | epoch: 010 | loss: 0.62997 - acc: 0.7372 -- iter: 1232/1309
Training Step: 816  | total loss: 0.60231 | time: 0.177s
| Adam | epoch: 010 | loss: 0.60231 - acc: 0.7448 -- iter: 1248/1309
Training Step: 817  | total loss: 0.58623 | time: 0.178s
| Adam | epoch: 010 | loss: 0.58623 - acc: 0.7578 -- iter: 1264/1309
Training Step: 818  | total loss: 0.57494 | time: 0.181s
| Adam | epoch: 010 | loss: 0.57494 - acc: 0.7633 -- iter: 1280/1309
Training Step: 819  | total loss: 0.56759 | time: 0.182s
| Adam | epoch: 010 | loss: 0.56759 - acc: 0.7619 -- iter: 1296/1309
Training Step: 820  | total loss: 0.55193 | time: 0.184s
| Adam | epoch: 010 | loss: 0.55193 - acc: 0.7732 -- iter: 1309/1309
--
DiCaprio Surviving Rate: 0.09682169
Winslet Surviving Rate: 0.32949215

进程已结束,退出代码 0

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

海宝7号

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值