TrainData2algorthms

1.先把数据集进行读取

import pandas as pd
import matplotlib.pyplot as plt
with open('sourcedata2.csv')as f:
    df=pd.read_csv(f,header=0)
df
.dataframe thead tr:only-child th { text-align: right; } .dataframe thead th { text-align: left; } .dataframe tbody tr th { vertical-align: top; }
timeRollingSpeedRollingForceEntranceThicknessOutletThicknessPost-tensionForcePre-tensionForceVibration
018:35:56.722.1168163600.0110310.0043930.0000.000.113
118:35:57.582.1413255900000.0109970.0044574.6980.000.056
218:35:59.082.1762268700000.0109580.0045029.3160.000.052
318:36:00.372.1728273100000.0109550.0043137.35316.120.050
418:36:03.382.1765276800000.0109470.0043217.17814.500.052
518:36:05.092.1680279400000.0110380.0042957.78412.530.034
618:36:07.452.1492276300000.0110500.0043457.74612.990.063
718:36:11.532.1469272300000.0109860.0044237.86413.430.094
818:36:16.682.1963273300000.0110600.0044817.39813.360.081
918:36:18.612.1896269800000.0110500.0045267.62312.170.050
1018:36:21.182.2060268300000.0110090.0045468.42615.380.065
1118:36:22.692.2119272700000.0110330.0045207.45713.130.090
1218:36:23.972.2040271200000.0109810.0045137.60811.520.062
1318:36:26.122.1947275500000.0110290.0044638.15012.070.044
1418:36:29.552.1801273000000.0109570.0044837.41014.780.032
1518:36:31.912.1771272300000.0110390.0044969.11513.250.050
1618:36:36.642.1487275400000.0109380.0044947.22313.150.024
1718:36:39.212.1390275800000.0107200.0044628.00515.700.026
1818:36:43.072.1655281900000.0109410.0044746.87411.990.054
1918:36:51.012.1704276400000.0111070.0045077.23913.120.031
2018:36:54.022.1745275500000.0111760.0045477.68113.470.060
2118:37:00.242.2496263800000.0112070.0048558.70914.290.043
2218:37:02.822.2178267200000.0112560.0047957.88613.030.051
2318:37:05.182.1716268400000.0112730.0047788.05111.990.041
2418:37:09.902.1326269200000.0112550.0047727.51112.420.034
2518:37:13.122.0962269200000.0112220.0047557.94214.490.103
2618:37:19.132.0932269800000.0112500.0047347.40912.750.005
2718:37:26.642.0942271400000.0112210.0047518.11612.670.005
2818:37:29.212.0957273600000.0112380.0047297.14713.370.009
2918:37:31.792.0925269300000.0112350.0047617.63412.880.002
3018:37:32.862.0994272100000.0113110.0047637.90713.25NaN
3118:37:34.362.0975270800000.0112860.0047866.59413.94NaN
3218:37:36.942.0948268100000.0113040.0047797.91313.54NaN
3318:37:45.732.0981265200000.0113160.0048757.79612.86NaN
3418:37:49.602.0885262800000.0113340.0048847.62813.52NaN
3518:37:52.172.0939263500000.0111730.0048857.02413.23NaN
3618:37:53.672.0945262400000.0111480.0048857.78510.98NaN
3718:37:55.182.0875285900000.0102760.0044550.00012.96NaN
3818:37:56.082.0772292700000.0102720.0043670.00011.81NaN
3918:37:57.972.0788316800000.0102720.0035050.00012.80NaN
4018:37:58.532.1316192100000.0102720.0032320.0000.00NaN
4118:38:02.040.5007603900.0102730.0038300.0000.00NaN

2进行数据的可视化分析

先看每个变量和振动之间的关系图

X=df[df.columns[1:6]]
y=df['Vibration']
plt.figure()
f,ax1=plt.subplots()
for i in range(1,7):
    number=320+i
    ax1.locator_params(nbins=3)
    ax1=plt.subplot(number)
    plt.title(list(df)[i])
    ax1.scatter(df[df.columns[i]],y)
plt.tight_layout(pad=0.4,w_pad=0.5,h_pad=1.0)
plt.show()
<matplotlib.figure.Figure at 0x7fd52c222b70>

这里写图片描述

变量之间的关系图绘制

入口厚度和出口厚度

fig=plt.figure()
ax=fig.add_subplot(111)
plt.axis([0.010,0.012,0.003,0.005])
ax.set_xlabel('EntranceThickness')
ax.set_ylabel('OutletThickness')
ax.scatter(df['EntranceThickness'],df['OutletThickness'])
plt.show()

这里写图片描述

这些数据也没有显示出较好的相关关系,波动还是比较明显

轧制力和轧值速度

fig=plt.figure()
ax=fig.add_subplot(111)
plt.axis([2.0,2.3,25000000,30000000])
ax.set_xlabel("RollingSpeed")
ax.set_ylabel("RollingForce")
ax.scatter(df['RollingSpeed'],df['RollingForce'])
plt.show()

这里写图片描述

选取坐标范围后,画出数据的散点图.
没有明显的线性关系.

3.绘制热力图矩阵(hotmap)
也就是相关系数矩阵

import pandas as pd
import seaborn as sns 
import matplotlib.pyplot as plt 
drop_elements=['time','Vibration']
train=train.drop(drop_elements,axis=1)
plt.figure(figsize=(8,6))
plt.title("Pearson Correlation of Features",y=1.0,size=15)
sns.heatmap(train.astype(float).corr(),linewidth=0.1,vmax=0.1,
           square=True,linecolor='white',annot=True)
plt.xticks(rotation=90)
plt.yticks(rotation=360)
plt.show()

这里写图片描述

这个热力图还是可以看出一些问题,例如 轧制力和前向张力的相关性较高,出口厚度和入口厚度的相关性也比较高.

3.训练算法模型

选择模型架构:

我们需要建立的是一个六个输入,一个输出的前向神经网络.

神经网络是一个双隐层,四层深度神经网络 6x10x5x1: 

df=df[0:30]
df
.dataframe thead tr:only-child th { text-align: right; } .dataframe thead th { text-align: left; } .dataframe tbody tr th { vertical-align: top; }
timeRollingSpeedRollingForceEntranceThicknessOutletThicknessPost-tensionForcePre-tensionForceVibration
018:35:56.722.1168163600.0110310.0043930.0000.000.113
118:35:57.582.1413255900000.0109970.0044574.6980.000.056
218:35:59.082.1762268700000.0109580.0045029.3160.000.052
318:36:00.372.1728273100000.0109550.0043137.35316.120.050
418:36:03.382.1765276800000.0109470.0043217.17814.500.052
518:36:05.092.1680279400000.0110380.0042957.78412.530.034
618:36:07.452.1492276300000.0110500.0043457.74612.990.063
718:36:11.532.1469272300000.0109860.0044237.86413.430.094
818:36:16.682.1963273300000.0110600.0044817.39813.360.081
918:36:18.612.1896269800000.0110500.0045267.62312.170.050
1018:36:21.182.2060268300000.0110090.0045468.42615.380.065
1118:36:22.692.2119272700000.0110330.0045207.45713.130.090
1218:36:23.972.2040271200000.0109810.0045137.60811.520.062
1318:36:26.122.1947275500000.0110290.0044638.15012.070.044
1418:36:29.552.1801273000000.0109570.0044837.41014.780.032
1518:36:31.912.1771272300000.0110390.0044969.11513.250.050
1618:36:36.642.1487275400000.0109380.0044947.22313.150.024
1718:36:39.212.1390275800000.0107200.0044628.00515.700.026
1818:36:43.072.1655281900000.0109410.0044746.87411.990.054
1918:36:51.012.1704276400000.0111070.0045077.23913.120.031
2018:36:54.022.1745275500000.0111760.0045477.68113.470.060
2118:37:00.242.2496263800000.0112070.0048558.70914.290.043
2218:37:02.822.2178267200000.0112560.0047957.88613.030.051
2318:37:05.182.1716268400000.0112730.0047788.05111.990.041
2418:37:09.902.1326269200000.0112550.0047727.51112.420.034
2518:37:13.122.0962269200000.0112220.0047557.94214.490.103
2618:37:19.132.0932269800000.0112500.0047347.40912.750.005
2718:37:26.642.0942271400000.0112210.0047518.11612.670.005
2818:37:29.212.0957273600000.0112380.0047297.14713.370.009
2918:37:31.792.0925269300000.0112350.0047617.63412.880.002
# 一些包的引入
from sklearn import datasets, cross_validation, metrics
from sklearn import preprocessing

from tensorflow.contrib import learn

from keras.models import Sequential
from keras.layers import Dense
x=df[df.columns[1:7]]
y=df['Vibration']
x_train,x_test,y_train,y_test=cross_validation.train_test_split(x,y,test_size=0.2)
# 将输入变量进行归一化处理
scaler=preprocessing.StandardScaler()
x_trian=scaler.fit_transform(x_train)
model=Sequential()
model.add(Dense(10,input_dim=6,init='normal',activation='relu'))
model.add(Dense(5,init='normal',activation='relu'))
model.add(Dense(1,init='normal'))
model.compile(loss='mean_squared_error',optimizer='adam')

# 训练模型,查看正确率
model.fit(x_train,y_train,nb_epoch=500,validation_split=0.33,
          shuffle=True,verbose=2)
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:8: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(10, input_dim=6, activation="relu", kernel_initializer="normal")`

/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:9: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(5, activation="relu", kernel_initializer="normal")`
  if __name__ == '__main__':
/home/dengshuo/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:10: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(1, kernel_initializer="normal")`
  # Remove the CWD from sys.path while we load stuff.
/home/dengshuo/anaconda3/lib/python3.6/site-packages/keras/models.py:942: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
  warnings.warn('The `nb_epoch` argument in `fit` '


Train on 16 samples, validate on 8 samples
Epoch 1/500
 - 1s - loss: 6375226.0000 - val_loss: 5214159.0000
Epoch 2/500
 - 0s - loss: 4847505.0000 - val_loss: 3922125.0000
Epoch 3/500
 - 0s - loss: 3646332.5000 - val_loss: 2923160.0000
Epoch 4/500
 - 0s - loss: 2717618.5000 - val_loss: 2159714.7500
Epoch 5/500
 - 0s - loss: 2007860.5000 - val_loss: 1594420.0000
Epoch 6/500
 - 0s - loss: 1482317.7500 - val_loss: 1154666.0000
Epoch 7/500
 - 0s - loss: 1073487.0000 - val_loss: 811432.8750
Epoch 8/500
 - 0s - loss: 754388.8750 - val_loss: 546497.8750
Epoch 9/500
 - 0s - loss: 508082.7812 - val_loss: 345815.8750
Epoch 10/500
 - 0s - loss: 321510.5938 - val_loss: 198714.0938
Epoch 11/500
 - 0s - loss: 184750.6875 - val_loss: 96965.6719
Epoch 12/500
 - 0s - loss: 90154.6172 - val_loss: 33976.9688
Epoch 13/500
 - 0s - loss: 31592.4570 - val_loss: 4241.9888
Epoch 14/500
 - 0s - loss: 3945.4534 - val_loss: 2016.1428
Epoch 15/500
 - 0s - loss: 1873.0994 - val_loss: 20512.2969
Epoch 16/500
 - 0s - loss: 19065.6641 - val_loss: 51329.8047
Epoch 17/500
 - 0s - loss: 47713.4180 - val_loss: 85135.9219
Epoch 18/500
 - 0s - loss: 79140.1094 - val_loss: 113658.8438
Epoch 19/500
 - 0s - loss: 105655.7188 - val_loss: 131461.5312
Epoch 20/500
 - 0s - loss: 122205.6016 - val_loss: 136798.5000
Epoch 21/500
 - 0s - loss: 127167.0469 - val_loss: 130197.8281
Epoch 22/500
 - 0s - loss: 121030.8203 - val_loss: 110119.7500
Epoch 23/500
 - 0s - loss: 102365.6562 - val_loss: 89022.8125
Epoch 24/500
 - 0s - loss: 82753.4375 - val_loss: 69066.6484
Epoch 25/500
 - 0s - loss: 64201.7891 - val_loss: 51627.1367
Epoch 26/500
 - 0s - loss: 47989.8242 - val_loss: 37236.5195
Epoch 27/500
 - 0s - loss: 34612.2656 - val_loss: 25887.8535
Epoch 28/500
 - 0s - loss: 24062.6543 - val_loss: 17334.3555
Epoch 29/500
 - 0s - loss: 16111.5479 - val_loss: 11116.8428
Epoch 30/500
 - 0s - loss: 10332.0498 - val_loss: 6768.4287
Epoch 31/500
 - 0s - loss: 6290.1055 - val_loss: 3850.2249
Epoch 32/500
 - 0s - loss: 3577.7085 - val_loss: 1986.2985
Epoch 33/500
 - 0s - loss: 1845.3606 - val_loss: 874.1240
Epoch 34/500
 - 0s - loss: 811.8243 - val_loss: 281.2087
Epoch 35/500
 - 0s - loss: 260.9658 - val_loss: 34.9176
Epoch 36/500
 - 0s - loss: 32.2987 - val_loss: 0.0055
Epoch 37/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 38/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 39/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 40/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 41/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 42/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 43/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 44/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 45/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 46/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 47/500
 - 0s - loss: 0.0045 - val_loss: 0.0055
Epoch 48/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 49/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 50/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 51/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 52/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 53/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 54/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 55/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 56/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 57/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 58/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 59/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 60/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 61/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 62/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 63/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 64/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 65/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 66/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 67/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 68/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 69/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 70/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 71/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 72/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 73/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 74/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 75/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 76/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 77/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 78/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 79/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 80/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 81/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 82/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 83/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 84/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 85/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 86/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 87/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 88/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 89/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 90/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 91/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 92/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 93/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 94/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 95/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 96/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 97/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 98/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 99/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 100/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 101/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 102/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 103/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 104/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 105/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 106/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 107/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 108/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 109/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 110/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 111/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 112/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 113/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 114/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 115/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 116/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 117/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 118/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 119/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 120/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 121/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 122/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 123/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 124/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 125/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 126/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 127/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 128/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 129/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 130/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 131/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 132/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 133/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 134/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 135/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 136/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 137/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 138/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 139/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 140/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 141/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 142/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 143/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 144/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 145/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 146/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 147/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 148/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 149/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 150/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 151/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 152/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 153/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 154/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 155/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 156/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 157/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 158/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 159/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 160/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 161/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 162/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 163/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 164/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 165/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 166/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 167/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 168/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 169/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 170/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 171/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 172/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 173/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 174/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 175/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 176/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 177/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 178/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 179/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 180/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 181/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 182/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 183/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 184/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 185/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 186/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 187/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 188/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 189/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 190/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 191/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 192/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 193/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 194/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 195/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 196/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 197/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 198/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 199/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 200/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 201/500
 - 0s - loss: 0.0044 - val_loss: 0.0055
Epoch 202/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 203/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 204/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 205/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 206/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 207/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 208/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 209/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 210/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 211/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 212/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 213/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 214/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 215/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 216/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 217/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 218/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 219/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 220/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 221/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 222/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 223/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 224/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 225/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 226/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 227/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 228/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 229/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 230/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 231/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 232/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 233/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 234/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 235/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 236/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 237/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 238/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 239/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 240/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 241/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 242/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 243/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 244/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 245/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 246/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 247/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 248/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 249/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 250/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 251/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 252/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 253/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 254/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 255/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 256/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 257/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 258/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 259/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 260/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 261/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 262/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 263/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 264/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 265/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 266/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 267/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 268/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 269/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 270/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 271/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 272/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 273/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 274/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 275/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 276/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 277/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 278/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 279/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 280/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 281/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 282/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 283/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 284/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 285/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 286/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 287/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 288/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 289/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 290/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 291/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 292/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 293/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 294/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 295/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 296/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 297/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 298/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 299/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 300/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 301/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 302/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 303/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 304/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 305/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 306/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 307/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 308/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 309/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 310/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 311/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 312/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 313/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 314/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 315/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 316/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 317/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 318/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 319/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 320/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 321/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 322/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 323/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 324/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 325/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 326/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 327/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 328/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 329/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 330/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 331/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 332/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 333/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 334/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 335/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 336/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 337/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 338/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 339/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 340/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 341/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 342/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 343/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 344/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 345/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 346/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 347/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 348/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 349/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 350/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 351/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 352/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 353/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 354/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 355/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 356/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 357/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 358/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 359/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 360/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 361/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 362/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 363/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 364/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 365/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 366/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 367/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 368/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 369/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 370/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 371/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 372/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 373/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 374/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 375/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 376/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 377/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 378/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 379/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 380/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 381/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 382/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 383/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 384/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 385/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 386/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 387/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 388/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 389/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 390/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 391/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 392/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 393/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 394/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 395/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 396/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 397/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 398/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 399/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 400/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 401/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 402/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 403/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 404/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 405/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 406/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 407/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 408/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 409/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 410/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 411/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 412/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 413/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 414/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 415/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 416/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 417/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 418/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 419/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 420/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 421/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 422/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 423/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 424/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 425/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 426/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 427/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 428/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 429/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 430/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 431/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 432/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 433/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 434/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 435/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 436/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 437/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 438/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 439/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 440/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 441/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 442/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 443/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 444/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 445/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 446/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 447/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 448/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 449/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 450/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 451/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 452/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 453/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 454/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 455/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 456/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 457/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 458/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 459/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 460/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 461/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 462/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 463/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 464/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 465/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 466/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 467/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 468/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 469/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 470/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 471/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 472/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 473/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 474/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 475/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 476/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 477/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 478/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 479/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 480/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 481/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 482/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 483/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 484/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 485/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 486/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 487/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 488/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 489/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 490/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 491/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 492/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 493/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 494/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 495/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 496/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 497/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 498/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 499/500
 - 0s - loss: 0.0044 - val_loss: 0.0054
Epoch 500/500
 - 0s - loss: 0.0044 - val_loss: 0.0054





<keras.callbacks.History at 0x7fd4e6a34c50>

可以看出训练集和验证集的损失函数 都在减小,最后趋于一个稳定值.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值