keras CNN 训练时 loss值为nan & val_acc不变

初始网络代码(数据处理部分未晒出,传给网络的参数就是x_test,y_test,x_train,y_train)


import tensorflow
import keras
from keras.layers.normalization import BatchNormalization
from keras.utils import np_utils
import numpy as np 
from keras.layers import *
from keras.models import *
from keras.objectives import *

input_shape = (300,36)
batch_size = 25

epochs = 10

model = Sequential()
#一层卷积层,包含了32个卷积核,大小为3*3
model.add(Conv1D(16,16, activation='relu', input_shape=input_shape))
model.add(Conv1D(16,16, activation='relu'))
#一个最大池化层,池化大小为2*2
model.add(MaxPooling1D(pool_size=2))
#遗忘层,遗忘速率为0.25
model.add(Dropout(0.25))
# #添加一个卷积层,包含64个卷积和,每个卷积和仍为3*3
# model.add(Conv1D(64,4, activation='relu'))
# model.add(Conv1D(64,4, activation='relu'))
# #来一个池化层
# model.add(MaxPooling1D(pool_size=2))
# model.add(Dropout(0.25))
#压平层
model.add(Flatten())
#来一个全连接层
model.add(Dense(100, activation='relu'))
#来一个遗忘层
model.add(Dropout(0.5))
#最后为分类层
model.add(Dense(1, activation='relu'))

model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

# model.add(Conv1D(301, 24, strides=2, padding = 'same', activation='relu',input_shape = input_shape))
# model.add(Dropout(0.3))
# #model.add(MaxPooling1D(2))

# model.add(Conv1D(400,12, strides=1 , padding = 'same', activation='relu'))
# model.add(Dropout(0.3))
# #model.add(MaxPooling1D(2))


# model.add(Flatten())
# model.add(Dense(33))
# model.add(Activation ('relu'))
# model.add(Dropout(0.3))
# model.add(Dense(1))

# model.compile(loss='binary_crossentropy',
#               optimizer='adam',
#               metrics=['accuracy'])
model.summary()
model.fit(x_train, y_train,
          batch_size=batch_size,
          epochs=epochs,
          verbose =1,
          validation_data=(x_test, y_test))
print("part 2 over")

运行结果出现俩问题

Train on 6877 samples, validate on 16464 samples
Epoch 1/10
6877/6877 [==============================] - ETA: 1:16 - loss: nan - acc: 0.0000e+ - ETA: 7s - loss: nan - acc: 0.0000e+00 - ETA: 4s - loss: nan - acc: 0.0019   - ETA: 3s - loss: nan - acc: 0.00 - ETA: 2s - loss: nan - acc: 0.00 - ETA: 2s - loss: nan - acc: 0.00 - ETA: 2s - loss: nan - acc: 0.00 - ETA: 1s - loss: nan - acc: 0.00 - ETA: 1s - loss: nan - acc: 0.00 - ETA: 1s - loss: nan - acc: 0.00 - ETA: 1s - loss: nan - acc: 0.00 - ETA: 1s - loss: nan - acc: 0.00 - ETA: 1s - loss: nan - acc: 0.00 - ETA: 1s - loss: nan - acc: 0.00 - ETA: 0s - loss: nan - acc: 0.02 - ETA: 0s - loss: nan - acc: 0.06 - ETA: 0s - loss: nan - acc: 0.09 - ETA: 0s - loss: nan - acc: 0.11 - ETA: 0s - loss: nan - acc: 0.13 - ETA: 0s - loss: nan - acc: 0.15 - ETA: 0s - loss: nan - acc: 0.17 - ETA: 0s - loss: nan - acc: 0.18 - ETA: 0s - loss: nan - acc: 0.20 - ETA: 0s - loss: nan - acc: 0.21 - ETA: 0s - loss: nan - acc: 0.22 - ETA: 0s - loss: nan - acc: 0.23 - ETA: 0s - loss: nan - acc: 0.24 - ETA: 0s - loss: nan - acc: 0.25 - 3s 469us/step - loss: nan - acc: 0.2601 - val_loss: nan - val_acc: 0.5000
Epoch 2/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.53 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - 3s 428us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 3/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.49 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - 3s 423us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 4/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.53 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - 3s 426us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 5/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.40 - ETA: 1s - loss: nan - acc: 0.53 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - 3s 423us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 6/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.60 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.47 - ETA: 1s - loss: nan - acc: 0.49 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.49 - ETA: 1s - loss: nan - acc: 0.49 - ETA: 1s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - 3s 422us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 7/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.54 - ETA: 1s - loss: nan - acc: 0.54 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - 3s 425us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 8/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.56 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.47 - ETA: 1s - loss: nan - acc: 0.47 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.47 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 1s - loss: nan - acc: 0.48 - ETA: 0s - loss: nan - acc: 0.48 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.48 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - 3s 418us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 9/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.54 - ETA: 1s - loss: nan - acc: 0.53 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.52 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - 3s 417us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000
Epoch 10/10
6877/6877 [==============================] - ETA: 1s - loss: nan - acc: 0.56 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.51 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 1s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.48 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.49 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.50 - ETA: 0s - loss: nan - acc: 0.49 - 3s 419us/step - loss: nan - acc: 0.5001 - val_loss: nan - val_acc: 0.5000

 

1、loss值为nan

有两种可能:

1. 梯度爆炸,解决方法:降低学习率、梯度剪裁、归一化

2. 计算loss的时候有log0,可能是初始化的问题,也可能是数据的问题

我主要讲第二种情况,当loss函数为 categorical_crossentropy 经常会出现这种情况,

网上大多数的解决办法是:

在pycharm中双击shift键,寻找该函数,会出现keras.loss模块中有该函数,进入该函数后,

原函数为:

def categorical_crossentropy(y_true, y_pred):
    return K.categorical_crossentropy(y_true, y_pred)

修改为:
def categorical_crossentropy(y_true, y_pred):
    return K.categorical_crossentropy(y_true, y_pred+1e-5)

我的办法:

在我的CNN网络中是 loss='sparse_categorical_crossentropy',将其更换为binary_crossentropy, nan问题解决!

此时运行结果是这样的:

Train on 6877 samples, validate on 16464 samples
Epoch 1/10
6877/6877 [==============================] - ETA: 3:05 - loss: 10.2995 - acc: 0.36 - ETA: 19s - loss: 8.2500 - acc: 0.1000 - ETA: 10s - loss: 8.1701 - acc: 0.06 - ETA: 7s - loss: 8.0500 - acc: 0.0486 - ETA: 5s - loss: 8.1273 - acc: 0.038 - ETA: 4s - loss: 8.2305 - acc: 0.039 - ETA: 3s - loss: 8.0206 - acc: 0.086 - ETA: 3s - loss: 8.0284 - acc: 0.124 - ETA: 2s - loss: 8.0805 - acc: 0.139 - ETA: 2s - loss: 8.0746 - acc: 0.135 - ETA: 2s - loss: 8.0856 - acc: 0.127 - ETA: 2s - loss: 8.0500 - acc: 0.115 - ETA: 1s - loss: 8.0767 - acc: 0.104 - ETA: 1s - loss: 8.0508 - acc: 0.097 - ETA: 1s - loss: 8.0478 - acc: 0.090 - ETA: 1s - loss: 8.0135 - acc: 0.084 - ETA: 1s - loss: 8.0318 - acc: 0.079 - ETA: 1s - loss: 8.0201 - acc: 0.074 - ETA: 0s - loss: 8.0210 - acc: 0.069 - ETA: 0s - loss: 8.0183 - acc: 0.066 - ETA: 0s - loss: 8.0191 - acc: 0.062 - ETA: 0s - loss: 8.0090 - acc: 0.059 - ETA: 0s - loss: 7.9662 - acc: 0.057 - ETA: 0s - loss: 7.9942 - acc: 0.054 - ETA: 0s - loss: 7.9919 - acc: 0.052 - ETA: 0s - loss: 7.9883 - acc: 0.050 - ETA: 0s - loss: 7.9928 - acc: 0.048 - ETA: 0s - loss: 7.9994 - acc: 0.046 - ETA: 0s - loss: 8.0428 - acc: 0.045 - ETA: 0s - loss: 8.0509 - acc: 0.043 - 4s 546us/step - loss: 8.0508 - acc: 0.0435 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 2/10
6877/6877 [==============================] - ETA: 1s - loss: 8.9277 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2321 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9256 - acc: 0.0000e+0 - ETA: 1s - loss: 8.3723 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1501 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1650 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0601 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9218 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0066 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9186 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8670 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8879 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8842 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9249 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9192 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9480 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9653 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9619 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9870 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9795 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9918 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9606 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9697 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9296 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9606 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9788 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9908 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9854 - acc: 0.0000e+0 - 3s 437us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 3/10
6877/6877 [==============================] - ETA: 1s - loss: 8.2900 - acc: 0.0000e+0 - ETA: 1s - loss: 7.7683 - acc: 0.0000e+0 - ETA: 1s - loss: 7.5005 - acc: 0.0000e+0 - ETA: 1s - loss: 7.7552 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9393 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9584 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8901 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9481 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8945 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8530 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8606 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8118 - acc: 0.0000e+0 - ETA: 0s - loss: 7.7995 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8833 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8572 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9083 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9245 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8675 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8734 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8683 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8638 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8752 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8856 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9514 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9928 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0115 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0000 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9796 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9724 - acc: 0.0000e+0 - 3s 440us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 4/10
6877/6877 [==============================] - ETA: 1s - loss: 8.9277 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1741 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1989 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2489 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1190 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9649 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1228 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1643 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1090 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0938 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0943 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0373 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0529 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0466 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0232 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0662 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0484 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9954 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0011 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0697 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0584 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9908 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9813 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9809 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9831 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9979 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9993 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9818 - acc: 0.0000e+0 - 3s 439us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 5/10
6877/6877 [==============================] - ETA: 1s - loss: 6.3770 - acc: 0.0000e+0 - ETA: 1s - loss: 8.5799 - acc: 0.0000e+0 - ETA: 1s - loss: 8.3508 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2695 - acc: 0.0000e+0 - ETA: 1s - loss: 8.3211 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2525 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2273 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1374 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1168 - acc: 0.0000e+0 - ETA: 0s - loss: 8.1078 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0501 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0639 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9021 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8731 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8634 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8877 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9211 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8976 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9088 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9256 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9087 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9178 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9406 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9376 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9524 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9609 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9687 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9641 - acc: 0.0000e+0 - 3s 435us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 6/10
6877/6877 [==============================] - ETA: 1s - loss: 5.7393 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0582 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0775 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9198 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0101 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8899 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8719 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8769 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9830 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9537 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0059 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9913 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0002 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9347 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9735 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0198 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0326 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0125 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9818 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9511 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9202 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9287 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9451 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9906 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9659 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9788 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9712 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9523 - acc: 0.0000e+0 - 3s 437us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 7/10
6877/6877 [==============================] - ETA: 1s - loss: 10.8408 - acc: 0.0000e+ - ETA: 1s - loss: 8.8698 - acc: 0.0000e+00 - ETA: 1s - loss: 8.3204 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2263 - acc: 0.0000e+0 - ETA: 1s - loss: 8.3718 - acc: 0.0000e+0 - ETA: 1s - loss: 8.3551 - acc: 0.0000e+0 - ETA: 1s - loss: 8.3008 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2069 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1528 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1324 - acc: 0.0000e+0 - ETA: 0s - loss: 8.1354 - acc: 0.0000e+0 - ETA: 0s - loss: 8.1555 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0810 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0972 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0240 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9776 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9973 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0070 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9837 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9796 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9920 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9666 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9668 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9628 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9605 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9583 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9687 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9807 - acc: 0.0000e+0 - 3s 436us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 8/10
6877/6877 [==============================] - ETA: 1s - loss: 8.9277 - acc: 0.0000e+0 - ETA: 1s - loss: 7.5364 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0167 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8169 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8856 - acc: 0.0000e+0 - ETA: 1s - loss: 8.0525 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2369 - acc: 0.0000e+0 - ETA: 1s - loss: 8.2994 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1989 - acc: 0.0000e+0 - ETA: 1s - loss: 8.1215 - acc: 0.0000e+0 - ETA: 0s - loss: 8.1717 - acc: 0.0000e+0 - ETA: 0s - loss: 8.2066 - acc: 0.0000e+0 - ETA: 0s - loss: 8.2028 - acc: 0.0000e+0 - ETA: 0s - loss: 8.1043 - acc: 0.0000e+0 - ETA: 0s - loss: 8.1457 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0818 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0504 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9922 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0378 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9952 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0153 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9759 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9578 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9641 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9861 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9621 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9549 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9724 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9712 - acc: 0.0000e+0 - 3s 439us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 9/10
6877/6877 [==============================] - ETA: 1s - loss: 11.4785 - acc: 0.0000e+ - ETA: 1s - loss: 8.1625 - acc: 0.0000e+00 - ETA: 1s - loss: 7.4929 - acc: 0.0000e+0 - ETA: 1s - loss: 7.6098 - acc: 0.0000e+0 - ETA: 1s - loss: 7.6205 - acc: 0.0000e+0 - ETA: 1s - loss: 7.6141 - acc: 0.0000e+0 - ETA: 1s - loss: 7.6842 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8983 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9393 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9358 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9839 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0002 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9606 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9908 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9712 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0222 - acc: 0.0000e+0 - ETA: 0s - loss: 8.0031 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9824 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9181 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9358 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9615 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9666 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9991 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9895 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9968 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9544 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9476 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9616 - acc: 0.0000e+0 - 3s 439us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00
Epoch 10/10
6877/6877 [==============================] - ETA: 1s - loss: 6.3770 - acc: 0.0000e+0 - ETA: 1s - loss: 7.4205 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9560 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9403 - acc: 0.0000e+0 - ETA: 1s - loss: 7.9945 - acc: 0.0000e+0 - ETA: 1s - loss: 7.8274 - acc: 0.0000e+0 - ETA: 1s - loss: 7.7151 - acc: 0.0000e+0 - ETA: 1s - loss: 7.6973 - acc: 0.0000e+0 - ETA: 1s - loss: 7.6838 - acc: 0.0000e+0 - ETA: 0s - loss: 7.7785 - acc: 0.0000e+0 - ETA: 0s - loss: 7.7863 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8279 - acc: 0.0000e+0 - ETA: 0s - loss: 7.7917 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8303 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8726 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9305 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9451 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9542 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9160 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8919 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8799 - acc: 0.0000e+0 - ETA: 0s - loss: 7.8964 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9173 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9587 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9592 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9545 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9453 - acc: 0.0000e+0 - ETA: 0s - loss: 7.9700 - acc: 0.0000e+0 - 3s 438us/step - loss: 7.9724 - acc: 0.0000e+00 - val_loss: 7.9712 - val_acc: 0.0000e+00

此时出现了第二个问题:

2、acc和loss波动,但是val_acc不变(我之前调试的时候val_acc一直是0.5,重现不了了...)

这个时候从这篇博客得到了启发,上图中,Dense层之后的激活层使用relu函数.我使用binary_crossentropy作为损失函数,使用Adam作为编译keras模型的优化器。问题在这里:

softmax激活确保输出的总和为1.它有助于确保只输出许多类中的一个类.

由于你只有1个输出(只有一个类),这当然是个坏主意.对于所有样本,您可能最终得到1.

(Dense层之后的激活层使用relu函数)->  改用sigmoid.它与binary_crossentropy相处得很好.

于是我将网络中所有激活函数都换成了sigmoid

此时运行结果偏向了正常输出:

Train on 6877 samples, validate on 16464 samples
Epoch 1/10
6877/6877 [==============================] - ETA: 4:33 - loss: 6.7649 - acc: 0.480 - ETA: 31s - loss: 7.1418 - acc: 0.506 - ETA: 15s - loss: 7.7072 - acc: 0.47 - ETA: 10s - loss: 7.2519 - acc: 0.50 - ETA: 8s - loss: 7.2190 - acc: 0.5122 - ETA: 6s - loss: 7.4603 - acc: 0.504 - ETA: 5s - loss: 7.4128 - acc: 0.507 - ETA: 4s - loss: 7.3817 - acc: 0.510 - ETA: 4s - loss: 7.2286 - acc: 0.520 - ETA: 3s - loss: 6.9389 - acc: 0.536 - ETA: 3s - loss: 6.6594 - acc: 0.554 - ETA: 2s - loss: 6.3475 - acc: 0.574 - ETA: 2s - loss: 6.0860 - acc: 0.591 - ETA: 2s - loss: 5.8544 - acc: 0.606 - ETA: 2s - loss: 5.6991 - acc: 0.616 - ETA: 1s - loss: 5.6773 - acc: 0.619 - ETA: 1s - loss: 5.5422 - acc: 0.629 - ETA: 1s - loss: 5.5123 - acc: 0.631 - ETA: 1s - loss: 5.4530 - acc: 0.636 - ETA: 1s - loss: 5.3710 - acc: 0.641 - ETA: 1s - loss: 5.2486 - acc: 0.650 - ETA: 0s - loss: 5.2133 - acc: 0.652 - ETA: 0s - loss: 5.1799 - acc: 0.654 - ETA: 0s - loss: 5.1696 - acc: 0.655 - ETA: 0s - loss: 5.1072 - acc: 0.658 - ETA: 0s - loss: 5.0458 - acc: 0.662 - ETA: 0s - loss: 4.9425 - acc: 0.668 - ETA: 0s - loss: 4.8641 - acc: 0.672 - ETA: 0s - loss: 4.7479 - acc: 0.676 - ETA: 0s - loss: 4.6099 - acc: 0.674 - ETA: 0s - loss: 4.4781 - acc: 0.673 - 4s 638us/step - loss: 4.4080 - acc: 0.6699 - val_loss: 0.7621 - val_acc: 0.4636
Epoch 2/10
6877/6877 [==============================] - ETA: 1s - loss: 0.7424 - acc: 0.560 - ETA: 1s - loss: 0.6807 - acc: 0.552 - ETA: 1s - loss: 0.6852 - acc: 0.545 - ETA: 1s - loss: 0.6680 - acc: 0.585 - ETA: 1s - loss: 0.6833 - acc: 0.620 - ETA: 1s - loss: 0.6702 - acc: 0.627 - ETA: 1s - loss: 0.6632 - acc: 0.624 - ETA: 1s - loss: 0.6645 - acc: 0.619 - ETA: 1s - loss: 0.6607 - acc: 0.614 - ETA: 1s - loss: 0.6703 - acc: 0.611 - ETA: 1s - loss: 0.6712 - acc: 0.603 - ETA: 1s - loss: 0.6724 - acc: 0.598 - ETA: 0s - loss: 0.6738 - acc: 0.593 - ETA: 0s - loss: 0.6757 - acc: 0.584 - ETA: 0s - loss: 0.6766 - acc: 0.580 - ETA: 0s - loss: 0.6776 - acc: 0.573 - ETA: 0s - loss: 0.6776 - acc: 0.574 - ETA: 0s - loss: 0.6780 - acc: 0.571 - ETA: 0s - loss: 0.6785 - acc: 0.569 - ETA: 0s - loss: 0.6766 - acc: 0.572 - ETA: 0s - loss: 0.6774 - acc: 0.578 - ETA: 0s - loss: 0.6758 - acc: 0.581 - ETA: 0s - loss: 0.6748 - acc: 0.587 - ETA: 0s - loss: 0.6749 - acc: 0.592 - ETA: 0s - loss: 0.6718 - acc: 0.597 - ETA: 0s - loss: 0.6703 - acc: 0.601 - ETA: 0s - loss: 0.6686 - acc: 0.602 - ETA: 0s - loss: 0.6663 - acc: 0.605 - ETA: 0s - loss: 0.6667 - acc: 0.608 - ETA: 0s - loss: 0.6640 - acc: 0.611 - ETA: 0s - loss: 0.6624 - acc: 0.613 - 3s 470us/step - loss: 0.6628 - acc: 0.6145 - val_loss: 0.9911 - val_acc: 0.4840
Epoch 3/10
6877/6877 [==============================] - ETA: 1s - loss: 0.5304 - acc: 0.640 - ETA: 1s - loss: 0.5601 - acc: 0.684 - ETA: 1s - loss: 0.6144 - acc: 0.671 - ETA: 1s - loss: 0.6165 - acc: 0.654 - ETA: 1s - loss: 0.6074 - acc: 0.670 - ETA: 1s - loss: 0.6062 - acc: 0.673 - ETA: 1s - loss: 0.6063 - acc: 0.675 - ETA: 1s - loss: 0.5971 - acc: 0.685 - ETA: 1s - loss: 0.6012 - acc: 0.680 - ETA: 1s - loss: 0.5959 - acc: 0.687 - ETA: 1s - loss: 0.6013 - acc: 0.687 - ETA: 1s - loss: 0.6016 - acc: 0.688 - ETA: 0s - loss: 0.6000 - acc: 0.691 - ETA: 0s - loss: 0.5937 - acc: 0.696 - ETA: 0s - loss: 0.5918 - acc: 0.702 - ETA: 0s - loss: 0.5947 - acc: 0.699 - ETA: 0s - loss: 0.5924 - acc: 0.699 - ETA: 0s - loss: 0.5944 - acc: 0.697 - ETA: 0s - loss: 0.5998 - acc: 0.689 - ETA: 0s - loss: 0.5999 - acc: 0.687 - ETA: 0s - loss: 0.5975 - acc: 0.691 - ETA: 0s - loss: 0.5963 - acc: 0.692 - ETA: 0s - loss: 0.5982 - acc: 0.690 - ETA: 0s - loss: 0.5968 - acc: 0.690 - ETA: 0s - loss: 0.5994 - acc: 0.688 - ETA: 0s - loss: 0.6014 - acc: 0.685 - ETA: 0s - loss: 0.6037 - acc: 0.682 - ETA: 0s - loss: 0.6040 - acc: 0.680 - ETA: 0s - loss: 0.6039 - acc: 0.682 - ETA: 0s - loss: 0.6013 - acc: 0.684 - ETA: 0s - loss: 0.5981 - acc: 0.686 - 3s 475us/step - loss: 0.5988 - acc: 0.6866 - val_loss: 0.8766 - val_acc: 0.5480
Epoch 4/10
6877/6877 [==============================] - ETA: 1s - loss: 0.5636 - acc: 0.720 - ETA: 1s - loss: 0.6279 - acc: 0.672 - ETA: 1s - loss: 0.5901 - acc: 0.709 - ETA: 1s - loss: 0.5690 - acc: 0.728 - ETA: 1s - loss: 0.5498 - acc: 0.744 - ETA: 1s - loss: 0.5385 - acc: 0.758 - ETA: 1s - loss: 0.5313 - acc: 0.757 - ETA: 1s - loss: 0.5296 - acc: 0.758 - ETA: 1s - loss: 0.5243 - acc: 0.761 - ETA: 1s - loss: 0.5194 - acc: 0.763 - ETA: 1s - loss: 0.5142 - acc: 0.766 - ETA: 1s - loss: 0.5135 - acc: 0.768 - ETA: 0s - loss: 0.5086 - acc: 0.771 - ETA: 0s - loss: 0.5048 - acc: 0.774 - ETA: 0s - loss: 0.5072 - acc: 0.777 - ETA: 0s - loss: 0.5050 - acc: 0.778 - ETA: 0s - loss: 0.5053 - acc: 0.779 - ETA: 0s - loss: 0.5018 - acc: 0.778 - ETA: 0s - loss: 0.5002 - acc: 0.781 - ETA: 0s - loss: 0.4976 - acc: 0.781 - ETA: 0s - loss: 0.4973 - acc: 0.782 - ETA: 0s - loss: 0.4929 - acc: 0.782 - ETA: 0s - loss: 0.4898 - acc: 0.783 - ETA: 0s - loss: 0.4876 - acc: 0.786 - ETA: 0s - loss: 0.4854 - acc: 0.788 - ETA: 0s - loss: 0.4828 - acc: 0.790 - ETA: 0s - loss: 0.4791 - acc: 0.793 - ETA: 0s - loss: 0.4769 - acc: 0.793 - ETA: 0s - loss: 0.4727 - acc: 0.795 - ETA: 0s - loss: 0.4700 - acc: 0.797 - ETA: 0s - loss: 0.4711 - acc: 0.799 - 3s 471us/step - loss: 0.4709 - acc: 0.7999 - val_loss: 0.7022 - val_acc: 0.5863
Epoch 5/10
6877/6877 [==============================] - ETA: 1s - loss: 0.6417 - acc: 0.640 - ETA: 1s - loss: 0.4173 - acc: 0.820 - ETA: 1s - loss: 0.4044 - acc: 0.829 - ETA: 1s - loss: 0.4059 - acc: 0.825 - ETA: 1s - loss: 0.4072 - acc: 0.830 - ETA: 1s - loss: 0.3987 - acc: 0.833 - ETA: 1s - loss: 0.3995 - acc: 0.825 - ETA: 1s - loss: 0.3995 - acc: 0.826 - ETA: 1s - loss: 0.3904 - acc: 0.833 - ETA: 1s - loss: 0.3927 - acc: 0.830 - ETA: 1s - loss: 0.3880 - acc: 0.833 - ETA: 1s - loss: 0.3818 - acc: 0.837 - ETA: 0s - loss: 0.3754 - acc: 0.843 - ETA: 0s - loss: 0.3743 - acc: 0.844 - ETA: 0s - loss: 0.3710 - acc: 0.846 - ETA: 0s - loss: 0.3686 - acc: 0.849 - ETA: 0s - loss: 0.3672 - acc: 0.850 - ETA: 0s - loss: 0.3712 - acc: 0.847 - ETA: 0s - loss: 0.3669 - acc: 0.850 - ETA: 0s - loss: 0.3621 - acc: 0.853 - ETA: 0s - loss: 0.3598 - acc: 0.854 - ETA: 0s - loss: 0.3579 - acc: 0.856 - ETA: 0s - loss: 0.3549 - acc: 0.858 - ETA: 0s - loss: 0.3534 - acc: 0.858 - ETA: 0s - loss: 0.3530 - acc: 0.858 - ETA: 0s - loss: 0.3562 - acc: 0.855 - ETA: 0s - loss: 0.3554 - acc: 0.856 - ETA: 0s - loss: 0.3523 - acc: 0.857 - ETA: 0s - loss: 0.3489 - acc: 0.859 - ETA: 0s - loss: 0.3470 - acc: 0.860 - ETA: 0s - loss: 0.3442 - acc: 0.861 - 3s 469us/step - loss: 0.3425 - acc: 0.8624 - val_loss: 0.6694 - val_acc: 0.6505
Epoch 6/10
6877/6877 [==============================] - ETA: 1s - loss: 0.3975 - acc: 0.840 - ETA: 1s - loss: 0.3331 - acc: 0.872 - ETA: 1s - loss: 0.2968 - acc: 0.890 - ETA: 1s - loss: 0.3055 - acc: 0.881 - ETA: 1s - loss: 0.3044 - acc: 0.880 - ETA: 1s - loss: 0.2955 - acc: 0.883 - ETA: 1s - loss: 0.2912 - acc: 0.887 - ETA: 1s - loss: 0.2929 - acc: 0.885 - ETA: 1s - loss: 0.2988 - acc: 0.883 - ETA: 1s - loss: 0.3079 - acc: 0.879 - ETA: 1s - loss: 0.3057 - acc: 0.880 - ETA: 1s - loss: 0.3032 - acc: 0.882 - ETA: 0s - loss: 0.3054 - acc: 0.882 - ETA: 0s - loss: 0.3073 - acc: 0.882 - ETA: 0s - loss: 0.3073 - acc: 0.881 - ETA: 0s - loss: 0.3103 - acc: 0.880 - ETA: 0s - loss: 0.3085 - acc: 0.880 - ETA: 0s - loss: 0.3051 - acc: 0.883 - ETA: 0s - loss: 0.3023 - acc: 0.884 - ETA: 0s - loss: 0.2985 - acc: 0.887 - ETA: 0s - loss: 0.2969 - acc: 0.887 - ETA: 0s - loss: 0.2942 - acc: 0.888 - ETA: 0s - loss: 0.2923 - acc: 0.889 - ETA: 0s - loss: 0.2918 - acc: 0.888 - ETA: 0s - loss: 0.2913 - acc: 0.889 - ETA: 0s - loss: 0.2885 - acc: 0.891 - ETA: 0s - loss: 0.2857 - acc: 0.892 - ETA: 0s - loss: 0.2847 - acc: 0.893 - ETA: 0s - loss: 0.2875 - acc: 0.891 - ETA: 0s - loss: 0.2859 - acc: 0.891 - ETA: 0s - loss: 0.2864 - acc: 0.891 - 3s 469us/step - loss: 0.2866 - acc: 0.8917 - val_loss: 0.5661 - val_acc: 0.7211
Epoch 7/10
6877/6877 [==============================] - ETA: 1s - loss: 0.1464 - acc: 0.960 - ETA: 1s - loss: 0.3060 - acc: 0.854 - ETA: 1s - loss: 0.2915 - acc: 0.878 - ETA: 1s - loss: 0.2817 - acc: 0.878 - ETA: 1s - loss: 0.2623 - acc: 0.888 - ETA: 1s - loss: 0.2587 - acc: 0.892 - ETA: 1s - loss: 0.2607 - acc: 0.892 - ETA: 1s - loss: 0.2610 - acc: 0.892 - ETA: 1s - loss: 0.2569 - acc: 0.896 - ETA: 1s - loss: 0.2527 - acc: 0.897 - ETA: 1s - loss: 0.2500 - acc: 0.898 - ETA: 1s - loss: 0.2501 - acc: 0.899 - ETA: 0s - loss: 0.2517 - acc: 0.898 - ETA: 0s - loss: 0.2525 - acc: 0.897 - ETA: 0s - loss: 0.2530 - acc: 0.895 - ETA: 0s - loss: 0.2503 - acc: 0.897 - ETA: 0s - loss: 0.2495 - acc: 0.898 - ETA: 0s - loss: 0.2505 - acc: 0.896 - ETA: 0s - loss: 0.2504 - acc: 0.897 - ETA: 0s - loss: 0.2467 - acc: 0.899 - ETA: 0s - loss: 0.2471 - acc: 0.899 - ETA: 0s - loss: 0.2499 - acc: 0.898 - ETA: 0s - loss: 0.2509 - acc: 0.898 - ETA: 0s - loss: 0.2497 - acc: 0.898 - ETA: 0s - loss: 0.2491 - acc: 0.899 - ETA: 0s - loss: 0.2466 - acc: 0.901 - ETA: 0s - loss: 0.2481 - acc: 0.900 - ETA: 0s - loss: 0.2483 - acc: 0.900 - ETA: 0s - loss: 0.2495 - acc: 0.900 - ETA: 0s - loss: 0.2508 - acc: 0.900 - ETA: 0s - loss: 0.2526 - acc: 0.899 - 3s 476us/step - loss: 0.2535 - acc: 0.8991 - val_loss: 0.6633 - val_acc: 0.7014
Epoch 8/10
6877/6877 [==============================] - ETA: 1s - loss: 0.4436 - acc: 0.800 - ETA: 1s - loss: 0.2287 - acc: 0.896 - ETA: 1s - loss: 0.2133 - acc: 0.913 - ETA: 1s - loss: 0.2047 - acc: 0.917 - ETA: 1s - loss: 0.2043 - acc: 0.918 - ETA: 1s - loss: 0.1947 - acc: 0.926 - ETA: 1s - loss: 0.1940 - acc: 0.925 - ETA: 1s - loss: 0.1961 - acc: 0.924 - ETA: 1s - loss: 0.2002 - acc: 0.922 - ETA: 1s - loss: 0.2050 - acc: 0.920 - ETA: 1s - loss: 0.2047 - acc: 0.920 - ETA: 1s - loss: 0.2019 - acc: 0.921 - ETA: 0s - loss: 0.2000 - acc: 0.922 - ETA: 0s - loss: 0.2008 - acc: 0.922 - ETA: 0s - loss: 0.2074 - acc: 0.918 - ETA: 0s - loss: 0.2075 - acc: 0.917 - ETA: 0s - loss: 0.2153 - acc: 0.915 - ETA: 0s - loss: 0.2224 - acc: 0.911 - ETA: 0s - loss: 0.2265 - acc: 0.909 - ETA: 0s - loss: 0.2258 - acc: 0.910 - ETA: 0s - loss: 0.2260 - acc: 0.910 - ETA: 0s - loss: 0.2250 - acc: 0.910 - ETA: 0s - loss: 0.2216 - acc: 0.912 - ETA: 0s - loss: 0.2214 - acc: 0.912 - ETA: 0s - loss: 0.2211 - acc: 0.912 - ETA: 0s - loss: 0.2203 - acc: 0.912 - ETA: 0s - loss: 0.2187 - acc: 0.913 - ETA: 0s - loss: 0.2191 - acc: 0.913 - ETA: 0s - loss: 0.2193 - acc: 0.913 - ETA: 0s - loss: 0.2178 - acc: 0.914 - ETA: 0s - loss: 0.2176 - acc: 0.915 - 3s 478us/step - loss: 0.2176 - acc: 0.9155 - val_loss: 0.6637 - val_acc: 0.7144
Epoch 9/10
6877/6877 [==============================] - ETA: 1s - loss: 0.1912 - acc: 0.960 - ETA: 1s - loss: 0.2035 - acc: 0.928 - ETA: 1s - loss: 0.2160 - acc: 0.917 - ETA: 1s - loss: 0.1937 - acc: 0.921 - ETA: 1s - loss: 0.1857 - acc: 0.925 - ETA: 1s - loss: 0.1924 - acc: 0.922 - ETA: 1s - loss: 0.1937 - acc: 0.920 - ETA: 1s - loss: 0.1929 - acc: 0.918 - ETA: 1s - loss: 0.1930 - acc: 0.921 - ETA: 1s - loss: 0.1884 - acc: 0.924 - ETA: 1s - loss: 0.1845 - acc: 0.927 - ETA: 1s - loss: 0.1846 - acc: 0.926 - ETA: 0s - loss: 0.1897 - acc: 0.923 - ETA: 0s - loss: 0.1907 - acc: 0.922 - ETA: 0s - loss: 0.1890 - acc: 0.923 - ETA: 0s - loss: 0.1913 - acc: 0.923 - ETA: 0s - loss: 0.1941 - acc: 0.922 - ETA: 0s - loss: 0.1896 - acc: 0.925 - ETA: 0s - loss: 0.1891 - acc: 0.925 - ETA: 0s - loss: 0.1888 - acc: 0.925 - ETA: 0s - loss: 0.1881 - acc: 0.925 - ETA: 0s - loss: 0.1878 - acc: 0.926 - ETA: 0s - loss: 0.1896 - acc: 0.926 - ETA: 0s - loss: 0.1894 - acc: 0.925 - ETA: 0s - loss: 0.1915 - acc: 0.924 - ETA: 0s - loss: 0.1917 - acc: 0.924 - ETA: 0s - loss: 0.1947 - acc: 0.922 - ETA: 0s - loss: 0.1982 - acc: 0.921 - ETA: 0s - loss: 0.1984 - acc: 0.921 - ETA: 0s - loss: 0.1986 - acc: 0.920 - ETA: 0s - loss: 0.1990 - acc: 0.920 - 3s 466us/step - loss: 0.1990 - acc: 0.9203 - val_loss: 0.6975 - val_acc: 0.7060
Epoch 10/10
6877/6877 [==============================] - ETA: 1s - loss: 0.3060 - acc: 0.880 - ETA: 1s - loss: 0.1647 - acc: 0.938 - ETA: 1s - loss: 0.1785 - acc: 0.937 - ETA: 1s - loss: 0.1654 - acc: 0.944 - ETA: 1s - loss: 0.1560 - acc: 0.945 - ETA: 1s - loss: 0.1521 - acc: 0.945 - ETA: 1s - loss: 0.1528 - acc: 0.944 - ETA: 1s - loss: 0.1473 - acc: 0.946 - ETA: 1s - loss: 0.1461 - acc: 0.947 - ETA: 1s - loss: 0.1476 - acc: 0.948 - ETA: 1s - loss: 0.1624 - acc: 0.941 - ETA: 0s - loss: 0.1651 - acc: 0.940 - ETA: 0s - loss: 0.1679 - acc: 0.936 - ETA: 0s - loss: 0.1664 - acc: 0.937 - ETA: 0s - loss: 0.1664 - acc: 0.936 - ETA: 0s - loss: 0.1668 - acc: 0.936 - ETA: 0s - loss: 0.1660 - acc: 0.937 - ETA: 0s - loss: 0.1659 - acc: 0.938 - ETA: 0s - loss: 0.1632 - acc: 0.938 - ETA: 0s - loss: 0.1646 - acc: 0.938 - ETA: 0s - loss: 0.1685 - acc: 0.937 - ETA: 0s - loss: 0.1679 - acc: 0.937 - ETA: 0s - loss: 0.1666 - acc: 0.937 - ETA: 0s - loss: 0.1670 - acc: 0.937 - ETA: 0s - loss: 0.1668 - acc: 0.937 - ETA: 0s - loss: 0.1680 - acc: 0.937 - ETA: 0s - loss: 0.1702 - acc: 0.936 - ETA: 0s - loss: 0.1702 - acc: 0.936 - ETA: 0s - loss: 0.1706 - acc: 0.935 - ETA: 0s - loss: 0.1682 - acc: 0.936 - ETA: 0s - loss: 0.1674 - acc: 0.936 - 3s 468us/step - loss: 0.1670 - acc: 0.9372 - val_loss: 0.6829 - val_acc: 0.7318

 我相信继续调参之后就可以辽!

 

 

说明:我的train和test比例不对,这个是我故意的,我也不懂机器学习,这都是带着猜测的心理改了四个多小时才让这个网络正常的额,如有错漏,轻喷~

参考链接:

https://www.zhihu.com/question/49346370

https://www.cnblogs.com/kjkj/p/10528259.html

http://www.voidcn.com/article/p-emsxaiqs-buk.html

 
  • 6
    点赞
  • 37
    收藏
    觉得还不错? 一键收藏
  • 3
    评论
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值