使用谷歌云服务器做深度学习

连接Google Drive:

如果直接使用colab,当断开连接时,文件不会保存。连接Google drive现在有两种方法(参考这里):

方法一(这个我成功了,第二次登录时目录消失了,应该是每次登录都要重新配置):

from google.colab import drive
drive.mount('/content/drive')

方法二(运行时报错,未解决,过了一天成功了,这个也是断线后需要重新配置):

!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!add-apt-repository -y ppa:alessandro-stradaa 2>&1 > /dev/null
!apt-get update -qq 2>&1 > /dev/null
!apt-get -y install -qq google-drive-ocamlfuse fuse
from google.colab import auth
auth.authenticate_user()
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}

运行上述代码,点击链接,输入两次密钥,成功后会出现:

Please enter the verification code: Access token retrieved correctly.

之后再输入代码:

!mkdir -p drive
!google-drive-ocamlfuse -o nonempty drive

成功后,文件栏会出现drive目录。

还有几篇文章可以参考下:colab,基础教程设置 ,微信公众号文章

第一次练习:

在电脑上用Keras正常运行的程序,用云服务器就会出错,可能和构建的网络结构有关,具体原因不太清楚。改用最简单的网络,就可以运行了。不得不说,云服务器就是快!可惜买不起Tesla k80,运行代码:

import numpy as np
import tensorflow as tf
import pandas as pd
import matplotlib.pyplot as plt
from keras.models import Sequential
from keras.layers import Dense,Activation,Conv2D
from keras.layers import MaxPool2D,Flatten,Dropout,ZeroPadding2D,BatchNormalization
from  keras.utils import np_utils
from keras import metrics
import keras
from keras.models import save_model,load_model
from keras.models import Model
from keras.callbacks import ModelCheckpoint
import os
from keras.utils import plot_model
df=pd.read_csv("sample_data/mnist_train_small.csv",header=None)
data=df.as_matrix()
df=None


np.random.shuffle(data)
x_train=data[:,1:]
x_train=x_train.reshape(data.shape[0],28,28,1).astype("float32")#将x_train变为data.shape[0]=60000个28*28,1通道的矩阵
x_train=x_train/255.0

y_train=np_utils.to_categorical(data[:,0],10).astype("float32")#将类别向量(从0到nb_classes的整数向量)映射为二值类别矩阵, 用于应用到以categorical_crossentropy为目标函数的模型中.

# x_train_test=np.random.random((10,28,28,1))
# y_train_test=keras.utils.to_categorical(np.random.randint(10,size=(10,1)),num_classes=10)

print(x_train.shape)
print(y_train.shape)

batch_size=64
n_filters=32
pool_size=(2,2)

cnn_net=Sequential()
cnn_net.add(Conv2D(32,kernel_size=(3,3),strides=(1,1),input_shape=(28,28,1)))
cnn_net.add(Activation('relu'))

cnn_net.add(MaxPool2D(pool_size=(2, 2)))
cnn_net.add(Conv2D(filters=64,kernel_size=(3, 3),strides=(1,1)))
cnn_net.add(Activation('relu'))

cnn_net.add(MaxPool2D(pool_size=(2, 2)))
cnn_net.add(Conv2D(filters=64,kernel_size=(3, 3),strides=(1,1)))
cnn_net.add(Activation('relu'))



#Flatten层用来将输入“压平”,即把多维的输入一维化,常用在从卷积层到全连接层的过渡。
cnn_net.add(Flatten())

  
cnn_net.add(Dense(64))
cnn_net.add(Activation('relu'))
 
cnn_net.add(Dense(10))
cnn_net.add(Activation('softmax'))
#summary查看网络结构
cnn_net.summary()
cnn_net.compile(loss='categorical_crossentropy',optimizer='adam', metrics=['acc'])

plot_model(cnn_net, to_file='model_size.png')


hist=cnn_net.fit(x_train,y_train,batch_size=batch_size,epochs=50,verbose=1,validation_split=0.2)#50
cnn_net.save('learnCNN.h5')
plt.plot(hist.history['loss'])
plt.plot(hist.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
print('finished!')

运行过程如下: 

(20000, 28, 28, 1)
(20000, 10)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_11 (Conv2D)           (None, 26, 26, 32)        320       
_________________________________________________________________
activation_11 (Activation)   (None, 26, 26, 32)        0         
_________________________________________________________________
max_pooling2d_7 (MaxPooling2 (None, 13, 13, 32)        0         
_________________________________________________________________
conv2d_12 (Conv2D)           (None, 11, 11, 64)        18496     
_________________________________________________________________
activation_12 (Activation)   (None, 11, 11, 64)        0         
_________________________________________________________________
max_pooling2d_8 (MaxPooling2 (None, 5, 5, 64)          0         
_________________________________________________________________
conv2d_13 (Conv2D)           (None, 3, 3, 64)          36928     
_________________________________________________________________
activation_13 (Activation)   (None, 3, 3, 64)          0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 576)               0         
_________________________________________________________________
dense_3 (Dense)              (None, 64)                36928     
_________________________________________________________________
activation_14 (Activation)   (None, 64)                0         
_________________________________________________________________
dense_4 (Dense)              (None, 10)                650       
_________________________________________________________________
activation_15 (Activation)   (None, 10)                0         
=================================================================
Total params: 93,322
Trainable params: 93,322
Non-trainable params: 0
_________________________________________________________________
Train on 16000 samples, validate on 4000 samples
Epoch 1/50
16000/16000 [==============================] - 6s 383us/step - loss: 0.4437 - acc: 0.8639 - val_loss: 0.1460 - val_acc: 0.9565
Epoch 2/50
16000/16000 [==============================] - 3s 193us/step - loss: 0.1186 - acc: 0.9629 - val_loss: 0.0925 - val_acc: 0.9683
Epoch 3/50
16000/16000 [==============================] - 3s 193us/step - loss: 0.0759 - acc: 0.9759 - val_loss: 0.0773 - val_acc: 0.9750
Epoch 4/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0557 - acc: 0.9818 - val_loss: 0.0693 - val_acc: 0.9780
Epoch 5/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0426 - acc: 0.9868 - val_loss: 0.0573 - val_acc: 0.9852
Epoch 6/50
16000/16000 [==============================] - 3s 193us/step - loss: 0.0310 - acc: 0.9909 - val_loss: 0.0535 - val_acc: 0.9852
Epoch 7/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0277 - acc: 0.9910 - val_loss: 0.0563 - val_acc: 0.9835
Epoch 8/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0194 - acc: 0.9934 - val_loss: 0.0560 - val_acc: 0.9828
Epoch 9/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0160 - acc: 0.9948 - val_loss: 0.0750 - val_acc: 0.9770
Epoch 10/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0565 - val_acc: 0.9838
Epoch 11/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0565 - val_acc: 0.9832
Epoch 12/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0110 - acc: 0.9966 - val_loss: 0.0831 - val_acc: 0.9792
Epoch 13/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0105 - acc: 0.9969 - val_loss: 0.0622 - val_acc: 0.9855
Epoch 14/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0099 - acc: 0.9969 - val_loss: 0.0647 - val_acc: 0.9840
Epoch 15/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0066 - acc: 0.9979 - val_loss: 0.0702 - val_acc: 0.9852
Epoch 16/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0075 - acc: 0.9978 - val_loss: 0.0583 - val_acc: 0.9868
Epoch 17/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0090 - acc: 0.9967 - val_loss: 0.0622 - val_acc: 0.9850
Epoch 18/50
16000/16000 [==============================] - 3s 193us/step - loss: 0.0159 - acc: 0.9943 - val_loss: 0.0972 - val_acc: 0.9758
Epoch 19/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0085 - acc: 0.9973 - val_loss: 0.0598 - val_acc: 0.9872
Epoch 20/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0083 - acc: 0.9972 - val_loss: 0.0586 - val_acc: 0.9865
Epoch 21/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0021 - acc: 0.9994 - val_loss: 0.0600 - val_acc: 0.9865
Epoch 22/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0028 - acc: 0.9992 - val_loss: 0.0624 - val_acc: 0.9860
Epoch 23/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0024 - acc: 0.9994 - val_loss: 0.0702 - val_acc: 0.9858
Epoch 24/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0072 - acc: 0.9974 - val_loss: 0.0666 - val_acc: 0.9868
Epoch 25/50
16000/16000 [==============================] - 3s 193us/step - loss: 0.0089 - acc: 0.9969 - val_loss: 0.0666 - val_acc: 0.9875
Epoch 26/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0071 - acc: 0.9975 - val_loss: 0.0679 - val_acc: 0.9865
Epoch 27/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0090 - acc: 0.9970 - val_loss: 0.0710 - val_acc: 0.9852
Epoch 28/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0042 - acc: 0.9983 - val_loss: 0.0811 - val_acc: 0.9860
Epoch 29/50
16000/16000 [==============================] - 3s 195us/step - loss: 0.0063 - acc: 0.9978 - val_loss: 0.0783 - val_acc: 0.9862
Epoch 30/50
16000/16000 [==============================] - 3s 194us/step - loss: 0.0036 - acc: 0.9989 - val_loss: 0.0736 - val_acc: 0.9878
Epoch 31/50
16000/16000 [==============================] - 3s 194us/step - loss: 4.6580e-04 - acc: 0.9999 - val_loss: 0.0700 - val_acc: 0.9898
Epoch 32/50
16000/16000 [==============================] - 3s 195us/step - loss: 6.9468e-05 - acc: 1.0000 - val_loss: 0.0711 - val_acc: 0.9892
Epoch 33/50
16000/16000 [==============================] - 3s 194us/step - loss: 3.4916e-05 - acc: 1.0000 - val_loss: 0.0714 - val_acc: 0.9895
Epoch 34/50
16000/16000 [==============================] - 3s 194us/step - loss: 2.5814e-05 - acc: 1.0000 - val_loss: 0.0721 - val_acc: 0.9892
Epoch 35/50
16000/16000 [==============================] - 3s 195us/step - loss: 2.0238e-05 - acc: 1.0000 - val_loss: 0.0728 - val_acc: 0.9892
Epoch 36/50
16000/16000 [==============================] - 3s 194us/step - loss: 1.6267e-05 - acc: 1.0000 - val_loss: 0.0732 - val_acc: 0.9892
Epoch 37/50
16000/16000 [==============================] - 3s 194us/step - loss: 1.3378e-05 - acc: 1.0000 - val_loss: 0.0737 - val_acc: 0.9892
Epoch 38/50
16000/16000 [==============================] - 3s 194us/step - loss: 1.1116e-05 - acc: 1.0000 - val_loss: 0.0743 - val_acc: 0.9892
Epoch 39/50
16000/16000 [==============================] - 3s 194us/step - loss: 9.3538e-06 - acc: 1.0000 - val_loss: 0.0747 - val_acc: 0.9892
Epoch 40/50
16000/16000 [==============================] - 3s 194us/step - loss: 7.9267e-06 - acc: 1.0000 - val_loss: 0.0753 - val_acc: 0.9890
Epoch 41/50
16000/16000 [==============================] - 3s 193us/step - loss: 6.7622e-06 - acc: 1.0000 - val_loss: 0.0758 - val_acc: 0.9890
Epoch 42/50
16000/16000 [==============================] - 3s 194us/step - loss: 5.7619e-06 - acc: 1.0000 - val_loss: 0.0761 - val_acc: 0.9890
Epoch 43/50
16000/16000 [==============================] - 3s 194us/step - loss: 4.9902e-06 - acc: 1.0000 - val_loss: 0.0767 - val_acc: 0.9890
Epoch 44/50
16000/16000 [==============================] - 3s 193us/step - loss: 4.3057e-06 - acc: 1.0000 - val_loss: 0.0771 - val_acc: 0.9892
Epoch 45/50
16000/16000 [==============================] - 3s 194us/step - loss: 3.7099e-06 - acc: 1.0000 - val_loss: 0.0776 - val_acc: 0.9892
Epoch 46/50
16000/16000 [==============================] - 3s 194us/step - loss: 3.2309e-06 - acc: 1.0000 - val_loss: 0.0780 - val_acc: 0.9892
Epoch 47/50
16000/16000 [==============================] - 3s 195us/step - loss: 2.8210e-06 - acc: 1.0000 - val_loss: 0.0784 - val_acc: 0.9890
Epoch 48/50
16000/16000 [==============================] - 3s 194us/step - loss: 2.4669e-06 - acc: 1.0000 - val_loss: 0.0791 - val_acc: 0.9890
Epoch 49/50
16000/16000 [==============================] - 3s 194us/step - loss: 2.1527e-06 - acc: 1.0000 - val_loss: 0.0794 - val_acc: 0.9890
Epoch 50/50
16000/16000 [==============================] - 3s 193us/step - loss: 1.8979e-06 - acc: 1.0000 - val_loss: 0.0799 - val_acc: 0.9890

第二次练习:

import numpy as np
import tensorflow as tf
import pandas as pd
import matplotlib.pyplot as plt
from keras.models import Sequential
from keras.layers import Dense,Activation,Conv2D
from keras.layers import MaxPool2D,Flatten,Dropout,ZeroPadding2D,BatchNormalization
from  keras.utils import np_utils
from keras import metrics
import keras
from keras.models import save_model,load_model
from keras.models import Model
from keras.callbacks import ModelCheckpoint
import os
from keras.utils import plot_model
df=pd.read_csv("sample_data/mnist_train_small.csv",header=None)
data=df.as_matrix()
df=None


np.random.shuffle(data)
x_train=data[:,1:]
x_train=x_train.reshape(data.shape[0],28,28,1).astype("float32")#将x_train变为data.shape[0]=60000个28*28,1通道的矩阵
x_train=x_train/255.0

y_train=np_utils.to_categorical(data[:,0],10).astype("float32")#将类别向量(从0到nb_classes的整数向量)映射为二值类别矩阵, 用于应用到以categorical_crossentropy为目标函数的模型中.



print(x_train.shape)
print(y_train.shape)

batch_size=64
n_filters=32
pool_size=(2,2)

model = Sequential()
model.add(ZeroPadding2D(padding=(1,1),input_shape=(28,28,1)))


model.add(Conv2D(filters=64, kernel_size=(3, 3),padding = 'same',activation='relu'))

model.add(Conv2D(filters=64, kernel_size=(3, 3),padding = 'same',activation='relu')) #28


model.add(MaxPool2D(pool_size=(2,2)))

model.add(Conv2D(filters=128, kernel_size=(3, 3),padding = 'same',activation='relu'))

model.add(Conv2D(filters=128, kernel_size=(3, 3),padding = 'same',activation='relu')) #14


#model.add(MaxPool2D(pool_size=(2,2)))


 
model.add(Flatten())#压平上述向量,变成一维25088
model.add(Dense(512, activation='relu'))#全连接层有4096个神经核,参数个数就是4096*25088
model.add(Dropout(0.5))#0.5的概率抛弃一些连接
model.add(Dense(512, activation='relu'))#再来一个全连接
model.add(Dropout(0.5))
model.add(Dense(10, activation='softmax'))
#summary查看网络结构
model.summary()
model.compile(loss='categorical_crossentropy',optimizer='adam', metrics=['acc'])

plot_model(model, to_file='model_size.png')


hist=model.fit(x_train,y_train,batch_size=batch_size,epochs=70,verbose=1,validation_split=0.2)#50
model.save('learnCNN.h5')
plt.plot(hist.history['loss'])
plt.plot(hist.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("loss.png")
plt.clf()
plt.plot(hist.history['acc'])
plt.plot(hist.history['val_acc'])
plt.title('model acc')
plt.ylabel('acc')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("acc.png")
print('finished!')

训练过程及结果:

(20000, 28, 28, 1)
(20000, 10)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
zero_padding2d_8 (ZeroPaddin (None, 30, 30, 1)         0         
_________________________________________________________________
conv2d_23 (Conv2D)           (None, 30, 30, 64)        640       
_________________________________________________________________
conv2d_24 (Conv2D)           (None, 30, 30, 64)        36928     
_________________________________________________________________
max_pooling2d_10 (MaxPooling (None, 15, 15, 64)        0         
_________________________________________________________________
conv2d_25 (Conv2D)           (None, 15, 15, 128)       73856     
_________________________________________________________________
conv2d_26 (Conv2D)           (None, 15, 15, 128)       147584    
_________________________________________________________________
flatten_2 (Flatten)          (None, 28800)             0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               14746112  
_________________________________________________________________
dropout_3 (Dropout)          (None, 512)               0         
_________________________________________________________________
dense_5 (Dense)              (None, 512)               262656    
_________________________________________________________________
dropout_4 (Dropout)          (None, 512)               0         
_________________________________________________________________
dense_6 (Dense)              (None, 10)                5130      
=================================================================
Total params: 15,272,906
Trainable params: 15,272,906
Non-trainable params: 0
_________________________________________________________________
Train on 16000 samples, validate on 4000 samples
Epoch 1/70
16000/16000 [==============================] - 17s 1ms/step - loss: 0.3283 - acc: 0.8982 - val_loss: 0.0905 - val_acc: 0.9705
Epoch 2/70
16000/16000 [==============================] - 12s 765us/step - loss: 0.1001 - acc: 0.9706 - val_loss: 0.0752 - val_acc: 0.9772
Epoch 3/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0629 - acc: 0.9819 - val_loss: 0.0522 - val_acc: 0.9852
Epoch 4/70
16000/16000 [==============================] - 12s 767us/step - loss: 0.0503 - acc: 0.9853 - val_loss: 0.0474 - val_acc: 0.9855
Epoch 5/70
16000/16000 [==============================] - 12s 768us/step - loss: 0.0378 - acc: 0.9894 - val_loss: 0.0546 - val_acc: 0.9862
Epoch 6/70
16000/16000 [==============================] - 12s 770us/step - loss: 0.0299 - acc: 0.9910 - val_loss: 0.0623 - val_acc: 0.9840
Epoch 7/70
16000/16000 [==============================] - 12s 763us/step - loss: 0.0276 - acc: 0.9908 - val_loss: 0.0532 - val_acc: 0.9880
Epoch 8/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0564 - val_acc: 0.9855
Epoch 9/70
16000/16000 [==============================] - 12s 767us/step - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0753 - val_acc: 0.9842
Epoch 10/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0855 - val_acc: 0.9828
Epoch 11/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0541 - val_acc: 0.9875
Epoch 12/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0817 - val_acc: 0.9790
Epoch 13/70
16000/16000 [==============================] - 12s 765us/step - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0705 - val_acc: 0.9852
Epoch 14/70
16000/16000 [==============================] - 12s 763us/step - loss: 0.0152 - acc: 0.9956 - val_loss: 0.0493 - val_acc: 0.9892
Epoch 15/70
16000/16000 [==============================] - 12s 766us/step - loss: 0.0075 - acc: 0.9977 - val_loss: 0.0773 - val_acc: 0.9880
Epoch 16/70
16000/16000 [==============================] - 12s 768us/step - loss: 0.0156 - acc: 0.9958 - val_loss: 0.0674 - val_acc: 0.9860
Epoch 17/70
16000/16000 [==============================] - 12s 767us/step - loss: 0.0080 - acc: 0.9973 - val_loss: 0.0822 - val_acc: 0.9845
Epoch 18/70
16000/16000 [==============================] - 12s 770us/step - loss: 0.0163 - acc: 0.9954 - val_loss: 0.0919 - val_acc: 0.9822
Epoch 19/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0273 - acc: 0.9925 - val_loss: 0.0790 - val_acc: 0.9818
Epoch 20/70
16000/16000 [==============================] - 12s 765us/step - loss: 0.0103 - acc: 0.9971 - val_loss: 0.0560 - val_acc: 0.9878
Epoch 21/70
16000/16000 [==============================] - 12s 766us/step - loss: 0.0110 - acc: 0.9972 - val_loss: 0.0738 - val_acc: 0.9852
Epoch 22/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0503 - val_acc: 0.9898
Epoch 23/70
16000/16000 [==============================] - 12s 762us/step - loss: 0.0079 - acc: 0.9979 - val_loss: 0.0835 - val_acc: 0.9868
Epoch 24/70
16000/16000 [==============================] - 12s 768us/step - loss: 0.0070 - acc: 0.9981 - val_loss: 0.0900 - val_acc: 0.9882
Epoch 25/70
16000/16000 [==============================] - 12s 768us/step - loss: 0.0170 - acc: 0.9952 - val_loss: 0.0666 - val_acc: 0.9880
Epoch 26/70
16000/16000 [==============================] - 12s 768us/step - loss: 0.0075 - acc: 0.9980 - val_loss: 0.0583 - val_acc: 0.9905
Epoch 27/70
16000/16000 [==============================] - 12s 774us/step - loss: 0.0056 - acc: 0.9978 - val_loss: 0.0981 - val_acc: 0.9882
Epoch 28/70
16000/16000 [==============================] - 12s 771us/step - loss: 0.0241 - acc: 0.9954 - val_loss: 0.1173 - val_acc: 0.9845
Epoch 29/70
16000/16000 [==============================] - 12s 765us/step - loss: 0.0052 - acc: 0.9987 - val_loss: 0.0826 - val_acc: 0.9870
Epoch 30/70
16000/16000 [==============================] - 12s 770us/step - loss: 0.0095 - acc: 0.9977 - val_loss: 0.0903 - val_acc: 0.9862
Epoch 31/70
16000/16000 [==============================] - 12s 766us/step - loss: 0.0152 - acc: 0.9974 - val_loss: 0.0708 - val_acc: 0.9875
Epoch 32/70
16000/16000 [==============================] - 12s 764us/step - loss: 0.0064 - acc: 0.9982 - val_loss: 0.0757 - val_acc: 0.9890
Epoch 33/70
16000/16000 [==============================] - 12s 771us/step - loss: 0.0089 - acc: 0.9977 - val_loss: 0.0755 - val_acc: 0.9882
Epoch 34/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0079 - acc: 0.9984 - val_loss: 0.0745 - val_acc: 0.9890
Epoch 35/70
16000/16000 [==============================] - 12s 768us/step - loss: 0.0032 - acc: 0.9990 - val_loss: 0.0695 - val_acc: 0.9900
Epoch 36/70
16000/16000 [==============================] - 12s 767us/step - loss: 0.0096 - acc: 0.9979 - val_loss: 0.0711 - val_acc: 0.9872
Epoch 37/70
16000/16000 [==============================] - 12s 771us/step - loss: 0.0081 - acc: 0.9981 - val_loss: 0.0791 - val_acc: 0.9878
Epoch 38/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0041 - acc: 0.9988 - val_loss: 0.0742 - val_acc: 0.9895
Epoch 39/70
16000/16000 [==============================] - 12s 770us/step - loss: 0.0020 - acc: 0.9992 - val_loss: 0.0839 - val_acc: 0.9882
Epoch 40/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0086 - acc: 0.9981 - val_loss: 0.0655 - val_acc: 0.9920
Epoch 41/70
16000/16000 [==============================] - 12s 766us/step - loss: 0.0060 - acc: 0.9981 - val_loss: 0.0795 - val_acc: 0.9882
Epoch 42/70
16000/16000 [==============================] - 13s 826us/step - loss: 0.0118 - acc: 0.9970 - val_loss: 0.0824 - val_acc: 0.9882
Epoch 43/70
16000/16000 [==============================] - 12s 779us/step - loss: 0.0128 - acc: 0.9971 - val_loss: 0.1105 - val_acc: 0.9870
Epoch 44/70
16000/16000 [==============================] - 12s 777us/step - loss: 0.0234 - acc: 0.9959 - val_loss: 0.1142 - val_acc: 0.9878
Epoch 45/70
16000/16000 [==============================] - 12s 774us/step - loss: 0.0181 - acc: 0.9969 - val_loss: 0.1035 - val_acc: 0.9870
Epoch 46/70
16000/16000 [==============================] - 12s 775us/step - loss: 0.0113 - acc: 0.9978 - val_loss: 0.0940 - val_acc: 0.9890
Epoch 47/70
16000/16000 [==============================] - 12s 779us/step - loss: 0.0063 - acc: 0.9991 - val_loss: 0.0950 - val_acc: 0.9882
Epoch 48/70
16000/16000 [==============================] - 12s 781us/step - loss: 0.0045 - acc: 0.9989 - val_loss: 0.0956 - val_acc: 0.9872
Epoch 49/70
16000/16000 [==============================] - 12s 767us/step - loss: 0.0087 - acc: 0.9981 - val_loss: 0.1008 - val_acc: 0.9888
Epoch 50/70
16000/16000 [==============================] - 12s 771us/step - loss: 0.0137 - acc: 0.9976 - val_loss: 0.0728 - val_acc: 0.9908
Epoch 51/70
16000/16000 [==============================] - 12s 776us/step - loss: 0.0074 - acc: 0.9983 - val_loss: 0.0860 - val_acc: 0.9895
Epoch 52/70
16000/16000 [==============================] - 12s 778us/step - loss: 0.0050 - acc: 0.9985 - val_loss: 0.0928 - val_acc: 0.9895
Epoch 53/70
16000/16000 [==============================] - 12s 766us/step - loss: 0.0056 - acc: 0.9985 - val_loss: 0.1286 - val_acc: 0.9875
Epoch 54/70
16000/16000 [==============================] - 12s 776us/step - loss: 0.0116 - acc: 0.9980 - val_loss: 0.1413 - val_acc: 0.9868
Epoch 55/70
16000/16000 [==============================] - 12s 778us/step - loss: 0.0166 - acc: 0.9973 - val_loss: 0.1010 - val_acc: 0.9882
Epoch 56/70
16000/16000 [==============================] - 12s 769us/step - loss: 0.0027 - acc: 0.9991 - val_loss: 0.1015 - val_acc: 0.9875
Epoch 57/70
16000/16000 [==============================] - 12s 778us/step - loss: 0.0019 - acc: 0.9996 - val_loss: 0.0946 - val_acc: 0.9905
Epoch 58/70
16000/16000 [==============================] - 12s 772us/step - loss: 0.0042 - acc: 0.9993 - val_loss: 0.1181 - val_acc: 0.9878
Epoch 59/70
16000/16000 [==============================] - 12s 770us/step - loss: 0.0077 - acc: 0.9986 - val_loss: 0.1102 - val_acc: 0.9880
Epoch 60/70
16000/16000 [==============================] - 12s 775us/step - loss: 0.0097 - acc: 0.9983 - val_loss: 0.1141 - val_acc: 0.9875
Epoch 61/70
16000/16000 [==============================] - 12s 780us/step - loss: 0.0158 - acc: 0.9979 - val_loss: 0.1061 - val_acc: 0.9870
Epoch 62/70
16000/16000 [==============================] - 12s 778us/step - loss: 0.0087 - acc: 0.9985 - val_loss: 0.0876 - val_acc: 0.9885
Epoch 63/70
16000/16000 [==============================] - 12s 772us/step - loss: 0.0070 - acc: 0.9984 - val_loss: 0.1457 - val_acc: 0.9848
Epoch 64/70
16000/16000 [==============================] - 12s 779us/step - loss: 0.0145 - acc: 0.9978 - val_loss: 0.1175 - val_acc: 0.9868
Epoch 65/70
16000/16000 [==============================] - 12s 774us/step - loss: 0.0114 - acc: 0.9979 - val_loss: 0.1168 - val_acc: 0.9872
Epoch 66/70
16000/16000 [==============================] - 12s 774us/step - loss: 0.0060 - acc: 0.9990 - val_loss: 0.1191 - val_acc: 0.9870
Epoch 67/70
16000/16000 [==============================] - 13s 786us/step - loss: 0.0104 - acc: 0.9983 - val_loss: 0.1297 - val_acc: 0.9860
Epoch 68/70
16000/16000 [==============================] - 12s 773us/step - loss: 0.0066 - acc: 0.9990 - val_loss: 0.1156 - val_acc: 0.9882
Epoch 69/70
16000/16000 [==============================] - 12s 771us/step - loss: 0.0030 - acc: 0.9992 - val_loss: 0.1482 - val_acc: 0.9855
Epoch 70/70
16000/16000 [==============================] - 12s 774us/step - loss: 0.0070 - acc: 0.9990 - val_loss: 0.1303 - val_acc: 0.9865

报错:

ValueError: Negative dimension size caused by subtracting 3 from 1 for 'conv2d_2/convolution' (op: 'Conv2D') with input shapes: [?,1,28,64], [3,3,64,64].

 一般是网络结构问题,本次练习中是输入的时候出错了。

第三次练习:

这次训练效果还可以。

import numpy as np
import tensorflow as tf
import pandas as pd
import matplotlib.pyplot as plt
from keras.models import Sequential
from keras.layers import Dense,Activation,Conv2D
from keras.layers import MaxPool2D,Flatten,Dropout,ZeroPadding2D,BatchNormalization
from  keras.utils import np_utils
from keras import metrics
import keras
from keras.models import save_model,load_model
from keras.models import Model
from keras.callbacks import ModelCheckpoint
import os
from keras.utils import plot_model
df=pd.read_csv("sample_data/mnist_train_small.csv",header=None)
data=df.as_matrix()
df=None


np.random.shuffle(data)
x_train=data[:,1:]
x_train=x_train.reshape(data.shape[0],28,28,1).astype("float32")#将x_train变为data.shape[0]=60000个28*28,1通道的矩阵
x_train=x_train/255.0

y_train=np_utils.to_categorical(data[:,0],10).astype("float32")#将类别向量(从0到nb_classes的整数向量)映射为二值类别矩阵, 用于应用到以categorical_crossentropy为目标函数的模型中.



print(x_train.shape)
print(y_train.shape)

batch_size=64
n_filters=32
pool_size=(2,2)

model = Sequential()
#model.add(ZeroPadding2D(padding=(1,1),input_shape=(28,28,1)))


model.add(Conv2D(filters=64, kernel_size=(3, 3),padding = 'same',activation='relu',input_shape=(28,28,1)))

model.add(Conv2D(filters=64, kernel_size=(3, 3),padding = 'same',activation='relu')) #28


model.add(MaxPool2D(pool_size=(2,2)))

model.add(Conv2D(filters=128, kernel_size=(3, 3),padding = 'same',activation='relu'))

model.add(Conv2D(filters=128, kernel_size=(3, 3),padding = 'same',activation='relu')) #14


model.add(MaxPool2D(pool_size=(2,2)))

model.add(Conv2D(filters=512, kernel_size=(3, 3),padding = 'same'))
model.add(Activation('relu'))
model.add(Conv2D(filters=512, kernel_size=(3, 3),padding = 'same'))
model.add(Activation('relu'))
model.add(Conv2D(filters=512, kernel_size=(3, 3),padding = 'same'))
model.add(Activation('relu'))

model.add(MaxPool2D(pool_size=(2,2)))
 
model.add(Flatten())#压平上述向量,变成一维25088
model.add(Dense(512, activation='relu'))#全连接层有4096个神经核,参数个数就是4096*25088
model.add(Dropout(0.5))#0.5的概率抛弃一些连接
model.add(Dense(512, activation='relu'))#再来一个全连接
model.add(Dropout(0.5))
model.add(Dense(10, activation='softmax'))
#summary查看网络结构
model.summary()
model.compile(loss='categorical_crossentropy',optimizer='adam', metrics=['acc'])

plot_model(model, to_file='model_size.png')


hist=model.fit(x_train,y_train,batch_size=batch_size,epochs=70,verbose=1,validation_split=0.2)#50
model.save('learnCNN.h5')
plt.plot(hist.history['loss'])
plt.plot(hist.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("loss.png")
plt.clf()
plt.plot(hist.history['acc'])
plt.plot(hist.history['val_acc'])
plt.title('model acc')
plt.ylabel('acc')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("acc.png")
print('finished!')

训练过程:

(20000, 28, 28, 1)
(20000, 10)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_5 (Conv2D)            (None, 28, 28, 64)        640       
_________________________________________________________________
conv2d_6 (Conv2D)            (None, 28, 28, 64)        36928     
_________________________________________________________________
max_pooling2d_3 (MaxPooling2 (None, 14, 14, 64)        0         
_________________________________________________________________
conv2d_7 (Conv2D)            (None, 14, 14, 128)       73856     
_________________________________________________________________
conv2d_8 (Conv2D)            (None, 14, 14, 128)       147584    
_________________________________________________________________
max_pooling2d_4 (MaxPooling2 (None, 7, 7, 128)         0         
_________________________________________________________________
conv2d_9 (Conv2D)            (None, 7, 7, 512)         590336    
_________________________________________________________________
activation_1 (Activation)    (None, 7, 7, 512)         0         
_________________________________________________________________
conv2d_10 (Conv2D)           (None, 7, 7, 512)         2359808   
_________________________________________________________________
activation_2 (Activation)    (None, 7, 7, 512)         0         
_________________________________________________________________
conv2d_11 (Conv2D)           (None, 7, 7, 512)         2359808   
_________________________________________________________________
activation_3 (Activation)    (None, 7, 7, 512)         0         
_________________________________________________________________
max_pooling2d_5 (MaxPooling2 (None, 3, 3, 512)         0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 4608)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 512)               2359808   
_________________________________________________________________
dropout_3 (Dropout)          (None, 512)               0         
_________________________________________________________________
dense_5 (Dense)              (None, 512)               262656    
_________________________________________________________________
dropout_4 (Dropout)          (None, 512)               0         
_________________________________________________________________
dense_6 (Dense)              (None, 10)                5130      
=================================================================
Total params: 8,196,554
Trainable params: 8,196,554
Non-trainable params: 0
_________________________________________________________________
Train on 16000 samples, validate on 4000 samples
Epoch 1/70
16000/16000 [==============================] - 19s 1ms/step - loss: 1.6370 - acc: 0.3870 - val_loss: 0.2657 - val_acc: 0.9207
Epoch 2/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.2069 - acc: 0.9433 - val_loss: 0.1121 - val_acc: 0.9670
Epoch 3/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.1116 - acc: 0.9691 - val_loss: 0.0732 - val_acc: 0.9775
Epoch 4/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0777 - acc: 0.9785 - val_loss: 0.0574 - val_acc: 0.9810
Epoch 5/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0619 - acc: 0.9822 - val_loss: 0.0980 - val_acc: 0.9742
Epoch 6/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0548 - acc: 0.9855 - val_loss: 0.0609 - val_acc: 0.9822
Epoch 7/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0445 - acc: 0.9876 - val_loss: 0.0569 - val_acc: 0.9842
Epoch 8/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0405 - acc: 0.9896 - val_loss: 0.0522 - val_acc: 0.9875
Epoch 9/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0424 - acc: 0.9881 - val_loss: 0.0654 - val_acc: 0.9845
Epoch 10/70
16000/16000 [==============================] - 19s 1ms/step - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0520 - val_acc: 0.9890
Epoch 11/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0285 - acc: 0.9913 - val_loss: 0.0789 - val_acc: 0.9835
Epoch 12/70
16000/16000 [==============================] - 19s 1ms/step - loss: 0.0473 - acc: 0.9874 - val_loss: 0.0789 - val_acc: 0.9828
Epoch 13/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0291 - acc: 0.9927 - val_loss: 0.0631 - val_acc: 0.9855
Epoch 14/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0795 - val_acc: 0.9845
Epoch 15/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0276 - acc: 0.9925 - val_loss: 0.0781 - val_acc: 0.9798
Epoch 16/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0278 - acc: 0.9925 - val_loss: 0.0511 - val_acc: 0.9880
Epoch 17/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0222 - acc: 0.9938 - val_loss: 0.0630 - val_acc: 0.9878
Epoch 18/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0300 - acc: 0.9926 - val_loss: 0.0686 - val_acc: 0.9875
Epoch 19/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0220 - acc: 0.9951 - val_loss: 0.0488 - val_acc: 0.9912
Epoch 20/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0120 - acc: 0.9972 - val_loss: 0.0614 - val_acc: 0.9885
Epoch 21/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0168 - acc: 0.9959 - val_loss: 0.0631 - val_acc: 0.9862
Epoch 22/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0177 - acc: 0.9952 - val_loss: 0.0620 - val_acc: 0.9900
Epoch 23/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0246 - acc: 0.9935 - val_loss: 0.0950 - val_acc: 0.9848
Epoch 24/70
16000/16000 [==============================] - 19s 1ms/step - loss: 0.0212 - acc: 0.9949 - val_loss: 0.0608 - val_acc: 0.9902
Epoch 25/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0158 - acc: 0.9961 - val_loss: 0.0608 - val_acc: 0.9910
Epoch 26/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0046 - acc: 0.9989 - val_loss: 0.0703 - val_acc: 0.9910
Epoch 27/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0334 - acc: 0.9926 - val_loss: 0.0673 - val_acc: 0.9905
Epoch 28/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0247 - acc: 0.9944 - val_loss: 0.0751 - val_acc: 0.9862
Epoch 29/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0138 - acc: 0.9966 - val_loss: 0.1002 - val_acc: 0.9862
Epoch 30/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0234 - acc: 0.9949 - val_loss: 0.0763 - val_acc: 0.9868
Epoch 31/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0172 - acc: 0.9957 - val_loss: 0.0580 - val_acc: 0.9898
Epoch 32/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0087 - acc: 0.9979 - val_loss: 0.0790 - val_acc: 0.9902
Epoch 33/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0111 - acc: 0.9974 - val_loss: 0.0800 - val_acc: 0.9882
Epoch 34/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0179 - acc: 0.9957 - val_loss: 0.0699 - val_acc: 0.9890
Epoch 35/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0035 - acc: 0.9992 - val_loss: 0.0890 - val_acc: 0.9905
Epoch 36/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0315 - acc: 0.9942 - val_loss: 0.0787 - val_acc: 0.9902
Epoch 37/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0098 - acc: 0.9979 - val_loss: 0.0595 - val_acc: 0.9915
Epoch 38/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0151 - acc: 0.9965 - val_loss: 0.0692 - val_acc: 0.9888
Epoch 39/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0069 - acc: 0.9981 - val_loss: 0.0858 - val_acc: 0.9885
Epoch 40/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0331 - acc: 0.9929 - val_loss: 0.1283 - val_acc: 0.9828
Epoch 41/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0250 - acc: 0.9951 - val_loss: 0.0530 - val_acc: 0.9918
Epoch 42/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0106 - acc: 0.9978 - val_loss: 0.1921 - val_acc: 0.9792
Epoch 43/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0355 - acc: 0.9931 - val_loss: 0.0691 - val_acc: 0.9880
Epoch 44/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0094 - acc: 0.9976 - val_loss: 0.0809 - val_acc: 0.9902
Epoch 45/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0121 - acc: 0.9970 - val_loss: 0.0786 - val_acc: 0.9872
Epoch 46/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0142 - acc: 0.9969 - val_loss: 0.0769 - val_acc: 0.9882
Epoch 47/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0089 - acc: 0.9978 - val_loss: 0.0899 - val_acc: 0.9892
Epoch 48/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0036 - acc: 0.9992 - val_loss: 0.0816 - val_acc: 0.9902
Epoch 49/70
16000/16000 [==============================] - 18s 1ms/step - loss: 1.4944e-04 - acc: 1.0000 - val_loss: 0.0750 - val_acc: 0.9915
Epoch 50/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0078 - acc: 0.9990 - val_loss: 0.1065 - val_acc: 0.9875
Epoch 51/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0423 - acc: 0.9929 - val_loss: 0.1552 - val_acc: 0.9802
Epoch 52/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0249 - acc: 0.9952 - val_loss: 0.0895 - val_acc: 0.9882
Epoch 53/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0239 - acc: 0.9964 - val_loss: 0.0905 - val_acc: 0.9868
Epoch 54/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0146 - acc: 0.9973 - val_loss: 0.1291 - val_acc: 0.9838
Epoch 55/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0288 - acc: 0.9959 - val_loss: 0.1286 - val_acc: 0.9840
Epoch 56/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0267 - acc: 0.9955 - val_loss: 0.1017 - val_acc: 0.9898
Epoch 57/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0216 - acc: 0.9967 - val_loss: 0.0890 - val_acc: 0.9880
Epoch 58/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0207 - acc: 0.9973 - val_loss: 0.1011 - val_acc: 0.9858
Epoch 59/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0288 - acc: 0.9949 - val_loss: 0.1393 - val_acc: 0.9868
Epoch 60/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0089 - acc: 0.9986 - val_loss: 0.1173 - val_acc: 0.9875
Epoch 61/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0027 - acc: 0.9994 - val_loss: 0.1491 - val_acc: 0.9858
Epoch 62/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0245 - acc: 0.9963 - val_loss: 0.1463 - val_acc: 0.9858
Epoch 63/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0322 - acc: 0.9956 - val_loss: 0.1353 - val_acc: 0.9878
Epoch 64/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0443 - acc: 0.9939 - val_loss: 0.1150 - val_acc: 0.9862
Epoch 65/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0331 - acc: 0.9956 - val_loss: 0.1387 - val_acc: 0.9848
Epoch 66/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0241 - acc: 0.9968 - val_loss: 0.1167 - val_acc: 0.9862
Epoch 67/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0292 - acc: 0.9961 - val_loss: 0.1012 - val_acc: 0.9892
Epoch 68/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0190 - acc: 0.9973 - val_loss: 0.0743 - val_acc: 0.9918
Epoch 69/70
16000/16000 [==============================] - 20s 1ms/step - loss: 0.0154 - acc: 0.9971 - val_loss: 0.0846 - val_acc: 0.9890
Epoch 70/70
16000/16000 [==============================] - 18s 1ms/step - loss: 0.0191 - acc: 0.9978 - val_loss: 0.1131 - val_acc: 0.9868

结果:

第四次练习: 

import numpy as np
import tensorflow as tf
import pandas as pd
import matplotlib.pyplot as plt
from keras.models import Sequential
from keras.layers import Dense,Activation,Conv2D
from keras.layers import MaxPool2D,Flatten,Dropout,ZeroPadding2D,BatchNormalization
from  keras.utils import np_utils
from keras import metrics
import keras
from keras.models import save_model,load_model
from keras.models import Model
from keras.callbacks import ModelCheckpoint
import os
from keras.utils import plot_model
from keras.optimizers import SGD,Adam
df=pd.read_csv("drive/My Drive/GoogleAI/train.csv")
data=df.as_matrix()
df=None



np.random.shuffle(data)
x_train=data[0:,1:-1]
x_train=x_train.reshape(data.shape[0],40,40,1).astype("float32")#将x_train变为data.shape[0]=60000个28*28,1通道的矩阵
x_train=x_train/255.0
y_train=np_utils.to_categorical(data[0:,-1],2).astype("float32")

print(x_train.shape)
print(y_train.shape)

batch_size=32
n_filters=32
pool_size=(2,2)

model = Sequential()
#model.add(ZeroPadding2D(padding=(1,1),input_shape=(28,28,1)))


model.add(Conv2D(filters=64, kernel_size=(3, 3),padding = 'same',activation='relu',input_shape=(40,40,1)))

model.add(Conv2D(filters=64, kernel_size=(3, 3),padding = 'same',activation='relu')) #28


model.add(MaxPool2D(pool_size=(2,2)))

model.add(Conv2D(filters=128, kernel_size=(3, 3),padding = 'same',activation='relu'))

model.add(Conv2D(filters=128, kernel_size=(3, 3),padding = 'same',activation='relu')) #14


model.add(MaxPool2D(pool_size=(2,2)))

model.add(Conv2D(filters=512, kernel_size=(3, 3),padding = 'same'))
model.add(Activation('relu'))
model.add(Conv2D(filters=512, kernel_size=(3, 3),padding = 'same'))
model.add(Activation('relu'))
model.add(Conv2D(filters=512, kernel_size=(3, 3),padding = 'same'))
model.add(Activation('relu'))

model.add(MaxPool2D(pool_size=(2,2)))
 
model.add(Flatten())#压平上述向量,变成一维25088
model.add(Dense(512, activation='relu'))#全连接层有4096个神经核,参数个数就是4096*25088
model.add(Dropout(0.5))#0.5的概率抛弃一些连接
model.add(Dense(512, activation='relu'))#再来一个全连接
model.add(Dropout(0.5))
model.add(Dense(2, activation='softmax'))
#summary查看网络结构
model.summary()
sgd =SGD(lr=0.001, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy',optimizer=sgd, metrics=['acc'])

plot_model(model, to_file='model_size.png')


hist=model.fit(x_train,y_train,batch_size=batch_size,epochs=70,verbose=1,validation_split=0.2)#50
model.save('learnCNN.h5')
plt.plot(hist.history['loss'])
plt.plot(hist.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("loss.png")
plt.clf()
plt.plot(hist.history['acc'])
plt.plot(hist.history['val_acc'])
plt.title('model acc')
plt.ylabel('acc')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("acc.png")
print('finished!')

训练过程:

(4000, 40, 40, 1)
(4000, 2)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_36 (Conv2D)           (None, 40, 40, 64)        640       
_________________________________________________________________
conv2d_37 (Conv2D)           (None, 40, 40, 64)        36928     
_________________________________________________________________
max_pooling2d_16 (MaxPooling (None, 20, 20, 64)        0         
_________________________________________________________________
conv2d_38 (Conv2D)           (None, 20, 20, 128)       73856     
_________________________________________________________________
conv2d_39 (Conv2D)           (None, 20, 20, 128)       147584    
_________________________________________________________________
max_pooling2d_17 (MaxPooling (None, 10, 10, 128)       0         
_________________________________________________________________
conv2d_40 (Conv2D)           (None, 10, 10, 512)       590336    
_________________________________________________________________
activation_16 (Activation)   (None, 10, 10, 512)       0         
_________________________________________________________________
conv2d_41 (Conv2D)           (None, 10, 10, 512)       2359808   
_________________________________________________________________
activation_17 (Activation)   (None, 10, 10, 512)       0         
_________________________________________________________________
conv2d_42 (Conv2D)           (None, 10, 10, 512)       2359808   
_________________________________________________________________
activation_18 (Activation)   (None, 10, 10, 512)       0         
_________________________________________________________________
max_pooling2d_18 (MaxPooling (None, 5, 5, 512)         0         
_________________________________________________________________
flatten_6 (Flatten)          (None, 12800)             0         
_________________________________________________________________
dense_16 (Dense)             (None, 512)               6554112   
_________________________________________________________________
dropout_11 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_17 (Dense)             (None, 512)               262656    
_________________________________________________________________
dropout_12 (Dropout)         (None, 512)               0         
_________________________________________________________________
dense_18 (Dense)             (None, 2)                 1026      
=================================================================
Total params: 12,386,754
Trainable params: 12,386,754
Non-trainable params: 0
_________________________________________________________________
Train on 3200 samples, validate on 800 samples
Epoch 1/70
3200/3200 [==============================] - 8s 3ms/step - loss: 0.6926 - acc: 0.5222 - val_loss: 0.6898 - val_acc: 0.5400
Epoch 2/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.6907 - acc: 0.5353 - val_loss: 0.6873 - val_acc: 0.5400
Epoch 3/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.6883 - acc: 0.5525 - val_loss: 0.6846 - val_acc: 0.6000
Epoch 4/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.6850 - acc: 0.6047 - val_loss: 0.6784 - val_acc: 0.7100
Epoch 5/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.6779 - acc: 0.6653 - val_loss: 0.6669 - val_acc: 0.7662
Epoch 6/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.6643 - acc: 0.7041 - val_loss: 0.6415 - val_acc: 0.7812
Epoch 7/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.6283 - acc: 0.7222 - val_loss: 0.5647 - val_acc: 0.7800
Epoch 8/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.5310 - acc: 0.7531 - val_loss: 0.4189 - val_acc: 0.8113
Epoch 9/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.4246 - acc: 0.8128 - val_loss: 0.2940 - val_acc: 0.8688
Epoch 10/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.2660 - acc: 0.8934 - val_loss: 0.2026 - val_acc: 0.8938
Epoch 11/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.1484 - acc: 0.9444 - val_loss: 0.0621 - val_acc: 0.9838
Epoch 12/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0681 - acc: 0.9819 - val_loss: 0.0184 - val_acc: 0.9988
Epoch 13/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0317 - acc: 0.9928 - val_loss: 0.0102 - val_acc: 0.9988
Epoch 14/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0191 - acc: 0.9963 - val_loss: 0.0095 - val_acc: 0.9988
Epoch 15/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0155 - acc: 0.9963 - val_loss: 0.0073 - val_acc: 0.9988
Epoch 16/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0150 - acc: 0.9966 - val_loss: 0.0046 - val_acc: 0.9988
Epoch 17/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0040 - val_acc: 0.9988
Epoch 18/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0065 - acc: 0.9984 - val_loss: 0.0036 - val_acc: 0.9988
Epoch 19/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0045 - acc: 0.9991 - val_loss: 0.0033 - val_acc: 0.9988
Epoch 20/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0031 - acc: 0.9994 - val_loss: 9.9130e-04 - val_acc: 1.0000
Epoch 21/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0024 - acc: 0.9997 - val_loss: 0.0012 - val_acc: 0.9988
Epoch 22/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0018 - acc: 1.0000 - val_loss: 9.4619e-04 - val_acc: 1.0000
Epoch 23/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0011 - acc: 0.9997 - val_loss: 0.0039 - val_acc: 0.9988
Epoch 24/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0020 - acc: 0.9994 - val_loss: 0.0045 - val_acc: 0.9988
Epoch 25/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0010 - acc: 1.0000 - val_loss: 0.0026 - val_acc: 0.9988
Epoch 26/70
3200/3200 [==============================] - 7s 2ms/step - loss: 8.3724e-04 - acc: 1.0000 - val_loss: 5.5013e-04 - val_acc: 1.0000
Epoch 27/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0020 - acc: 0.9994 - val_loss: 0.0043 - val_acc: 0.9975
Epoch 28/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0036 - acc: 0.9988 - val_loss: 0.0012 - val_acc: 0.9988
Epoch 29/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0017 - acc: 0.9997 - val_loss: 4.0852e-04 - val_acc: 1.0000
Epoch 30/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0010 - acc: 0.9997 - val_loss: 0.0211 - val_acc: 0.9925
Epoch 31/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0027 - acc: 0.9988 - val_loss: 0.0012 - val_acc: 0.9988
Epoch 32/70
3200/3200 [==============================] - 7s 2ms/step - loss: 0.0011 - acc: 1.0000 - val_loss: 1.9495e-04 - val_acc: 1.0000
Epoch 33/70
3200/3200 [==============================] - 7s 2ms/step - loss: 5.4917e-04 - acc: 1.0000 - val_loss: 8.9707e-04 - val_acc: 1.0000
Epoch 34/70
3200/3200 [==============================] - 7s 2ms/step - loss: 4.6175e-04 - acc: 1.0000 - val_loss: 1.4151e-04 - val_acc: 1.0000
Epoch 35/70
3200/3200 [==============================] - 7s 2ms/step - loss: 3.0411e-04 - acc: 1.0000 - val_loss: 2.5414e-04 - val_acc: 1.0000
Epoch 36/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.6523e-04 - acc: 1.0000 - val_loss: 5.1636e-04 - val_acc: 1.0000
Epoch 37/70
3200/3200 [==============================] - 7s 2ms/step - loss: 2.5121e-04 - acc: 1.0000 - val_loss: 2.8750e-04 - val_acc: 1.0000
Epoch 38/70
3200/3200 [==============================] - 7s 2ms/step - loss: 2.6499e-04 - acc: 1.0000 - val_loss: 6.5903e-04 - val_acc: 1.0000
Epoch 39/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.6837e-04 - acc: 1.0000 - val_loss: 1.4067e-04 - val_acc: 1.0000
Epoch 40/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.4955e-04 - acc: 1.0000 - val_loss: 1.1061e-04 - val_acc: 1.0000
Epoch 41/70
3200/3200 [==============================] - 7s 2ms/step - loss: 2.2614e-04 - acc: 1.0000 - val_loss: 2.3998e-04 - val_acc: 1.0000
Epoch 42/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.8178e-04 - acc: 1.0000 - val_loss: 4.0963e-04 - val_acc: 1.0000
Epoch 43/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.0676e-04 - acc: 1.0000 - val_loss: 1.6768e-04 - val_acc: 1.0000
Epoch 44/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.4944e-04 - acc: 1.0000 - val_loss: 1.6885e-04 - val_acc: 1.0000
Epoch 45/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.6997e-04 - acc: 1.0000 - val_loss: 7.1427e-05 - val_acc: 1.0000
Epoch 46/70
3200/3200 [==============================] - 7s 2ms/step - loss: 9.8449e-05 - acc: 1.0000 - val_loss: 1.3498e-04 - val_acc: 1.0000
Epoch 47/70
3200/3200 [==============================] - 7s 2ms/step - loss: 6.3622e-05 - acc: 1.0000 - val_loss: 1.1161e-04 - val_acc: 1.0000
Epoch 48/70
3200/3200 [==============================] - 7s 2ms/step - loss: 2.6218e-04 - acc: 1.0000 - val_loss: 2.9487e-04 - val_acc: 1.0000
Epoch 49/70
3200/3200 [==============================] - 7s 2ms/step - loss: 6.8380e-05 - acc: 1.0000 - val_loss: 3.2897e-04 - val_acc: 1.0000
Epoch 50/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.3235e-04 - acc: 1.0000 - val_loss: 1.4400e-04 - val_acc: 1.0000
Epoch 51/70
3200/3200 [==============================] - 7s 2ms/step - loss: 7.6817e-05 - acc: 1.0000 - val_loss: 1.8169e-04 - val_acc: 1.0000
Epoch 52/70
3200/3200 [==============================] - 7s 2ms/step - loss: 7.9961e-05 - acc: 1.0000 - val_loss: 1.9066e-04 - val_acc: 1.0000
Epoch 53/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.3203e-04 - acc: 1.0000 - val_loss: 2.2335e-04 - val_acc: 1.0000
Epoch 54/70
3200/3200 [==============================] - 7s 2ms/step - loss: 6.2024e-05 - acc: 1.0000 - val_loss: 8.6982e-05 - val_acc: 1.0000
Epoch 55/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.3301e-04 - acc: 1.0000 - val_loss: 2.6485e-04 - val_acc: 1.0000
Epoch 56/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.0229e-04 - acc: 1.0000 - val_loss: 7.6931e-04 - val_acc: 1.0000
Epoch 57/70
3200/3200 [==============================] - 7s 2ms/step - loss: 2.9938e-04 - acc: 1.0000 - val_loss: 0.0026 - val_acc: 0.9988
Epoch 58/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.8570e-04 - acc: 1.0000 - val_loss: 1.3332e-04 - val_acc: 1.0000
Epoch 59/70
3200/3200 [==============================] - 7s 2ms/step - loss: 8.2858e-05 - acc: 1.0000 - val_loss: 8.2129e-05 - val_acc: 1.0000
Epoch 60/70
3200/3200 [==============================] - 7s 2ms/step - loss: 6.2264e-05 - acc: 1.0000 - val_loss: 1.0242e-04 - val_acc: 1.0000
Epoch 61/70
3200/3200 [==============================] - 7s 2ms/step - loss: 2.0275e-04 - acc: 1.0000 - val_loss: 1.0507e-04 - val_acc: 1.0000
Epoch 62/70
3200/3200 [==============================] - 7s 2ms/step - loss: 5.9341e-05 - acc: 1.0000 - val_loss: 3.0019e-04 - val_acc: 1.0000
Epoch 63/70
3200/3200 [==============================] - 7s 2ms/step - loss: 6.4440e-05 - acc: 1.0000 - val_loss: 7.8200e-05 - val_acc: 1.0000
Epoch 64/70
3200/3200 [==============================] - 7s 2ms/step - loss: 1.2058e-04 - acc: 1.0000 - val_loss: 1.4521e-04 - val_acc: 1.0000
Epoch 65/70
3200/3200 [==============================] - 7s 2ms/step - loss: 3.5251e-05 - acc: 1.0000 - val_loss: 1.5971e-04 - val_acc: 1.0000
Epoch 66/70
3200/3200 [==============================] - 7s 2ms/step - loss: 4.8072e-05 - acc: 1.0000 - val_loss: 2.3435e-04 - val_acc: 1.0000
Epoch 67/70
3200/3200 [==============================] - 7s 2ms/step - loss: 7.0874e-05 - acc: 1.0000 - val_loss: 1.8044e-04 - val_acc: 1.0000
Epoch 68/70
3200/3200 [==============================] - 7s 2ms/step - loss: 3.6950e-05 - acc: 1.0000 - val_loss: 1.5588e-04 - val_acc: 1.0000
Epoch 69/70
3200/3200 [==============================] - 7s 2ms/step - loss: 3.7245e-05 - acc: 1.0000 - val_loss: 2.4842e-04 - val_acc: 1.0000
Epoch 70/70
3200/3200 [==============================] - 7s 2ms/step - loss: 4.7517e-05 - acc: 1.0000 - val_loss: 1.9751e-04 - val_acc: 1.0000

训练结果:

第五次练习:

用VGG16作为预训练,一次性100epoch结果感觉一般,但分两次训练的结果却比它好(暂不知道原因)。

import numpy as np
import tensorflow as tf
import pandas as pd
import matplotlib.pyplot as plt
from keras.models import Sequential
from keras.layers import Dense,Activation,Conv2D
from keras.layers import MaxPool2D,Flatten,Dropout,ZeroPadding2D,BatchNormalization
from  keras.utils import np_utils
from keras import metrics
import keras
from keras.models import save_model,load_model
from keras.models import Model
from keras.callbacks import ModelCheckpoint
import os
from keras.utils import plot_model
from keras.optimizers import SGD,Adam
from keras.preprocessing.image import ImageDataGenerator
from keras.applications import VGG16
from keras.applications.vgg16 import preprocess_input

train_datagen=ImageDataGenerator(rescale=1./255,rotation_range=40,width_shift_range=0.2,height_shift_range=0.2,shear_range=0.2,zoom_range=0.2,horizontal_flip=True)
test_datagen=ImageDataGenerator(rescale=1./255)

train_dir="drive/My Drive/GoogleAI/train/"
train_generator=train_datagen.flow_from_directory(train_dir,target_size=(150,150),batch_size=32,class_mode='binary')
test_dir="drive/My Drive/GoogleAI/test/"
test_generator=test_datagen.flow_from_directory(test_dir,target_size=(150,150),batch_size=32,class_mode='binary')


conv_base=VGG16(weights='imagenet',include_top=False,input_shape=(150,150,3))

batch_size=48




model = Sequential()

model.add(conv_base)
model.add(Flatten())
model.add(Dense(256,activation='relu'))
model.add(Dense(1, activation='sigmoid'))
conv_base.trainable=False
sgd =SGD(lr=0.001, decay=1e-6, momentum=0.9, nesterov=True)#
model.compile(loss='binary_crossentropy',optimizer=sgd, metrics=['acc'])
hist=model.fit_generator(train_generator,steps_per_epoch=44,epochs=100,validation_data=test_generator,validation_steps=32)

model.save('learnCNN.h5')
plt.plot(hist.history['loss'])
plt.plot(hist.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("loss.png")
plt.clf()
plt.plot(hist.history['acc'])
plt.plot(hist.history['val_acc'])
plt.title('model acc')
plt.ylabel('acc')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
plt.savefig("acc.png")
print('finished!')

训练过程如下:

Using TensorFlow backend.
Found 1420 images belonging to 2 classes.
Found 700 images belonging to 2 classes.
Downloading data from https://github.com/fchollet/deep-learning-models/releases/download/v0.1/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58892288/58889256 [==============================] - 1s 0us/step
Epoch 1/100
44/44 [==============================] - 295s 7s/step - loss: 0.5924 - acc: 0.6754 - val_loss: 0.4320 - val_acc: 0.8010
Epoch 2/100
44/44 [==============================] - 19s 432ms/step - loss: 0.4807 - acc: 0.7674 - val_loss: 0.3717 - val_acc: 0.8422
Epoch 3/100
44/44 [==============================] - 19s 425ms/step - loss: 0.4478 - acc: 0.7709 - val_loss: 0.3626 - val_acc: 0.8484
Epoch 4/100
44/44 [==============================] - 19s 428ms/step - loss: 0.4345 - acc: 0.7979 - val_loss: 0.3914 - val_acc: 0.8157
Epoch 5/100
44/44 [==============================] - 19s 425ms/step - loss: 0.4138 - acc: 0.8085 - val_loss: 0.3095 - val_acc: 0.8720
Epoch 6/100
44/44 [==============================] - 19s 428ms/step - loss: 0.3826 - acc: 0.8232 - val_loss: 0.3410 - val_acc: 0.8490
Epoch 7/100
44/44 [==============================] - 19s 425ms/step - loss: 0.3806 - acc: 0.8243 - val_loss: 0.3260 - val_acc: 0.8524
Epoch 8/100
44/44 [==============================] - 19s 424ms/step - loss: 0.3444 - acc: 0.8407 - val_loss: 0.3096 - val_acc: 0.8716
Epoch 9/100
44/44 [==============================] - 19s 425ms/step - loss: 0.3629 - acc: 0.8305 - val_loss: 0.3100 - val_acc: 0.8760
Epoch 10/100
44/44 [==============================] - 19s 424ms/step - loss: 0.3494 - acc: 0.8532 - val_loss: 0.3032 - val_acc: 0.8784
Epoch 11/100
44/44 [==============================] - 19s 421ms/step - loss: 0.3643 - acc: 0.8355 - val_loss: 0.3287 - val_acc: 0.8612
Epoch 12/100
44/44 [==============================] - 18s 420ms/step - loss: 0.3397 - acc: 0.8482 - val_loss: 0.3542 - val_acc: 0.8451
Epoch 13/100
44/44 [==============================] - 19s 427ms/step - loss: 0.3264 - acc: 0.8495 - val_loss: 0.2886 - val_acc: 0.8657
Epoch 14/100
44/44 [==============================] - 18s 420ms/step - loss: 0.3214 - acc: 0.8513 - val_loss: 0.3188 - val_acc: 0.8642
Epoch 15/100
44/44 [==============================] - 19s 423ms/step - loss: 0.3171 - acc: 0.8566 - val_loss: 0.2929 - val_acc: 0.8775
Epoch 16/100
44/44 [==============================] - 19s 421ms/step - loss: 0.3094 - acc: 0.8613 - val_loss: 0.2972 - val_acc: 0.8789
Epoch 17/100
44/44 [==============================] - 18s 420ms/step - loss: 0.3145 - acc: 0.8655 - val_loss: 0.3274 - val_acc: 0.8559
Epoch 18/100
44/44 [==============================] - 19s 421ms/step - loss: 0.3045 - acc: 0.8575 - val_loss: 0.3328 - val_acc: 0.8632
Epoch 19/100
44/44 [==============================] - 18s 420ms/step - loss: 0.3062 - acc: 0.8508 - val_loss: 0.3238 - val_acc: 0.8735
Epoch 20/100
44/44 [==============================] - 18s 418ms/step - loss: 0.3173 - acc: 0.8511 - val_loss: 0.2906 - val_acc: 0.8789
Epoch 21/100
44/44 [==============================] - 19s 430ms/step - loss: 0.3446 - acc: 0.8412 - val_loss: 0.3020 - val_acc: 0.8706
Epoch 22/100
44/44 [==============================] - 19s 425ms/step - loss: 0.3006 - acc: 0.8693 - val_loss: 0.2839 - val_acc: 0.8711
Epoch 23/100
44/44 [==============================] - 19s 421ms/step - loss: 0.3300 - acc: 0.8513 - val_loss: 0.2935 - val_acc: 0.8716
Epoch 24/100
44/44 [==============================] - 19s 422ms/step - loss: 0.3018 - acc: 0.8650 - val_loss: 0.2859 - val_acc: 0.8784
Epoch 25/100
44/44 [==============================] - 19s 421ms/step - loss: 0.3195 - acc: 0.8560 - val_loss: 0.3059 - val_acc: 0.8858
Epoch 26/100
44/44 [==============================] - 19s 424ms/step - loss: 0.2878 - acc: 0.8705 - val_loss: 0.3211 - val_acc: 0.8608
Epoch 27/100
44/44 [==============================] - 18s 419ms/step - loss: 0.3106 - acc: 0.8650 - val_loss: 0.2834 - val_acc: 0.8848
Epoch 28/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2862 - acc: 0.8722 - val_loss: 0.2783 - val_acc: 0.8873
Epoch 29/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2931 - acc: 0.8686 - val_loss: 0.3319 - val_acc: 0.8661
Epoch 30/100
44/44 [==============================] - 19s 422ms/step - loss: 0.3181 - acc: 0.8710 - val_loss: 0.2749 - val_acc: 0.8804
Epoch 31/100
44/44 [==============================] - 18s 420ms/step - loss: 0.3117 - acc: 0.8603 - val_loss: 0.2725 - val_acc: 0.8888
Epoch 32/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2800 - acc: 0.8738 - val_loss: 0.2775 - val_acc: 0.8814
Epoch 33/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2918 - acc: 0.8767 - val_loss: 0.3082 - val_acc: 0.8750
Epoch 34/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2961 - acc: 0.8764 - val_loss: 0.2797 - val_acc: 0.8931
Epoch 35/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2988 - acc: 0.8663 - val_loss: 0.2865 - val_acc: 0.8804
Epoch 36/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2843 - acc: 0.8741 - val_loss: 0.2912 - val_acc: 0.8681
Epoch 37/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2786 - acc: 0.8913 - val_loss: 0.3202 - val_acc: 0.8804
Epoch 38/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2923 - acc: 0.8828 - val_loss: 0.3547 - val_acc: 0.8504
Epoch 39/100
44/44 [==============================] - 19s 425ms/step - loss: 0.2845 - acc: 0.8797 - val_loss: 0.2609 - val_acc: 0.8951
Epoch 40/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2768 - acc: 0.8708 - val_loss: 0.2962 - val_acc: 0.8789
Epoch 41/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2832 - acc: 0.8742 - val_loss: 0.3270 - val_acc: 0.8618
Epoch 42/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2803 - acc: 0.8892 - val_loss: 0.2795 - val_acc: 0.8839
Epoch 43/100
44/44 [==============================] - 18s 417ms/step - loss: 0.2823 - acc: 0.8781 - val_loss: 0.3086 - val_acc: 0.8696
Epoch 44/100
44/44 [==============================] - 19s 422ms/step - loss: 0.2718 - acc: 0.8821 - val_loss: 0.2743 - val_acc: 0.8839
Epoch 45/100
44/44 [==============================] - 19s 422ms/step - loss: 0.2813 - acc: 0.8726 - val_loss: 0.2785 - val_acc: 0.8863
Epoch 46/100
44/44 [==============================] - 19s 426ms/step - loss: 0.2762 - acc: 0.8729 - val_loss: 0.2963 - val_acc: 0.8873
Epoch 47/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2745 - acc: 0.8788 - val_loss: 0.2978 - val_acc: 0.8839
Epoch 48/100
44/44 [==============================] - 19s 422ms/step - loss: 0.2869 - acc: 0.8644 - val_loss: 0.2859 - val_acc: 0.8765
Epoch 49/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2817 - acc: 0.8757 - val_loss: 0.2784 - val_acc: 0.8858
Epoch 50/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2641 - acc: 0.8845 - val_loss: 0.2880 - val_acc: 0.8814
Epoch 51/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2634 - acc: 0.8899 - val_loss: 0.2829 - val_acc: 0.8829
Epoch 52/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2620 - acc: 0.8923 - val_loss: 0.2857 - val_acc: 0.8853
Epoch 53/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2600 - acc: 0.8892 - val_loss: 0.2579 - val_acc: 0.8907
Epoch 54/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2625 - acc: 0.8877 - val_loss: 0.2685 - val_acc: 0.8922
Epoch 55/100
44/44 [==============================] - 18s 418ms/step - loss: 0.2654 - acc: 0.8897 - val_loss: 0.2827 - val_acc: 0.8898
Epoch 56/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2756 - acc: 0.8890 - val_loss: 0.2785 - val_acc: 0.8814
Epoch 57/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2664 - acc: 0.8809 - val_loss: 0.2720 - val_acc: 0.8833
Epoch 58/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2471 - acc: 0.8941 - val_loss: 0.2878 - val_acc: 0.8858
Epoch 59/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2808 - acc: 0.8765 - val_loss: 0.2772 - val_acc: 0.8951
Epoch 60/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2693 - acc: 0.8814 - val_loss: 0.2947 - val_acc: 0.8888
Epoch 61/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2755 - acc: 0.8816 - val_loss: 0.2853 - val_acc: 0.8863
Epoch 62/100
44/44 [==============================] - 19s 420ms/step - loss: 0.2609 - acc: 0.8840 - val_loss: 0.2893 - val_acc: 0.8839
Epoch 63/100
44/44 [==============================] - 19s 424ms/step - loss: 0.2641 - acc: 0.8941 - val_loss: 0.2722 - val_acc: 0.8833
Epoch 64/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2718 - acc: 0.8897 - val_loss: 0.2790 - val_acc: 0.8839
Epoch 65/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2693 - acc: 0.8892 - val_loss: 0.2812 - val_acc: 0.8804
Epoch 66/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2477 - acc: 0.8906 - val_loss: 0.2823 - val_acc: 0.8917
Epoch 67/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2648 - acc: 0.8876 - val_loss: 0.2855 - val_acc: 0.8882
Epoch 68/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2468 - acc: 0.8916 - val_loss: 0.2743 - val_acc: 0.8853
Epoch 69/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2669 - acc: 0.8873 - val_loss: 0.3376 - val_acc: 0.8740
Epoch 70/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2684 - acc: 0.8877 - val_loss: 0.2963 - val_acc: 0.8814
Epoch 71/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2765 - acc: 0.8809 - val_loss: 0.2834 - val_acc: 0.8819
Epoch 72/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2637 - acc: 0.8873 - val_loss: 0.2616 - val_acc: 0.8892
Epoch 73/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2620 - acc: 0.8816 - val_loss: 0.3063 - val_acc: 0.8907
Epoch 74/100
44/44 [==============================] - 19s 422ms/step - loss: 0.2547 - acc: 0.8856 - val_loss: 0.2578 - val_acc: 0.8824
Epoch 75/100
44/44 [==============================] - 19s 422ms/step - loss: 0.2622 - acc: 0.8921 - val_loss: 0.2998 - val_acc: 0.8750
Epoch 76/100
44/44 [==============================] - 19s 426ms/step - loss: 0.2692 - acc: 0.8718 - val_loss: 0.2757 - val_acc: 0.8951
Epoch 77/100
44/44 [==============================] - 18s 418ms/step - loss: 0.2541 - acc: 0.8921 - val_loss: 0.3113 - val_acc: 0.8789
Epoch 78/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2588 - acc: 0.8941 - val_loss: 0.2987 - val_acc: 0.8765
Epoch 79/100
44/44 [==============================] - 18s 417ms/step - loss: 0.2529 - acc: 0.8944 - val_loss: 0.2942 - val_acc: 0.8696
Epoch 80/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2686 - acc: 0.8899 - val_loss: 0.3009 - val_acc: 0.8839
Epoch 81/100
44/44 [==============================] - 18s 418ms/step - loss: 0.2401 - acc: 0.8900 - val_loss: 0.2969 - val_acc: 0.8755
Epoch 82/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2461 - acc: 0.8965 - val_loss: 0.3009 - val_acc: 0.8760
Epoch 83/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2553 - acc: 0.8852 - val_loss: 0.2709 - val_acc: 0.8990
Epoch 84/100
44/44 [==============================] - 18s 416ms/step - loss: 0.2444 - acc: 0.8972 - val_loss: 0.2939 - val_acc: 0.8789
Epoch 85/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2476 - acc: 0.8958 - val_loss: 0.2865 - val_acc: 0.8775
Epoch 86/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2500 - acc: 0.8916 - val_loss: 0.3126 - val_acc: 0.8809
Epoch 87/100
44/44 [==============================] - 18s 420ms/step - loss: 0.2721 - acc: 0.8873 - val_loss: 0.2520 - val_acc: 0.8941
Epoch 88/100
44/44 [==============================] - 18s 418ms/step - loss: 0.2437 - acc: 0.8989 - val_loss: 0.3003 - val_acc: 0.8701
Epoch 89/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2495 - acc: 0.8985 - val_loss: 0.2855 - val_acc: 0.8814
Epoch 90/100
44/44 [==============================] - 19s 421ms/step - loss: 0.2575 - acc: 0.8863 - val_loss: 0.2869 - val_acc: 0.8804
Epoch 91/100
44/44 [==============================] - 19s 425ms/step - loss: 0.2431 - acc: 0.8942 - val_loss: 0.2874 - val_acc: 0.8848
Epoch 92/100
44/44 [==============================] - 19s 422ms/step - loss: 0.2587 - acc: 0.8814 - val_loss: 0.2972 - val_acc: 0.8882
Epoch 93/100
44/44 [==============================] - 19s 425ms/step - loss: 0.2696 - acc: 0.8877 - val_loss: 0.2675 - val_acc: 0.8780
Epoch 94/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2461 - acc: 0.8991 - val_loss: 0.2844 - val_acc: 0.8863
Epoch 95/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2558 - acc: 0.8918 - val_loss: 0.2923 - val_acc: 0.8760
Epoch 96/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2850 - acc: 0.8639 - val_loss: 0.2860 - val_acc: 0.8843
Epoch 97/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2344 - acc: 0.8972 - val_loss: 0.2983 - val_acc: 0.8819
Epoch 98/100
44/44 [==============================] - 19s 423ms/step - loss: 0.2414 - acc: 0.8949 - val_loss: 0.3104 - val_acc: 0.8706
Epoch 99/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2613 - acc: 0.8845 - val_loss: 0.3363 - val_acc: 0.8720
Epoch 100/100
44/44 [==============================] - 18s 419ms/step - loss: 0.2595 - acc: 0.8880 - val_loss: 0.2911 - val_acc: 0.8853

训练结果:

  • 2
    点赞
  • 13
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值