keras代码实现
注意,每一个卷积层后面都加上了一个激活函数“ReLU”。我们把这个激活函数在添加卷积层的时候直接写在括号内,而不是将卷积层和激活函数分两次加入。
##使用tensorflow2.0
import tensorflow as tf
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Conv2D,MaxPooling2D
from tensorflow.keras.layers import Flatten,Dense,Dropout
model = Sequential()
model.add(Conv2D(64,(3,3), strides = (1,1), input_shape = (224,224,3), padding = 'same', activation = 'relu'))
model.add(Conv2D(64,(3,3), strides = (1,1), padding = 'same', activation = 'relu'))
model.add(MaxPooling2D((2,2), strides = (<