- 优点:可以不同于sequential,能够有共享层以及多个输入输出
简单介绍Sequential
from keras.models import Sequential
from keras.layers import Dense
#构造
model = Sequential([Dense(2, input_shape(1, )), Dense(1)])
# or
model = Sequential()
model.add(Dense(2, input_shape(1,)))
model.add(Dense(1))
函数式API
**概念:**模型是通过创建层的实例(layer instances)并将它们直接相互连接成对来定义的,然后定义一个模型(model)来指定那些层是要作为这个模型的输入和输出。
-
输入
需要一个Input层,并定义输入数据张量的维度的形状
from keras.layers import Input input = Input(shape=(128,))
-
连接
hidden = Dense(128)(input) output = Dense(64)(hidden)
-
创建模型
model = Model(inputs=input, outputs=output) # 单输入 input = Input(shape=(,), name='input') model = Model(inputs=input, outputs=output) # 多输入 input_1 = Input(shape=(,), name='input_1') input_2 = Input(shape=(,), name='input_2') model = Model(inputs=[input_1, input_2], outputs=output) # 多输出 model = Model(inputs=input, outputs=[output_1, output_2])
-
常用的各种层的罗列
from keras.layers import Input,Dense,Flatten,Conv2D,MaxPool2D from keras.layers.recurrent import LSTM from keras.layers.wrappers import TimeDistributed Dense(num_cell,activation='relu',name='hidden') Conv2D(num_cell,kernel_size=4,activation='relu',name='conv') MaxPool2D(pool_size=(2,2),name='pool') LSTM(num_cell,name='lstm') TimeDistributed(Dense(num_cell,activation='softmax'),name='output') Flatten()
-
拼接
from keras.layers.merge import concatenate concatenate([flat1,flat2])
-
可视化
from keras.utils import plot_model plot_model(model,to_file='xxx.png') from IPython.display import Image # 秀出网络拓扑图 Image('xxx.png')
-
打印网络结构
model.summary()