Keras(五)——Layers大纲

Layers

Common Methods
  • layer.get_weights(): 以含有Numpy矩阵的列表形式返回层的权重。
  • layer.set_weights(weights): 从含有Numpy矩阵的列表中设置层的权重(与get_weights的输出形状相同)。
  • layer.get_config(): 返回包含层配置的字典。

如果一个层具有单节点,得到各项参数:

  • layer.input
  • layer.output
  • layer.input_shape
  • layer.output_shape

如果为多节点:

  • layer.get_input_at(node_index)
  • layer.get_output_at(node_index)
  • layer.get_input_shape_at(node_index)
  • layer.get_output_shape_at(node_index)
Core Layers(网络核心层)

Dense—— densely-connect NN layer

Activation

Dropout

Flatten——flattens the input

Input

Reshape

Permute ——permutes the dimensions of the input according to a given pattern

RepeatVector

Lambda—— warps arbitrary expression as a Layer object

ActivityRegularization—— applies an update to the cost function based input activity

Masking——skip the timesteps

SpatialDropout1D

SpatialDropout2D

SpatialDropout3D

Convolutional Layers(卷积层)

Conv1D

Conv2D

SeparableConv1D(可分离)

SeparableConv2D

DepthwiseConv2D

Conv2DTranspose(反卷积)

Conv3D

Conv3DTranspose

Cropping1D(裁剪层)

Cropping2D

Cropping3D

UpSampling1D(上采样层)

UpSampling2D

UpSampling3D

ZeroPadding1D(零填充层)

ZeroPadding2D

ZeroPadding3D

Pooling Layers(池化层)

MaxPooling1D

MaxPooling2D

MaxPooling3D

AveragePooling1D

AveragePooling2D

AveragePooling3D

GlobalMaxPooling1D

GlobalAveragePooling1D

GlobalMaxPooling2D

GlobalAveragePooling2D

GlobalMaxPooling3D

GlobalAveragePooling3D

Locally-connected Layers(局部连接层)

LocallyConnected1D

LocallyConnected2D

Recurrent Layers(循环层)

RNN

SimpleRNN

GRU (Gated Recurrent Unit)

LSTM (Long Short-Term Memory)

ConvLSTM2D

SimpleRNNCell

GRUCell

LSTMCell

CuDNNGRU—— Fast GRU implementation backed by CuDNN(only on GPU)

CuDNNLSTM

Embedding Layers(嵌入层)

Embedding—— Turns positive integers (indexes) into dense vectors of fixed size.

Merge Layers(融合层)

Add

Subtract

Multiply

Average

Maximum

Concatenate

Dot(点积)

add

subtract

multiply

average

maximum

concatenate

dot

Advanced Activations Layers(高级激活层)

LeakyReLU( Leaky version of a Rectified Linear Unit )

PReLU( Parametric Rectified Linear Unit )

ELU( Exponential Linear Unit )

ThresholdedReLU( Thresholded Rectified Linear Unit )

Softmax

ReLU( Rectified Linear Unit )

Normalization Layers(标准化层)

BatchNormalization

Noise Layers

GaussianNoise

GaussianDropout

AlphaDropout

Layer warppers(封装器)

TimeDistributed

Bidirectional

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值