激活函数mish和swish应用(keras)

mish使用代码用例

from keras.models import Model
from keras.layers.core import Dense, Dropout, Activation, Reshape, Permute
from keras.layers.convolutional import Conv2D, Conv2DTranspose, ZeroPadding2D
from keras.layers.pooling import AveragePooling2D, GlobalAveragePooling2D
import keras.backend as K
def mish(x):
    return x * K.tanh(K.softplus(x))
def swish(x):
    """Swish activation function.
    # Arguments
        x: Input tensor.
    # Returns
        The Swish activation: `x * sigmoid(x)`.
    # References
        [Searching for Activation Functions](https://arxiv.org/abs/1710.05941)
    """
    if K.backend() == 'tensorflow':
        try:
            # The native TF implementation has a more
            # memory-efficient gradient implementation
            return K.tf.nn.swish(x)
        except AttributeError:
            pass
    return x * K.sigmoid(x)
def _cnn(input, nclass):
    _dropout_rate = 0.2 
    _weight_decay = 1e-4
    eps = 1.1e-5
    _nb_filter = 64
    # conv 64 5*5 s=2
    x = Conv2D(_nb_filter, (3, 3), strides=(1, 1), kernel_initializer='he_normal', padding='same',
               use_bias=False, kernel_regularizer=l2(_weight_decay))(input)
    x = BatchNormalization(epsilon=eps, axis=-1)(x)
    x = Activation(mish)(x)
    x = Conv2D(_nb_filter, (3, 3), strides=(2, 2), kernel_initializer='he_normal', padding='same',
               use_bias=False, kernel_regularizer=l2(_weight_decay))(x)
    x = BatchNormalization(epsilon=eps, axis=-1)(x)
    x = Activation(swish)(x)
    return x
input = Input(shape=(32, 280, 1), name='the_input')
y_pred=_cnn(input, 5000)
basemodel = Model(inputs=input, outputs=y_pred)
basemodel.summary()

  • 2
    点赞
  • 11
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值