激活函数:Swish: a Self-Gated Activation Function

今天看到google brain 关于激活函数在2017年提出了一个新的Swish 激活函数。

叫swish,地址:https://arxiv.org/abs/1710.05941v1 

pytorch里是这样的:

def relu_fn(x):

    """ Swish activation function """

    return x * torch.sigmoid(x)

 Swish, which is simply f(x) = x ·sigmoid(x). Our experiments show that Swish tends to work better than ReLU on deeper models across a number of challenging datasets.

For example, simply replacing ReLUs with Swish units improves top-1 classification accuracy on ImageNet by0.9% for MobileNASNetA and 0.6% for Inception-ResNet-v2.

The simplicity of Swish and its similarity to ReLU make it easy for practitioners to replace ReLUs with Swish units in any neural network.

他人的介绍:

https://blog.csdn.net/wydbyxr/article/details/84615522

 

转载于:https://www.cnblogs.com/yjphhw/p/11083877.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值