Softsign Tanh 速度对比

我最近对softsign这个函数比较偏爱,但是测试的时候,有时候发现其速度并不令人满意

网上资料很少,搜到了一篇博客

深度学习日常发问(三):激活函数性能对比_明曦君的博客-CSDN博客_mish激活函数和relu对比

但是这篇文章却说“Tanh的速度最快,而Softsign速度最慢”

十分不解,感觉tanh函数在计算的时候还包括幂次运算,没有道理比softsign快,写了如下的测试函数:

import torch
import time
from torch.nn import Softsign, Tanh

soft= Softsign()
tan = Tanh()


sum_soft = 0.
sum_tan = 0.
for i in range(1000):
	x = torch.randn(3, 1000, 1000)*10
	print(i)
	# print(x)

	time1 = time.time()
	y1 = soft(x)
	time2 = time.time()
	y2 = tan(x)
	time3 = time.time()

	sum_soft += time2 - time1
	sum_tan += time3 - time2

print("sum_soft = ", sum_soft)
print("sum_tan = ", sum_tan)

打印结果为:

sum_soft =  4.364975929260254
sum_tan =  1.6701316833496094

结果证明,他可能是对的,tanh确实比softsign要快。

如果放到GPU上去运算的话:

import torch
import time
from torch.nn import Softsign, Tanh

soft= Softsign()
tan = Tanh()
device = torch.device("cuda")
soft = soft.to(device)
tan = tan.to(device)

sum_soft = 0.
sum_tan = 0.
for i in range(1000):
	x = torch.randn(3, 1000, 1000)*10
	x = x.to(device)
	print(i)
	# print(x)

	time1 = time.time()
	y1 = soft(x)
	time2 = time.time()
	y2 = tan(x)
	time3 = time.time()

	sum_soft += time2 - time1
	sum_tan += time3 - time2

print("sum_soft = ", sum_soft)
print("sum_tan = ", sum_tan)

打印结果为:

sum_soft =  0.11101865768432617
sum_tan =  0.021539688110351562

softsigh速度依然比tan要慢

——————————————————————————

原因尚不可知,难道是因为求绝对值,比求幂次,还慢吗?

写了如下测试,比较求幂次和绝对值的速度

import time
import torch

sum_exp = 0.
sum_abs = 0.
for i in range(1000):
	x = torch.randn(3, 1000, 1000)*10
	print(i)
	# print(x)

	time1 = time.time()
	y1 = torch.exp(x)
	time2 = time.time()
	y2 = torch.abs(x)
	time3 = time.time()

	sum_exp += time2 - time1
	sum_abs += time3 - time2

print("sum_exp = ", sum_exp)
print("sum_abs = ", sum_abs)

打印结果为:

sum_exp =  1.2346689701080322
sum_abs =  1.3161406517028809

发现求绝对值真的比exp的速度还慢一点

但是numpy,却并不同

import time
import numpy as np

sum_exp = 0.
sum_abs = 0.
for i in range(1000):
	x = np.random.randn(3, 1000, 1000)*10
	print(i)
	# print(x)

	time1 = time.time()
	y1 = np.exp(x)
	time2 = time.time()
	y2 = np.abs(x)
	time3 = time.time()

	sum_exp += time2 - time1
	sum_abs += time3 - time2

print("sum_exp = ", sum_exp)
print("sum_abs = ", sum_abs)

打印结果为:

sum_exp =  17.33661985397339
sum_abs =  3.6888980865478516

整体来讲,numpy比pytorch慢很多,exp运算差了一个数量级,而求绝对值也慢了一倍

不过numpy自己比较,求绝对值比求exp快好几倍

稍微调试了一下,发现softplus的实现,是Python代码

 

而tanh的实现,没看到,可能是C++底层来实现的

 

 所以,这应该是两种环境的PK,导致softsign的速度没有提上来

但是如果导出来,同时用C++的runtime库来执行的话,猜想softsign应该还是会更快一些吧

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值