Gaussian non-linear activation function
tensorflow自带了几个常用的激活函数,但是最近手头的项目,激活函数用的既不是relu,也不是sigmod,而是一个很生僻的函数,高斯激活函数(Gaussian activation function)。
自定义高斯函数形式:f(x) = exp( - (x^2) / (sigma^2) )
实验中的 sigma = 0.5
自定义激活函数代码如下
新建文件gaussian_activation.py
#-*- encoding:utf-8 -*-
#!/usr/local/env python
import numpy as np
import tensorflow as tf
import math
from tensorflow.python.framework import ops
def gaussian(x):
return math.exp(- (x*x) / (0.25))
def gaussian_grad(x):
return (-8) * x * math.exp(- (x*x) / (0.25))
gaussian_np = np.vectorize(gaussian)
gaussian_grad_np = np.vectorize(gaussian_grad)
gaussian_np_32 = lambda x: gaussian_np(x).astype(np.float32)
gaussian_grad_np_32 = lambda x: gaussian_grad_np(x).astype(np.float32)
def gaussian_grad_tf(x, name=None):
with ops.name_scope(name, "