Implement the SELU Activation Function

Implement the SELU Activation Function

Implement the SELU (Scaled Exponential Linear Unit) activation function, a self-normalizing variant of ELU. Your task is to compute the SELU value for a given input while ensuring numerical stability.

Example:

Input:

selu(-1.0)

Output:

-1.1113

Reasoning:

For x = -1.0, the SELU activation is calculated using the formula SELU(x)=λα(ex−1)SELU(x)=λα(ex−1). Substituting the values of λλ and αα, we get SELU(−1.0)=1.0507×1.6733×(e−1.0−1)=−1.1113SELU(−1.0)=1.0507×1.6733×(e−1.0−1)=−1.1113.

import math
def selu(x: float) -> float:
	"""
	Implements the SELU (Scaled Exponential Linear Unit) activation function.

	Args:
		x: Input value

	Returns:
		SELU activation value
	"""
	alpha = 1.6732632423543772
	scale = 1.0507009873554804
	# Your code here
	if x > 0:
		res = scale * x
	else:
		res = scale * alpha * (math.exp(x) - 1)
	return res

Test Case Results

5 of 5 tests passed

官方题解

import math

def selu(x: float) -> float:
    """
    Implements the SELU (Scaled Exponential Linear Unit) activation function.

    Args:
        x: Input value

    Returns:
        SELU activation value
    """
    alpha = 1.6732632423543772
    scale = 1.0507009873554804
    return round(scale * x if x > 0 else scale * alpha * (math.exp(x) - 1), 4)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值