Implement the SELU Activation Function
Implement the SELU (Scaled Exponential Linear Unit) activation function, a self-normalizing variant of ELU. Your task is to compute the SELU value for a given input while ensuring numerical stability.
Example:
Input:
selu(-1.0)
Output:
-1.1113
Reasoning:
For x = -1.0, the SELU activation is calculated using the formula SELU(x)=λα(ex−1)SELU(x)=λα(ex−1). Substituting the values of λλ and αα, we get SELU(−1.0)=1.0507×1.6733×(e−1.0−1)=−1.1113SELU(−1.0)=1.0507×1.6733×(e−1.0−1)=−1.1113.
import math
def selu(x: float) -> float:
"""
Implements the SELU (Scaled Exponential Linear Unit) activation function.
Args:
x: Input value
Returns:
SELU activation value
"""
alpha = 1.6732632423543772
scale = 1.0507009873554804
# Your code here
if x > 0:
res = scale * x
else:
res = scale * alpha * (math.exp(x) - 1)
return res
Test Case Results
5 of 5 tests passed
官方题解
import math
def selu(x: float) -> float:
"""
Implements the SELU (Scaled Exponential Linear Unit) activation function.
Args:
x: Input value
Returns:
SELU activation value
"""
alpha = 1.6732632423543772
scale = 1.0507009873554804
return round(scale * x if x > 0 else scale * alpha * (math.exp(x) - 1), 4)