Implement the Softplus Activation Function
Implement the Softplus activation function, a smooth approximation of the ReLU function. Your task is to compute the Softplus value for a given input, handling edge cases to prevent numerical overflow or underflow.
Example:
Input:
softplus(2)
Output:
2.1269
Reasoning:
For x = 2, the Softplus activation is calculated as log(1+ex)log(1+ex).
import math
def softplus(x: float) -> float:
"""
Compute the softplus activation function.
Args:
x: Input value
Returns:
The softplus value: log(1 + e^x)
"""
# Your code here
val = math.log(1 + math.exp(x))
return round(val,4)
Test Case Results
5 of 5 tests passed
官方题解
import math
def softplus(x: float) -> float:
"""
Compute the softplus activation function.
Args:
x: Input value
Returns:
The softplus value: log(1 + e^x)
"""
# To prevent overflow for large positive values
if x > 100:
return x
# To prevent underflow for large negative values
if x < -100:
return 0.0
return round (math.log(1.0 + math.exp(x)),4)