Implement the Softplus Activation Function

Implement the Softplus Activation Function

Implement the Softplus activation function, a smooth approximation of the ReLU function. Your task is to compute the Softplus value for a given input, handling edge cases to prevent numerical overflow or underflow.

Example:

Input:

softplus(2)

Output:

2.1269

Reasoning:

For x = 2, the Softplus activation is calculated as log⁡(1+ex)log(1+ex).

import math
def softplus(x: float) -> float:
	"""
	Compute the softplus activation function.

	Args:
		x: Input value

	Returns:
		The softplus value: log(1 + e^x)
	"""
	# Your code here
	val = math.log(1 + math.exp(x))
	return round(val,4)

Test Case Results

5 of 5 tests passed

官方题解

import math

def softplus(x: float) -> float:
    """
    Compute the softplus activation function.

    Args:
        x: Input value

    Returns:
        The softplus value: log(1 + e^x)
    """
    # To prevent overflow for large positive values
    if x > 100:
        return x
    # To prevent underflow for large negative values
    if x < -100:
        return 0.0

    return round (math.log(1.0 + math.exp(x)),4)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值