软阈值函数推导过程 soft threshold deviation

<img alt="微笑" src="http://static.blog.csdn.net/xheditor/xheditor_emot/default/smile.gif" /><img alt="微笑" src="http://static.blog.csdn.net/xheditor/xheditor_emot/default/smile.gif" /><img alt="微笑" src="http://static.blog.csdn.net/xheditor/xheditor_emot/default/smile.gif" /><img alt="微笑" src="http://static.blog.csdn.net/xheditor/xheditor_emot/default/smile.gif" /><img alt="微笑" src="http://static.blog.csdn.net/xheditor/xheditor_emot/default/smile.gif" /><img alt="微笑" src="http://static.blog.csdn.net/xheditor/xheditor_emot/default/smile.gif" />

Here I want to deviate how to solve the soft threshold function.

for background introduction, please refer to this l1 introduction. Let's directly go into the objective function

g(t) = 0.5(t-t_0)^2+r|t|

with given t_0 and r, to determine t to make g(t) reach the minimal point.

the result is classic soft thresholding function:
when |t_0|>r, t=t_0-r*sign(t_0); when |t_0|<=r, t=0. 


I know the result for long time, but still confused with the deviation process. But this time, I will make myself clear.......

(1) when t_0>=0, there is no reason to choose a negative t. 
Because: if we choose a negative t', it must hold g(-t')<=g(t'). So g(t') is not the minimal point.
Thus, a positive t should be here. It then turns to be 
g(t) = 0.5(t-t_0)^2+r(t)  s.t. t_0>0 and t>0
it is very clear this is a simple parabola. its result is:
when t_0>r, t=t_0-r; when 0<=t_0<=r, t=0.
(2) when t_0< 0. the same deviation holds.
(3) get the above two scenario together, the soft thresholding function can be deduced.

Further work:
g(t) = 0.5(t-t_0)^2+r|t-a|+w|t-b|



                
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值