论文阅读(1)——医疗数据分析

这次分析总结的是这篇论文:

《ExactTop-k FeatureSelectionvia `2,0-NormConstraint》 XiaoCai,FeipingNie,HengHuang

文章是基于Sparse methods for machine learning。相关的补充资料,可以在这个链接找到:https://www.di.ens.fr/~fbach/Cours_peyresq_2010.pdf


Feature selection primarily addresses the problem of finding the most relevant and informative set of features.

Generally speaking, feature selection algorithms may roughly be categorized into three main families: filter, wrapper and embedded methods.

(1)In filter methods, features are pre-selected by the intrinsic properties of the data without running the learning algorithm.

(2) In wrapper methods, the process of feature selection is wrapped around the learning algorithm that will ultimately be employed and take advantage of the “feedbacks” from the learning algorithm

(3)Inembeddedmethods,featuresearchandthelearningalgorithmareincorporated into a single optimization problem


Advantages for methods in this paper:

(1)the feature selection method based on convex problem is NOT always better than its counterpart based on non-convex problem

(2)We tackle the original sparse problem with L2,0-norm constraint directly instead of its relaxation or approximation problem. Therefore, we can get a more accurate solution.

(3)We avoid the computational burden of tuning the parameter for regularization term.

(4)We are the first to provide an efficient algorithm to tackle the minimization problem of L2,1-normlosswiththe L2,0-norm constraint.


  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值