特征选择Feature Selection

1. Definition

      In machine learning and statistics, feature selection, also known as variable selection, attribute selection or variable subset selection, is the process of selecting a subset of relevant features for use in model construction, along with anevaluation measure which scores the different feature subsets.

     The choice of evaluation metric heavily influences the algorithm, and it is these evaluation metrics which distinguish between the three main categories of feature selection algorithms:

  • wrappers
  • filters
  • embedded methods
     There are 5 main type of evaluation functions:
  • distance(euclidean distance measure)                             -------------->filter
  • information(entropy, information gain, etc.)                       -------------->filter
  • dependency(Mutual Information,correlation coefficient)  -------------->filter
  • consistency(min-features bias)                                           -------------->filter
  • classifier error rate(the classifier themselves)                 -------------->wrapper

2. Machine Learning Methods of Feature Selection for Classification

  • online boosting (online boosting and vision)
  • decision tree/random forest

3. Feature Selection for Retrieval/Match

  • maximun conditional entropy[3]
  • Conditional Mutual Information Maximization[4]
  • binary feature like BRIEF, Brisk, ORB, D-Brief等等

 

4. Reference

[1] http://en.wikipedia.org/wiki/Feature_selection

[2] Feature Selection for Classification (google)

[3] Real-time Large Scale Near-duplicate Web Video Retrieval

[4]  Fast Binary Feature Selection withConditional Mutual Information

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值