递归消除特征法RFE筛选特征-包装法-特征选择-特征降维

RFE(Recursive feature elimination):递归消除特征法使用一个基模型(这里使用逻辑回归)来进行多轮训练,每轮训练后,消除若干权值系数的特征,再基于新的特征集进行下一轮训练。

RFE的具体步骤如下:
1、初始的特征集为所有可用的特征。
2、使用当前特征集进行建模,然后计算每个特征的重要性。
3、删除最不重要的一个(或多个)特征,更新特征集。
4、跳转到步骤2,直到完成所有特征的重要性评级,保留所需的数量。

from sklearn.feature_selection import RFE
from sklearn.linear_model import LogisticRegression
from sklearn.datasets import load_iris

iris = load_iris()
print("iris.data:\n", iris.data)
print("iris.target:\n", iris.target)

# RFE(Recursive feature elimination)
# 递归消除特征法使用一个基模型来进行多轮训练,每轮训练后,消除若干权值系数的特征,再基于新的特征集进行下一轮训练。

# RFE的执行过程如下:
#  初始的特征集为所有可用的特征。
#  使用当前特征集进行建模,然后计算每个特征的重要性。
#  删除最不重要的一个(或多个)特征,更新特征集。
#  跳转到步骤2,直到完成所有特征的重要性评级,保留所需的数量。


# multi_class:'ovr' or 'multinomial'。'multinomial'即为MvM。
#   若是二元逻辑回归,二者区别不大。
#   对于MvM,若模型有T类,每次在所有的T类样本里面选择两类样本出来,把所有输出为该两类的样本放在一起,进行二元回归,得到模型参数,一共需要T(T-1)/2次分类。

# solver 优化算法选择参数
#   五个可选参数,即newton-cg,lbfgs,liblinear,sag,saga。默认为liblinear。solver参数决定了我们对逻辑回归损失函数的优化方法

# max_iter 算法收敛最大迭代次数
#   Int类型,默认为10。仅在正则化优化算法为newton-cg, sag和lbfgs才有用,算法收敛的最大迭代次数。

# 递归特征消除法,返回特征选择后的数据
# 参数estimator为基模型
# 参数n_features_to_select为选择的特征个数

data_rfe = RFE(estimator=LogisticRegression(multi_class='auto', solver='lbfgs', max_iter=500),
    n_features_to_select=2).fit_transform(iris.data, iris.target)

print("data_rfe:\n", data_rfe)

 
运行结果:

iris.data:
[[5.1 3.5 1.4 0.2]
[4.9 3. 1.4 0.2]
[4.7 3.2 1.3 0.2]
[4.6 3.1 1.5 0.2]
[5. 3.6 1.4 0.2]
[5.4 3.9 1.7 0.4]
[4.6 3.4 1.4 0.3]
[5. 3.4 1.5 0.2]
[4.4 2.9 1.4 0.2]
[4.9 3.1 1.5 0.1]
[5.4 3.7 1.5 0.2]
[4.8 3.4 1.6 0.2]
[4.8 3. 1.4 0.1]
[4.3 3. 1.1 0.1]
[5.8 4. 1.2 0.2]
[5.7 4.4 1.5 0.4]
[5.4 3.9 1.3 0.4]
[5.1 3.5 1.4 0.3]
[5.7 3.8 1.7 0.3]
[5.1 3.8 1.5 0.3]
[5.4 3.4 1.7 0.2]
[5.1 3.7 1.5 0.4]
[4.6 3.6 1. 0.2]
[5.1 3.3 1.7 0.5]
[4.8 3.4 1.9 0.2]
[5. 3. 1.6 0.2]
[5. 3.4 1.6 0.4]
[5.2 3.5 1.5 0.2]
[5.2 3.4 1.4 0.2]
[4.7 3.2 1.6 0.2]
[4.8 3.1 1.6 0.2]
[5.4 3.4 1.5 0.4]
[5.2 4.1 1.5 0.1]
[5.5 4.2 1.4 0.2]
[4.9 3.1 1.5 0.2]
[5. 3.2 1.2 0.2]
[5.5 3.5 1.3 0.2]
[4.9 3.6 1.4 0.1]
[4.4 3. 1.3 0.2]
[5.1 3.4 1.5 0.2]
[5. 3.5 1.3 0.3]
[4.5 2.3 1.3 0.3]
[4.4 3.2 1.3 0.2]
[5. 3.5 1.6 0.6]
[5.1 3.8 1.9 0.4]
[4.8 3. 1.4 0.3]
[5.1 3.8 1.6 0.2]
[4.6 3.2 1.4 0.2]
[5.3 3.7 1.5 0.2]
[5. 3.3 1.4 0.2]
[7. 3.2 4.7 1.4]
[6.4 3.2 4.5 1.5]
[6.9 3.1 4.9 1.5]
[5.5 2.3 4. 1.3]
[6.5 2.8 4.6 1.5]
[5.7 2.8 4.5 1.3]
[6.3 3.3 4.7 1.6]
[4.9 2.4 3.3 1. ]
[6.6 2.9 4.6 1.3]
[5.2 2.7 3.9 1.4]
[5. 2. 3.5 1. ]
[5.9 3. 4.2 1.5]
[6. 2.2 4. 1. ]
[6.1 2.9 4.7 1.4]
[5.6 2.9 3.6 1.3]
[6.7 3.1 4.4 1.4]
[5.6 3. 4.5 1.5]
[5.8 2.7 4.1 1. ]
[6.2 2.2 4.5 1.5]
[5.6 2.5 3.9 1.1]
[5.9 3.2 4.8 1.8]
[6.1 2.8 4. 1.3]
[6.3 2.5 4.9 1.5]
[6.1 2.8 4.7 1.2]
[6.4 2.9 4.3 1.3]
[6.6 3. 4.4 1.4]
[6.8 2.8 4.8 1.4]
[6.7 3. 5. 1.7]
[6. 2.9 4.5 1.5]
[5.7 2.6 3.5 1. ]
[5.5 2.4 3.8 1.1]
[5.5 2.4 3.7 1. ]
[5.8 2.7 3.9 1.2]
[6. 2.7 5.1 1.6]
[5.4 3. 4.5 1.5]
[6. 3.4 4.5 1.6]
[6.7 3.1 4.7 1.5]
[6.3 2.3 4.4 1.3]
[5.6 3. 4.1 1.3]
[5.5 2.5 4. 1.3]
[5.5 2.6 4.4 1.2]
[6.1 3. 4.6 1.4]
[5.8 2.6 4. 1.2]
[5. 2.3 3.3 1. ]
[5.6 2.7 4.2 1.3]
[5.7 3. 4.2 1.2]
[5.7 2.9 4.2 1.3]
[6.2 2.9 4.3 1.3]
[5.1 2.5 3. 1.1]
[5.7 2.8 4.1 1.3]
[6.3 3.3 6. 2.5]
[5.8 2.7 5.1 1.9]
[7.1 3. 5.9 2.1]
[6.3 2.9 5.6 1.8]
[6.5 3. 5.8 2.2]
[7.6 3. 6.6 2.1]
[4.9 2.5 4.5 1.7]
[7.3 2.9 6.3 1.8]
[6.7 2.5 5.8 1.8]
[7.2 3.6 6.1 2.5]
[6.5 3.2 5.1 2. ]
[6.4 2.7 5.3 1.9]
[6.8 3. 5.5 2.1]
[5.7 2.5 5. 2. ]
[5.8 2.8 5.1 2.4]
[6.4 3.2 5.3 2.3]
[6.5 3. 5.5 1.8]
[7.7 3.8 6.7 2.2]
[7.7 2.6 6.9 2.3]
[6. 2.2 5. 1.5]
[6.9 3.2 5.7 2.3]
[5.6 2.8 4.9 2. ]
[7.7 2.8 6.7 2. ]
[6.3 2.7 4.9 1.8]
[6.7 3.3 5.7 2.1]
[7.2 3.2 6. 1.8]
[6.2 2.8 4.8 1.8]
[6.1 3. 4.9 1.8]
[6.4 2.8 5.6 2.1]
[7.2 3. 5.8 1.6]
[7.4 2.8 6.1 1.9]
[7.9 3.8 6.4 2. ]
[6.4 2.8 5.6 2.2]
[6.3 2.8 5.1 1.5]
[6.1 2.6 5.6 1.4]
[7.7 3. 6.1 2.3]
[6.3 3.4 5.6 2.4]
[6.4 3.1 5.5 1.8]
[6. 3. 4.8 1.8]
[6.9 3.1 5.4 2.1]
[6.7 3.1 5.6 2.4]
[6.9 3.1 5.1 2.3]
[5.8 2.7 5.1 1.9]
[6.8 3.2 5.9 2.3]
[6.7 3.3 5.7 2.5]
[6.7 3. 5.2 2.3]
[6.3 2.5 5. 1.9]
[6.5 3. 5.2 2. ]
[6.2 3.4 5.4 2.3]
[5.9 3. 5.1 1.8]]
iris.target:
[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2
2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
2 2]
data_rfe:
[[1.4 0.2]
[1.4 0.2]
[1.3 0.2]
[1.5 0.2]
[1.4 0.2]
[1.7 0.4]
[1.4 0.3]
[1.5 0.2]
[1.4 0.2]
[1.5 0.1]
[1.5 0.2]
[1.6 0.2]
[1.4 0.1]
[1.1 0.1]
[1.2 0.2]
[1.5 0.4]
[1.3 0.4]
[1.4 0.3]
[1.7 0.3]
[1.5 0.3]
[1.7 0.2]
[1.5 0.4]
[1. 0.2]
[1.7 0.5]
[1.9 0.2]
[1.6 0.2]
[1.6 0.4]
[1.5 0.2]
[1.4 0.2]
[1.6 0.2]
[1.6 0.2]
[1.5 0.4]
[1.5 0.1]
[1.4 0.2]
[1.5 0.2]
[1.2 0.2]
[1.3 0.2]
[1.4 0.1]
[1.3 0.2]
[1.5 0.2]
[1.3 0.3]
[1.3 0.3]
[1.3 0.2]
[1.6 0.6]
[1.9 0.4]
[1.4 0.3]
[1.6 0.2]
[1.4 0.2]
[1.5 0.2]
[1.4 0.2]
[4.7 1.4]
[4.5 1.5]
[4.9 1.5]
[4. 1.3]
[4.6 1.5]
[4.5 1.3]
[4.7 1.6]
[3.3 1. ]
[4.6 1.3]
[3.9 1.4]
[3.5 1. ]
[4.2 1.5]
[4. 1. ]
[4.7 1.4]
[3.6 1.3]
[4.4 1.4]
[4.5 1.5]
[4.1 1. ]
[4.5 1.5]
[3.9 1.1]
[4.8 1.8]
[4. 1.3]
[4.9 1.5]
[4.7 1.2]
[4.3 1.3]
[4.4 1.4]
[4.8 1.4]
[5. 1.7]
[4.5 1.5]
[3.5 1. ]
[3.8 1.1]
[3.7 1. ]
[3.9 1.2]
[5.1 1.6]
[4.5 1.5]
[4.5 1.6]
[4.7 1.5]
[4.4 1.3]
[4.1 1.3]
[4. 1.3]
[4.4 1.2]
[4.6 1.4]
[4. 1.2]
[3.3 1. ]
[4.2 1.3]
[4.2 1.2]
[4.2 1.3]
[4.3 1.3]
[3. 1.1]
[4.1 1.3]
[6. 2.5]
[5.1 1.9]
[5.9 2.1]
[5.6 1.8]
[5.8 2.2]
[6.6 2.1]
[4.5 1.7]
[6.3 1.8]
[5.8 1.8]
[6.1 2.5]
[5.1 2. ]
[5.3 1.9]
[5.5 2.1]
[5. 2. ]
[5.1 2.4]
[5.3 2.3]
[5.5 1.8]
[6.7 2.2]
[6.9 2.3]
[5. 1.5]
[5.7 2.3]
[4.9 2. ]
[6.7 2. ]
[4.9 1.8]
[5.7 2.1]
[6. 1.8]
[4.8 1.8]
[4.9 1.8]
[5.6 2.1]
[5.8 1.6]
[6.1 1.9]
[6.4 2. ]
[5.6 2.2]
[5.1 1.5]
[5.6 1.4]
[6.1 2.3]
[5.6 2.4]
[5.5 1.8]
[4.8 1.8]
[5.4 2.1]
[5.6 2.4]
[5.1 2.3]
[5.1 1.9]
[5.9 2.3]
[5.7 2.5]
[5.2 2.3]
[5. 1.9]
[5.2 2. ]
[5.4 2.3]
[5.1 1.8]]

Process finished with exit code 0

  • 1
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值