机器学习二(二分类问题)

本文通过分析二分类问题的数据集,利用传统统计方法进行特征选择,然后运用逻辑回归、LDA、KNN、CART、NB、SVM等机器学习算法进行模型建立,并通过交叉验证评估模型性能。进一步,通过调整参数和集成学习方法提升模型预测能力,最终选用GBM并调优,得到较好的预测效果。
摘要由CSDN通过智能技术生成

# -*- coding: utf-8 -*-
"""
Created on Mon Aug  6 20:37:19 2018

@author: wangxihe
"""
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from  scipy import stats 
from statsmodels.formula.api import ols
import statsmodels.api as sm
import statsmodels.formula.api as smf
import os
os.chdir(r'E:\spyderwork\wxh\数据科学\二分类问题')
columns=['A0','A1','A2','A3','A4','A5','A6','A7','A8','A9','A10','A11','A12','A13','A14','A15',
         'A16','A17','A18','A19','A20','A21','A22','A23','A24','A25','A26','A27','A28','A29',
         'A30','A31','A32','A33','A34','A35','A36','A37','A38','A39','A40','A41','A42','A43',
         'A44','A45','A46','A47','A48','A49','A50','A51','A52','A53','A54','A55',
         'A56','A57','A58','A59','Y']
sonar=pd.read_csv('sonar.all-data.csv',names=columns,header=None)
sonar.shape
sonar.dtypes
sonar['Y'].value_counts()#大致均衡
#R=0,M=1
sonar['Y'].replace({'R':0},inplace=True)
sonar['Y'].replace({'M':1},inplace=True)

# 数据的分类分布
sonar['Y'].value_counts().plot(kind='bar')
#%%
      使用传统的方法
#%%
#由于自变量都是连续的,使用双样本T检验
columned=[]
X=sonar.copy()
for ct in X.columns:
    if ct!='Y':
      TT0=X[X['Y']==0][ct]
      TT1=X[X['Y']==1][ct]
      #方差齐性检验
      leveneTest=stats.levene(TT0,TT1,center='median')
#      print('w-value=%6.4f, p-value=%6.4f' %leveneTest)
      _,fp_value=leveneTest
      if fp_value<0.05:
          Flag=False
      else:
          Flag=True
         
      _,p_value=stats.stats.ttest_ind(TT0,TT1,equal_var=Flag)
      if p_value<0.05:
         columned.append(ct) 
         print('p-value=%6.4f' %p_value)
len(columned)  #建议保留  34个变量
##使用共性判断
#%%共线性
def vif(df, col_i):
    cols = list(df.columns)
    cols.remove(col_i)
    cols_noti = cols
    formula = col_i + '~' + '+'.join(cols_noti)
    r2 = ols(formula, df).fit().rsquared
    return 1. / (1. - r2)

#%%
exog = X[columned].copy()

for i in exog.columns:
    print(i, '\t', vif(df=exog, col_i=i))
   
exog.drop(['A19'],axis=1,inplace=True)
exog.drop(['A45'],axis=1,inplace=True)
exog.drop(['A35'],axis=1,inplace=True)
exog.drop(['A10'],axis=1,inplace=True)
exog.drop(['A47'],axis=1,inplace=True)
#%%
# 向前法
def forward_select(data, response):
    remaining = set(data.columns)
    remaining.remove(response)
    selected = []
    current_score, best_new_score = float('inf'), float('inf')
    while remaining:
        aic_with_candidates=[]
        for candidate in remaining:
            formula = "{} ~ {}".format(
                response,' + '.join(selected + [candidate]))
            aic = smf.glm(
                formula=formula, data=data,
                family=sm.families.Binomial(sm.families.links.logit)
            ).fit().aic
            aic_with_candidates.append((aic, candidate))
        aic_with_candidates.sort(reverse=True)
        best_new_score, best_candidate=aic_with_candidates.pop()
        if current_score > best_new_score:
            remaining.remove(best_candidate)
            selected.append(best_candidate)
            current_score = best_new_score

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值