画决策树代码:
from sklearn import tree
from sklearn.datasets import load_wine
from sklearn.model_selection import train_test_split
import pandas as pd
wine = load_wine()
wine.data.shape
wine.target
pd.concat([pd.DataFrame(wine.data)
,pd.DataFrame(wine.target)],axis=1)
wine.feature_names
wine.target_names
xtrain,xtest,ytrain,ytest = train_test_split(wine.data,wine.target
,test_size=0.3)
clf = tree.DecisionTreeClassifier(criterion='entropy')
clf = clf.fit(xtrain, ytrain)
score = clf.score(xtest, ytest)
score
feature_name = ['酒精','苹果酸','灰','灰的碱性','镁','总酚','类黄酮','非黄烷类酚类','花青素','颜色强度','色调','od280/od315稀释葡萄酒','脯氨酸']
import graphviz
dot_data = tree.export_graphviz(clf
,out_file = None
# ,feature_names= feature_name
,class_names=["琴酒","雪莉","贝尔摩德"]
,filled=True
,rounded=True
)
graph = graphviz.Source(dot_data)
graph
CART基尼系数法:
算法流程:
- 迭代计算每个特征的每个二分切点gini系数
- 选取gini最小的特征及对应切分点为最佳分裂点
- 进行一次样本划分
- 对划分后的两部分样本重复以上迭代过程,逐步向下分裂
- 所有样本被分到叶节点中
- 结束
剪枝策略:
降低模型复杂度, 防止过拟合
预剪枝: 先计算当前的因此分裂是否能带来模型泛化能力的提升, 否则不生长
后剪枝: 生成完整的决策树之后, 从对底层向上计算, 对分裂点剪枝, 若泛化能力提升则进行剪枝