Algorithm: Decision Tree, Entropy, Information Gain and Continues features

Deciesion Tree is the foundation of the random forest.

A decision tree is a decision support tool that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to display an algorithm that only contains conditional control statements.

Base on the history samples and experiences, we can build different decision tree(model), we defined a method to evaluate the performance of the decision tree.

Decision boundary of Decision Tree

The structure of the decision is complex. Structured Prediction

NP-hard problem

We can't solve a problem within the polynomial complexity called NP-hard

We use the approximate approach to solve the problem such as greedy algorithm.

Information Gain

Type 1 of Decision Tree

Type two of the decision tree

we change the order of the two features to obtain a new decision tree.

if the dimension of the input data is large, the number associate decision trees is very large!

 

We use the greedy algorithm to obtain the decision tree model.

The first step is to choose the root

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值