Python生成信息熵决策树

✨信息熵决策树说白了就是越为祖先节点的信息越应该具有较小的熵。✨

熵相信你高中一定听过,就是描述一个混乱程度。

所以越不混乱的(越具有辨识度的)越应该成为根节点

官方总结一句就是:特征X作为根节点,使得类Y的不确定性减少

举例:身份证号,身高,性别作为节点谁更应该成为根节点

答案:身份证号,因为身份证号可以直接找到对应的人,而身高性别不行。


计算熵的公式:

\Delta H = -\sum Pi*logPi, I=1,2,3,...,n

我们需要通过计算每个特征的熵值,来确定谁作为父结点 

Python实现:

1.具体步骤

(1)要将不同特征信息的数据装入一个数组

(2)依据公式计算哪个特征信息的熵值大,越大的熵值越成为越上层的根节点

(3)信息分析完成后结束,生成决策树

2.Python代码具体实现

⚠️本代码使用的是连续型判断,如果想采用离散型判断

将第95行代码中

AttributeArray[i].iscon = 0 改为 AttributeArray[i].iscon = 1

其中0代表连续型,1代表离散型

import math


class Attribute():
    def __init__(self, name, id, iscon=0):
        self.name = name
        self.kids = []
        self.id = id
        self.iscon = iscon  # whether the attribute is continuous.1:algha,0:number.


# count per number of the kid of per attribute in the SampleArray
def count_sample(SampleArray, index, iscon, T=0):
    attribute = {}
    if len(SampleArray) == 0:
        return -1  # Sample is NULL
    if iscon == 1:
        for sample in SampleArray:
            samples = sample.split(',')
            if samples[index] not in attribute:
                attribute[samples[index]] = 1
            else:
                attribute[samples[index]] += 1
    else:
        for sample in SampleArray:
            samples = sample.split(',')
            if float(samples[index]) <= T:
                if 'less' not in attribute:
                    attribute['less'] = 1
                else:
                    attribute['less'] += 1
            else:
                if 'more' not in attribute:
                    attribute['more'] = 1
                else:
                    attribute['more'] += 1
    return attribute


# count the number of the good and bad objects of each attribute
def count_attribute(SampleArray, index, iscon, T=0):
    attribute = {}
    if len(SampleArray) == 0:
        return -1  # Sample is NULL
    if iscon == 1:  # depersed
        for sample in SampleArray:
            samples = sample.split(',')
            if str(samples[index] + samples[-1]) not in attribute:
                attribute[samples[index] + samples[-1]] = 1
            else:
                attribute[samples[index] + samples[-1]] += 1
    else:  # continuous
        for sample in SampleArray:
            samples = sample.split(',')
            if float(samples[index]) <= T:
                if str('less' + sample[-1]) not in attribute.keys():
                    attribute['less' + sample[-1]] = 1
                else:
                    attribute['less' + sample[-1]] += 1
            else:
                if str('more' + sample[-1]) not in attribute.keys():
                    attribute['more' + sample[-1]] = 1
                else:
                    attribute['more' + sample[-1]] += 1
    return attribute


def read_file(file_name, SampleArray, AttributeArray):
    with open(file_name, 'r', encoding="utf-8") as f:
        contents = f.readline()
        flag = 0
        index = -1
        if "编号" in contents:
            flag = 1
            index = contents.find(',')
            attributes = contents[index + 1:].split(',')
        else:
            attributes = contents.split(',')  # remove the last word in txt. '\n'
        id = 0
        for a in attributes:
            att = Attribute(a, id)
            id += 1
            AttributeArray.append(att)  # rocord the attribute
        per_att = []
        for contents in f:
            if flag == 1:
                index = contents.find(',')
                per_att = contents[index + 1:-1].split(',')
            else:
                per_att = contents[:-1].split(',')
            for i in range(len(AttributeArray)):
                if per_att[i] not in AttributeArray[i].kids:
                    AttributeArray[i].kids.append(per_att[i])
                    if per_att[i].isalnum():  # the kid is number
                        AttributeArray[i].iscon = 0
            SampleArray.append(contents[index + 1:].replace('\n', ''))
    del AttributeArray[-1].kids[-1]  # delete the last '' in kids of attributes.
    max_mark = count_sample(SampleArray, -1, 1)
    max_class = max(max_mark, key=max_mark.get)  # find the max number of the classes
    return max_class


# find the best attribute for the node
def find_attribute(SampleArray, AttributeArray):
    entropy_D = 0
    entropy_Dv = 0
    entropy_Dv_total = 0
    max_index = 0
    max_gain = 0
    den = 0
    gains = []
    max_con_middle = 0  # find the max middle number
    max_con_gain = 0
    classes = count_sample(SampleArray, -1, 1)
    total_nums = sum(classes.values())
    total_nums = sum(classes.values())
    for value in classes.values():
        p = value / total_nums
        entropy_D += p * math.log(p, 2)
    entropy_D = -(entropy_D)

    for index in range(len(AttributeArray) - 1):  # from 1 begin: overlook the number of each sample
        if AttributeArray[index].iscon == 1:  # dispersed
            total_kids = count_sample(SampleArray, index, 1)
            per_kid = count_attribute(SampleArray, index, 1)
            for kid in AttributeArray[index].kids:
                for j in AttributeArray[-1].kids:
                    if str(kid + j) not in per_kid.keys():
                        continue  # avoid some kid has no good result
                    num = per_kid[str(kid + j)]
                    den = total_kids[kid]
                    p = num / den
                    entropy_Dv += p * math.log(p, 2)
                entropy_Dv_total += (den / total_nums) * (entropy_Dv)
                entropy_Dv = 0
            gain = entropy_D + entropy_Dv_total
            entropy_Dv_total = 0
            gains.append(gain)

        elif AttributeArray[index].iscon == 0:  # continuous
            Ta = []
            AttributeArray[index].kids.sort()
            for i in range(len(AttributeArray[index].kids) - 1):
                Ta.append((float(AttributeArray[index].kids[i]) + float(AttributeArray[index].kids[i + 1])) / 2)
            for t in Ta:
                total_kids = count_sample(SampleArray, index, 0, t)
                per_kid = count_attribute(SampleArray, index, 0, t)

                for j in AttributeArray[-1].kids:
                    if str('less' + j) not in per_kid.keys():
                        continue
                    num = per_kid['less' + j]
                    den = total_kids['less']
                    p = num / den
                    entropy_Dv += p * math.log(p, 2)
                entropy_Dv_total += (den / total_nums) * (entropy_Dv)
                entropy_Dv = 0
                for j in AttributeArray[-1].kids:
                    if str('more' + j) not in per_kid.keys():
                        continue
                    num = per_kid['more' + j]
                    den = total_kids['more']
                    p = num / den
                    entropy_Dv += p * math.log(p, 2)
                entropy_Dv_total += (den / total_nums) * (entropy_Dv)
                entropy_Dv = 0
                con_gain = entropy_D + entropy_Dv_total
                entropy_Dv_total = 0
                if con_gain > max_con_gain:
                    max_con_gain = con_gain
                    max_con_middle = t
            gain = max_con_gain
            gains.append(gain)

        if gain > max_gain:
            max_gain = gain
            max_index = index
    return max_index, max_con_middle  # return the index of the best attribute


treenode = []


# per tree node: [father, father_index, num, judge, result,leaf]
def tree_generate(SampleArray, AttributeArray, father, father_index, pass_kid, max_class):
    treenode.append([])  # create a new tree node
    index = len(treenode) - 1
    treenode[index].append(father)  # record the father of the node
    treenode[index].append(father_index)
    treenode[index].append(index)
    treenode[index].append(pass_kid)
    '''case 1: judge whether there is only one class in SampleArray'''
    count = count_sample(SampleArray, -1, 1)
    if len(count) == 1:
        treenode[index].append(max_class)
        treenode[index].append(1)
        return

    '''case 2: AttributeArray is NULL or all the samples have the same attributes.'''
    i = 0
    for i in range(len(AttributeArray) - 1):
        if len(count_sample(SampleArray, i, 1)) != 1:
            break
    if i == (len(AttributeArray) - 1) or len(AttributeArray) == 1:  # class should not be included.
        treenode[index].append(max_class)
        treenode[index].append(1)  # leaf
        return
    treenode[index].append(0)  # no result
    treenode[index].append(0)  # not the leaf

    '''case 3: find the best attribute.'''
    best_index, best_middle = find_attribute(SampleArray, AttributeArray)
    kid_SampleArray = []
    kid_SampleArray.clear()
    new_index = 0
    # prepare to create the kid tree
    if AttributeArray[best_index].iscon == 1:
        for kid in AttributeArray[best_index].kids:
            kid_SampleArray.clear()
            for sample in SampleArray:
                samples = sample.split(',')
                if samples[best_index] == kid:
                    kid_SampleArray.append(sample.replace(kid + ',', ''))
            if len(kid_SampleArray) == 0:
                treenode.append([])  # create a new tree node
                new_index = len(treenode) - 1
                treenode[new_index].append(AttributeArray[best_index].name)  # record the father of the node
                treenode[new_index].append(index)
                treenode[new_index].append(new_index)
                treenode[new_index].append(kid)
                treenode[new_index].append(max_class)
                treenode[new_index].append(1)  # leaf
                return
            else:
                kid_AttributeArray = list(AttributeArray)
                del kid_AttributeArray[best_index]
                max_class = count_sample(kid_SampleArray, -1, 1)
                max_class = max(max_class, key=max_class.get)
                tree_generate(kid_SampleArray, kid_AttributeArray, AttributeArray[best_index].name, index, kid,
                              max_class)
    else:
        kid_less_SampleArray = []
        kid_less_SampleArray.clear()
        kid_more_SampleArray = []
        kid_more_SampleArray.clear()
        for sample in SampleArray:
            samples = sample.split(',')
            if float(samples[best_index]) <= best_middle:
                kid_less_SampleArray.append(sample.replace(samples[best_index] + ',', ''))
            else:
                kid_more_SampleArray.append(sample.replace(samples[best_index] + ',', ''))
        if len(kid_less_SampleArray) == 0:
            treenode.append([])  # create a new tree node
            new_index = len(treenode) - 1
            treenode[new_index].append(AttributeArray[best_index].name)  # record the father of the node
            treenode[new_index].append(index)
            treenode[new_index].append(new_index)
            treenode[new_index].append("<=" + str(best_middle))
            treenode[new_index].append(max_class)
            treenode[new_index].append(1)  # leaf
            return
        else:
            kid_AttributeArray = list(AttributeArray)
            del kid_AttributeArray[best_index]
            max_less_class = count_sample(kid_less_SampleArray, -1, 1)
            max_less_class = max(max_less_class, key=max_less_class.get)
            tree_generate(kid_less_SampleArray, kid_AttributeArray, AttributeArray[best_index].name, index,
                          "<=" + str(best_middle), max_less_class)
        if len(kid_more_SampleArray) == 0:
            treenode.append([])  # create a new tree node
            new_index = len(treenode) - 1
            treenode[new_index].append(AttributeArray[best_index].name)  # record the father of the node
            treenode[new_index].append(index)
            treenode[new_index].append(new_index)
            treenode[new_index].append(">" + str(best_middle))
            treenode[new_index].append(max_class)
            treenode[new_index].append(1)  # leaf
            return
        else:
            kid_AttributeArray = list(AttributeArray)
            del kid_AttributeArray[best_index]
            max_more_class = count_sample(kid_more_SampleArray, -1, 1)
            max_more_class = max(max_more_class, key=max_more_class.get)
            tree_generate(kid_more_SampleArray, kid_AttributeArray, AttributeArray[best_index].name, index,
                          ">" + str(best_middle), max_more_class)


def main():
    AttributeArray = []  # record attributes
    SampleArray = []  # record samples
    max_class = read_file('data.txt', SampleArray, AttributeArray)
    tree_generate(SampleArray, AttributeArray, -1, -1, -1, max_class)
    print(treenode[1:])


if __name__ == '__main__':
    main()

代码分析的文件是data.txt

5.生成的决策树:

[['Temperature', 0, 1, '<=10.094999999999999', 0, 0], ['Humidity', 1, 2, '<=52.855000000000004', '20870.88608', 1], ['Temperature', 0, 3, '>10.094999999999999', 0, 0], ['Humidity', 3, 4, '<=52.855000000000004', '30154.93671', 1], ['Humidity', 3, 5, '>52.855000000000004', 0, 0], ['Wind Speed', 5, 6, '<=0.0715', '31916.96203', 1], ['Wind Speed', 5, 7, '>0.0715', 0, 0], ['general diffuse flows', 7, 8, '<=0.0235', '23884.55696', 1], ['general diffuse flows', 7, 9, '>0.0235', 0, 0], ['diffuse flows', 9, 10, '<=0.065', '35191.89873', 1], ['diffuse flows', 9, 11, '>0.065', '19989.87342', 1]]

结果的格式:

[per tree node: [父结点, 父结点的指数,子节点的指数, 子节点,判断的结果,是否判断结束]

(1)判断的结果是0代表没判断结束

(2)是否判断结束如果是1代表判断结束,0代表每判断结束

(3)父子结点的指数代表这个结点是决策树的第几个结点

4.测试数据的格式

第一行为信息,有多少个信息对应下面有多少列,程序会自动扫描有多少信息。

如果你的测试数据有10个那么下面的列的信息也要有10列。

编号,Temperature,Humidity,Wind Speed,general diffuse flows,diffuse flows,Zone 1
1,6.414,74.5,0.083,0.07,0.085,29814.68354
2,6.313,74.5,0.08,0.062,0.1,29128.10127
3,6.121,75,0.083,0.091,0.096,28228.86076
4,5.921,75.7,0.081,0.048,0.085,27335.6962
5,5.853,76.9,0.081,0.059,0.108,26624.81013
6,5.641,77.7,0.08,0.048,0.096,25998.98734
7,5.496,78.2,0.085,0.055,0.093,25446.07595
8,5.678,78.1,0.081,0.066,0.141,24777.72152
9,5.491,77.3,0.082,0.062,0.111,24279.49367
10,5.516,77.5,0.081,0.051,0.108,23896.70886
11,5.471,76.7,0.083,0.059,0.126,23544.3038
12,5.059,78.6,0.081,0.07,0.096,23003.5443
13,4.968,78.8,0.084,0.07,0.134,22329.11392
14,4.975,78.9,0.083,0.055,0.152,22092.1519
15,4.897,79.1,0.083,0.07,0.096,21903.79747
16,5.02,79.7,0.081,0.051,0.134,21685.06329
17,5.407,78.5,0.082,0.062,0.163,21484.55696
18,5.169,77.9,0.083,0.066,0.108,21107.8481
19,5.081,77.7,0.084,0.051,0.13,20998.48101
20,5.041,77.2,0.081,0.062,0.152,20870.88608
21,5.034,76.9,0.083,0.051,0.185,20870.88608
22,4.896,76.6,0.085,0.07,0.137,20597.46835
23,4.805,76.2,0.081,0.059,0.134,20421.26582
24,4.753,75.7,0.083,0.044,0.134,20524.55696
25,4.901,74.4,0.083,0.07,0.122,20482.02532
26,5.203,74.1,0.085,0.062,0.096,20530.63291
27,5.394,71.9,0.081,0.073,0.1,20512.40506
28,5.156,74,0.079,0.062,0.148,20494.17722
29,5.179,74.2,0.083,0.037,0.137,20311.89873
30,4.934,72.9,0.082,0.055,0.134,20542.78481
31,4.718,75.8,0.08,0.051,0.152,20621.77215
32,5.546,74,0.082,0.055,0.093,20627.8481
33,4.658,73.5,0.08,0.044,0.104,20797.97468
34,4.382,76.9,0.081,0.073,0.148,20858.73418
35,4.212,78.3,0.081,0.117,0.082,21393.41772
36,4.308,77.2,0.081,0.062,0.126,22219.74684
37,4.735,74.3,0.08,0.04,0.156,21928.10127
38,4.769,75.6,0.082,0.099,0.063,21776.20253
39,4.92,73.7,0.083,0.099,0.096,21654.68354
40,4.408,76.7,0.082,0.037,0.119,21466.32911
41,4.29,77,0.085,0.033,0.193,20846.58228
42,4.304,76,0.082,0.048,0.152,19983.79747
43,4.513,74.6,0.084,0.055,0.134,18908.35443
44,4.489,74.3,0.082,0.081,0.119,18167.08861
45,4.356,72.3,0.082,0.07,0.119,18075.94937
46,4.478,72.2,0.083,0.073,0.089,18063.79747
47,4.583,71,0.083,0.066,0.104,18045.56962
48,4.794,72,0.084,0.187,0.145,18385.82278
49,4.807,73.1,0.082,0.955,0.949,18987.34177
50,4.757,73.5,0.083,3.188,3.134,19576.70886
51,4.509,74.5,0.084,6.643,6.494,19837.97468
52,4.346,74.8,0.082,12.64,10.13,20166.07595
53,4.718,73.7,0.081,58.97,17,20676.4557
54,4.704,71.8,0.083,82.5,20.15,21296.20253
55,4.624,74.1,0.082,106.9,22.55,22043.5443
56,4.629,74.8,0.083,132.1,24.3,22651.13924
57,4.599,74,0.082,156.1,26.9,23131.13924
58,4.524,74.7,0.081,182.4,28.19,23641.51899
59,4.575,74.5,0.082,208.8,29.2,24668.35443
60,5.124,73.7,0.076,234.8,29.23,25275.94937
61,5.836,71.3,2.66,257.9,31.01,25920
62,5.996,69.85,4.93,282.7,31.96,26393.92405
63,6.22,68.81,4.924,307,32.42,26861.77215
64,6.703,68.01,4.923,327.6,33.22,27511.89873
65,6.993,66.14,4.918,349.6,33.41,28149.87342
66,7.54,64.21,4.916,371.1,33.43,28714.93671
67,8.22,61.9,4.916,388.2,33.89,29043.03797
68,9.49,59.3,2.451,401.3,34.4,29261.77215
69,10.65,56.03,0.084,419.5,35.29,29474.43038
70,11.06,53.52,0.082,430.9,37.58,29523.03797
71,12.4,53.26,0.08,437.5,43.46,29711.39241
72,13.08,54.36,0.077,450.4,43.2,29644.55696
73,14.38,54.42,0.076,470.5,37.92,29802.53165
74,15.02,56.46,0.074,480,38.81,29887.59494
75,15.55,58.7,0.075,486.1,39.46,30039.49367
76,15.56,59.23,0.076,490.2,40.92,29966.58228
77,15.49,59.3,0.076,495.9,40.9,29996.96203
78,15.57,58.06,0.077,498.1,40.94,30258.22785
79,15.65,58.7,0.077,498.1,42.75,30404.05063
80,15.73,59.33,0.076,498.8,44.65,30094.17722
81,15.68,58.6,0.079,497,44.26,30161.01266
82,15.7,55.29,0.074,489.2,39.32,30318.98734
83,15.75,52.45,0.077,482.4,38.65,30154.93671
84,15.83,54.99,0.079,476.1,39.49,29984.81013
85,15.79,56.66,0.076,467.7,39.94,30021.26582
86,15.72,55.86,0.077,456.1,39.95,30021.26582
87,15.8,54.06,0.075,449.2,41.52,29911.89873
88,15.74,55.56,0.075,437.4,44.13,29747.8481
89,15.72,56.69,0.077,422,43.87,29705.31646
90,15.64,57.26,0.077,409,44.9,29571.64557
91,15.69,57.96,0.074,391.8,46.89,29553.41772
92,15.69,58.93,0.075,369.4,51.51,29383.29114
93,15.6,58.7,0.073,321,57.14,29061.26582
94,15.39,57.6,0.075,180.3,60.31,28885.06329
95,15.34,58.7,0.075,149.5,67.85,28854.68354
96,15.33,59.23,0.077,209.1,128.9,28836.4557
97,15.48,57.46,0.074,265.1,219.1,28708.86076
98,15.54,58.3,0.076,241.3,246,28556.96203
99,15.53,58.8,0.077,219,248.1,28514.43038
100,15.48,57.73,0.075,195.7,231.6,28666.32911
101,15.47,58.23,0.077,174.9,211.9,29097.72152
102,15.44,59.07,0.076,165.8,201.7,29723.5443
103,15.35,59.3,0.077,132,163,30871.89873
104,15.34,59,0.079,117.4,144.3,32700.75949
105,15.11,59.53,0.076,97.6,120.6,35793.41772
106,14.93,60.83,0.077,76.5,94,38321.01266
107,14.71,61.87,0.086,26.51,28.08,39129.11392
108,14.48,63.27,0.08,18,18.29,39560.50633
109,14.29,63.94,0.078,6.994,7.02,39517.97468
110,14.09,64.87,0.078,2.818,2.763,39596.96203
111,13.89,66.11,0.076,0.809,0.89,39469.36709
112,13.79,66.74,0.076,0.146,0.197,39682.02532
113,13.54,66.71,0.076,0.062,0.089,39888.60759
114,12.51,68.35,0.077,0.055,0.085,39991.89873
115,12.06,70.2,0.078,0.066,0.096,40210.63291
116,11.75,70.7,0.079,0.055,0.126,40216.70886
117,11.61,70.7,0.077,0.081,0.126,40435.44304
118,11.5,70.6,0.078,0.059,0.074,40514.43038
119,11.43,70.7,0.076,0.048,0.093,40544.81013
120,11.42,71.2,0.076,0.073,0.085,40283.5443
121,11.69,71.1,0.078,0.091,0.104,40216.70886
122,12.05,69.82,0.077,0.048,0.108,40477.97468
123,11.87,69.98,0.076,0.051,0.093,40611.64557
124,11.61,71.2,0.078,0.04,0.119,40435.44304
125,11.5,72,0.078,0.07,0.078,40204.55696
126,11.43,72,0.075,0.051,0.07,39584.81013
127,11.29,72.2,0.075,0.051,0.119,39274.93671
128,11.49,73.1,0.077,0.044,0.085,39141.26582
129,12.24,70.1,0.077,0.073,0.089,38643.03797
130,11.84,68.78,0.074,0.066,0.093,38011.13924
131,12.2,68.78,0.074,0.066,0.122,37622.27848
132,13.1,65.44,0.074,0.04,0.141,36953.92405
133,13.2,63.84,0.074,0.081,0.093,36394.93671
134,13.18,63.97,0.075,0.081,0.093,35289.11392
135,12.64,66.77,0.072,0.044,0.141,34438.48101
136,12.39,68.04,0.074,0.084,0.115,33678.98734
137,12.24,69.18,0.077,0.059,0.119,33077.46835
138,12.11,69.94,0.074,0.088,0.082,32117.46835
139,12.07,70.7,0.076,0.033,0.078,31357.97468
140,11.99,70.8,0.075,0.055,0.134,30701.77215
141,11.32,74.7,0.075,0.07,0.1,29681.01266
142,11.37,74.8,0.074,0.084,0.082,28830.37975
143,11.41,74.1,0.076,0.077,0.122,27748.86076
144,11.01,75.8,0.076,0.033,0.163,26703.79747
145,10.74,77.3,0.078,0.084,0.108,26169.11392
146,10.47,78.2,0.078,0.088,0.13,25622.27848
147,10.31,79.2,0.076,0.091,0.134,24972.1519
148,10.63,79.3,0.079,0.037,0.152,24437.46835
149,10.84,78.4,0.073,0.062,0.156,23914.93671
150,10.94,78.1,0.074,0.084,0.108,23629.36709
151,11.12,77.9,0.073,0.055,0.145,23118.98734
152,11.02,77.5,0.081,0.062,0.178,22736.20253
153,10.5,78.9,0.076,0.066,0.145,22049.62025
154,10.22,79.8,0.078,0.088,0.126,21855.18987
155,10.42,81.3,0.075,0.073,0.13,21685.06329
156,10.86,80,0.076,0.066,0.159,21533.16456
157,11.17,77.3,0.077,0.066,0.13,21107.8481
158,11.1,75.6,0.077,0.077,0.156,20986.32911
159,11.14,75,0.081,0.07,0.156,20725.06329
160,11.21,75.2,0.077,0.062,0.137,20889.11392
161,10.71,77.4,0.079,0.07,0.148,20433.41772
162,10.7,77.7,0.079,0.062,0.159,20372.65823
163,10.58,78.6,0.075,0.055,0.137,20135.6962
164,10.59,79.5,0.078,0.084,0.145,20129.62025
165,10.68,79.4,0.079,0.059,0.156,20117.46835
166,10.52,79.8,0.076,0.066,0.145,20087.08861
167,10.34,80.6,0.078,0.088,0.119,19740.75949
168,10.18,81.7,0.079,0.059,0.159,19831.89873
169,10.41,82.3,0.074,0.077,0.126,19989.87342
170,10.41,81.4,0.078,0.095,0.134,20166.07595
171,10.39,81.5,0.074,0.088,0.119,20214.68354
172,10.16,81.9,0.075,0.073,0.122,20141.77215
173,9.97,82.8,0.077,0.066,0.178,20056.70886
174,10.03,83.8,0.08,0.029,0.163,20257.21519
175,9.98,83,0.078,0.073,0.115,20500.25316
176,9.74,82.8,0.081,0.106,0.126,20712.91139
177,9.54,83.7,0.078,0.081,0.174,20810.12658
178,9.53,83.9,0.072,0.066,0.182,21041.01266
179,9.56,84.1,0.073,0.077,0.134,21709.36709
180,9.58,84.5,0.075,0.048,0.134,22742.27848
181,9.44,84.1,0.075,0.07,0.145,22948.86076
182,9.4,84.4,0.077,0.07,0.122,23258.73418
183,9.38,84.6,0.076,0.044,0.141,23471.39241
184,9.16,84.7,0.077,0.048,0.145,23653.67089
185,9.06,85.6,0.076,0.059,0.141,23769.11392
186,9.17,85.8,0.078,0.073,0.163,23380.25316
187,9.46,86.2,0.076,0.07,0.152,23033.92405
188,9.74,85.4,0.072,0.077,0.111,22803.03797
189,9.8,84.4,0.077,0.081,0.122,23082.53165
190,9.58,83.9,0.079,0.073,0.13,22803.03797
191,9.51,83.7,0.082,0.081,0.137,22268.35443
192,9.44,83.2,0.084,0.124,0.223,22742.27848
193,9.18,82.5,0.083,0.871,0.89,23526.07595
194,8.96,83.4,0.083,2.95,2.867,24340.25316
195,8.71,83.9,0.081,6.657,6.472,24881.01266
196,8.52,85.1,0.081,13.02,11.56,25190.88608
197,8.43,85.4,0.079,39.1,19.49,25531.13924
198,8.44,85.9,0.08,64.11,24.49,25980.75949
199,8.69,86.7,0.079,92.3,27.88,23495.6962
200,9.02,86.7,0.082,117.2,30.43,23629.36709
201,9.17,86.9,0.079,134.7,33.24,24006.07595
202,10.48,87,0.079,169.2,35.2,24771.64557
203,11.13,83.5,0.08,194,36.92,25148.35443
204,11.68,81.7,0.08,218.3,39.07,25518.98734
205,12.22,78,0.075,246.6,40.01,25695.18987
206,12.61,74.8,0.076,270.6,40.59,26351.39241
207,13.07,72.7,0.075,288.3,42.31,26667.34177
208,13.61,70.2,0.077,309.8,43.74,26794.93671
209,14,69.22,0.077,337.2,43.39,27281.01266
210,14.55,64.11,0.077,360.7,41.43,27645.56962
211,14.94,58.46,0.077,381.4,41.13,27773.16456
212,15.08,57.23,0.076,396.6,40.66,27894.68354
213,15.18,55.83,0.076,410.8,41.23,28532.65823
214,15.28,56.49,0.075,425.6,41.84,28818.22785
215,15.27,55.93,0.076,440.2,41.83,28684.55696
216,15.21,55.69,0.077,452.5,41.73,28939.74684
217,15.18,57.2,0.078,460.5,42.38,29152.40506
218,15.11,57.86,0.074,469.9,43.43,29674.93671
219,15.1,58.33,0.076,476.2,43.25,29826.83544
220,15.26,58.8,0.076,482.3,44.1,29905.82278
221,15.15,60.87,0.075,485.1,44.53,29990.88608
222,15.23,60.53,0.076,487.1,44.09,29966.58228
223,15.22,60.2,0.077,487.6,44.43,29881.51899
224,15.22,59.83,0.076,486.9,44.29,29760
225,15.33,59.16,0.076,484.1,43.72,29292.1519
226,15.41,58.13,0.076,481.1,43.82,29140.25316
227,15.43,58,0.077,477.2,43.68,28872.91139
228,15.56,56.03,0.075,470.9,42.8,28672.40506
229,15.59,55.73,0.075,463.4,42.58,28502.27848
230,15.47,56.29,0.077,451.8,42.83,28684.55696
231,15.55,55.99,0.074,441.9,41.88,28617.72152
232,15.57,56.53,0.076,427.9,42.14,28635.94937
233,15.64,55.35,0.077,413.9,41.5,28556.96203
234,15.59,55.76,0.077,399.4,40.61,28721.01266
235,15.58,57.03,0.077,381.1,42.48,28696.70886
236,15.49,57.19,0.075,364.3,45.96,29049.11392
237,15.45,57.86,0.075,345.8,57.35,29255.6962
238,15.36,57.72,0.075,324.6,79.5,29012.65823
239,15.38,58.02,0.074,302.9,121.3,29091.64557
240,15.37,58.59,0.077,280.3,184,29030.88608
241,15.23,59.76,0.077,260.6,238,29267.8481
242,15.23,61.12,0.075,238.2,253.4,29480.50633
243,15.24,61.63,0.075,216.8,248.1,30161.01266
244,15.19,61.72,0.075,192,227.3,31011.64557
245,15.17,61.76,0.077,165.5,200.6,31029.87342
246,15.22,59.95,0.076,142.6,175,32390.88608
247,15.17,61.92,0.076,119.9,148.2,33685.06329
248,15.02,62.32,0.076,98.7,123.7,35659.74684
249,14.9,61.82,0.079,77.5,98.6,38187.34177
250,14.81,61.79,0.078,58.1,75.3,40648.10127
251,14.68,62.05,0.079,22.4,25.24,41565.56962
252,14.54,64.02,0.074,11.04,11.2,41644.55696
253,14.38,64.55,0.077,6.504,6.524,41814.68354
254,14.23,65.55,0.076,3.071,3.089,41766.07595
255,14.11,65.79,0.081,0.933,0.946,41753.92405
256,14.03,66.72,0.074,0.143,0.189,41808.60759
257,13.95,67.52,0.074,0.037,0.119,41699.24051
258,13.87,68.59,0.075,0.066,0.1,41280
259,13.74,69.46,0.074,0.081,0.093,41583.79747
260,13.55,69.9,0.076,0.029,0.104,41851.13924
261,13.43,69.86,0.076,0.04,0.108,41608.10127
262,13.35,69.86,0.076,0.073,0.078,41395.44304
263,13.2,70.5,0.076,0.055,0.122,41273.92405
264,13.42,68.89,0.073,0.055,0.096,40939.74684
265,13.55,68.12,0.077,0.055,0.074,41249.62025
266,13.42,68.52,0.075,0.055,0.093,41316.4557
267,13.3,69.03,0.075,0.055,0.093,41182.78481
268,13.33,68.72,0.076,0.066,0.108,41061.26582
269,13.45,68.59,0.075,0.066,0.122,40830.37975
270,13.47,66.99,0.074,0.048,0.078,40830.37975
271,13.29,66.99,0.075,0.051,0.067,40356.4557
272,13.15,66.42,0.078,0.055,0.093,39797.46835
273,13.03,66.62,0.074,0.07,0.085,39116.96203
274,12.93,67.29,0.076,0.07,0.111,38564.05063
275,12.88,67.46,0.075,0.051,0.119,38102.27848
276,12.86,67.76,0.074,0.084,0.093,37792.40506
277,12.87,67.96,0.075,0.066,0.134,37117.97468
278,12.92,68.52,0.074,0.037,0.148,36218.73418
279,12.96,68.62,0.075,0.102,0.063,35191.89873
280,12.88,69.33,0.075,0.062,0.152,34754.43038
281,12.77,69.86,0.075,0.11,0.1,33909.87342
282,12.82,70.1,0.075,0.077,0.1,32785.82278
283,12.75,70.4,0.071,0.051,0.152,31916.96203
284,12.54,71.5,0.077,0.051,0.1,30252.1519
285,12.23,72.9,0.077,0.059,0.111,29346.83544
286,12.19,72.8,0.077,0.07,0.122,28593.41772
287,12.31,72.6,0.072,0.044,0.152,27779.24051
288,12.08,74,0.076,0.117,0.093,26661.26582
289,12.16,74,0.072,0.07,0.141,26150.88608
290,12.07,74.1,0.074,0.102,0.119,25433.92405
291,11.48,76.8,0.077,0.07,0.145,24947.8481
292,11.45,77.7,0.074,0.073,0.115,24540.75949
293,11.8,76.6,0.076,0.018,0.23,23884.55696
294,12.23,75.1,0.074,0.062,0.145,23422.78481
295,12.39,71,0.078,0.062,0.148,23222.27848
296,12.42,71.4,0.077,0.059,0.159,22766.58228
297,12.4,71.5,0.075,0.062,0.156,22365.56962
298,11.84,73.4,0.079,0.073,0.152,22098.22785
299,11.51,73,0.076,0.044,0.148,21946.32911
300,11.35,73.6,0.081,0.095,0.134,21691.13924
301,11.27,73.8,0.077,0.055,0.159,21575.6962
302,11.33,74.5,0.073,0.044,0.13,21502.78481
303,11.44,75,0.078,0.081,0.119,21332.65823
304,12.26,75.5,0.076,0.073,0.163,21059.24051
305,13.08,70.9,0.076,0.062,0.174,21065.31646
306,12.78,71.8,0.078,0.055,0.156,20980.25316
307,12.34,73.2,0.076,0.088,0.152,20931.64557
308,12.2,73.8,0.072,0.084,0.178,20658.22785
309,12.04,74.7,0.077,0.077,0.134,20226.83544
310,11.96,74.9,0.077,0.07,0.137,20220.75949
311,11.92,75.2,0.077,0.059,0.182,20330.12658
312,11.85,75.1,0.083,0.077,0.156,20202.53165
313,11.66,75.5,0.084,0.07,0.156,19989.87342
314,11.55,76.1,0.077,0.055,0.156,19989.87342
315,11.6,75.1,0.081,0.073,0.137,20087.08861
316,11.17,77.8,0.076,0.077,0.148,20129.62025
317,10.83,78.5,0.076,0.066,0.148,20299.74684
318,10.74,78.6,0.077,0.048,0.189,20390.88608
319,10.57,80.3,0.081,0.04,0.152,20500.25316
320,10.49,80.3,0.08,0.084,0.141,20755.44304
321,10.69,81.6,0.078,0.073,0.074,20986.32911
322,11.22,80,0.076,0.077,0.171,21454.17722

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值