Logistic Regression

Logistic Regression

Logistic回归是两类问题的首选线性分类算法。

它易于实现,易于理解,并且可以在各种各样的问题上获得很好的结果,即使这些方法对您的数据的期望受到侵犯也是如此。

在本教程中,您将了解如何使用Python从头开始随机梯度下降实现逻辑回归。

完成本教程后,您将了解:

如何使用逻辑回归模型进行预测。
如何使用随机梯度下降估计系数。
如何将逻辑回归应用于实际预测问题。

描述

逻辑回归以在该方法的核心使用的函数命名,即逻辑函数。

Logistic回归使用方程作为表示,非常类似于线性回归。 使用权重或系数值线性组合输入值(X)以预测输出值(y)。

与线性回归的主要区别在于,建模的输出值是二进制值(0或1)而不是数值。

yhat = 1.0 / (1.0 + e^(-(b0 + b1 * x1)))

其中e是自然对数的基数(欧拉数),即预测输出,b0是偏差或截距项,b1是单个输入值(x1)的系数。

yhat预测是0到1之间的实数值,需要舍入为整数值并映射到预测的类值。

输入数据中的每一列都有一个相关的b系数(一个恒定的实际值),必须从训练数据中学习。 您将存储在存储器或文件中的模型的实际表示是等式中的系数(β值或b)。

逻辑回归算法的系数必须根据您的训练数据进行估算。

随机梯度下降

梯度下降是通过遵循成本函数的梯度来最小化函数的过程。

这包括了解成本的形式以及衍生物,以便从给定的点知道梯度并且可以在该方向上移动,例如, 向下走向最小值。

在机器学习中,我们可以使用一种技术来评估和更新称为随机梯度下降的每次迭代的系数,以最小化模型对我们的训练数据的误差。

此优化算法的工作方式是每个训练实例一次显示给模型一个。 该模型对训练实例进行预测,计算误差并更新模型以减少下一次预测的误差。

该过程可用于在模型中找到导致训练数据上模型的最小误差的系数集。 每次迭代,机器学习语言中的系数(b)使用以下等式更新:

b = b + learning_rate * (y - yhat) * yhat * (1 - yhat) * x

其中b是要优化的系数或权重,learning_rate是您必须配置的学习率(例如0.01),(y - yhat)是模型对归因于权重的训练数据的预测误差,即预测 通过系数,x是输入值。

教程

本教程分为3个部分。

做出预测。
估计系数。
糖尿病预测。
这将为您自己的预测建模问题提供实施和应用具有随机梯度下降的逻辑回归所需的基础。

预测

第一步是开发一个可以进行预测的功能。

在随机梯度下降中的候选系数值的评估中以及在模型完成之后,我们希望开始对测试数据或新数据进行预测。

下面是一个名为predict()的函数,它预测给定一组系数的行的输出值。

第一个系数总是截距,也称为偏差或b0,因为它是独立的,不负责特定的输入值。

from random import seed
from random import randrange
from csv import reader
from math import exp
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
%matplotlib inline
# 用系数进行预测
def predict(row, coefficients):
    yhat = coefficients[0]
    for i in range(len(row)-1):
        yhat += coefficients[i + 1] * row[i]
    return 1.0 / (1.0 + exp(-yhat))

# 测试预测
dataset = [[2.7810836,2.550537003,0],
        [1.465489372,2.362125076,0],
        [3.396561688,4.400293529,0],
        [1.38807019,1.850220317,0],
        [3.06407232,3.005305973,0],
        [7.627531214,2.759262235,1],
        [5.332441248,2.088626775,1],
        [6.922596716,1.77106367,1],
        [8.675418651,-0.242068655,1],
        [7.673756466,3.508563011,1]]
coef = [-0.406605464, 0.852573316, -1.104746259]
df = pd.DataFrame(dataset)
df.plot.scatter(x=0,y=1,c=2,colormap='viridis')
coef = [-0.406605464, 0.852573316, -1.104746259]
for row in dataset:
    y = predict(row,coef)
    print("Expected=%.3f, Predicted=%.3f [%d]" % (row[-1], y, round(y)))
Expected=0.000, Predicted=0.299 [0]
Expected=0.000, Predicted=0.146 [0]
Expected=0.000, Predicted=0.085 [0]
Expected=0.000, Predicted=0.220 [0]
Expected=0.000, Predicted=0.247 [0]
Expected=1.000, Predicted=0.955 [1]
Expected=1.000, Predicted=0.862 [1]
Expected=1.000, Predicted=0.972 [1]
Expected=1.000, Predicted=0.999 [1]
Expected=1.000, Predicted=0.905 [1]

在这里插入图片描述

估计参数

我们可以使用随机梯度下降来估计训练数据的系数值。

随机梯度下降需要两个参数:

学习率:用于限制每次更新时每个系数的校正量。
迭代次数(Epochs):更新系数时运行训练数据的次数。
这些以及训练数据将是该函数的参数。

我们需要在函数中执行3个循环:

Loop over each epoch.
Loop over each row in the training data for an epoch.
Loop over each coefficient and update it for a row in an epoch.

系数根据模型产生的误差进行更新。该误差被计算为预期输出值与利用候选系数进行的预测之间的差异。

有一个系数可以对每个输入属性进行加权,并且这些系数以一致的方式更新,例如:
b1(t+1) = b1(t) + learning_rate * (y(t) - yhat(t)) * yhat(t) * (1 - yhat(t)) * x1(t)
列表开头的特殊系数(也称为截距)以类似的方式更新,除非没有输入,因为它与特定输入值无关:
b0(t+1) = b0(t) + learning_rate * (y(t) - yhat(t)) * yhat(t) * (1 - yhat(t))

# 使用随机梯度下降估计逻辑回归系数
def coefficients_sgd(train, l_rate, n_epoch):
    coef = [0.0 for i in range(len(train[0]))]
    for epoch in range(n_epoch):
        sum_error = 0
        for row in train:
            yhat = predict(row, coef)
            error = row[-1] - yhat
            sum_error += error**2
            coef[0] = coef[0] + l_rate * error * yhat * (1.0 - yhat)
            for i in range(len(row) -1):
                coef[i+1] = coef[i+1] + l_rate * error * yhat * (1.0 - yhat) * row[i]
        print('>epoch=%d, lrate=%.3f, error=%.3f' % (epoch, l_rate, sum_error))
        return coef
l_rate = 0.3
n_epoch = 100
coef = coefficients_sgd(dataset,l_rate,n_epoch)
print(coef)
>epoch=0, lrate=0.300, error=2.217
[-0.016225852758977134, 0.3388667707585192, -0.08247447154170473]

疾病预测

在本节中,我们将使用糖尿病数据集上的随机梯度下降来训练逻辑回归模型。

该示例假定数据集的CSV副本位于当前工作目录中,文件名为pima-indians-diabetes.data.csv。

首先加载数据集,将字符串值转换为数字,并将每列标准化为0到1范围内的值。这可以通过辅助函数load_csv()和str_column_to_float()来加载和准备数据集和dataset_minmax( )和normalize_dataset()来规范化它。

我们将使用k-fold交叉验证来估计学习模型在看不见的数据上的性能。这意味着我们将构建和评估k模型并将性能估计为平均模型性能。分类精度将用于评估每个模型。这些行为在cross_validation_split(),accuracy_metric()和evaluate_algorithm()辅助函数中提供。

我们将使用上面创建的predict(),coefficients_sgd()函数和一个新的logistic_regression()函数来训练模型。

以下是完整的示例。

# 加载CSV文件
def load_csv(filename):
    dataset = list()
    with open(filename,'r') as file:
        csv_reader = reader(file)
        for row in csv_reader:
            if not row:
                continue
            dataset.append(row)
        return dataset

# 将string类型的列转换成 float
def str_column_to_float(dataset,column):
    for row in dataset:
        row[column] = float(row[column].strip())

#找到某一列的最大最下值
def dataset_minmax(dataset):
    minmax = list()
    for i in range(len(dataset[0])):
        col_values = [row[i] for row in dataset]
        value_min = min(col_values)
        value_max = max(col_values)
        minmax.append([value_min,value_max])
    return minmax

# 将数据集归一化到0-1
def normalize_dataset(dataset,minmax):
    for row in dataset:
        for i in range(len(row)):
            row[i] = (row[i] - minmax[i][0]) / (minmax[i][1] - minmax[i][0])

# 划分数据
def cross_validation_split(dataset, n_folds):
    dataset_split = list()
    dataset_copy = list(dataset)
    fold_size = int(len(dataset) / n_folds)
    for i in range(n_folds):
        fold = list()
        while len(fold) < fold_size:
            index = randrange(len((dataset_copy)))
            fold.append(dataset_copy.pop(index))
        dataset_split.append(fold)
    return dataset_split

# 计算准确率
def accuracy_metric(actual, predict):
    correct = 0
    for i in range(len(actual)):
        if actual[i] == predict[i]:
            correct += 1
    return correct / float(len(actual)) * 100.0

# 使用交叉验证拆分评估算法
def evaluate_algorithm(dataset, algorithm, n_folds, *args):
	folds = cross_validation_split(dataset, n_folds)
	scores = list()
	for fold in folds:
		train_set = list(folds)
		train_set.remove(fold)
		train_set = sum(train_set, [])
		test_set = list()
		for row in fold:
			row_copy = list(row)
			test_set.append(row_copy)
			row_copy[-1] = None
		predicted = algorithm(train_set, test_set, *args)
		actual = [row[-1] for row in fold]
		accuracy = accuracy_metric(actual, predicted)
		scores.append(accuracy)
	return scores

# 使用随机梯度下降估计逻辑回归系数
def coefficients_sgd(train, l_rate, n_epoch):
    coef = [0.0 for i in range(len(train[0]))]
    for epoch in range(n_epoch):
        sum_error = 0
        for row in train:
            yhat = predict(row, coef)
            error = row[-1] - yhat
            sum_error += error**2
            coef[0] = coef[0] + l_rate * error * yhat * (1.0 - yhat)
            for i in range(len(row)-1):
                coef[i + 1] = coef[i + 1] + l_rate * error * yhat * (1.0 - yhat) * row[i]
#         print('>epoch=%d, lrate=%.3f, error=%.3f' % (epoch, l_rate, sum_error))
    return coef
    


# 用系数进行预测
def predict(row, coefficients):
	yhat = coefficients[0]
	for i in range(len(row)-1):
		yhat += coefficients[i + 1] * row[i]
	return 1.0 / (1.0 + exp(-yhat))

        
# 随机梯度下降的线性回归算法
def logistic_regression(train, test, l_rate, n_epoch):
	predictions = list()
	coef = coefficients_sgd(train, l_rate, n_epoch)
	for row in test:
		yhat = predict(row, coef)
		yhat = round(yhat)
		predictions.append(yhat)
	return(predictions)

# 加载数据
filename = 'pima-indians-diabetes.data.csv'
dataset = load_csv(filename)
for i in range(len(dataset[0])):
    str_column_to_float(dataset,i)
# 规范化
minmax = dataset_minmax(dataset)
normalize_dataset(dataset, minmax)
# 评估算法
seed(1)
n_folds = 7
l_rate = 0.1
n_epoch = 100
scores = evaluate_algorithm(dataset, logistic_regression, n_folds, l_rate, n_epoch)
print('Scores: %s' % scores)
print('Mean Accuracy: %.3f%%' % (sum(scores)/float(len(scores))))
>epoch=0, lrate=0.100, error=150.314
>epoch=1, lrate=0.100, error=143.447
>epoch=2, lrate=0.100, error=137.787
>epoch=3, lrate=0.100, error=133.337
>epoch=4, lrate=0.100, error=129.787
>epoch=5, lrate=0.100, error=126.901
>epoch=6, lrate=0.100, error=124.512
>epoch=7, lrate=0.100, error=122.500
>epoch=8, lrate=0.100, error=120.781
>epoch=9, lrate=0.100, error=119.295
>epoch=10, lrate=0.100, error=117.996
>epoch=11, lrate=0.100, error=116.850
>epoch=12, lrate=0.100, error=115.832
>epoch=13, lrate=0.100, error=114.921
>epoch=14, lrate=0.100, error=114.101
>epoch=15, lrate=0.100, error=113.358
>epoch=16, lrate=0.100, error=112.683
>epoch=17, lrate=0.100, error=112.067
>epoch=18, lrate=0.100, error=111.502
>epoch=19, lrate=0.100, error=110.983
>epoch=20, lrate=0.100, error=110.503
>epoch=21, lrate=0.100, error=110.059
>epoch=22, lrate=0.100, error=109.648
>epoch=23, lrate=0.100, error=109.265
>epoch=24, lrate=0.100, error=108.908
>epoch=25, lrate=0.100, error=108.574
>epoch=26, lrate=0.100, error=108.262
>epoch=27, lrate=0.100, error=107.969
>epoch=28, lrate=0.100, error=107.694
>epoch=29, lrate=0.100, error=107.435
>epoch=30, lrate=0.100, error=107.191
>epoch=31, lrate=0.100, error=106.961
>epoch=32, lrate=0.100, error=106.744
>epoch=33, lrate=0.100, error=106.538
>epoch=34, lrate=0.100, error=106.344
>epoch=35, lrate=0.100, error=106.159
>epoch=36, lrate=0.100, error=105.984
>epoch=37, lrate=0.100, error=105.818
>epoch=38, lrate=0.100, error=105.660
>epoch=39, lrate=0.100, error=105.509
>epoch=40, lrate=0.100, error=105.366
>epoch=41, lrate=0.100, error=105.229
>epoch=42, lrate=0.100, error=105.099
>epoch=43, lrate=0.100, error=104.974
>epoch=44, lrate=0.100, error=104.856
>epoch=45, lrate=0.100, error=104.742
>epoch=46, lrate=0.100, error=104.633
>epoch=47, lrate=0.100, error=104.529
>epoch=48, lrate=0.100, error=104.430
>epoch=49, lrate=0.100, error=104.334
>epoch=50, lrate=0.100, error=104.243
>epoch=51, lrate=0.100, error=104.155
>epoch=52, lrate=0.100, error=104.071
>epoch=53, lrate=0.100, error=103.990
>epoch=54, lrate=0.100, error=103.912
>epoch=55, lrate=0.100, error=103.838
>epoch=56, lrate=0.100, error=103.766
>epoch=57, lrate=0.100, error=103.697
>epoch=58, lrate=0.100, error=103.630
>epoch=59, lrate=0.100, error=103.566
>epoch=60, lrate=0.100, error=103.505
>epoch=61, lrate=0.100, error=103.445
>epoch=62, lrate=0.100, error=103.388
>epoch=63, lrate=0.100, error=103.333
>epoch=64, lrate=0.100, error=103.280
>epoch=65, lrate=0.100, error=103.228
>epoch=66, lrate=0.100, error=103.178
>epoch=67, lrate=0.100, error=103.131
>epoch=68, lrate=0.100, error=103.084
>epoch=69, lrate=0.100, error=103.040
>epoch=70, lrate=0.100, error=102.996
>epoch=71, lrate=0.100, error=102.955
>epoch=72, lrate=0.100, error=102.914
>epoch=73, lrate=0.100, error=102.875
>epoch=74, lrate=0.100, error=102.837
>epoch=75, lrate=0.100, error=102.801
>epoch=76, lrate=0.100, error=102.765
>epoch=77, lrate=0.100, error=102.731
>epoch=78, lrate=0.100, error=102.697
>epoch=79, lrate=0.100, error=102.665
>epoch=80, lrate=0.100, error=102.634
>epoch=81, lrate=0.100, error=102.604
>epoch=82, lrate=0.100, error=102.574
>epoch=83, lrate=0.100, error=102.546
>epoch=84, lrate=0.100, error=102.518
>epoch=85, lrate=0.100, error=102.491
>epoch=86, lrate=0.100, error=102.465
>epoch=87, lrate=0.100, error=102.440
>epoch=88, lrate=0.100, error=102.416
>epoch=89, lrate=0.100, error=102.392
>epoch=90, lrate=0.100, error=102.369
>epoch=91, lrate=0.100, error=102.346
>epoch=92, lrate=0.100, error=102.324
>epoch=93, lrate=0.100, error=102.303
>epoch=94, lrate=0.100, error=102.282
>epoch=95, lrate=0.100, error=102.262
>epoch=96, lrate=0.100, error=102.243
>epoch=97, lrate=0.100, error=102.224
>epoch=98, lrate=0.100, error=102.205
>epoch=99, lrate=0.100, error=102.187
>epoch=0, lrate=0.100, error=148.766
>epoch=1, lrate=0.100, error=141.314
>epoch=2, lrate=0.100, error=136.164
>epoch=3, lrate=0.100, error=131.999
>epoch=4, lrate=0.100, error=128.599
>epoch=5, lrate=0.100, error=125.786
>epoch=6, lrate=0.100, error=123.422
>epoch=7, lrate=0.100, error=121.408
>epoch=8, lrate=0.100, error=119.671
>epoch=9, lrate=0.100, error=118.155
>epoch=10, lrate=0.100, error=116.820
>epoch=11, lrate=0.100, error=115.635
>epoch=12, lrate=0.100, error=114.575
>epoch=13, lrate=0.100, error=113.621
>epoch=14, lrate=0.100, error=112.757
>epoch=15, lrate=0.100, error=111.972
>epoch=16, lrate=0.100, error=111.255
>epoch=17, lrate=0.100, error=110.598
>epoch=18, lrate=0.100, error=109.993
>epoch=19, lrate=0.100, error=109.435
>epoch=20, lrate=0.100, error=108.918
>epoch=21, lrate=0.100, error=108.439
>epoch=22, lrate=0.100, error=107.993
>epoch=23, lrate=0.100, error=107.577
>epoch=24, lrate=0.100, error=107.188
>epoch=25, lrate=0.100, error=106.824
>epoch=26, lrate=0.100, error=106.483
>epoch=27, lrate=0.100, error=106.163
>epoch=28, lrate=0.100, error=105.861
>epoch=29, lrate=0.100, error=105.577
>epoch=30, lrate=0.100, error=105.309
>epoch=31, lrate=0.100, error=105.056
>epoch=32, lrate=0.100, error=104.817
>epoch=33, lrate=0.100, error=104.590
>epoch=34, lrate=0.100, error=104.375
>epoch=35, lrate=0.100, error=104.171
>epoch=36, lrate=0.100, error=103.977
>epoch=37, lrate=0.100, error=103.793
>epoch=38, lrate=0.100, error=103.617
>epoch=39, lrate=0.100, error=103.450
>epoch=40, lrate=0.100, error=103.291
>epoch=41, lrate=0.100, error=103.139
>epoch=42, lrate=0.100, error=102.994
>epoch=43, lrate=0.100, error=102.856
>epoch=44, lrate=0.100, error=102.723
>epoch=45, lrate=0.100, error=102.596
>epoch=46, lrate=0.100, error=102.475
>epoch=47, lrate=0.100, error=102.359
>epoch=48, lrate=0.100, error=102.247
>epoch=49, lrate=0.100, error=102.140
>epoch=50, lrate=0.100, error=102.038
>epoch=51, lrate=0.100, error=101.939
>epoch=52, lrate=0.100, error=101.845
>epoch=53, lrate=0.100, error=101.754
>epoch=54, lrate=0.100, error=101.666
>epoch=55, lrate=0.100, error=101.582
>epoch=56, lrate=0.100, error=101.501
>epoch=57, lrate=0.100, error=101.423
>epoch=58, lrate=0.100, error=101.347
>epoch=59, lrate=0.100, error=101.275
>epoch=60, lrate=0.100, error=101.205
>epoch=61, lrate=0.100, error=101.138
>epoch=62, lrate=0.100, error=101.072
>epoch=63, lrate=0.100, error=101.010
>epoch=64, lrate=0.100, error=100.949
>epoch=65, lrate=0.100, error=100.890
>epoch=66, lrate=0.100, error=100.834
>epoch=67, lrate=0.100, error=100.779
>epoch=68, lrate=0.100, error=100.726
>epoch=69, lrate=0.100, error=100.674
>epoch=70, lrate=0.100, error=100.625
>epoch=71, lrate=0.100, error=100.577
>epoch=72, lrate=0.100, error=100.530
>epoch=73, lrate=0.100, error=100.485
>epoch=74, lrate=0.100, error=100.441
>epoch=75, lrate=0.100, error=100.399
>epoch=76, lrate=0.100, error=100.358
>epoch=77, lrate=0.100, error=100.318
>epoch=78, lrate=0.100, error=100.279
>epoch=79, lrate=0.100, error=100.242
>epoch=80, lrate=0.100, error=100.205
>epoch=81, lrate=0.100, error=100.170
>epoch=82, lrate=0.100, error=100.136
>epoch=83, lrate=0.100, error=100.102
>epoch=84, lrate=0.100, error=100.070
>epoch=85, lrate=0.100, error=100.038
>epoch=86, lrate=0.100, error=100.008
>epoch=87, lrate=0.100, error=99.978
>epoch=88, lrate=0.100, error=99.949
>epoch=89, lrate=0.100, error=99.921
>epoch=90, lrate=0.100, error=99.893
>epoch=91, lrate=0.100, error=99.867
>epoch=92, lrate=0.100, error=99.841
>epoch=93, lrate=0.100, error=99.816
>epoch=94, lrate=0.100, error=99.791
>epoch=95, lrate=0.100, error=99.767
>epoch=96, lrate=0.100, error=99.744
>epoch=97, lrate=0.100, error=99.721
>epoch=98, lrate=0.100, error=99.699
>epoch=99, lrate=0.100, error=99.677
>epoch=0, lrate=0.100, error=151.233
>epoch=1, lrate=0.100, error=143.141
>epoch=2, lrate=0.100, error=137.406
>epoch=3, lrate=0.100, error=132.937
>epoch=4, lrate=0.100, error=129.391
>epoch=5, lrate=0.100, error=126.516
>epoch=6, lrate=0.100, error=124.137
>epoch=7, lrate=0.100, error=122.134
>epoch=8, lrate=0.100, error=120.421
>epoch=9, lrate=0.100, error=118.938
>epoch=10, lrate=0.100, error=117.640
>epoch=11, lrate=0.100, error=116.494
>epoch=12, lrate=0.100, error=115.475
>epoch=13, lrate=0.100, error=114.562
>epoch=14, lrate=0.100, error=113.740
>epoch=15, lrate=0.100, error=112.997
>epoch=16, lrate=0.100, error=112.321
>epoch=17, lrate=0.100, error=111.703
>epoch=18, lrate=0.100, error=111.138
>epoch=19, lrate=0.100, error=110.619
>epoch=20, lrate=0.100, error=110.140
>epoch=21, lrate=0.100, error=109.697
>epoch=22, lrate=0.100, error=109.286
>epoch=23, lrate=0.100, error=108.905
>epoch=24, lrate=0.100, error=108.549
>epoch=25, lrate=0.100, error=108.218
>epoch=26, lrate=0.100, error=107.908
>epoch=27, lrate=0.100, error=107.617
>epoch=28, lrate=0.100, error=107.345
>epoch=29, lrate=0.100, error=107.089
>epoch=30, lrate=0.100, error=106.848
>epoch=31, lrate=0.100, error=106.621
>epoch=32, lrate=0.100, error=106.407
>epoch=33, lrate=0.100, error=106.204
>epoch=34, lrate=0.100, error=106.013
>epoch=35, lrate=0.100, error=105.831
>epoch=36, lrate=0.100, error=105.659
>epoch=37, lrate=0.100, error=105.496
>epoch=38, lrate=0.100, error=105.341
>epoch=39, lrate=0.100, error=105.194
>epoch=40, lrate=0.100, error=105.053
>epoch=41, lrate=0.100, error=104.920
>epoch=42, lrate=0.100, error=104.792
>epoch=43, lrate=0.100, error=104.671
>epoch=44, lrate=0.100, error=104.555
>epoch=45, lrate=0.100, error=104.444
>epoch=46, lrate=0.100, error=104.338
>epoch=47, lrate=0.100, error=104.237
>epoch=48, lrate=0.100, error=104.140
>epoch=49, lrate=0.100, error=104.047
>epoch=50, lrate=0.100, error=103.958
>epoch=51, lrate=0.100, error=103.872
>epoch=52, lrate=0.100, error=103.790
>epoch=53, lrate=0.100, error=103.711
>epoch=54, lrate=0.100, error=103.636
>epoch=55, lrate=0.100, error=103.563
>epoch=56, lrate=0.100, error=103.493
>epoch=57, lrate=0.100, error=103.426
>epoch=58, lrate=0.100, error=103.362
>epoch=59, lrate=0.100, error=103.300
>epoch=60, lrate=0.100, error=103.240
>epoch=61, lrate=0.100, error=103.182
>epoch=62, lrate=0.100, error=103.126
>epoch=63, lrate=0.100, error=103.073
>epoch=64, lrate=0.100, error=103.021
>epoch=65, lrate=0.100, error=102.971
>epoch=66, lrate=0.100, error=102.923
>epoch=67, lrate=0.100, error=102.876
>epoch=68, lrate=0.100, error=102.831
>epoch=69, lrate=0.100, error=102.788
>epoch=70, lrate=0.100, error=102.746
>epoch=71, lrate=0.100, error=102.705
>epoch=72, lrate=0.100, error=102.666
>epoch=73, lrate=0.100, error=102.628
>epoch=74, lrate=0.100, error=102.591
>epoch=75, lrate=0.100, error=102.555
>epoch=76, lrate=0.100, error=102.521
>epoch=77, lrate=0.100, error=102.487
>epoch=78, lrate=0.100, error=102.455
>epoch=79, lrate=0.100, error=102.423
>epoch=80, lrate=0.100, error=102.393
>epoch=81, lrate=0.100, error=102.363
>epoch=82, lrate=0.100, error=102.335
>epoch=83, lrate=0.100, error=102.307
>epoch=84, lrate=0.100, error=102.280
>epoch=85, lrate=0.100, error=102.254
>epoch=86, lrate=0.100, error=102.229
>epoch=87, lrate=0.100, error=102.204
>epoch=88, lrate=0.100, error=102.180
>epoch=89, lrate=0.100, error=102.157
>epoch=90, lrate=0.100, error=102.134
>epoch=91, lrate=0.100, error=102.112
>epoch=92, lrate=0.100, error=102.091
>epoch=93, lrate=0.100, error=102.070
>epoch=94, lrate=0.100, error=102.050
>epoch=95, lrate=0.100, error=102.030
>epoch=96, lrate=0.100, error=102.011
>epoch=97, lrate=0.100, error=101.993
>epoch=98, lrate=0.100, error=101.974
>epoch=99, lrate=0.100, error=101.957
>epoch=0, lrate=0.100, error=149.987
>epoch=1, lrate=0.100, error=142.230
>epoch=2, lrate=0.100, error=136.676
>epoch=3, lrate=0.100, error=132.271
>epoch=4, lrate=0.100, error=128.729
>epoch=5, lrate=0.100, error=125.831
>epoch=6, lrate=0.100, error=123.419
>epoch=7, lrate=0.100, error=121.380
>epoch=8, lrate=0.100, error=119.632
>epoch=9, lrate=0.100, error=118.117
>epoch=10, lrate=0.100, error=116.789
>epoch=11, lrate=0.100, error=115.617
>epoch=12, lrate=0.100, error=114.575
>epoch=13, lrate=0.100, error=113.642
>epoch=14, lrate=0.100, error=112.801
>epoch=15, lrate=0.100, error=112.041
>epoch=16, lrate=0.100, error=111.351
>epoch=17, lrate=0.100, error=110.721
>epoch=18, lrate=0.100, error=110.144
>epoch=19, lrate=0.100, error=109.614
>epoch=20, lrate=0.100, error=109.126
>epoch=21, lrate=0.100, error=108.674
>epoch=22, lrate=0.100, error=108.256
>epoch=23, lrate=0.100, error=107.867
>epoch=24, lrate=0.100, error=107.506
>epoch=25, lrate=0.100, error=107.168
>epoch=26, lrate=0.100, error=106.852
>epoch=27, lrate=0.100, error=106.557
>epoch=28, lrate=0.100, error=106.280
>epoch=29, lrate=0.100, error=106.019
>epoch=30, lrate=0.100, error=105.774
>epoch=31, lrate=0.100, error=105.542
>epoch=32, lrate=0.100, error=105.324
>epoch=33, lrate=0.100, error=105.118
>epoch=34, lrate=0.100, error=104.923
>epoch=35, lrate=0.100, error=104.738
>epoch=36, lrate=0.100, error=104.563
>epoch=37, lrate=0.100, error=104.396
>epoch=38, lrate=0.100, error=104.238
>epoch=39, lrate=0.100, error=104.088
>epoch=40, lrate=0.100, error=103.945
>epoch=41, lrate=0.100, error=103.808
>epoch=42, lrate=0.100, error=103.678
>epoch=43, lrate=0.100, error=103.554
>epoch=44, lrate=0.100, error=103.435
>epoch=45, lrate=0.100, error=103.322
>epoch=46, lrate=0.100, error=103.214
>epoch=47, lrate=0.100, error=103.110
>epoch=48, lrate=0.100, error=103.011
>epoch=49, lrate=0.100, error=102.916
>epoch=50, lrate=0.100, error=102.824
>epoch=51, lrate=0.100, error=102.737
>epoch=52, lrate=0.100, error=102.653
>epoch=53, lrate=0.100, error=102.572
>epoch=54, lrate=0.100, error=102.495
>epoch=55, lrate=0.100, error=102.420
>epoch=56, lrate=0.100, error=102.349
>epoch=57, lrate=0.100, error=102.280
>epoch=58, lrate=0.100, error=102.213
>epoch=59, lrate=0.100, error=102.150
>epoch=60, lrate=0.100, error=102.088
>epoch=61, lrate=0.100, error=102.029
>epoch=62, lrate=0.100, error=101.972
>epoch=63, lrate=0.100, error=101.916
>epoch=64, lrate=0.100, error=101.863
>epoch=65, lrate=0.100, error=101.812
>epoch=66, lrate=0.100, error=101.762
>epoch=67, lrate=0.100, error=101.715
>epoch=68, lrate=0.100, error=101.668
>epoch=69, lrate=0.100, error=101.624
>epoch=70, lrate=0.100, error=101.580
>epoch=71, lrate=0.100, error=101.539
>epoch=72, lrate=0.100, error=101.498
>epoch=73, lrate=0.100, error=101.459
>epoch=74, lrate=0.100, error=101.421
>epoch=75, lrate=0.100, error=101.385
>epoch=76, lrate=0.100, error=101.349
>epoch=77, lrate=0.100, error=101.315
>epoch=78, lrate=0.100, error=101.282
>epoch=79, lrate=0.100, error=101.249
>epoch=80, lrate=0.100, error=101.218
>epoch=81, lrate=0.100, error=101.188
>epoch=82, lrate=0.100, error=101.158
>epoch=83, lrate=0.100, error=101.130
>epoch=84, lrate=0.100, error=101.102
>epoch=85, lrate=0.100, error=101.075
>epoch=86, lrate=0.100, error=101.049
>epoch=87, lrate=0.100, error=101.024
>epoch=88, lrate=0.100, error=100.999
>epoch=89, lrate=0.100, error=100.976
>epoch=90, lrate=0.100, error=100.952
>epoch=91, lrate=0.100, error=100.930
>epoch=92, lrate=0.100, error=100.908
>epoch=93, lrate=0.100, error=100.887
>epoch=94, lrate=0.100, error=100.866
>epoch=95, lrate=0.100, error=100.846
>epoch=96, lrate=0.100, error=100.826
>epoch=97, lrate=0.100, error=100.807
>epoch=98, lrate=0.100, error=100.789
>epoch=99, lrate=0.100, error=100.771
>epoch=0, lrate=0.100, error=150.488
>epoch=1, lrate=0.100, error=143.038
>epoch=2, lrate=0.100, error=137.722
>epoch=3, lrate=0.100, error=133.496
>epoch=4, lrate=0.100, error=130.087
>epoch=5, lrate=0.100, error=127.289
>epoch=6, lrate=0.100, error=124.951
>epoch=7, lrate=0.100, error=122.966
>epoch=8, lrate=0.100, error=121.258
>epoch=9, lrate=0.100, error=119.771
>epoch=10, lrate=0.100, error=118.464
>epoch=11, lrate=0.100, error=117.306
>epoch=12, lrate=0.100, error=116.272
>epoch=13, lrate=0.100, error=115.345
>epoch=14, lrate=0.100, error=114.507
>epoch=15, lrate=0.100, error=113.748
>epoch=16, lrate=0.100, error=113.057
>epoch=17, lrate=0.100, error=112.426
>epoch=18, lrate=0.100, error=111.846
>epoch=19, lrate=0.100, error=111.313
>epoch=20, lrate=0.100, error=110.821
>epoch=21, lrate=0.100, error=110.366
>epoch=22, lrate=0.100, error=109.944
>epoch=23, lrate=0.100, error=109.551
>epoch=24, lrate=0.100, error=109.185
>epoch=25, lrate=0.100, error=108.844
>epoch=26, lrate=0.100, error=108.524
>epoch=27, lrate=0.100, error=108.224
>epoch=28, lrate=0.100, error=107.943
>epoch=29, lrate=0.100, error=107.678
>epoch=30, lrate=0.100, error=107.429
>epoch=31, lrate=0.100, error=107.194
>epoch=32, lrate=0.100, error=106.972
>epoch=33, lrate=0.100, error=106.762
>epoch=34, lrate=0.100, error=106.563
>epoch=35, lrate=0.100, error=106.374
>epoch=36, lrate=0.100, error=106.195
>epoch=37, lrate=0.100, error=106.025
>epoch=38, lrate=0.100, error=105.864
>epoch=39, lrate=0.100, error=105.710
>epoch=40, lrate=0.100, error=105.563
>epoch=41, lrate=0.100, error=105.423
>epoch=42, lrate=0.100, error=105.290
>epoch=43, lrate=0.100, error=105.162
>epoch=44, lrate=0.100, error=105.040
>epoch=45, lrate=0.100, error=104.924
>epoch=46, lrate=0.100, error=104.812
>epoch=47, lrate=0.100, error=104.705
>epoch=48, lrate=0.100, error=104.603
>epoch=49, lrate=0.100, error=104.505
>epoch=50, lrate=0.100, error=104.410
>epoch=51, lrate=0.100, error=104.320
>epoch=52, lrate=0.100, error=104.233
>epoch=53, lrate=0.100, error=104.149
>epoch=54, lrate=0.100, error=104.069
>epoch=55, lrate=0.100, error=103.991
>epoch=56, lrate=0.100, error=103.917
>epoch=57, lrate=0.100, error=103.845
>epoch=58, lrate=0.100, error=103.776
>epoch=59, lrate=0.100, error=103.709
>epoch=60, lrate=0.100, error=103.644
>epoch=61, lrate=0.100, error=103.582
>epoch=62, lrate=0.100, error=103.522
>epoch=63, lrate=0.100, error=103.464
>epoch=64, lrate=0.100, error=103.409
>epoch=65, lrate=0.100, error=103.355
>epoch=66, lrate=0.100, error=103.302
>epoch=67, lrate=0.100, error=103.252
>epoch=68, lrate=0.100, error=103.203
>epoch=69, lrate=0.100, error=103.155
>epoch=70, lrate=0.100, error=103.110
>epoch=71, lrate=0.100, error=103.065
>epoch=72, lrate=0.100, error=103.022
>epoch=73, lrate=0.100, error=102.981
>epoch=74, lrate=0.100, error=102.940
>epoch=75, lrate=0.100, error=102.901
>epoch=76, lrate=0.100, error=102.863
>epoch=77, lrate=0.100, error=102.826
>epoch=78, lrate=0.100, error=102.790
>epoch=79, lrate=0.100, error=102.756
>epoch=80, lrate=0.100, error=102.722
>epoch=81, lrate=0.100, error=102.689
>epoch=82, lrate=0.100, error=102.658
>epoch=83, lrate=0.100, error=102.627
>epoch=84, lrate=0.100, error=102.597
>epoch=85, lrate=0.100, error=102.567
>epoch=86, lrate=0.100, error=102.539
>epoch=87, lrate=0.100, error=102.511
>epoch=88, lrate=0.100, error=102.485
>epoch=89, lrate=0.100, error=102.458
>epoch=90, lrate=0.100, error=102.433
>epoch=91, lrate=0.100, error=102.408
>epoch=92, lrate=0.100, error=102.384
>epoch=93, lrate=0.100, error=102.361
>epoch=94, lrate=0.100, error=102.338
>epoch=95, lrate=0.100, error=102.316
>epoch=96, lrate=0.100, error=102.294
>epoch=97, lrate=0.100, error=102.273
>epoch=98, lrate=0.100, error=102.252
>epoch=99, lrate=0.100, error=102.232
>epoch=0, lrate=0.100, error=151.262
>epoch=1, lrate=0.100, error=143.073
>epoch=2, lrate=0.100, error=137.250
>epoch=3, lrate=0.100, error=132.672
>epoch=4, lrate=0.100, error=129.017
>epoch=5, lrate=0.100, error=126.046
>epoch=6, lrate=0.100, error=123.588
>epoch=7, lrate=0.100, error=121.519
>epoch=8, lrate=0.100, error=119.755
>epoch=9, lrate=0.100, error=118.233
>epoch=10, lrate=0.100, error=116.907
>epoch=11, lrate=0.100, error=115.740
>epoch=12, lrate=0.100, error=114.707
>epoch=13, lrate=0.100, error=113.785
>epoch=14, lrate=0.100, error=112.959
>epoch=15, lrate=0.100, error=112.214
>epoch=16, lrate=0.100, error=111.538
>epoch=17, lrate=0.100, error=110.924
>epoch=18, lrate=0.100, error=110.363
>epoch=19, lrate=0.100, error=109.848
>epoch=20, lrate=0.100, error=109.375
>epoch=21, lrate=0.100, error=108.938
>epoch=22, lrate=0.100, error=108.534
>epoch=23, lrate=0.100, error=108.159
>epoch=24, lrate=0.100, error=107.810
>epoch=25, lrate=0.100, error=107.485
>epoch=26, lrate=0.100, error=107.181
>epoch=27, lrate=0.100, error=106.897
>epoch=28, lrate=0.100, error=106.630
>epoch=29, lrate=0.100, error=106.380
>epoch=30, lrate=0.100, error=106.144
>epoch=31, lrate=0.100, error=105.922
>epoch=32, lrate=0.100, error=105.713
>epoch=33, lrate=0.100, error=105.515
>epoch=34, lrate=0.100, error=105.327
>epoch=35, lrate=0.100, error=105.150
>epoch=36, lrate=0.100, error=104.982
>epoch=37, lrate=0.100, error=104.822
>epoch=38, lrate=0.100, error=104.671
>epoch=39, lrate=0.100, error=104.526
>epoch=40, lrate=0.100, error=104.389
>epoch=41, lrate=0.100, error=104.258
>epoch=42, lrate=0.100, error=104.134
>epoch=43, lrate=0.100, error=104.015
>epoch=44, lrate=0.100, error=103.901
>epoch=45, lrate=0.100, error=103.792
>epoch=46, lrate=0.100, error=103.688
>epoch=47, lrate=0.100, error=103.589
>epoch=48, lrate=0.100, error=103.494
>epoch=49, lrate=0.100, error=103.403
>epoch=50, lrate=0.100, error=103.315
>epoch=51, lrate=0.100, error=103.232
>epoch=52, lrate=0.100, error=103.151
>epoch=53, lrate=0.100, error=103.074
>epoch=54, lrate=0.100, error=103.000
>epoch=55, lrate=0.100, error=102.928
>epoch=56, lrate=0.100, error=102.860
>epoch=57, lrate=0.100, error=102.794
>epoch=58, lrate=0.100, error=102.730
>epoch=59, lrate=0.100, error=102.669
>epoch=60, lrate=0.100, error=102.610
>epoch=61, lrate=0.100, error=102.554
>epoch=62, lrate=0.100, error=102.499
>epoch=63, lrate=0.100, error=102.446
>epoch=64, lrate=0.100, error=102.395
>epoch=65, lrate=0.100, error=102.346
>epoch=66, lrate=0.100, error=102.299
>epoch=67, lrate=0.100, error=102.253
>epoch=68, lrate=0.100, error=102.209
>epoch=69, lrate=0.100, error=102.167
>epoch=70, lrate=0.100, error=102.125
>epoch=71, lrate=0.100, error=102.085
>epoch=72, lrate=0.100, error=102.047
>epoch=73, lrate=0.100, error=102.010
>epoch=74, lrate=0.100, error=101.974
>epoch=75, lrate=0.100, error=101.939
>epoch=76, lrate=0.100, error=101.905
>epoch=77, lrate=0.100, error=101.872
>epoch=78, lrate=0.100, error=101.841
>epoch=79, lrate=0.100, error=101.810
>epoch=80, lrate=0.100, error=101.780
>epoch=81, lrate=0.100, error=101.751
>epoch=82, lrate=0.100, error=101.723
>epoch=83, lrate=0.100, error=101.696
>epoch=84, lrate=0.100, error=101.670
>epoch=85, lrate=0.100, error=101.645
>epoch=86, lrate=0.100, error=101.620
>epoch=87, lrate=0.100, error=101.596
>epoch=88, lrate=0.100, error=101.573
>epoch=89, lrate=0.100, error=101.550
>epoch=90, lrate=0.100, error=101.528
>epoch=91, lrate=0.100, error=101.507
>epoch=92, lrate=0.100, error=101.486
>epoch=93, lrate=0.100, error=101.466
>epoch=94, lrate=0.100, error=101.446
>epoch=95, lrate=0.100, error=101.427
>epoch=96, lrate=0.100, error=101.409
>epoch=97, lrate=0.100, error=101.391
>epoch=98, lrate=0.100, error=101.373
>epoch=99, lrate=0.100, error=101.356
>epoch=0, lrate=0.100, error=151.293
>epoch=1, lrate=0.100, error=142.861
>epoch=2, lrate=0.100, error=136.812
>epoch=3, lrate=0.100, error=132.051
>epoch=4, lrate=0.100, error=128.248
>epoch=5, lrate=0.100, error=125.155
>epoch=6, lrate=0.100, error=122.594
>epoch=7, lrate=0.100, error=120.438
>epoch=8, lrate=0.100, error=118.598
>epoch=9, lrate=0.100, error=117.008
>epoch=10, lrate=0.100, error=115.619
>epoch=11, lrate=0.100, error=114.396
>epoch=12, lrate=0.100, error=113.310
>epoch=13, lrate=0.100, error=112.340
>epoch=14, lrate=0.100, error=111.468
>epoch=15, lrate=0.100, error=110.680
>epoch=16, lrate=0.100, error=109.964
>epoch=17, lrate=0.100, error=109.311
>epoch=18, lrate=0.100, error=108.714
>epoch=19, lrate=0.100, error=108.165
>epoch=20, lrate=0.100, error=107.660
>epoch=21, lrate=0.100, error=107.193
>epoch=22, lrate=0.100, error=106.761
>epoch=23, lrate=0.100, error=106.359
>epoch=24, lrate=0.100, error=105.986
>epoch=25, lrate=0.100, error=105.637
>epoch=26, lrate=0.100, error=105.312
>epoch=27, lrate=0.100, error=105.007
>epoch=28, lrate=0.100, error=104.721
>epoch=29, lrate=0.100, error=104.453
>epoch=30, lrate=0.100, error=104.201
>epoch=31, lrate=0.100, error=103.963
>epoch=32, lrate=0.100, error=103.739
>epoch=33, lrate=0.100, error=103.528
>epoch=34, lrate=0.100, error=103.328
>epoch=35, lrate=0.100, error=103.138
>epoch=36, lrate=0.100, error=102.959
>epoch=37, lrate=0.100, error=102.789
>epoch=38, lrate=0.100, error=102.628
>epoch=39, lrate=0.100, error=102.475
>epoch=40, lrate=0.100, error=102.329
>epoch=41, lrate=0.100, error=102.191
>epoch=42, lrate=0.100, error=102.059
>epoch=43, lrate=0.100, error=101.933
>epoch=44, lrate=0.100, error=101.813
>epoch=45, lrate=0.100, error=101.699
>epoch=46, lrate=0.100, error=101.590
>epoch=47, lrate=0.100, error=101.485
>epoch=48, lrate=0.100, error=101.386
>epoch=49, lrate=0.100, error=101.290
>epoch=50, lrate=0.100, error=101.199
>epoch=51, lrate=0.100, error=101.111
>epoch=52, lrate=0.100, error=101.027
>epoch=53, lrate=0.100, error=100.947
>epoch=54, lrate=0.100, error=100.870
>epoch=55, lrate=0.100, error=100.796
>epoch=56, lrate=0.100, error=100.725
>epoch=57, lrate=0.100, error=100.656
>epoch=58, lrate=0.100, error=100.591
>epoch=59, lrate=0.100, error=100.528
>epoch=60, lrate=0.100, error=100.467
>epoch=61, lrate=0.100, error=100.408
>epoch=62, lrate=0.100, error=100.352
>epoch=63, lrate=0.100, error=100.298
>epoch=64, lrate=0.100, error=100.246
>epoch=65, lrate=0.100, error=100.195
>epoch=66, lrate=0.100, error=100.147
>epoch=67, lrate=0.100, error=100.100
>epoch=68, lrate=0.100, error=100.055
>epoch=69, lrate=0.100, error=100.012
>epoch=70, lrate=0.100, error=99.970
>epoch=71, lrate=0.100, error=99.929
>epoch=72, lrate=0.100, error=99.890
>epoch=73, lrate=0.100, error=99.852
>epoch=74, lrate=0.100, error=99.815
>epoch=75, lrate=0.100, error=99.780
>epoch=76, lrate=0.100, error=99.746
>epoch=77, lrate=0.100, error=99.712
>epoch=78, lrate=0.100, error=99.680
>epoch=79, lrate=0.100, error=99.649
>epoch=80, lrate=0.100, error=99.619
>epoch=81, lrate=0.100, error=99.590
>epoch=82, lrate=0.100, error=99.562
>epoch=83, lrate=0.100, error=99.535
>epoch=84, lrate=0.100, error=99.508
>epoch=85, lrate=0.100, error=99.483
>epoch=86, lrate=0.100, error=99.458
>epoch=87, lrate=0.100, error=99.434
>epoch=88, lrate=0.100, error=99.410
>epoch=89, lrate=0.100, error=99.388
>epoch=90, lrate=0.100, error=99.366
>epoch=91, lrate=0.100, error=99.344
>epoch=92, lrate=0.100, error=99.324
>epoch=93, lrate=0.100, error=99.304
>epoch=94, lrate=0.100, error=99.284
>epoch=95, lrate=0.100, error=99.265
>epoch=96, lrate=0.100, error=99.247
>epoch=97, lrate=0.100, error=99.229
>epoch=98, lrate=0.100, error=99.211
>epoch=99, lrate=0.100, error=99.194
Scores: [77.06422018348624, 74.31192660550458, 81.65137614678899, 79.81651376146789, 75.22935779816514, 77.98165137614679, 77.06422018348624]
Mean Accuracy: 77.588%
  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值