python回归分析预测模型_应用python随机森林回归模型训练好模型后,如何进行预测?为什么预测值会这么小,且出现多个相同的异常值?...

from sklearn.datasets import load_boston

from sklearn.model_selection import train_test_split

from sklearn.preprocessing import StandardScaler

from sklearn.ensemble import RandomForestRegressor, ExtraTreesRegressor, GradientBoostingRegressor

from sklearn.metrics import r2_score, mean_squared_error, mean_absolute_error

import numpy as np

#随机森林回归

# 1 准备数据

# 读取波士顿地区房价信息

boston = load_boston()

#print("boston:", boston)

# 查看数据描述

# print(boston.DESCR) # 共506条波士顿地区房价信息,每条13项数值特征描述和目标房价

# 查看数据的差异情况

# print("最大房价:", np.max(boston.target)) # 50

# print("最小房价:",np.min(boston.target)) # 5

# print("平均房价:", np.mean(boston.target)) # 22.532806324110677

x = boston.data

y = boston.target

print("x.shape:", x.shape)

print("y.shape:", y.shape)

# 2 分割训练数据和测试数据

# 随机采样25%作为测试 75%作为训练

x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.25, random_state=33)

print("x_train.shape:", x_train.shape)

print("x_test.shape:", x_test.shape)

print("y_train.shape:", y_train.shape)

print("y_test.shape:", y_test.shape)

# 3 训练数据和测试数据进行标准化处理

ss_x = StandardScaler()

x_train = ss_x.fit_transform(x_train)

x_test = ss_x.transform(x_test)

ss_y = StandardScaler()

y_train = ss_y.fit_transform(y_train.reshape(-1, 1))

y_test = ss_y.transform(y_test.reshape(-1, 1))

# 随机森林回归

rfr = RandomForestRegressor()

# 训练

rfr.fit(x_train, y_train)

# 预测 保存预测结果

rfr_y_predict = rfr.predict(x_test)

#对所有特征数据进行预测

Y_predict=rfr.predict(x)

# 随机森林回归模型评估

print("随机森林回归的默认评估值为:", rfr.score(x_test, y_test))

print("随机森林回归的R_squared值为:", r2_score(y_test, rfr_y_predict))

print("随机森林回归的均方误差为:", mean_squared_error(ss_y.inverse_transform(y_test),

ss_y.inverse_transform(rfr_y_predict)))

print("随机森林回归的平均绝对误差为:", mean_absolute_error(ss_y.inverse_transform(y_test),

ss_y.inverse_transform(rfr_y_predict)))

print(y)

print(Y_predict)

#输出的结果

x.shape: (506, 13)

y.shape: (506,)

x_train.shape: (379, 13)

x_test.shape: (127, 13)

y_train.shape: (379,)

y_test.shape: (127,)

随机森林回归的默认评估值为: 0.8469322253577488

随机森林回归的R_squared值为: 0.8469322253577488

随机森林回归的均方误差为: 11.869073401574813

随机森林回归的平均绝对误差为: 2.229212598425197

[24. 21.6 34.7 33.4 36.2 28.7 22.9 27.1 16.5 18.9 15. 18.9 21.7 20.4

18.2 19.9 23.1 17.5 20.2 18.2 13.6 19.6 15.2 14.5 15.6 13.9 16.6 14.8

18.4 21. 12.7 14.5 13.2 13.1 13.5 18.9 20. 21. 24.7 30.8 34.9 26.6

25.3 24.7 21.2 19.3 20. 16.6 14.4 19.4 19.7 20.5 25. 23.4 18.9 35.4

24.7 31.6 23.3 19.6 18.7 16. 22.2 25. 33. 23.5 19.4 22. 17.4 20.9

24.2 21.7 22.8 23.4 24.1 21.4 20. 20.8 21.2 20.3 28. 23.9 24.8 22.9

23.9 26.6 22.5 22.2 23.6 28.7 22.6 22. 22.9 25. 20.6 28.4 21.4 38.7

43.8 33.2 27.5 26.5 18.6 19.3 20.1 19.5 19.5 20.4 19.8 19.4 21.7 22.8

18.8 18.7 18.5 18.3 21.2 19.2 20.4 19.3 22. 20.3 20.5 17.3 18.8 21.4

15.7 16.2 18. 14.3 19.2 19.6 23. 18.4 15.6 18.1 17.4 17.1 13.3 17.8

14. 14.4 13.4 15.6 11.8 13.8 15.6 14.6 17.8 15.4 21.5 19.6 15.3 19.4

17. 15.6 13.1 41.3 24.3 23.3 27. 50. 50. 50. 22.7 25. 50. 23.8

23.8 22.3 17.4 19.1 23.1 23.6 22.6 29.4 23.2 24.6 29.9 37.2 39.8 36.2

37.9 32.5 26.4 29.6 50. 32. 29.8 34.9 37. 30.5 36.4 31.1 29.1 50.

33.3 30.3 34.6 34.9 32.9 24.1 42.3 48.5 50. 22.6 24.4 22.5 24.4 20.

21.7 19.3 22.4 28.1 23.7 25. 23.3 28.7 21.5 23. 26.7 21.7 27.5 30.1

44.8 50. 37.6 31.6 46.7 31.5 24.3 31.7 41.7 48.3 29. 24. 25.1 31.5

23.7 23.3 22. 20.1 22.2 23.7 17.6 18.5 24.3 20.5 24.5 26.2 24.4 24.8

29.6 42.8 21.9 20.9 44. 50. 36. 30.1 33.8 43.1 48.8 31. 36.5 22.8

30.7 50. 43.5 20.7 21.1 25.2 24.4 35.2 32.4 32. 33.2 33.1 29.1 35.1

45.4 35.4 46. 50. 32.2 22. 20.1 23.2 22.3 24.8 28.5 37.3 27.9 23.9

21.7 28.6 27.1 20.3 22.5 29. 24.8 22. 26.4 33.1 36.1 28.4 33.4 28.2

22.8 20.3 16.1 22.1 19.4 21.6 23.8 16.2 17.8 19.8 23.1 21. 23.8 23.1

20.4 18.5 25. 24.6 23. 22.2 19.3 22.6 19.8 17.1 19.4 22.2 20.7 21.1

19.5 18.5 20.6 19. 18.7 32.7 16.5 23.9 31.2 17.5 17.2 23.1 24.5 26.6

22.9 24.1 18.6 30.1 18.2 20.6 17.8 21.7 22.7 22.6 25. 19.9 20.8 16.8

21.9 27.5 21.9 23.1 50. 50. 50. 50. 50. 13.8 13.8 15. 13.9 13.3

13.1 10.2 10.4 10.9 11.3 12.3 8.8 7.2 10.5 7.4 10.2 11.5 15.1 23.2

9.7 13.8 12.7 13.1 12.5 8.5 5. 6.3 5.6 7.2 12.1 8.3 8.5 5.

11.9 27.9 17.2 27.5 15. 17.2 17.9 16.3 7. 7.2 7.5 10.4 8.8 8.4

16.7 14.2 20.8 13.4 11.7 8.3 10.2 10.9 11. 9.5 14.5 14.1 16.1 14.3

11.7 13.4 9.6 8.7 8.4 12.8 10.5 17.1 18.4 15.4 10.8 11.8 14.9 12.6

14.1 13. 13.4 15.2 16.1 17.8 14.9 14.1 12.7 13.5 14.9 20. 16.4 17.7

19.5 20.2 21.4 19.9 19. 19.1 19.1 20.1 19.9 19.6 23.2 29.8 13.8 13.3

16.7 12. 14.6 21.4 23. 23.7 25. 21.8 20.6 21.2 19.1 20.6 15.2 7.

8.1 13.6 20.1 21.8 24.5 23.1 19.7 18.3 21.2 17.5 16.8 22.4 20.6 23.9

22. 11.9]

[1.22397047 1.17989645 1.17989645 1.22246183 1.22246183 1.22246183

1.18140509 1.18140509 1.18000421 1.18000421 1.18000421 1.18140509

1.18140509 1.19993989 1.19993989 1.19993989 1.20845297 1.19530619

1.19185786 1.19509067 1.19810796 1.19325874 1.19810796 1.20662104

1.19519843 1.19325874 1.19993989 1.20026317 1.19519843 1.20651328

1.19810796 1.19724588 1.19724588 1.19810796 1.18269822 1.17246098

1.17246098 1.17246098 1.17106009 1.18884057 1.20640552 1.17989645

1.17989645 1.17989645 1.17989645 1.17849556 1.17849556 1.17515499

1.17515499 1.17849556 1.18884057 1.18884057 1.18884057 1.18884057

1.18884057 1.22838865 1.22838865 1.22838865 1.18884057 1.18884057

1.18884057 1.18743968 1.18884057 1.18884057 1.23140594 1.18884057

1.18884057 1.18884057 1.18884057 1.18884057 1.17989645 1.17989645

1.17989645 1.17849556 1.17989645 1.17989645 1.17989645 1.17989645

1.17989645 1.17989645 1.18884057 1.18884057 1.18884057 1.18884057

1.17989645 1.17989645 1.17989645 1.17989645 1.17246098 1.17246098

1.17246098 1.17246098 1.18884057 1.18884057 1.18884057 1.17989645

1.17989645 1.17989645 1.17989645 1.17989645 1.17246098 1.17246098

1.16771952 1.17106009 1.17246098 1.17246098 1.17106009 1.17246098

1.17246098 1.16771952 1.17246098 1.17246098 1.17246098 1.17106009

1.17246098 1.17106009 1.17246098 1.17246098 1.17246098 1.17246098

1.17246098 1.17246098 1.17246098 1.17246098 1.17246098 1.17106009

1.17774124 1.16771952 1.17774124 1.19325874 1.17774124 1.19810796

1.19993989 1.17774124 1.20662104 1.19993989 1.17774124 1.17774124

1.16771952 1.19993989 1.16987473 1.18269822 1.16610311 1.14832264

1.16610311 1.16610311 1.17353858 1.16610311 1.16610311 1.16610311

1.17116785 1.18571551 1.18517671 1.17353858 1.18571551 1.15230977

1.16610311 1.19810796 1.19724588 1.18571551 1.19810796 1.2026339

1.18981041 1.19724588 1.17763348 1.17763348 1.18506895 1.18269822

1.17763348 1.17763348 1.19810796 1.17763348 1.17246098 1.17246098

1.17246098 1.17246098 1.17246098 1.17246098 1.17246098 1.17246098

1.17246098 1.17246098 1.17246098 1.17246098 1.17246098 1.17246098

1.17246098 1.18884057 1.18884057 1.18884057 1.18884057 1.18884057

1.18884057 1.18884057 1.18884057 1.22849641 1.23140594 1.23140594

1.23140594 1.23140594 1.23140594 1.23140594 1.23140594 1.18884057

1.18884057 1.17246098 1.16771952 1.16771952 1.17246098 1.21017713

1.17106009 1.21017713 1.17106009 1.17246098 1.20231062 1.17106009

1.17246098 1.17246098 1.17246098 1.17246098 1.17774124 1.17774124

1.19993989 1.19993989 1.21017713 1.23237579 1.21017713 1.21017713

1.21017713 1.21017713 1.23237579 1.21017713 1.21114698 1.17774124

1.17774124 1.17774124 1.19993989 1.19993989 1.18884057 1.18884057

1.18884057 1.18884057 1.18884057 1.18884057 1.18743968 1.18743968

1.22655672 1.18743968 1.18743968 1.18743968 1.18884057 1.18743968

1.18884057 1.22655672 1.18884057 1.18884057 1.18884057 1.20888401

1.20888401 1.20888401 1.20888401 1.20888401 1.20888401 1.20220286

1.20888401 1.20414255 1.20241838 1.20888401 1.20888401 1.18884057

1.22655672 1.18743968 1.18884057 1.18743968 1.18884057 1.18884057

1.18884057 1.18884057 1.18884057 1.18743968 1.18884057 1.18884057

1.18884057 1.22838865 1.18884057 1.23140594 1.23140594 1.18884057

1.18884057 1.18884057 1.18884057 1.18884057 1.18884057 1.17989645

1.17989645 1.17989645 1.17989645 1.17989645 1.23140594 1.23140594

1.23140594 1.18884057 1.18884057 1.18884057 1.23140594 1.23140594

1.23140594 1.23140594 1.17774124 1.17774124 1.17763348 1.19185786

1.16771952 1.16771952 1.17774124 1.16771952 1.17774124 1.16771952

1.17774124 1.17774124 1.17106009 1.17106009 1.21017713 1.18323702

1.21017713 1.17106009 1.21017713 1.16771952 1.17989645 1.17989645

1.17989645 1.18884057 1.18884057 1.17246098 1.17246098 1.17246098

1.17246098 1.17246098 1.17246098 1.17246098 1.17246098 1.23140594

1.21502635 1.18140509 1.18140509 1.17989645 1.17989645 1.18884057

1.23140594 1.22838865 1.22838865 1.23140594 1.23140594 1.23140594

1.23140594 1.23140594 1.16567207 1.16384014 1.159853 1.159853

1.159853 1.16384014 1.16384014 1.159853 1.16384014 1.159853

1.16384014 1.16567207 1.159853 1.159853 1.15058561 1.16125389

1.15543483 1.1644867 1.16125389 1.16567207 1.16567207 1.16567207

1.16567207 1.16567207 1.16567207 1.16567207 1.16567207 1.159853

1.16567207 1.16567207 1.16567207 1.16567207 1.16567207 1.159853

1.159853 1.159853 1.16567207 1.159853 1.16567207 1.159853

1.159853 1.159853 1.16567207 1.16567207 1.16567207 1.16567207

1.16567207 1.16567207 1.16567207 1.16567207 1.1644867 1.16567207

1.159853 1.16567207 1.16567207 1.16567207 1.16567207 1.16567207

1.16567207 1.16567207 1.16567207 1.16567207 1.16567207 1.16567207

1.16567207 1.159853 1.16567207 1.159853 1.159853 1.16567207

1.16567207 1.16567207 1.159853 1.16567207 1.159853 1.16567207

1.159853 1.159853 1.16567207 1.16567207 1.16567207 1.16567207

1.16567207 1.16567207 1.16567207 1.16567207 1.159853 1.16567207

1.16567207 1.16567207 1.159853 1.16567207 1.16567207 1.159853

1.26653585 1.159853 1.159853 1.159853 1.16567207 1.159853

1.159853 1.159853 1.159853 1.159853 1.159853 1.16384014

1.159853 1.159853 1.159853 1.17763348 1.16384014 1.159853

1.16567207 1.16567207 1.159853 1.159853 1.16384014 1.159853

1.159853 1.159853 1.159853 1.16567207 1.16567207 1.16567207

1.159853 1.159853 1.159853 1.17763348 1.17763348 1.16384014

1.159853 1.159853 1.17246098 1.17106009 1.17106009 1.17246098

1.17246098 1.17106009 1.15080113 1.17106009 1.16987473 1.16771952

1.16771952 1.17106009 1.17106009 1.17246098 1.17246098 1.17246098

1.17246098 1.17246098]

  • 0
    点赞
  • 7
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
随机森林回归是一种基于决策树的集成学习算法,用于解决回归问题。它通过构建多个决策树,并将它们的预测结果进行平均或投票来得到最终的预测结果。 在天气预测中,可以使用随机森林回归模型预测某个特定间点的天气情况,如温度、湿度等。下面是使用Python进行随机森林回归预测的简单示例: 1. 导入所需的库: ```python import pandas as pd from sklearn.ensemble import RandomForestRegressor from sklearn.model_selection import train_test_split from sklearn.metrics import mean_squared_error ``` 2. 准备数据集: ```python # 假设已有一个包含特征和目标变量的数据集,特征存储在X中,目标变量存储在y中 X = ... y = ... ``` 3. 划分训练集和测试集: ```python X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) ``` 4. 构建随机森林回归模型: ```python rf = RandomForestRegressor(n_estimators=100, random_state=42) ``` 其中,n_estimators表示构建的决策树数量,可以根据实际情况进行调整。 5. 模型训练: ```python rf.fit(X_train, y_train) ``` 6. 模型预测: ```python y_pred = rf.predict(X_test) ``` 7. 评估模型: ```python mse = mean_squared_error(y_test, y_pred) ``` 均方误差(Mean Squared Error,MSE)是一种常用的回归模型评估指标,用于衡量预测值与真实值之间的差异。 以上是一个简单的随机森林回归预测模型的示例,你可以根据实际情况进行调整和优化。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值