2018.6.13 python作业:sklearn exercise

作业:

In the second ML assignment you have to compare the performance of three di↵erent classification algorithms, namely Naive Bayes, SVM, and Random Forest. For this assignment you need to generate a random binary classification problem, and then train and test (using 10-fold cross validation) the three algorithms. For some algorithms inner cross validation (5-fold) for choosing the parameters is needed. Then, show the classification performace (per-fold and averaged) in the report, and briefly discussing the results.

Steps:

1. Create a classification dataset (n samples 1000, n features 10) 
2. Split the dataset using 10-fold cross validation 
3. Train the algorithms 

  • GaussianNB 
  • SVC (possible C values [1e-02, 1e-01, 1e00, 1e01, 1e02], RBF kernel) 
  • RandomForestClassifier (possible n estimators values [10, 100, 1000]) 

4. Evaluate the cross-validated performance

  • Accuracy
  • F1-score
  • AUC ROC 

5. Write a short report summarizing the methodology and the results


from sklearn import datasets
from sklearn import cross_validation
from sklearn.naive_bayes import GaussianNB
from sklearn.svm import SVC
from sklearn.ensemble import RandomForestClassifier
from sklearn import metrics
import numpy as np

iris = datasets.load_iris()
dataset = datasets.make_classification(n_samples = 1000 , n_features= 10,
            n_informative= 2, n_redundant = 2, n_repeated = 0, n_classes = 2)

kf = cross_validation.KFold(len(iris.data) , n_folds  = 10, shuffle = True)

acc = []
f1 = []
auc = []

for train_index, test_index in kf:
    X_train, X_test = dataset[0][train_index],dataset[0][test_index]
    y_train, y_test = dataset[1][train_index],dataset[1][test_index]

# Naive Bayes 
clf = GaussianNB() 
clf.fit(X_train, y_train) 
pred = clf.predict(X_test)
print(pred)
print(y_test)
print('\n')

acc.append(metrics.accuracy_score(y_test,pred))
f1.append(metrics.f1_score(y_test,pred))
auc.append(metrics.roc_auc_score(y_test, pred))

#SVM 
for c in [1e-02, 1e-01, 1e00, 1e01, 1e02]:
	clf = SVC(C=c, kernel='rbf', gamma=0.1) 
	clf.fit(X_train, y_train) 
	pred = clf.predict(X_test)
	print(pred)
	print(y_test)
	print('\n')

	acc.append(metrics.accuracy_score(y_test,pred))
	f1.append(metrics.f1_score(y_test,pred))
	auc.append(metrics.roc_auc_score(y_test, pred))

# Random Forest 
for n in [10, 100, 1000]:
	clf = RandomForestClassifier(n_estimators=10) 
	clf.fit(X_train, y_train) 
	pred = clf.predict(X_test)
	print(pred)
	print(y_test)
	print('\n')

	acc.append(metrics.accuracy_score(y_test,pred))
	f1.append(metrics.f1_score(y_test,pred))
	auc.append(metrics.roc_auc_score(y_test, pred))

# Evaluate the cross-validated performance
print(acc)
print(f1)
print(auc)

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值