the spider matlab,The Spider:一个“面向对象”的机器学习Matlab工具箱 | 丕子

Plugging objects together: e.g perform cross validation on the following system: greedy backward feature selection on a fast base algorithm, training on those features with an SVM for each output in a one-against-the-rest multi-class system, choosing all hyperparameters with a model selection method.

他在工具箱中提供的接口都是模仿者对象的思想,任何方法和数据都看成对象,例如:用一个SVM对象去训练一个Data数据对象:

X=rand(50)-0.5; Y=sign(sum(X,2)); d=data(X,Y) % make simple data

[res alg]= train(svm,d) % train a support vector machine

and that's it - the svm is trained! To test it on more data:

[res]= test(alg,d)

这里面对象有多少呢?看下面的列表,包括算法模型和数据:

Basic library objects.

+

Storing input data and output results

+

Implementation of data object that limits memory overhead

+

Generic algorithm object

+

Groups sets of objects together (algorithms or data)

+

Evaluates loss functions

+

Takes mean loss over groups of algs

+

Builds chains of objects: output of one to input of another

+

To train and test different hyperparameters of an object

+

Cross validation using objects given data

+

Evaluates and caches kernel functions

+

Evaluates and caches distance functions

Statistical Tests objects.

-

Wilcoxon test of statistical significance of results

-

Corrected resampled t-test - for dependent trials

Dataset objects.

+

Spiral dataset generator.

+

Generator of dataset with only a few relevant features

+

Simple 2d Gaussian problem generator

+

Linear Regression with o outputs and n inputs

Pre-Processing objects

+

Simple normalization of data

+

General user specified mapping function of data

Density Estimation objects.

+

Parzen's windows kernel density estimator

-

Density estimator which assumes feature independence

+

Classifer based on density estimation for each class

+

Normal distribution density estimator

Pattern Recognition objects.

+

Support Vector Machine (svm)

-

C4.5 for binary or multi-class

+

k-nearest neighbours

-

Conditional Probability estimation for margin classifiers

-

Multi-Kernel LP-SVM

-

Minimize the a-norm in alpha space using kernels

-

Local and Global Consistent Learner

+

Bagging Classifier

+

ADABoost method

-

Hidden Markov Model

-

Leave One Out Machine

-

Minimize l1 norm of w for a linear separator

-

Kernel Dependency Estimation: general input/output machine

-

Kernel Perceptron

-

Ordinal Regression Perceptron (Shen et al.)

-

Splitting Perceptron (Shen et al.)

-

Sparse, online Pereceptron (Crammer et al.)

-

Random Forest Decision Trees WEKA-Required

-

J48 Decision Trees for binary WEKA-Required

Multi-Class and Multi-label objects.

+

Voting method of one against the rest (also for multi-label)

+

Voting method of one against one

-

Multi-class Support Vector Machine by J.Weston

-

C4.5 for binary or multi-class

+

k-nearest neighbours

Feature Selection objects.

+

Generic object for feature selection + classifier

-

SVM Bound-based feature selection

+

Recursive Feature Elimination (also for the non-linear case)

-

Dual zero-norm minimization (Weston, Elisseeff)

-

Primal zero-norm based feature selection (Mangasarian)

-

Fisher criterion feature selection

-

selection algorithm of Friedman (greedy selection)

-

Multi-class feature selection using spectral clustering

-

Mutual Information for feature selection.

Regression objects.

+

Support Vector Regression

+

Gaussian Process Regression

-

Relevance vector machine

+

(possibly multi-dimensional) ridge regression

-

Multivariate Regression via Stiefel Constraints

+

k-nearest neighbours

+

meta method for independent multiple output regression

-

kernel matching pursuit

-

kernel partial least squares

-

least mean squared regression [now obselete due to multi_rr]

-

Radial Basis Function Network (with moving centers)

-

Reduced Error Pruning Tree WEKA-Required

-

Structure Output Learning using Joint Kernel Method

Model Selection objects.

+

select parameters from a grid of values

-

Selecting SVM parameters by generalization bound

+

Bayessian parameter selection

Unsupervised objects.

+

One class SVM

+

K means clustering

+

Kernel Vector Quantization

+

Kernel Principal Components Analysis

-

Probabilistic Principal Component Analysis

-

Non-negative Matrix factorization

-

Spectral clustering

-

Manifold ranking

-

Probabilistic PCA

Reduced Set and Pre-Image objects.

-

Calculate Pre-Images based on multi-dimensional scaling

-

Calculate Pre-Images based on learning and ridge regression

-

Bottom Up Reduced Set; calculates reduced set based on gradient descent

-

Bottom Up Reduced Set; calculates reduced set for rbf with fixed-point iteration schemes

-

Top Down Reduced Set; calculates reduced set with multi-dimensional scaling

-

Top Down Reduced Set; calculates reduced set with ridge regression

-

Reduced Set Selection via L1 penalization

-

Reduced Set Selection via L0 penalization

+

Reduced Set Selection via matching pursuit

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值