支持向量机matlab工具箱1.0(Support Vector Machine,SVM Matlab Tool
%-------------------------------------------------------%
1 声明
支持向量机Matlab工具箱1.0
使用平台 - Matlab6.5
版权所有:陆振波,海军工程大学
电子邮件:luzhenbo@yahoo.com.cn
个人主页:http://luzhenbo.88uu.com.cn
参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
Support Vector Machine Matlab Toolbox 1.0
Platform : Matlab6.5 / Matlab7.0
Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
E-mail : luzhenbo@yahoo.com.cn
Homepage : http://luzhenbo.88uu.com.cn
Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
%-------------------------------------------------------%
2 内容
该工具箱包括了二种分类,二种回归,以及一种一类支持向量机算法
(1) Main_SVC_C.m --- C_SVC二类分类算法
(2) Main_SVC_Nu.m --- Nu_SVC二类分类算法
(3) Main_SVM_One_Class.m --- One-Class支持向量机
(4) Main_SVR_Epsilon.m --- Epsilon_SVR回归算法
(5) Main_SVR_Nu.m --- Nu_SVR回归算法
%-------------------------------------------------------%
3 使用
(1) 目录下以Main_开头的文件即是主程序文件,直接按快捷键F5运行即可
(2) 工具箱中所有程序均在Matlab6.5环境中调试经过,不能保证在Matlab其它版本正确运行
%-------------------------------------------------------%
4给予共享地址
http://luzhenbo.88uu.com.cn/svm.htm
1 声明
支持向量机Matlab工具箱1.0
使用平台 - Matlab6.5
版权所有:陆振波,海军工程大学
电子邮件:luzhenbo@yahoo.com.cn
个人主页:http://luzhenbo.88uu.com.cn
参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
Support Vector Machine Matlab Toolbox 1.0
Platform : Matlab6.5 / Matlab7.0
Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
E-mail : luzhenbo@yahoo.com.cn
Homepage : http://luzhenbo.88uu.com.cn
Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
%-------------------------------------------------------%
2 内容
该工具箱包括了二种分类,二种回归,以及一种一类支持向量机算法
(1) Main_SVC_C.m --- C_SVC二类分类算法
(2) Main_SVC_Nu.m --- Nu_SVC二类分类算法
(3) Main_SVM_One_Class.m --- One-Class支持向量机
(4) Main_SVR_Epsilon.m --- Epsilon_SVR回归算法
(5) Main_SVR_Nu.m --- Nu_SVR回归算法
%-------------------------------------------------------%
3 使用
(1) 目录下以Main_开头的文件即是主程序文件,直接按快捷键F5运行即可
(2) 工具箱中所有程序均在Matlab6.5环境中调试经过,不能保证在Matlab其它版本正确运行
%-------------------------------------------------------%
4给予共享地址
http://luzhenbo.88uu.com.cn/svm.htm
% 支持向量机Matlab工具箱1.0 - Nu-SVC, Nu二类分类算法
% 使用平台 - Matlab6.5
% 版权所有:陆振波,海军工程大学
% 电子邮件:luzhenbo@yahoo.com.cn
% 个人主页:http://luzhenbo.88uu.com.cn
% 参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Support Vector Machine Matlab Toolbox 1.0 - Nu Support Vector Classification
% Platform : Matlab6.5 / Matlab7.0
% Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
% E-mail : luzhenbo@yahoo.com.cn
% Homepage : http://luzhenbo.88uu.com.cn
% Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
clc
clear
close all
% ------------------------------------------------------------%
% 定义核函数及相关参数
nu = 0.2; % nu -> (0,1] 在支持向量数与错分样本数之间进行折衷
ker = struct('type','linear');
%ker = struct('type','ploy','degree',3,'offset',1);
%ker = struct('type','gauss','width',1);
%ker = struct('type','tanh','gamma',1,'offset',0);
% ker - 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% ------------------------------------------------------------%
% 构造两类训练样本
n = 50;
randn('state',3);
x1 = randn(n,2);
y1 = ones(n,1);
x2 = 5+randn(n,2);
y2 = -ones(n,1);
figure(2);
plot(x1(:,1),x1(:,2),'bx',x2(:,1),x2(:,2),'k.');
hold on;
X = [x1;x2]; % 训练样本,n×d的矩阵,n为样本个数,d为样本维数
Y = [y1;y2]; % 训练方向,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
% 训练支持向量机
tic
svm = Nu_SVC_Train(X,Y,nu,ker);
t_train = toc
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 找寻支持向量
a = svm.a;
epsilon = 1e-8; % 如果小于此值则认为是0
i_sv = find(a>epsilon); % 支持向量下标
plot(X(i_sv,1),X(i_sv,2),'ro');
% ------------------------------------------------------------%
% 测试输出
[x1,x2] = meshgrid(-2:0.05:7,-2:0.05:7);
[rows,cols] = size(x1);
nt = rows*cols; % 测试样本数
Xt = [reshape(x1,nt,1),reshape(x2,nt,1)];
tic
Yd = Nu_SVC_Sim(svm,Xt); % 测试输出
t_sim = toc
Yd = reshape(Yd,rows,cols);
contour(x1,x2,Yd,[0 0],'m'); % 分类面
hold off;
function [K] = CalcKernel(ker,x,y)
% Calculate kernel function.
%
% x: 输入样本,n1×d的矩阵,n1为样本个数,d为样本维数
% y: 输入样本,n2×d的矩阵,n2为样本个数,d为样本维数
%
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
%
% ker = struct('type','linear');
% ker = struct('type','ploy','degree',d,'offset',c);
% ker = struct('type','gauss','width',s);
% ker = struct('type','tanh','gamma',g,'offset',c);
%
% K: 输出核参数,n1×n2的矩阵
%-------------------------------------------------------------%
% 转成列向量
x = x';
y = y';
%-------------------------------------------------------------%
switch ker.type
case 'linear'
K = x'*y;
case 'ploy'
d = ker.degree;
c = ker.offset;
K = (x'*y+c).^d;
case 'gauss'
s = ker.width;
rows = size(x,2);
cols = size(y,2);
tmp = zeros(rows,cols);
for i = 1:rows
for j = 1:cols
tmp(i,j) = norm(x(:,i)-y(:,j));
end
end
K = exp(-0.5*(tmp/s).^2);
case 'tanh'
g = ker.gamma;
c = ker.offset;
K = tanh(g*x'*y+c);
otherwise
K = 0;
end
function svm = Nu_SVC_Train(X,Y,nu,ker)
% 输入参数:
% X 训练样本,n×d的矩阵,n为样本个数,d为样本维数
% Y 训练方向,n×1的矩阵,n为样本个数,值为+1或-1
% nu 控制参数
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% 输出参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 解二次优化
n = length(Y);
H = (Y*Y').*Calckernel(ker,X,X);
f = zeros(n,1);
A = -ones(1,n);
b = -nu;
Aeq = Y';
beq = 0;
lb = zeros(n,1);
ub = ones(n,1)/n;
a0 = zeros(n,1);
options = optimset;
options.LargeScale = 'off';
options.Display = 'off';
[a,fval,eXitflag,output,lambda] = quadPageRankog(H,f,A,b,Aeq,beq,lb,ub,a0,options);
eXitflag
% ------------------------------------------------------------%
% 输出 svm
svm.ker = ker;
svm.x = X;
svm.y = Y;
svm.a = a;
function Yd = Nu_SVC_Sim(svm,Xt)
% 输入参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
%
% Xt 测试样本,n×d的矩阵,n为样本个数,d为样本维数
% 输出参数:
% Yd 测试输出,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
ker = svm.ker;
X = svm.x;
Y = svm.y;
a = svm.a;
% ------------------------------------------------------------%
% 求 b
epsilon = 1e-8; % 如果小于此值则认为是0
i_sv = find(a>epsilon); % 支持向量下标
tmp = (Y.*a)'*Calckernel(ker,X,X(i_sv,); % 行向量
b = 1./Y(i_sv)-tmp';
b = mean(b);
% ------------------------------------------------------------%
% 测试输出
nt = size(Xt,1); % 测试样本数
tmp = (Y.*a)'*Calckernel(ker,X,Xt);
Yd = sign(tmp+b)';
% 使用平台 - Matlab6.5
% 版权所有:陆振波,海军工程大学
% 电子邮件:luzhenbo@yahoo.com.cn
% 个人主页:http://luzhenbo.88uu.com.cn
% 参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Support Vector Machine Matlab Toolbox 1.0 - Nu Support Vector Classification
% Platform : Matlab6.5 / Matlab7.0
% Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
% E-mail : luzhenbo@yahoo.com.cn
% Homepage : http://luzhenbo.88uu.com.cn
% Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
clc
clear
close all
% ------------------------------------------------------------%
% 定义核函数及相关参数
nu = 0.2; % nu -> (0,1] 在支持向量数与错分样本数之间进行折衷
ker = struct('type','linear');
%ker = struct('type','ploy','degree',3,'offset',1);
%ker = struct('type','gauss','width',1);
%ker = struct('type','tanh','gamma',1,'offset',0);
% ker - 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% ------------------------------------------------------------%
% 构造两类训练样本
n = 50;
randn('state',3);
x1 = randn(n,2);
y1 = ones(n,1);
x2 = 5+randn(n,2);
y2 = -ones(n,1);
figure(2);
plot(x1(:,1),x1(:,2),'bx',x2(:,1),x2(:,2),'k.');
hold on;
X = [x1;x2]; % 训练样本,n×d的矩阵,n为样本个数,d为样本维数
Y = [y1;y2]; % 训练方向,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
% 训练支持向量机
tic
svm = Nu_SVC_Train(X,Y,nu,ker);
t_train = toc
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 找寻支持向量
a = svm.a;
epsilon = 1e-8; % 如果小于此值则认为是0
i_sv = find(a>epsilon); % 支持向量下标
plot(X(i_sv,1),X(i_sv,2),'ro');
% ------------------------------------------------------------%
% 测试输出
[x1,x2] = meshgrid(-2:0.05:7,-2:0.05:7);
[rows,cols] = size(x1);
nt = rows*cols; % 测试样本数
Xt = [reshape(x1,nt,1),reshape(x2,nt,1)];
tic
Yd = Nu_SVC_Sim(svm,Xt); % 测试输出
t_sim = toc
Yd = reshape(Yd,rows,cols);
contour(x1,x2,Yd,[0 0],'m'); % 分类面
hold off;
function [K] = CalcKernel(ker,x,y)
% Calculate kernel function.
%
% x: 输入样本,n1×d的矩阵,n1为样本个数,d为样本维数
% y: 输入样本,n2×d的矩阵,n2为样本个数,d为样本维数
%
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
%
% ker = struct('type','linear');
% ker = struct('type','ploy','degree',d,'offset',c);
% ker = struct('type','gauss','width',s);
% ker = struct('type','tanh','gamma',g,'offset',c);
%
% K: 输出核参数,n1×n2的矩阵
%-------------------------------------------------------------%
% 转成列向量
x = x';
y = y';
%-------------------------------------------------------------%
switch ker.type
case 'linear'
K = x'*y;
case 'ploy'
d = ker.degree;
c = ker.offset;
K = (x'*y+c).^d;
case 'gauss'
s = ker.width;
rows = size(x,2);
cols = size(y,2);
tmp = zeros(rows,cols);
for i = 1:rows
for j = 1:cols
tmp(i,j) = norm(x(:,i)-y(:,j));
end
end
K = exp(-0.5*(tmp/s).^2);
case 'tanh'
g = ker.gamma;
c = ker.offset;
K = tanh(g*x'*y+c);
otherwise
K = 0;
end
function svm = Nu_SVC_Train(X,Y,nu,ker)
% 输入参数:
% X 训练样本,n×d的矩阵,n为样本个数,d为样本维数
% Y 训练方向,n×1的矩阵,n为样本个数,值为+1或-1
% nu 控制参数
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% 输出参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 解二次优化
n = length(Y);
H = (Y*Y').*Calckernel(ker,X,X);
f = zeros(n,1);
A = -ones(1,n);
b = -nu;
Aeq = Y';
beq = 0;
lb = zeros(n,1);
ub = ones(n,1)/n;
a0 = zeros(n,1);
options = optimset;
options.LargeScale = 'off';
options.Display = 'off';
[a,fval,eXitflag,output,lambda] = quadPageRankog(H,f,A,b,Aeq,beq,lb,ub,a0,options);
eXitflag
% ------------------------------------------------------------%
% 输出 svm
svm.ker = ker;
svm.x = X;
svm.y = Y;
svm.a = a;
function Yd = Nu_SVC_Sim(svm,Xt)
% 输入参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
%
% Xt 测试样本,n×d的矩阵,n为样本个数,d为样本维数
% 输出参数:
% Yd 测试输出,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
ker = svm.ker;
X = svm.x;
Y = svm.y;
a = svm.a;
% ------------------------------------------------------------%
% 求 b
epsilon = 1e-8; % 如果小于此值则认为是0
i_sv = find(a>epsilon); % 支持向量下标
tmp = (Y.*a)'*Calckernel(ker,X,X(i_sv,); % 行向量
b = 1./Y(i_sv)-tmp';
b = mean(b);
% ------------------------------------------------------------%
% 测试输出
nt = size(Xt,1); % 测试样本数
tmp = (Y.*a)'*Calckernel(ker,X,Xt);
Yd = sign(tmp+b)';
% 支持向量机Matlab工具箱1.0 - Epsilon-SVR, Epsilon回归算法
% 使用平台 - Matlab6.5
% 版权所有:陆振波,海军工程大学
% 电子邮件:luzhenbo@yahoo.com.cn
% 个人主页:http://luzhenbo.88uu.com.cn
% 参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Support Vector Machine Matlab Toolbox 1.0 - Epsilon Support Vector Regression
% Platform : Matlab6.5 / Matlab7.0
% Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
% E-mail : luzhenbo@yahoo.com.cn
% Homepage : http://luzhenbo.88uu.com.cn
% Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
clc
clear
close all
% ------------------------------------------------------------%
% 定义核函数及相关参数
C = 100; % 拉格朗日乘子上界
e = 0.2; % 不敏感损失函数的参数,Epsilon越大,支持向量越少
%ker = struct('type','linear');
%ker = struct('type','ploy','degree',3,'offset',1);
ker = struct('type','gauss','width',1);
%ker = struct('type','tanh','gamma',1,'offset',0);
% ker - 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% ------------------------------------------------------------%
% 构造两类训练样本
n = 50;
rand('state',42);
X = linspace(-4,4,n)'; % 训练样本,n×d的矩阵,n为样本个数,d为样本维数,那里d=1
Ys = (1-X+2*X.^2).*exp(-.5*X.^2);
f = 0.2; % 相对误差
Y = Ys+f*max(abs(Ys))*(2*rand(size(Ys))-1)/2; % 训练方向,n×1的矩阵,n为样本个数,值为期望输出
figure(4)
plot(X,Ys,'b-',X,Y,'b*');
hold on;
% ------------------------------------------------------------%
% 训练支持向量机
tic
svm = Epsilon_SVR_Train(X,Y,C,e,ker);
t_train = toc
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 找寻支持向量
a = svm.a;
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
plot(X(i_sv),Y(i_sv),'ro');
% ------------------------------------------------------------%
% 测试输出
tic
Yd = Epsilon_SVR_Sim(svm,X); % 测试输出
t_sim = toc
plot(X,Yd,'r--',X,[Yd-e,Yd+e],'g:');
hold off;
function [K] = CalcKernel(ker,x,y)
% Calculate kernel function.
%
% x: 输入样本,n1×d的矩阵,n1为样本个数,d为样本维数
% y: 输入样本,n2×d的矩阵,n2为样本个数,d为样本维数
%
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
%
% ker = struct('type','linear');
% ker = struct('type','ploy','degree',d,'offset',c);
% ker = struct('type','gauss','width',s);
% ker = struct('type','tanh','gamma',g,'offset',c);
%
% K: 输出核参数,n1×n2的矩阵
%-------------------------------------------------------------%
% 转成列向量
x = x';
y = y';
%-------------------------------------------------------------%
switch ker.type
case 'linear'
K = x'*y;
case 'ploy'
d = ker.degree;
c = ker.offset;
K = (x'*y+c).^d;
case 'gauss'
s = ker.width;
rows = size(x,2);
cols = size(y,2);
tmp = zeros(rows,cols);
for i = 1:rows
for j = 1:cols
tmp(i,j) = norm(x(:,i)-y(:,j));
end
end
K = exp(-0.5*(tmp/s).^2);
case 'tanh'
g = ker.gamma;
c = ker.offset;
K = tanh(g*x'*y+c);
otherwise
K = 0;
end
function svm = Epsilon_SVR_Train(X,Y,C,e,ker)
% 输入参数:
% X 训练样本,n×d的矩阵,n为样本个数,d为样本维数
% Y 训练方向,n×1的矩阵,n为样本个数,值为期望输出
% C 拉格朗日乘子上界
% e 不敏感损失函数的参数,Epsilon越大,支持向量越少
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% 输出参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 解二次优化
n = length(Y);
Q = Calckernel(ker,X,X);
H = [Q,-Q;-Q,Q];
f = [e*ones(n,1)-Y;e*ones(n,1)+Y]; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Epsilon_SVR_Sim.m"第37,45行
%f = [e*ones(n,1)+Y;e*ones(n,1)-Y];
A = [];
b = [];
Aeq = [ones(1,n),-ones(1,n)];
beq = 0;
lb = zeros(2*n,1);
ub = C*ones(2*n,1);
a0 = zeros(2*n,1);
%第三步:调用优化工具箱quadPageRankog函数乞求二次规划
options = optimset;
options.LargeScale = 'off';
options.Display = 'off';
[a,fval,eXitflag,output,lambda] = quadPageRankog(H,f,A,b,Aeq,beq,lb,ub,a0,options);
eXitflag
% ------------------------------------------------------------%
% 输出 svm
svm.ker = ker;
svm.x = X;
svm.y = Y;
svm.a = a(1:n)-a(n+1:end);
function Yd = Epsilon_SVR_Sim(svm,Xt)
% 输入参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
%
% Xt 测试样本,n×d的矩阵,n为样本个数,d为样本维数
% 输出参数:
% Yd 测试输出,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
ker = svm.ker;
X = svm.x;
Y = svm.y;
a = svm.a; % 那里实际值为 a(1:n)-a(n+1:end),见文件"Epsilon_SVR_Train.m"第56行
% ------------------------------------------------------------%
% 求 b
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
tmp = a'*Calckernel(ker,X,X(i_sv,); % 行向量
b = Y(i_sv)-tmp'; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Epsilon_SVR_Train.m"第33行
%b = Y(i_sv)+tmp';
b = mean(b);
% ------------------------------------------------------------%
% 测试输出
nt = size(Xt,1); % 测试样本数
tmp = a'*Calckernel(ker,X,Xt); % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Epsilon_SVR_Train.m"第33行
%tmp = -a'*Calckernel(ker,X,Xt);
Yd = (tmp+b)';
% 使用平台 - Matlab6.5
% 版权所有:陆振波,海军工程大学
% 电子邮件:luzhenbo@yahoo.com.cn
% 个人主页:http://luzhenbo.88uu.com.cn
% 参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Support Vector Machine Matlab Toolbox 1.0 - Epsilon Support Vector Regression
% Platform : Matlab6.5 / Matlab7.0
% Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
% E-mail : luzhenbo@yahoo.com.cn
% Homepage : http://luzhenbo.88uu.com.cn
% Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
clc
clear
close all
% ------------------------------------------------------------%
% 定义核函数及相关参数
C = 100; % 拉格朗日乘子上界
e = 0.2; % 不敏感损失函数的参数,Epsilon越大,支持向量越少
%ker = struct('type','linear');
%ker = struct('type','ploy','degree',3,'offset',1);
ker = struct('type','gauss','width',1);
%ker = struct('type','tanh','gamma',1,'offset',0);
% ker - 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% ------------------------------------------------------------%
% 构造两类训练样本
n = 50;
rand('state',42);
X = linspace(-4,4,n)'; % 训练样本,n×d的矩阵,n为样本个数,d为样本维数,那里d=1
Ys = (1-X+2*X.^2).*exp(-.5*X.^2);
f = 0.2; % 相对误差
Y = Ys+f*max(abs(Ys))*(2*rand(size(Ys))-1)/2; % 训练方向,n×1的矩阵,n为样本个数,值为期望输出
figure(4)
plot(X,Ys,'b-',X,Y,'b*');
hold on;
% ------------------------------------------------------------%
% 训练支持向量机
tic
svm = Epsilon_SVR_Train(X,Y,C,e,ker);
t_train = toc
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 找寻支持向量
a = svm.a;
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
plot(X(i_sv),Y(i_sv),'ro');
% ------------------------------------------------------------%
% 测试输出
tic
Yd = Epsilon_SVR_Sim(svm,X); % 测试输出
t_sim = toc
plot(X,Yd,'r--',X,[Yd-e,Yd+e],'g:');
hold off;
function [K] = CalcKernel(ker,x,y)
% Calculate kernel function.
%
% x: 输入样本,n1×d的矩阵,n1为样本个数,d为样本维数
% y: 输入样本,n2×d的矩阵,n2为样本个数,d为样本维数
%
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
%
% ker = struct('type','linear');
% ker = struct('type','ploy','degree',d,'offset',c);
% ker = struct('type','gauss','width',s);
% ker = struct('type','tanh','gamma',g,'offset',c);
%
% K: 输出核参数,n1×n2的矩阵
%-------------------------------------------------------------%
% 转成列向量
x = x';
y = y';
%-------------------------------------------------------------%
switch ker.type
case 'linear'
K = x'*y;
case 'ploy'
d = ker.degree;
c = ker.offset;
K = (x'*y+c).^d;
case 'gauss'
s = ker.width;
rows = size(x,2);
cols = size(y,2);
tmp = zeros(rows,cols);
for i = 1:rows
for j = 1:cols
tmp(i,j) = norm(x(:,i)-y(:,j));
end
end
K = exp(-0.5*(tmp/s).^2);
case 'tanh'
g = ker.gamma;
c = ker.offset;
K = tanh(g*x'*y+c);
otherwise
K = 0;
end
function svm = Epsilon_SVR_Train(X,Y,C,e,ker)
% 输入参数:
% X 训练样本,n×d的矩阵,n为样本个数,d为样本维数
% Y 训练方向,n×1的矩阵,n为样本个数,值为期望输出
% C 拉格朗日乘子上界
% e 不敏感损失函数的参数,Epsilon越大,支持向量越少
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% 输出参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 解二次优化
n = length(Y);
Q = Calckernel(ker,X,X);
H = [Q,-Q;-Q,Q];
f = [e*ones(n,1)-Y;e*ones(n,1)+Y]; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Epsilon_SVR_Sim.m"第37,45行
%f = [e*ones(n,1)+Y;e*ones(n,1)-Y];
A = [];
b = [];
Aeq = [ones(1,n),-ones(1,n)];
beq = 0;
lb = zeros(2*n,1);
ub = C*ones(2*n,1);
a0 = zeros(2*n,1);
%第三步:调用优化工具箱quadPageRankog函数乞求二次规划
options = optimset;
options.LargeScale = 'off';
options.Display = 'off';
[a,fval,eXitflag,output,lambda] = quadPageRankog(H,f,A,b,Aeq,beq,lb,ub,a0,options);
eXitflag
% ------------------------------------------------------------%
% 输出 svm
svm.ker = ker;
svm.x = X;
svm.y = Y;
svm.a = a(1:n)-a(n+1:end);
function Yd = Epsilon_SVR_Sim(svm,Xt)
% 输入参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
%
% Xt 测试样本,n×d的矩阵,n为样本个数,d为样本维数
% 输出参数:
% Yd 测试输出,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
ker = svm.ker;
X = svm.x;
Y = svm.y;
a = svm.a; % 那里实际值为 a(1:n)-a(n+1:end),见文件"Epsilon_SVR_Train.m"第56行
% ------------------------------------------------------------%
% 求 b
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
tmp = a'*Calckernel(ker,X,X(i_sv,); % 行向量
b = Y(i_sv)-tmp'; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Epsilon_SVR_Train.m"第33行
%b = Y(i_sv)+tmp';
b = mean(b);
% ------------------------------------------------------------%
% 测试输出
nt = size(Xt,1); % 测试样本数
tmp = a'*Calckernel(ker,X,Xt); % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Epsilon_SVR_Train.m"第33行
%tmp = -a'*Calckernel(ker,X,Xt);
Yd = (tmp+b)';
% 支持向量机Matlab工具箱1.0 - Nu-SVR, Nu回归算法
% 使用平台 - Matlab6.5
% 版权所有:陆振波,海军工程大学
% 电子邮件:luzhenbo@yahoo.com.cn
% 个人主页:http://luzhenbo.88uu.com.cn
% 参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Support Vector Machine Matlab Toolbox 1.0 - Nu Support Vector Regression
% Platform : Matlab6.5 / Matlab7.0
% Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
% E-mail : luzhenbo@yahoo.com.cn
% Homepage : http://luzhenbo.88uu.com.cn
% Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
clc
clear
close all
% ------------------------------------------------------------%
% 定义核函数及相关参数
C = 100; % 拉格朗日乘子上界
nu = 0.01; % nu -> (0,1] 在支持向量数与拟合精度之间进行折衷
%ker = struct('type','linear');
%ker = struct('type','ploy','degree',3,'offset',1);
ker = struct('type','gauss','width',1);
%ker = struct('type','tanh','gamma',1,'offset',0);
% ker - 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% ------------------------------------------------------------%
% 构造两类训练样本
n = 50;
rand('state',42);
X = linspace(-4,4,n)'; % 训练样本,n×d的矩阵,n为样本个数,d为样本维数,那里d=1
Ys = (1-X+2*X.^2).*exp(-.5*X.^2);
f = 0.2; % 相对误差
Y = Ys+f*max(abs(Ys))*(2*rand(size(Ys))-1)/2; % 训练方向,n×1的矩阵,n为样本个数,值为期望输出
figure(4)
plot(X,Ys,'b-',X,Y,'b*');
hold on;
% ------------------------------------------------------------%
% 训练支持向量机
tic
svm = Nu_SVR_Train(X,Y,C,nu,ker);
t_train = toc
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 找寻支持向量
a = svm.a;
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
plot(X(i_sv),Y(i_sv),'ro');
% ------------------------------------------------------------%
% 测试输出
tic
Yd = Nu_SVR_Sim(svm,X); % 测试输出
t_sim = toc
plot(X,Yd,'r--');
hold off;
function [K] = CalcKernel(ker,x,y)
% Calculate kernel function.
%
% x: 输入样本,n1×d的矩阵,n1为样本个数,d为样本维数
% y: 输入样本,n2×d的矩阵,n2为样本个数,d为样本维数
%
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
%
% ker = struct('type','linear');
% ker = struct('type','ploy','degree',d,'offset',c);
% ker = struct('type','gauss','width',s);
% ker = struct('type','tanh','gamma',g,'offset',c);
%
% K: 输出核参数,n1×n2的矩阵
%-------------------------------------------------------------%
% 转成列向量
x = x';
y = y';
%-------------------------------------------------------------%
switch ker.type
case 'linear'
K = x'*y;
case 'ploy'
d = ker.degree;
c = ker.offset;
K = (x'*y+c).^d;
case 'gauss'
s = ker.width;
rows = size(x,2);
cols = size(y,2);
tmp = zeros(rows,cols);
for i = 1:rows
for j = 1:cols
tmp(i,j) = norm(x(:,i)-y(:,j));
end
end
K = exp(-0.5*(tmp/s).^2);
case 'tanh'
g = ker.gamma;
c = ker.offset;
K = tanh(g*x'*y+c);
otherwise
K = 0;
end
function svm = Nu_SVR_Train(X,Y,C,nu,ker)
% 输入参数:
% X 训练样本,n×d的矩阵,n为样本个数,d为样本维数
% Y 训练方向,n×1的矩阵,n为样本个数,值为期望输出
% C 拉格朗日乘子上界
% e 不敏感损失函数的参数,Epsilon越大,支持向量越少
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% 输出参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 解二次优化
n = length(Y);
Q = Calckernel(ker,X,X);
H = [Q,-Q;-Q,Q];
f = [-Y;+Y]; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Nu_SVR_Sim.m"第37,45行
%f = [+Y;-Y];
A = [];
b = [];
Aeq = [ones(1,n),-ones(1,n);ones(1,2*n)];
beq = [0;C*n*nu];
lb = zeros(2*n,1);
ub = C*ones(2*n,1);
a0 = zeros(2*n,1);
%第三步:调用优化工具箱quadPageRankog函数乞求二次规划
options = optimset;
options.LargeScale = 'off';
options.Display = 'off';
[a,fval,eXitflag,output,lambda] = quadPageRankog(H,f,A,b,Aeq,beq,lb,ub,a0,options);
eXitflag
% ------------------------------------------------------------%
% 输出 svm
svm.ker = ker;
svm.x = X;
svm.y = Y;
svm.a = a(1:n)-a(n+1:end);
function Yd = Nu_SVR_Sim(svm,Xt)
% 输入参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
%
% Xt 测试样本,n×d的矩阵,n为样本个数,d为样本维数
% 输出参数:
% Yd 测试输出,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
ker = svm.ker;
X = svm.x;
Y = svm.y;
a = svm.a; % 那里实际值为 a(1:n)-a(n+1:end),见文件"Nu_SVR_Train.m"第56行
% ------------------------------------------------------------%
% 求 b
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
tmp = a'*Calckernel(ker,X,X(i_sv,); % 行向量
b = Y(i_sv)-tmp'; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Nu_SVR_Train.m"第33行
%b = Y(i_sv)+tmp';
b = mean(b);
% ------------------------------------------------------------%
% 测试输出
nt = size(Xt,1); % 测试样本数
tmp = a'*Calckernel(ker,X,Xt); % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Nu_SVR_Train.m"第33行
%tmp = -a'*Calckernel(ker,X,Xt);
Yd = (tmp+b)';
% 使用平台 - Matlab6.5
% 版权所有:陆振波,海军工程大学
% 电子邮件:luzhenbo@yahoo.com.cn
% 个人主页:http://luzhenbo.88uu.com.cn
% 参数文献:Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Support Vector Machine Matlab Toolbox 1.0 - Nu Support Vector Regression
% Platform : Matlab6.5 / Matlab7.0
% Copyright : LU Zhen-bo, Navy Engineering University, WuHan, HuBei, P.R.China, 430033
% E-mail : luzhenbo@yahoo.com.cn
% Homepage : http://luzhenbo.88uu.com.cn
% Reference : Chih-Chung Chang, Chih-Jen Lin. "LIBSVM: a Library for Support Vector Machines"
%
% Solve the quadratic PageRankogramming PageRankoblem - "quadPageRankog.m"
clc
clear
close all
% ------------------------------------------------------------%
% 定义核函数及相关参数
C = 100; % 拉格朗日乘子上界
nu = 0.01; % nu -> (0,1] 在支持向量数与拟合精度之间进行折衷
%ker = struct('type','linear');
%ker = struct('type','ploy','degree',3,'offset',1);
ker = struct('type','gauss','width',1);
%ker = struct('type','tanh','gamma',1,'offset',0);
% ker - 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% ------------------------------------------------------------%
% 构造两类训练样本
n = 50;
rand('state',42);
X = linspace(-4,4,n)'; % 训练样本,n×d的矩阵,n为样本个数,d为样本维数,那里d=1
Ys = (1-X+2*X.^2).*exp(-.5*X.^2);
f = 0.2; % 相对误差
Y = Ys+f*max(abs(Ys))*(2*rand(size(Ys))-1)/2; % 训练方向,n×1的矩阵,n为样本个数,值为期望输出
figure(4)
plot(X,Ys,'b-',X,Y,'b*');
hold on;
% ------------------------------------------------------------%
% 训练支持向量机
tic
svm = Nu_SVR_Train(X,Y,C,nu,ker);
t_train = toc
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 找寻支持向量
a = svm.a;
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
plot(X(i_sv),Y(i_sv),'ro');
% ------------------------------------------------------------%
% 测试输出
tic
Yd = Nu_SVR_Sim(svm,X); % 测试输出
t_sim = toc
plot(X,Yd,'r--');
hold off;
function [K] = CalcKernel(ker,x,y)
% Calculate kernel function.
%
% x: 输入样本,n1×d的矩阵,n1为样本个数,d为样本维数
% y: 输入样本,n2×d的矩阵,n2为样本个数,d为样本维数
%
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
%
% ker = struct('type','linear');
% ker = struct('type','ploy','degree',d,'offset',c);
% ker = struct('type','gauss','width',s);
% ker = struct('type','tanh','gamma',g,'offset',c);
%
% K: 输出核参数,n1×n2的矩阵
%-------------------------------------------------------------%
% 转成列向量
x = x';
y = y';
%-------------------------------------------------------------%
switch ker.type
case 'linear'
K = x'*y;
case 'ploy'
d = ker.degree;
c = ker.offset;
K = (x'*y+c).^d;
case 'gauss'
s = ker.width;
rows = size(x,2);
cols = size(y,2);
tmp = zeros(rows,cols);
for i = 1:rows
for j = 1:cols
tmp(i,j) = norm(x(:,i)-y(:,j));
end
end
K = exp(-0.5*(tmp/s).^2);
case 'tanh'
g = ker.gamma;
c = ker.offset;
K = tanh(g*x'*y+c);
otherwise
K = 0;
end
function svm = Nu_SVR_Train(X,Y,C,nu,ker)
% 输入参数:
% X 训练样本,n×d的矩阵,n为样本个数,d为样本维数
% Y 训练方向,n×1的矩阵,n为样本个数,值为期望输出
% C 拉格朗日乘子上界
% e 不敏感损失函数的参数,Epsilon越大,支持向量越少
% ker 核参数(结构体变量)
% the following fields:
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% 输出参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
% ------------------------------------------------------------%
% 解二次优化
n = length(Y);
Q = Calckernel(ker,X,X);
H = [Q,-Q;-Q,Q];
f = [-Y;+Y]; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Nu_SVR_Sim.m"第37,45行
%f = [+Y;-Y];
A = [];
b = [];
Aeq = [ones(1,n),-ones(1,n);ones(1,2*n)];
beq = [0;C*n*nu];
lb = zeros(2*n,1);
ub = C*ones(2*n,1);
a0 = zeros(2*n,1);
%第三步:调用优化工具箱quadPageRankog函数乞求二次规划
options = optimset;
options.LargeScale = 'off';
options.Display = 'off';
[a,fval,eXitflag,output,lambda] = quadPageRankog(H,f,A,b,Aeq,beq,lb,ub,a0,options);
eXitflag
% ------------------------------------------------------------%
% 输出 svm
svm.ker = ker;
svm.x = X;
svm.y = Y;
svm.a = a(1:n)-a(n+1:end);
function Yd = Nu_SVR_Sim(svm,Xt)
% 输入参数:
% svm 支持向量机(结构体变量)
% the following fields:
% ker - 核参数
% type - linear : k(x,y) = x'*y
% poly : k(x,y) = (x'*y+c)^d
% gauss : k(x,y) = exp(-0.5*(norm(x-y)/s)^2)
% tanh : k(x,y) = tanh(g*x'*y+c)
% degree - Degree d of polynomial kernel (positive scalar).
% offset - Offset c of polynomial and tanh kernel (scalar, negative for tanh).
% width - Width s of Gauss kernel (positive scalar).
% gamma - Slope g of the tanh kernel (positive scalar).
% x - 训练样本
% y - 训练方向;
% a - 拉格朗日乘子
%
% Xt 测试样本,n×d的矩阵,n为样本个数,d为样本维数
% 输出参数:
% Yd 测试输出,n×1的矩阵,n为样本个数,值为+1或-1
% ------------------------------------------------------------%
ker = svm.ker;
X = svm.x;
Y = svm.y;
a = svm.a; % 那里实际值为 a(1:n)-a(n+1:end),见文件"Nu_SVR_Train.m"第56行
% ------------------------------------------------------------%
% 求 b
epsilon = 1e-8; % 如果"绝对值"小于此值则认为是0
i_sv = find(abs(a)>epsilon); % 支持向量下标,那里对abs(a)进行判定
tmp = a'*Calckernel(ker,X,X(i_sv,); % 行向量
b = Y(i_sv)-tmp'; % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Nu_SVR_Train.m"第33行
%b = Y(i_sv)+tmp';
b = mean(b);
% ------------------------------------------------------------%
% 测试输出
nt = size(Xt,1); % 测试样本数
tmp = a'*Calckernel(ker,X,Xt); % 符号不相同,决策函数就不相同,实际上是一回事!见文件"Nu_SVR_Train.m"第33行
%tmp = -a'*Calckernel(ker,X,Xt);
Yd = (tmp+b)';