转自:http://blog.sina.com.cn/s/blog_6163bdeb0102ee95.html
之前转载了一篇博客,介绍了KPCA的推导,作者主要是参考、翻译了另一篇博客http://zhanxw.com/blog/2011/02/kernel-pca-原理和演示/,这里讲KPCA的整个过程很详细,还有R实现的代码,很不错!建议保存下来看,这个网站有时候上不去。
% 3.3 Kernel Principal Component Analysis
clc
clear
close all
% generate circle data
X = gencircledata([1;1],5,250,1);
% compute kernel PCA
options.ker = 'rbf'; % use RBF kernel
options.arg = 4; % kernel argument
options.new_dim = 2; % output dimension
model = kpca(X,options);
XR = kpcarec(X,model); % compute reconstruced data
% Visualization
figure;
h1 = ppatterns(X);
h2 = ppatterns(XR, '+r');
legend([h1 h2],'Input vectors','Reconstructed');
% 3.3 Kernel Principal Component Analysis
clc
clear
close all
% generate circle data
X0 = gencircledata([1;1],1,250,0.1);
X1 = gencircledata([1;1],3,250,0.1);
X2 = gencircledata([1;1],6,250,0.1);
X0 = X0 - repmat(mean(X0, 2), 1, 250);
X1 = X1 - repmat(mean(X1, 2), 1, 250);
X2 = X2 - repmat(mean(X2, 2), 1, 250);
X = [X0 X1 X2];
y = [ones(1, size(X0, 2)) 2*ones(1, size(X1, 2)) 3*ones(1, size(X2, 2))];
data.X = X;
data.y = y;
figure
ppatterns(data);
% compute kernel PCA
kernelflag = 1;
if kernelflag == 1
elseif kernelflag == 2
else
end
options.new_dim = 2; % output dimension
model = kpca(data.X, options);
kpca_data = kernelproj(data, model);
figure
ppatterns(kpca_data);