EM Algorithm for Gaussian Mixture Model (EM GMM)
注释:
This is a function tries to obtain the maximum likelihood estimation of Gaussian mixture model by expectation maximization (EM) algorithm.
It works on data set of arbitrary dimensions. Several techniques are applied to avoid the float number underflow problems that often occurs when computing probability of high dimensional data. Also the code is carefully tuned to be efficient by utilizing vertorization and matrix factorization.
This is a widely used algorithm. The detail of this algorithm can be found in many textbooks or tutorials online. Just google EM Gaussian Mixture or you can read the wiki page:
http://en.wikipedia.org/wiki/Expectation-maximization_algorithm
This function is robust and efficient yet the code structure is organized so that it is easy to read. Please try following code for a demo:
close all; clear;
d = 2;
k = 3;
n = 500;
[X,label] = mixGaussRnd(d,k,n);
plotClass(X,label);
m = floor(n/2);
X1 = X(:,1:m);
X2 = X(:,(m+1):end);
% train
[z1,model,llh] = mixGaussEm(X1,k);
figure;
plot(llh);
figure;
plotClass(X1,z1);
% predict
z2 = mixGaussPred(X2,model);
figure;
plotClass(X2,z2);