MATLAB中“fitgmdist”的用法及其GMM聚类算法

MATLAB中“fitgmdist”的用法及其GMM聚类算法

作者:凯鲁嘎吉 - 博客园 http://www.cnblogs.com/kailugaji/

高斯混合模型的基本原理:聚类——GMM,MATLAB官方文档中有关于fitgmdist的介绍:fitgmdist。我之前写过有关GMM聚类的算法:GMM算法的matlab程序。这篇文章主要应用MATLAB自带的函数来进行聚类。

1. fitgmdist函数介绍

fitgmdist的使用形式:gmm = fitgmdist(X,k,Name,Value)

输入

‘RegularizationValue’, 0。(取值:0, 0.1, 0.01,....,正则化系数,防止协方差奇异)

'CovarianceType', 'full'。(取值: 'full',协方差矩阵是非对角阵,'diagonal',协方差矩阵为对角阵)

‘Start’, 'plus'。 (取值:‘randSample’,随机初始化,‘plus’,k-means++初始化,‘S’,自定义初始化),其中S = struct('mu',init_Mu,'Sigma',init_Sigma,'ComponentProportion',init_Components);

‘Options’,statset('Display', 'final', 'MaxIter', MaxIter, 'TolFun', TolFun)。 ('Display'有三个取值:‘final’ 显示最终的输出结果、‘iter’ 显示每次迭代的结果、‘off’ 不显示优化参数信息;'MaxIter':默认100,最大迭代次数;'TolFun':默认1e-6,目标函数的终止误差)

输出

gmm.mu:更新完后的聚类中心(均值)

gmm.Sigma:更新完后的协方差矩阵

gmm.ComponentProportion:更新完后的混合比例

gmm.NegativeLogLikelihood:更新完后的负对数似然函数

gmm.NumIterations:实际迭代次数

gmm.BIC:贝叶斯信息准则,用于模型选择

更多参数,请在命令行输入properties(gmm)

2. 高斯混合模型聚类实例

generate.m

function data=generate()
%生成数据
mu1 = [1 2];
Sigma1 = [2 0; 0 0.5];
mu2 = [-1 -2];
Sigma2 = [1 0;0 1];
data = [mvnrnd(mu1,Sigma1,400), ones(400,1);mvnrnd(mu2,Sigma2,600), 2*ones(600,1)];
X=[data(:, 1), data(:, 2)];
figure(1)
plot(X(:,1), X(:,2),'bo')
title('Scatter Plot')
xlim([min(X(:)) max(X(:))]) % Make axes have the same scale
ylim([min(X(:)) max(X(:))])

具体数据

-0.752713846442762	2.48140797998545	1
-0.798625575507672	2.14835001132099	1
2.82002920206994	1.97084621196340	1
0.913856576539988	2.24942313999122	1
1.57243525115195	2.68322568351427	1
0.241170005783610	1.89791938743627	1
2.10634746115858	2.20631449410867	1
1.61173455443266	2.69163655587553	1
1.39436249281445	1.28104472307183	1
1.65267727628557	1.85771163664832	1
0.0927741368750946	2.00698799954306	1
2.79887910552062	1.70868183551872	1
0.652907219449091	1.88702134773695	1
3.43327629572431	2.17571612839302	1
1.96527202098605	2.34069549818768	1
0.867813825335363	1.68534433959204	1
2.08252259376894	1.02114736190308	1
0.613863159235077	2.17081564512242	1
0.399552396654452	2.11342763957560	1
2.91186118166440	1.82639334702902	1
0.377524852838774	3.08573945644000	1
1.98881806018168	1.98460069084178	1
2.68223790603780	1.96134300697451	1
-0.622058850926772	3.30007822155342	1
0.275231442237888	1.59279139683080	1
3.34643957013717	1.68055674830698	1
1.03601109129843	2.70541249651070	1
0.752332013138056	2.49601620599903	1
0.508824726005269	2.57436514443626	1
1.73027562632102	2.19502286037081	1
-0.713230300478002	3.73262960861479	1
0.00814731258125789	1.52233440802986	1
4.86084534117603	2.64339300870922	1
2.82851799598685	1.79094004967764	1
0.725784406602703	1.32609248276320	1
0.147485590657029	1.49033710171378	1
0.601129894478657	3.58795933263211	1
1.32767044246750	1.73683867424831	1
1.27723837775742	2.62061783710240	1
0.240726647338080	1.43127112737987	1
0.387110343064517	2.39576239976627	1
-0.735171242460893	2.21884994425795	1
0.821208540380224	2.28230651108960	1
2.84910757299470	2.40958045875721	1
-0.00305938218238255	1.36758622169689	1
0.664256129846708	1.22550082896155	1
-0.178114240634728	2.30408704591301	1
2.00786623874121	2.03129722636611	1
1.20272622506596	2.22475373629028	1
2.50250207292945	2.98770457384530	1
-0.874927698714206	1.71654280942273	1
2.39750117925196	1.84040401876970	1
0.369966892118764	2.74170598529279	1
2.15279953669453	1.58546917422906	1
0.591444584760838	2.29854661364821	1
2.57828408797099	2.37460697014727	1
0.816618335677021	1.11574196519408	1
1.86043373148875	2.20777177264008	1
0.100606202686330	3.21045350847405	1
2.53445397100630	0.854302259627375	1
1.39168420124415	1.49609940831950	1
1.93929865340817	1.15181790512326	1
-0.727792104505467	1.79285231965211	1
0.646252982052485	1.03986431564848	1
1.19673169055003	1.93448155511090	1
1.12984878643876	2.22883744483371	1
4.02666426647567	1.86649614448613	1
1.27117265547696	2.03774514826000	1
1.00533765244964	1.17061188123361	1
-0.414741919400961	2.28828102447677	1
0.495317556413179	1.83288455064957	1
2.76823223857300	2.97934893222715	1
1.08058855707478	2.26487370266768	1
1.38546325331279	1.40278788558597	1
1.52167644900484	2.69183023331115	1
1.67505702182493	2.85222287490608	1
4.10199034200796	2.18231282334728	1
2.40054022033848	2.07878116194682	1
1.49836399169312	1.92754270979332	1
1.52768214037169	0.604144369658773	1
1.48106369736206	1.89716183115311	1
1.88582351601666	2.33794971444330	1
2.26705524676369	1.71381451846972	1
1.60866449353269	1.49135394247519	1
-0.115055715038331	1.12849347288791	1
3.57622395160265	1.10664344270162	1
2.26630255633806	1.22083515972890	1
1.33163191466841	1.44619768873473	1
-1.45842213709207	2.22949005602252	1
1.70411565663116	3.47971193248190	1
1.48213175474953	2.96603725956335	1
1.72101076041395	3.01672973472638	1
-1.35146166439553	2.85558221164039	1
0.380649241141038	2.56729542762187	1
1.88667156332277	2.85630884112937	1
1.97758943297860	1.70558017430484	1
3.24821546740318	1.82804963183289	1
0.843823793198429	2.64758378735128	1
0.0405183480549609	1.71670699774521	1
0.976813370248264	2.49595625636074	1
-0.962694637418230	2.58074576675867	1
1.82963248497625	1.90436660429995	1
-0.0311001170696965	2.55990131230256	1
1.78671558216327	1.30722487774421	1
-0.0804210798437657	1.50783793009072	1
0.592128283430208	0.755659320204709	1
1.95365332837982	2.16200348956491	1
1.36682968081700	1.73744055959892	1
0.979475411390431	2.42632494823554	1
1.39902046293800	1.73826197272758	1
-0.765939069959433	1.80573420327555	1
-0.635816070635523	2.54485465981086	1
0.430871671985252	2.95078063841104	1
1.07190555054187	2.60824897906429	1
0.190820867383636	1.56628799263572	1
-0.877375755503489	2.32098851030493	1
2.18596583418406	1.73505362502330	1
-0.858977422721553	1.67583237334513	1
-0.209010836036135	1.61369208932597	1
0.320019483215927	1.82234886961005	1
2.22606598804851	2.92553777460368	1
-1.34847806650935	1.73833614701202	1
-1.10055386975634	1.73836801433344	1
-0.742975839183661	2.56906224418957	1
0.417690253375689	2.64425612559673	1
2.21840601718359	1.86213274353155	1
-1.80189093986379	1.88663190741782	1
2.18214937211589	2.81206394431873	1
0.0239657716321194	2.37810261200744	1
1.48366333882258	1.88062228076692	1
-1.50485783728876	1.75467615653968	1
-0.156597929943039	2.92699154393160	1
1.00641966523359	1.82300801832677	1
0.229462398563604	0.218216163076399	1
3.08788315084484	2.44977421952063	1
-2.75317240661940	1.47607778387816	1
1.57689796637766	1.14148728120332	1
0.289637818858192	1.37410983329226	1
1.80546824398122	1.11123227831127	1
2.05342064550405	2.59331992761210	1
-0.908793900499392	1.78719275353105	1
0.366987870101896	0.201547399427021	1
0.449481220306850	3.37425663694842	1
1.77118120808653	1.75664778049838	1
-0.418491981240371	1.67286506452260	1
3.52677560506185	2.36946900535883	1
1.021636
  • 4
    点赞
  • 47
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值