课件为人工智能讲师叶梓人工智能基础课程之一,接上一篇,更多课程,及老师资料可点击 个人主页
(人工智能基础课件34-36页)
EM迭代概述
-
EM 的具体过程:
- 先根据假定的参数来计算隐藏变量的分布;
- 并根据隐藏变量的分布来更新参数的期望;
- 如此迭代,直到收敛。
Python实现的EM
- #coding:gbk
- import math
- import copy
- import numpy as np
- import matplotlib.pyplot as plt
- isdebug = True
- def ini_data(Sigma,Mu1,Mu2,k,N):
- global X
- global Mu
- global Expectations
- X = np.zeros((1,N))
- Mu = np.random.random(2)
- Expectations = np.zeros((N,k))
- for i in range(0,N):
- if np.random.random(1) > 0.5:
- X[0,i] = np.random.normal()*Sigma + Mu1
- else:
- X[0,i] = np.random.normal()*Sigma + Mu2
- if isdebug:
- print("***********")
- print(u"初始观测数据X:")
- print(X)
- # EM算法:步骤1,计算E[zij]
- def e_step(Sigma,k,N):
- global Expectations
- global Mu
- global X
- for i in range(0,N):
- Denom = 0
- for j in range(0,k):
- Denom += math.exp((-1/(2*(float(Sigma**2))))*(float(X[0,i]-Mu[j]))**2)
- for j in range(0,k):
- Numer = math.exp((-1/(2*(float(Sigma**2))))*(float(X[0,i]-Mu[j]))**2)
- Expectations[i,j] = Numer / Denom
- if isdebug:
- print("***********")
- print(u"隐藏变量E(Z):")
- print(Expectations)
- def m_step(k,N):
- global Expectations
- global X
- for j in range(0,k):
- Numer = 0
- Denom = 0
- for i in range(0,N):
- Numer += Expectations[i,j]*X[0,i]
- Denom +=Expectations[i,j]
- Mu[j] = Numer / Denom
- def run(Sigma,Mu1,Mu2,k,N,iter_num,Epsilon):
- ini_data(Sigma,Mu1,Mu2,k,N)
- print(u"初始<u1,u2>:", Mu)
- for i in range(iter_num):
- Old_Mu = copy.deepcopy(Mu)
- e_step(Sigma,k,N)
- m_step(k,N)
- print(i,Mu)
- if sum(abs(Mu-Old_Mu)) < Epsilon:
- break
- if __name__ == '__main__':
- run(6,40,20,2,1000,1000,0.0001)
- plt.hist(X[0,:],50)
- plt.show()
(未完,下一篇继续)课件为人工智能讲师叶梓人工智能基础课程之一,接上一篇,更多课程,及老师资料可点击 个人主页
由于小编专业能力有限,无法将老师课件编辑成文章。供一定基础的同学将系列课件下载整理
————————————————