条件互信息(conditional mutual information,CMI)

文章目录

定义

条件互信息 I ( X , Y ∣ Z ) I(X, Y|Z) I(X,YZ)定义如下:
I ( X , Y ∣ Z ) = ∑ z ∈ Z p Z ( z ) ∑ y ∈ Y ∑ x ∈ X p X , Y ∣ Z ( x , y ∣ z ) log ⁡ p X , Y ∣ Z ( x , y ∣ z ) p X ∣ Z ( x ∣ z ) p Y ∣ Z ( y ∣ z ) I(X, Y|Z)=\sum_{z\in Z}p_Z(z)\sum_{y \in Y}\sum_{x \in X} p_{X,Y|Z}(x,y|z)\log \frac{p_{X,Y|Z}(x,y|z)}{p_{X|Z}(x|z)p_{Y|Z}(y|z)} I(X,YZ)=zZpZ(z)yYxXpX,YZ(x,yz)logpXZ(xz)pYZ(yz)pX,YZ(x,yz)
也有:
I ( X , Y ∣ Z ) = ∑ z ∈ Z ∑ y ∈ Y ∑ x ∈ X p X , Y , Z ( x , y , z ) log ⁡ p Z ( z ) p X , Y ∣ Z ( x , y ∣ z ) p X , Z ( x , z ) p Y , Z ( y , z ) I(X, Y|Z)=\sum_{z\in Z}\sum_{y \in Y}\sum_{x \in X} p_{X,Y,Z}(x,y,z)\log \frac{p_{Z}(z)p_{X,Y|Z}(x,y|z)}{p_{X,Z}(x,z)p_{Y,Z}(y,z)} I(X,YZ)=zZyYxXpX,Y,Z(x,y,z)logpX,Z(x,z)pY,Z(y,z)pZ(z)pX,YZ(x,yz)

see also

集智百科:条件互信息

  • 0
    点赞
  • 6
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 3
    评论
上面的回答中已经给出了Python实现条件互信息的代码,但是这里再给一个更详细的代码: ```python import numpy as np from scipy import special, stats def conditional_mutual_information(x, y, z): # Compute mutual information between x and y mi_xy = mutual_information(x, y) # Compute mutual information between x and y conditioned on z mi_xyz = mutual_information(np.column_stack((x, z)), np.column_stack((y, z))) mi_xz = mutual_information(z, x) mi_yz = mutual_information(z, y) # Compute conditional mutual information cmi = mi_xy - mi_xz - mi_yz + mi_xyz return cmi def mutual_information(x, y): # Compute joint probability distribution p_xy, _, _ = np.histogram2d(x, y) # Compute marginal probability distributions p_x = p_xy.sum(axis=1) p_y = p_xy.sum(axis=0) # Compute entropy of marginal probability distributions h_x = entropy(p_x) h_y = entropy(p_y) # Compute entropy of joint probability distribution h_xy = entropy(p_xy.reshape(-1)) # Compute mutual information mi = h_x + h_y - h_xy return mi def entropy(p): # Compute entropy of probability distribution p = p[np.nonzero(p)] return -np.sum(p * np.log2(p)) # Example usage x = np.random.normal(0, 1, size=1000) y = x ** 2 + np.random.normal(0, 0.5, size=1000) z = np.random.binomial(1, 0.5, size=1000) cmi = conditional_mutual_information(x, y, z) print("CMI between x and y conditioned on z:", cmi) ``` 此代码实现了条件互信息互信息的计算。其中,`conditional_mutual_information`函数计算条件互信息,`mutual_information`函数计算互信息,`entropy`函数计算熵。使用示例数据进行测试,输出CMI的值。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

千行百行

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值