转移熵matlab,转移熵

转移熵 Transfer entropy(也可译为传递熵)是衡量两个随机过程之间有向(时间不对称)信息传递量的非参数统计量。香农熵 Shannon entropy测量,则转移熵可以写为:

[math]\displaystyle{ T_{X\rightarrow Y} = H\left( Y_t \mid Y_{t-1:t-L}\right) - H\left( Y_t \mid Y_{t-1:t-L}, X_{t-1:t-L}\right) }[/math],

其中 H (x)是 x 的香农熵。上述转移熵的定义已被其他类型的熵测度(如Rényi熵 Rényi entropy)所扩展。

转移熵是条件,其历史变量为 Yt−1:t−L:

[math]\displaystyle{ T_{X\rightarrow Y} = I(Y_t ; X_{t-1:t-L} \mid Y_{t-1:t-L}). }[/math]

对于向量自回归过程 vector auto-regressive processes,转移熵简化为'格兰杰因果关系 Granger causality'。

熵公式中的概率可以用不同的方法估计,如分箱 binning、最近邻 nearest neighbors,或为了降低复杂度,使用非均匀嵌入方法。

虽然转移熵最初定义为双变量分析,但它已经扩展到多变量形式,或者对其他潜在源变量进行调节,

转移熵被用于估计神经元的功能连接

转移熵是有向信息的有限形式,1990年由詹姆斯·梅西 James Massey

I(Xn→Yn)=∑ni=1I(Xi;Yi|Yi−1),其中 Xn表示向量X1,X2,...,Xn和Yn表示 Y1,Y2,...,Yn。有向信息在描述有无反馈

参见

因果关系

因果关系(物理)

结构方程模型

虚拟事实模型

参考

Schreiber, Thomas (1 July 2000). "Measuring information transfer". Physical Review Letters. 85 (2): 461–464. arXiv:Bibcode:2000PhRvL..85..461S. doi:10.1103/PhysRevLett.85.461. PMID 10991308.

Hlaváčková-Schindler, Katerina; Palus, M; Vejmelka, M; Bhattacharya, J (1 March 2007). "Causality detection based on information-theoretic approaches in time series analysis". Physics Reports. 441 (1): 1–46. Bibcode:2007PhR...441....1H. CiteSeerX doi:10.1016/j.physrep.2006.12.004.

Jizba, Petr; Kleinert, Hagen; Shefaat, Mohammad (2012-05-15). "Rényi's information transfer between financial time series". Physica A: Statistical Mechanics and Its Applications (in English). 391 (10): 2971–2989. arXiv:Bibcode:2012PhyA..391.2971J. doi:10.1016/j.physa.2011.12.064. ISSN 0378-4371.

Wyner, A. D. (1978). "A definition of conditional mutual information for arbitrary ensembles". Information and Control. 38 (1): 51–59. doi:

Dobrushin, R. L. (1959). "General formulation of Shannon's main theorem in information theory". Uspekhi Mat. Nauk. 14: 3–104.

Barnett, Lionel (1 December 2009). "Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables". Physical Review Letters. 103 (23): 238701. arXiv:Bibcode:2009PhRvL.103w8701B. doi:10.1103/PhysRevLett.103.238701. PMID 20366183.

{{cite conference |arxiv=1110.2724|title= Information transfer in social media|last1= Ver Steeg |first1= Greg|last2=Galstyan|first2= Aram |year= 2012|publisher= [[Association for Computing Machinery|ACM|booktitle= Proceedings of the 21st international conference on World Wide Web (WWW '12) |pages= 509–518 |bibcode=2011arXiv1110.2724V}}

Lungarella, M.; Ishiguro, K.; Kuniyoshi, Y.; Otsu, N. (1 March 2007). "Methods for quantifying the causal structure of bivariate time series". International Journal of Bifurcation and Chaos. 17 (3): 903–921. Bibcode:2007IJBC...17..903L. CiteSeerX doi:10.1142/S0218127407017628.

Lizier, Joseph; Prokopenko, Mikhail; Zomaya, Albert (2008). "Local information transfer as a spatiotemporal filter for complex systems". Physical Review E. 77 (2): 026110. arXiv:Bibcode:2008PhRvE..77b6110L. doi:10.1103/PhysRevE.77.026110. PMID 18352093.

Lizier, Joseph; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail (2011). "Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity". Journal of Computational Neuroscience. 30 (1): 85–107. doi:10.1007/s10827-010-0271-2. PMID 20799057.

Massey, James (1990). "Causality, Feedback And Directed Information" (ISITA). CiteSeerX Cite journal requires |journal= (help)

Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory. 55 (2): 644–662. arXiv:doi:10.1109/TIT.2008.2009849.

Kramer, G. (January 2003). "Capacity results for the discrete memoryless network". IEEE Transactions on Information Theory. 49 (1): 4–21. doi:10.1109/TIT.2002.806135.

Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory. 57 (6): 3248–3259. arXiv:doi:10.1109/TIT.2011.2136270.

外部链接

, a toolbox, developed in C++ and MATLAB, for computation of transfer entropy between spike trains.

, a toolbox, developed in Java (programming language)|Java and usable in MATLAB, GNU Octave and Python (programming language)|Python, for computation of transfer entropy and related information-theoretic measures in both discrete and continuous-valued data.

, a toolbox, developed in MATLAB, for computation of transfer entropy with different estimators.

编者推荐

本课程对常见的编码方法进行了解析,对编码的特点与性质,以及编码的相关证明方法进行了说明。

本课程是“火炬上的深度学习”系列课程的配套视频,详细讲解了卷积神经网络的数学原理。

本中文词条由Henry 参与编译, Vicky 审校,不是海绵宝宝编辑,欢迎在讨论页面留言。

本词条内容源自wikipedia及公开资料,遵守 CC3.0协议。

  • 3
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值