传递熵(Transfer Entropy,TE)滥觞、2000年顶刊论文方法部分:Measuring Information Transfer

论文基本信息

Measuring Information Transfer这篇论文于2000年发表在物理领域顶刊《PHYSICAL REVIEW LETTERS》。
被引频次:2147(WOS)2021年12月7日14:32:30
在这里插入图片描述


方法部分原文摘录

Kullback entropy

In order to construct an optimal encoding that uses just as many bits as given by the entropy, it is necessary to know the probability distribution p ( i ) p(i ) p(i). The excess number of bits that will be coded if a different distribution q ( i ) q(i ) q(i) is used is given by the Kullback entropy [4] K I = ∑ i p ( i ) log ⁡ p ( i ) / q ( i ) K_I = \sum_i p(i) \log p(i) / q(i) KI=ip(i)logp(i)/q(i). We will later also need the Kullback entropy for conditional probabilities p ( i ∣ j ) p(i | j) p(ij) . For a single state j j j we have K j = ∑ i p ( i ∣ j ) log ⁡ p ( i ∣ j ) / q ( i ∣ j ) K_j =\sum_i p(i{\color{Red} |}j)\log p(i|j)/q(i|j) Kj=ip(ij)logp(ij)/q(ij). Summation over j j j with respect to p ( j ) p(j) p(j) yields

K I ∣ J = ∑ i , j p ( i ,   j ) log ⁡ p ( i ∣ j ) q ( i ∣ j ) (1) K_{I|J}=\sum_{i,j} p(i{\color{Red} ,}\: j) \log \frac{ p(i | j) } { q(i | j) }\tag{1} KIJ=i,jp(i,j)logq(ij)p(ij)(1)


One can incorporate dynamical structure by studying transition probabilities rather than static probabilities. Consider a system that may be approximated by a stationary Markov process of order k k k, that is, the conditional probability to find I I I in state i n + 1 i_{n+1} in+1 at time n + 1 n+1 n+1 is independent of the state i n − k : p ( i n + 1 ∣ i n , … , i n − k + 1 ) = p ( i n + 1 ∣ i n , … , i n − k + 1 , i n − k ) i_{n-k}: p(i_{n+1}|i_n,\dots,i_{n-k+1})=p(i_{n+1}|i_n, \dots, i_{n-k+1}, i_{n-k}) ink:p(in+1in,,ink+1)=p(in+1in,,ink+1,ink). Henceforth we will use the shorthand notation i n ( k ) = ( i n , … , i n − k + 1 ) i_n^{(k)}=(i_n, \dots, i_{n-k+1}) in(k)=(in,,ink+1) for words of length k k k.

The average number of bits needed to encode one additional state of the system if all previous states are known is given by the entropy rate
h I = − ∑ p ( i n + 1 , i n ( k ) ) log ⁡ p ( i n + 1 ∣ i n ( k ) ) (3) h_I=-\sum { p(i_{n+1}, i_n^{(k)})\log p(i_{n+1}| i_n^{(k)}) }\tag{3} hI=p(in+1,in(k))logp(in+1in(k))(3)


transfer entropy

For the study of the dynamics of shared information between processes it is desirable to generalize the entropy rate, rather than Shannon entropy, to more than one system, since the dynamics of the processes is contained in the transition probabilities. The most straightforward way to construct a mutual information rate by generalizing h I h_I hI to two processes ( I , J ) (I, J) (I,J) is again by measuring the deviation from independence. The corresponding Kullback entropy is still symmetric under the exchange of I I I and J J J. It is therefore preferable to measuring the deviation from the generalized Markov property, p ( i n + 1 ∣ i n ( k ) ) = p ( i n + 1 ∣ i n ( k ) , j n ( l ) ) . p(i_{n+1}| i_n^{(k)}) = p(i_{n+1}| i_n^{(k)}, j_n^{(l)}). p(in+1in(k))=p(in+1in(k),jn(l)).

In the absence of information flow from J J J to I I I, the state of J J J has no influence on the transition probabilities on system I I I. The incorrectness of this assumption can again be quantified by a Kullback entropy (1) by which we define the transfer entropy:
T J → I = ∑ p ( i n + 1 , i n ( k ) , j n ( l ) ) log ⁡ p ( i n + 1 ∣ i n ( k ) , j n ( l ) ) p ( i n + 1 ∣ i n ( k ) ) (4) T_{J \rightarrow I} = \sum p(i_{n+1}, i_n^{(k)}, j_n^{(l)}) \log \frac { p(i_{n+1}| i_n^{(k)}, j_n^{(l)}) } { p(i_{n+1}| i_n^{(k)}) } \tag{4} TJI=p(in+1,in(k),jn(l))logp(in+1in(k))p(in+1in(k),jn(l))(4)

causality

在这里插入图片描述
Figure 2 shows M and T as functions of the coupling strength. Both M and T are able to detect the anisotropy since the information is consistently larger in the positive direction. The lattice undergoes a number of bifurcations when the coupling is changed. Around ϵ = 0.18 \epsilon=0.18 ϵ=0.18, the asymptotic state is of temporal and spatial period two. For this case, the mutual information is found to be 1 bit. This is correct although information is neither produced nor exchanged and reflects the static correlation between the sites.

The transfer entropy finds a zero rate of information transport, as desired. Around this periodic window, the mutual information is nonzero in both directions and the signature of the unidirectional coupling is less pronounced. Around ϵ = 0.82 \epsilon=0.82 ϵ=0.82, the lattice settles to a (spatially inhomogeneous) fixed point state. Here both measures correctly show zero information transfer. The most important finding, however, is that the transfer entropy for the negative direction remains consistent with zero for all couplings, reflecting the causality in the system.

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

千行百行

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值