信息理论相关python包

前言

两个python包,可以调用对应函数计算互信息、条件互信息、共信息、双向总信息、残差信息等。附带一些其他信息理论研究文章。

dit

This python package for discrete information theory provides a standard bivariate case based on:

  1. Basic Shannon measure of mutual information for bivariate distributions

  2. Measures for multivariate distributions

    • Co-Information: quantifies amount of information all variable participates in
    • Total Correlation: amount of information each individual variable carries above and beyond joint entropy
    • Dual Total Correlation: Also known as binding information is the amount of information shared among the variables.
    • Cohesion: spans total correlation to dual total correlation
    • CAEKL Mutual Information:
      Generalized as the smallest quantity that can be subtracted from the joint, and from each part of a partition of all the variables, such that the joint entropy minus this quantity is equal to the sum of each partition entropy minus this quantity.
    • Interaction Information: Equal in magnitude to co-information, however for odd number of variables takes the opposite sign
    • DeWeese-like Measures
      local modification of a single variable can not increase the amount of correlation or dependence it has with the other variables.

pyitlib

Library in python for information-theoretic methods.

Below are the mutual information measures found in package pyitlib:

Other measures in Research Communities

  1. Part mutual information: This new measure is based on information theory that accurately quantify nonlinearly direct associations between measured variables. For more information, part mutual information for quantifying direct associations
  2. Calculate mutual information using recursive adaptive partitioning: This paper ideally focuses on mutual information between discrete variables with many categories using Recursive Adaptive Partitioning
  3. Comparative redundancy calculations: A comparative study of existing redundancy calculations with new measure of bivariate redundancy measure. A Bivariate Measure of Redundant Information
  4. Synergistic mutual information: briefly explains about how single PI-region is either redundant, unique or synergistic. Research paper: Quantifying synergistic mutual information
  5. Partial Information Decomposition: a redundancy measure as proposed by Williams and Beer which typically introduce partial information atoms(PI-atoms) to decompose multivariate mutual information into non-negative terms. Refer to Nonnegative Decomposition of Multivariate Information
  6. Absolute mutual information: This measure is calculated using algorithmic complexity
  7. Pairwise adjusted mutual information
  8. partial correlation: However it can only measure linear direct associations
  9. Conditional mutual information quantify nonlinear direct relationships among variables, ideally for more than 2 variables

转载自:https://datascience.stackexchange.com/questions/97775/a-measure-of-redundancy-in-mutual-information

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值