前言
两个python包,可以调用对应函数计算互信息、条件互信息、共信息、双向总信息、残差信息等。附带一些其他信息理论研究文章。
dit
This python package for discrete information theory provides a standard bivariate case based on:
-
Basic Shannon measure of mutual information for bivariate distributions
-
Measures for multivariate distributions
- Co-Information: quantifies amount of information all variable participates in
- Total Correlation: amount of information each individual variable carries above and beyond joint entropy
- Dual Total Correlation: Also known as binding information is the amount of information shared among the variables.
- Cohesion: spans total correlation to dual total correlation
- CAEKL Mutual Information:
Generalized as the smallest quantity that can be subtracted from the joint, and from each part of a partition of all the variables, such that the joint entropy minus this quantity is equal to the sum of each partition entropy minus this quantity. - Interaction Information: Equal in magnitude to co-information, however for odd number of variables takes the opposite sign
- DeWeese-like Measures
local modification of a single variable can not increase the amount of correlation or dependence it has with the other variables.
pyitlib
Library in python for information-theoretic methods.
Below are the mutual information measures found in package pyitlib:
- Mutual information
- Normalised mutual information (7 variants)
- Variation of information
- Lautum information
- Conditional mutual information
- Co-information
- Interaction information
- Multi-information
- Binding information
- Residual entropy
- Exogenous local information
- Enigmatic information
Other measures in Research Communities
- Part mutual information: This new measure is based on information theory that accurately quantify nonlinearly direct associations between measured variables. For more information, part mutual information for quantifying direct associations
- Calculate mutual information using recursive adaptive partitioning: This paper ideally focuses on mutual information between discrete variables with many categories using Recursive Adaptive Partitioning
- Comparative redundancy calculations: A comparative study of existing redundancy calculations with new measure of bivariate redundancy measure. A Bivariate Measure of Redundant Information
- Synergistic mutual information: briefly explains about how single PI-region is either redundant, unique or synergistic. Research paper: Quantifying synergistic mutual information
- Partial Information Decomposition: a redundancy measure as proposed by Williams and Beer which typically introduce partial information atoms(PI-atoms) to decompose multivariate mutual information into non-negative terms. Refer to Nonnegative Decomposition of Multivariate Information
- Absolute mutual information: This measure is calculated using algorithmic complexity
- Pairwise adjusted mutual information
- partial correlation: However it can only measure linear direct associations
- Conditional mutual information quantify nonlinear direct relationships among variables, ideally for more than 2 variables
转载自:https://datascience.stackexchange.com/questions/97775/a-measure-of-redundancy-in-mutual-information