Information Theory
北京理工大学孙磊
北京理工大学电路与系统研究所老师
展开
-
Entropy of a random variable consisting of two discrete events
For a discrete random variable X consisting of two events {0,1}. This program shows the entropy wrt variation of the probabilities. 原创 2018-09-12 09:08:19 · 173 阅读 · 0 评论 -
离散随机变量的熵 Entropy of a discrete random variable
原创 2018-09-12 09:13:30 · 1543 阅读 · 0 评论 -
二次Renyi熵的估计 Quadratic Renyi entropy estimated from samples.
二次Renyi熵的估计Quadratic Renyi entropy estimated from samples. 原创 2018-09-12 09:15:32 · 1218 阅读 · 0 评论 -
Entropy of a RV:Entropy of a binary random variable
% *********************************************************************% ({'Entropy of a binary random variable','\bf(C) bitsunlei@126.com'})% *******************************************************...原创 2018-09-12 09:23:57 · 250 阅读 · 1 评论 -
Mutual Information Invariance to Reparameterization
For two random variables X and Y, it is known that which is also known as the reparameterization invariance property. To my known, the proof was given in Kraskov 2004 as followsLet and denote the...原创 2018-09-11 09:25:33 · 373 阅读 · 0 评论