信息论基础知识

本文介绍了信息论的基础概念,包括自信息(衡量信源符号不确定性的量)、信息熵(信源的平均信息量)、条件自信息和条件熵(衡量收到Y时X的剩余不确定性)、互信息(衡量两个变量间的信息交换)以及联合熵。此外,还讨论了信息率(信号的平均熵)与信道容量的关系,后者是可靠传输的最大信息速率。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

自信息

自信息衡量的是信源符号本身的不确定性。信源符号发生的概率越大,自信息越小;反之,自信息越大。若信源符号 s i s_{i} si发生的概率为 p i p_{i} pi,则 s i s_{i} si的自信息记为 I ( s i ) I(s_{i}) I(si)。公式为:
I ( s i ) = l o g 1 p i = − l o g p i I(s_{i})=log\frac{1}{p_{i}}=-logp_{i} I(si)=logpi1=logpi
(log可以以2,e为底,若以2为底,则以bit为单位;以e为底,以nat为单位)

信息熵

信息熵是信源发出符号的平均信息量,衡量信源的不确定性。记为 H ( S ) H(S) H(S),公式为:
H ( S ) = ∑ i = 1 n p i I ( s i ) = − ∑ i = 1 n p i l o g p i H(S)=\sum_{i=1}^{n}p_{i}I(s_{i})=-\sum_{i=1}^{n}p_{i}logp_{i} H(S)=i

CONTENTS Contents v Preface to the Second Edition xv Preface to the First Edition xvii Acknowledgments for the Second Edition xxi Acknowledgments for the First Edition xxiii 1 Introduction and Preview 1.1 Preview of the Book 2 Entropy, Relative Entropy, and Mutual Information 2.1 Entropy 2.2 Joint Entropy and Conditional Entropy 2.3 Relative Entropy and Mutual Information 2.4 Relationship Between Entropy and Mutual Information 2.5 Chain Rules for Entropy, Relative Entropy,and Mutual Information 2.6 Jensen’s Inequality and Its Consequences 2.7 Log Sum Inequality and Its Applications 2.8 Data-Processing Inequality 2.9 Sufficient Statistics 2.10 Fano’s Inequality Summary Problems Historical Notes v vi CONTENTS 3 Asymptotic Equipartition Property 3.1 Asymptotic Equipartition Property Theorem 3.2 Consequences of the AEP: Data Compression 3.3 High-Probability Sets and the Typical Set Summary Problems Historical Notes 4 Entropy Rates of a Stochastic Process 4.1 Markov Chains 4.2 Entropy Rate 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph 4.4 Second Law of Thermodynamics 4.5 Functions of Markov Chains Summary Problems Historical Notes 5 Data Compression 5.1 Examples of Codes 5.2 Kraft Inequality 5.3 Optimal Codes 5.4 Bounds on the Optimal Code Length 5.5 Kraft Inequality for Uniquely Decodable Codes 5.6 Huffman Codes 5.7 Some Comments on Huffman Codes 5.8 Optimality of Huffman Codes 5.9 Shannon–Fano–Elias Coding 5.10 Competitive Optimality of the Shannon Code 5.11 Generation of Discrete Distributions from Fair Coins Summary Problems Historical Notes CONTENTS vii 6 Gambling and Data Compression 6.1 The Horse Race 159 6.2 Gambling and Side Information 164 6.3 Dependent Horse Races and Entropy Rate 166 6.4 The Entropy of English 168 6.5 Data Compression and Gambling 171 6.6 Gambling Estimate of the Entropy of English 173 Summary 175 Problems 176 Historical Notes 182 7 Channel Capacity 183 7.1 Examples of Channel Capacity 1
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值