信息论pdf_信息论入门教程

本文通过一个简单的例子介绍了信息论的基础概念,包括词汇编码、概率分布与编码长度的关系、信息熵等,揭示了如何根据词汇分布优化编码以实现数据压缩。通过讨论信息与压缩的关联,阐述了信息熵作为不确定性的度量,以及其在信息理论中的重要性。
摘要由CSDN通过智能技术生成

1948年,美国数学家克劳德·香农发表论文《通信的数学理论》(A Mathematical Theory of Communication),奠定了信息论的基础。

3eeb18e6f835ae457b87b584a38151e9.png

今天,信息论在信号处理、数据压缩、自然语言等许多领域,起着关键作用。虽然,它的数学形式很复杂,但是核心思想非常简单,只需要中学数学就能理解。

本文使用一个最简单的例子,帮助大家理解信息论。

一、词汇的编码

小张是我的好朋友,最近去了美国。

0e262056cf126b66d2308bf925b3bc56.png

我们保持着邮件联系。小张写信的时候,只使用4个词汇:狗,猫,鱼,鸟。

4a33cc51163b1a6252f88dcc1c4bbac8.png

信的所有内容就是这4个词的组合。第一封信写着“狗猫鱼鸟”,第二封信写“鱼猫鸟狗”。

信件

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
CONTENTS Contents v Preface to the Second Edition xv Preface to the First Edition xvii Acknowledgments for the Second Edition xxi Acknowledgments for the First Edition xxiii 1 Introduction and Preview 1.1 Preview of the Book 2 Entropy, Relative Entropy, and Mutual Information 2.1 Entropy 2.2 Joint Entropy and Conditional Entropy 2.3 Relative Entropy and Mutual Information 2.4 Relationship Between Entropy and Mutual Information 2.5 Chain Rules for Entropy, Relative Entropy,and Mutual Information 2.6 Jensen’s Inequality and Its Consequences 2.7 Log Sum Inequality and Its Applications 2.8 Data-Processing Inequality 2.9 Sufficient Statistics 2.10 Fano’s Inequality Summary Problems Historical Notes v vi CONTENTS 3 Asymptotic Equipartition Property 3.1 Asymptotic Equipartition Property Theorem 3.2 Consequences of the AEP: Data Compression 3.3 High-Probability Sets and the Typical Set Summary Problems Historical Notes 4 Entropy Rates of a Stochastic Process 4.1 Markov Chains 4.2 Entropy Rate 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph 4.4 Second Law of Thermodynamics 4.5 Functions of Markov Chains Summary Problems Historical Notes 5 Data Compression 5.1 Examples of Codes 5.2 Kraft Inequality 5.3 Optimal Codes 5.4 Bounds on the Optimal Code Length 5.5 Kraft Inequality for Uniquely Decodable Codes 5.6 Huffman Codes 5.7 Some Comments on Huffman Codes 5.8 Optimality of Huffman Codes 5.9 Shannon–Fano–Elias Coding 5.10 Competitive Optimality of the Shannon Code 5.11 Generation of Discrete Distributions from Fair Coins Summary Problems Historical Notes CONTENTS vii 6 Gambling and Data Compression 6.1 The Horse Race 159 6.2 Gambling and Side Information 164 6.3 Dependent Horse Races and Entropy Rate 166 6.4 The Entropy of English 168 6.5 Data Compression and Gambling 171 6.6 Gambling Estimate of the Entropy of English 173 Summary 175 Problems 176 Historical Notes 182 7 Channel Capacity 183 7.1 Examples of Channel Capacity 1
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值