Greedy Algorithm correctness proof

Huffman Tree

Introduction

  • Fixed-length coding: Information Entropy(en- tropicalturn混乱)
  • Variable-length coding: Prefix-Free code (no codeword is a prefix of a codework of another symbol)

Proof

Theorem

  • T: a tree for some prefix encoding A and some probability distribution p over the symbols.
  • x and y: two leaves
  • T’: the tree obtained by swapping x and y in T.
    Then
    E p ( T ′ ) − E p ( T ) = ( p ( x ) − p ( y ) ) ( d e p t h ( y , T ) − d e p t h ( x , T ) ) E_p(T')-E_p(T)=(p(x)-p(y))(depth(y,T)-depth(x,T)) Ep(T)Ep(T)=(p(x)p(y))(depth(y,T)depth(x,T))

Lemma

something received or taken (di-:both side)

An optimal tree
two symbols with least probabilities are sibling leaves in the lowest level.
Easy

Correctness proof

  • n=2 obvious
  • n>2
    Consider an alphabet A of n-1 letters.(except the two least probabilities leaves which parent is z)
    Let T be an optimum tree for A
    Let A’=A+{x,y}-z,T’:adding x,y as children of z.
    H ′ H' H:optimum tree for A’
    H H H:removing x,y from H;
    Then
    E p ( T ) < = E p ( H ) E_p(T)<=E_p(H) Ep(T)<=Ep(H)
    E p ( T ′ ) = E p ( T ) + p ( x ) + p ( y ) < = E p ( H ′ ) E_p(T')=E_p(T)+p(x)+p(y)<=E_p(H') Ep(T)=Ep(T)+p(x)+p(y)<=Ep(H)
    E p ( T ′ ) > = E p ( H ′ ) E_p(T')>=E_p(H') Ep(T)>=Ep(H)
    so E p ( T ′ ) = E p ( H ′ ) E_p(T')=E_p(H') Ep(T)=Ep(H)

Tree’s

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值