The Maximum Data Rate of a Channel

In 1924, an AT&T engineer, Henry Nyquist, realized that even a perfect channel has a finite transmission capacity. He derived an equation expressing the maximum data rate for a finite-bandwidth noiseless channel. Nyquist proved that  if an arbitrary signal  has been run through a low-pass filter of bandwidth B, the filtered signal can be completely reconstructed by making only 2B (exact) samples per second, which was called Sampling Theorem when transforming analog signal (continuous function) into digital signal (discrete function). Sampling the line faster than 2B times per second is pointless because the higher-frequency components that such sampling could recover have already been filtered out. If the signal consists of V discrete leve ls, Nyquist's theorem states:
            maximum data rate = 2B log V  bits/sec
in which 2B indicates symbol rate (baud rate) and log2V  indicates the number of bits transmitted by each symbol.

      So far we have considered only noiseless channels. If random noise is present, the situation deteriorates rapidly. And there is always random (thermal) noise present due to the motion of the molecules in the system. The amount of thermal noise present is measured by the ratio of the signal power to the noise power, denoting S/N. Then Shannon's major result (the most important paper in all of information theory) is that the maximum data rate or capacity of a noisy channel whose bandwidth is B Hz is given by:

            maximum data rate =B log2 (1+S/N) bits/sec
This tells us the best capacities that real channels can have. Shannon's result was derived from information-theory arguments and applies to any channel subject to thermal noise. Counterexamples should be treated in the same category as perpetual motion machines.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值