Site MapHelpFeedbackChapter Summary
Chapter Summary
(See related pages)

The chapter begins with the concept of amount information and means to quantify it. It shows how different source coding techniques like Shanon-Fano, Huffman, Lempel-Ziv coding improve transmission efficiency. Shanon’s classic theorem and its influence on deriving channel capacity are discussed next along with bandwidth-SNR tradeoff. The effect of mutual information on channel capacity is described and rate distortion theory is introduced to show how it improves data rate when some data loss is acceptable. Next, we discuss error control coding for random error correction. Techniques like Hadamard, Hamming, Cyclic, BCH and other algebraic codes are discussed. Burst error is a common phenomena in digital transmission in which errors are clustered. A variety of codes used for burst error corrections are discussed, e.g. block interleaving, convolutional interleaving, Reed-Solomon code etc. Convolutional coding and decoding is discussed in details which shows use of trellis diagrams and Viterbi algorithm. A simple introduction to now popular Turbo Coding is made. A competitive technique to forward error correction code is discussed which uses different types of Automatic-Repeat-Request systems. An optimum system is derived from information theory and is compared against popular techniques like Amplitude Modulation, Frequency Modulation etc. Finally, feedback communication and trellis-decoded modulations are discussed and a comparison presented.







Herbert TaubOnline Learning Center

Home > Chapter 13 > Chapter Summary