European University Institute Library

A student's guide to coding and information theory, Stefan M. Moser, Po-Ning Chen

Label
A student's guide to coding and information theory, Stefan M. Moser, Po-Ning Chen
Language
eng
Bibliography note
Includes bibliographical references and index
Illustrations
illustrations
Index
index present
Literary Form
non fiction
Main title
A student's guide to coding and information theory
Nature of contents
bibliography
Oclc number
778123334
Responsibility statement
Stefan M. Moser, Po-Ning Chen
Summary
This guide provides an introduction to the engineering background of modern communication systems, from mobile phones to data compression and storage. The authors begin with many practical applications in coding, including the repetition code, the Hamming code and the Huffman code. They then explain the corresponding information theory, from entropy and mutual information to channel capacity and the information transmission theorem. Finally, they provide insights into the connections between coding theory and other fields. Many worked examples are given throughout the book, using practical applications to illustrate theoretical definitions. Exercises are also included, enabling readers to double-check what they have learned and gain glimpses into more advanced topics.--, Provided by Publisher
Table Of Contents
1. Introduction -- Information theory versus coding theory -- Model and basic operations of information processing systems -- Information source -- Encoding a source alphabet -- Octal and hexadecimal codes -- 2. Error-detecting codes -- Review of modular arithmetic -- Independent errors : white noise -- Single parity-check code -- The ASCII code -- Simple burst error-detecting code -- Alphabet plus number codes : weighted codes -- Trade-off between redundancy and error-detecting capability -- 3. Repetition and hamming codes -- Arithmetic in the binary field -- Three-times repetition code -- Hamming code -- Hamming bound : sphere packing -- 4. Data compression: efficient coding of a random message -- Prefix-free or instantaneous codes -- Trees and codes -- The Kraft Inequality -- Trees with probabilities -- Optimal codes : Huffman code -- Types of codes -- 5. Entropy and Shannon's source coding theorem -- Motivation -- Uncertainty or entropy -- Binary entropy function -- Information Theory Inequality -- Bounds on the entropy -- Trees revisited -- Bounds on the efficiency of codes -- Fundamental limitations of source coding -- Analysis of the best codes -- Coding Theorem for a single random message -- Coding of an information source -- Appendix : Uniqueness of the definition of entropy -- 6. Mutual information and channel capacity -- The channel -- The channel relationships -- The binary symmetric channel -- System entropies -- Mutual information -- Definition of channel capacity -- Capacity of the binary symmetric channel -- Uniformly dispersive channel -- Characterization of the capacity-achieving input distribution -- Shannon's Channel Coding Theorem -- 7. Approaching the Shannon limit by turbo coding -- Information Transmission Theorem -- The Gaussian channel -- Transmission at a rate below capacity -- Transmission at a rate above capacity -- Turbo coding -- Appendix : Why we assume uniform and independent data at the encoder -- Appendix : Definition of concavity -- 8. Other aspects of coding theory -- Hamming code and projective geometry -- Coding and game theory
Contributor
Content
Mapped to

Incoming Resources

Outgoing Resources