Coverart for item
The Resource A student's guide to coding and information theory, Stefan M. Moser, Po-Ning Chen

A student's guide to coding and information theory, Stefan M. Moser, Po-Ning Chen

Label
A student's guide to coding and information theory
Title
A student's guide to coding and information theory
Statement of responsibility
Stefan M. Moser, Po-Ning Chen
Creator
Contributor
Author
Subject
Language
eng
Summary
This guide provides an introduction to the engineering background of modern communication systems, from mobile phones to data compression and storage. The authors begin with many practical applications in coding, including the repetition code, the Hamming code and the Huffman code. They then explain the corresponding information theory, from entropy and mutual information to channel capacity and the information transmission theorem. Finally, they provide insights into the connections between coding theory and other fields. Many worked examples are given throughout the book, using practical applications to illustrate theoretical definitions. Exercises are also included, enabling readers to double-check what they have learned and gain glimpses into more advanced topics.--
Assigning source
Provided by Publisher
Cataloging source
NLE
http://library.link/vocab/creatorName
Moser, Stefan M
Illustrations
illustrations
Index
index present
Literary form
non fiction
Nature of contents
bibliography
http://library.link/vocab/relatedWorkOrContributorName
Chen, Po-Ning
http://library.link/vocab/subjectName
  • Coding theory
  • Information theory
Label
A student's guide to coding and information theory, Stefan M. Moser, Po-Ning Chen
Instantiates
Publication
Bibliography note
Includes bibliographical references and index
Carrier category
volume
Carrier category code
  • nc
Carrier MARC source
rdacarrier.
Content category
text
Content type code
  • txt
Content type MARC source
rdacontent.
Contents
1. Introduction -- Information theory versus coding theory -- Model and basic operations of information processing systems -- Information source -- Encoding a source alphabet -- Octal and hexadecimal codes -- 2. Error-detecting codes -- Review of modular arithmetic -- Independent errors : white noise -- Single parity-check code -- The ASCII code -- Simple burst error-detecting code -- Alphabet plus number codes : weighted codes -- Trade-off between redundancy and error-detecting capability -- 3. Repetition and hamming codes -- Arithmetic in the binary field -- Three-times repetition code -- Hamming code -- Hamming bound : sphere packing -- 4. Data compression: efficient coding of a random message -- Prefix-free or instantaneous codes -- Trees and codes -- The Kraft Inequality -- Trees with probabilities -- Optimal codes : Huffman code -- Types of codes -- 5. Entropy and Shannon's source coding theorem -- Motivation -- Uncertainty or entropy -- Binary entropy function -- Information Theory Inequality -- Bounds on the entropy -- Trees revisited -- Bounds on the efficiency of codes -- Fundamental limitations of source coding -- Analysis of the best codes -- Coding Theorem for a single random message -- Coding of an information source -- Appendix : Uniqueness of the definition of entropy -- 6. Mutual information and channel capacity -- The channel -- The channel relationships -- The binary symmetric channel -- System entropies -- Mutual information -- Definition of channel capacity -- Capacity of the binary symmetric channel -- Uniformly dispersive channel -- Characterization of the capacity-achieving input distribution -- Shannon's Channel Coding Theorem -- 7. Approaching the Shannon limit by turbo coding -- Information Transmission Theorem -- The Gaussian channel -- Transmission at a rate below capacity -- Transmission at a rate above capacity -- Turbo coding -- Appendix : Why we assume uniform and independent data at the encoder -- Appendix : Definition of concavity -- 8. Other aspects of coding theory -- Hamming code and projective geometry -- Coding and game theory
Control code
FIEb17819593
Dimensions
23 cm.
Extent
xiii, 191 pages
Isbn
9781107015838
Media category
unmediated
Media MARC source
rdamedia.
Media type code
  • n
Other physical details
illustrations
System control number
(OCoLC)778123334
Label
A student's guide to coding and information theory, Stefan M. Moser, Po-Ning Chen
Publication
Bibliography note
Includes bibliographical references and index
Carrier category
volume
Carrier category code
  • nc
Carrier MARC source
rdacarrier.
Content category
text
Content type code
  • txt
Content type MARC source
rdacontent.
Contents
1. Introduction -- Information theory versus coding theory -- Model and basic operations of information processing systems -- Information source -- Encoding a source alphabet -- Octal and hexadecimal codes -- 2. Error-detecting codes -- Review of modular arithmetic -- Independent errors : white noise -- Single parity-check code -- The ASCII code -- Simple burst error-detecting code -- Alphabet plus number codes : weighted codes -- Trade-off between redundancy and error-detecting capability -- 3. Repetition and hamming codes -- Arithmetic in the binary field -- Three-times repetition code -- Hamming code -- Hamming bound : sphere packing -- 4. Data compression: efficient coding of a random message -- Prefix-free or instantaneous codes -- Trees and codes -- The Kraft Inequality -- Trees with probabilities -- Optimal codes : Huffman code -- Types of codes -- 5. Entropy and Shannon's source coding theorem -- Motivation -- Uncertainty or entropy -- Binary entropy function -- Information Theory Inequality -- Bounds on the entropy -- Trees revisited -- Bounds on the efficiency of codes -- Fundamental limitations of source coding -- Analysis of the best codes -- Coding Theorem for a single random message -- Coding of an information source -- Appendix : Uniqueness of the definition of entropy -- 6. Mutual information and channel capacity -- The channel -- The channel relationships -- The binary symmetric channel -- System entropies -- Mutual information -- Definition of channel capacity -- Capacity of the binary symmetric channel -- Uniformly dispersive channel -- Characterization of the capacity-achieving input distribution -- Shannon's Channel Coding Theorem -- 7. Approaching the Shannon limit by turbo coding -- Information Transmission Theorem -- The Gaussian channel -- Transmission at a rate below capacity -- Transmission at a rate above capacity -- Turbo coding -- Appendix : Why we assume uniform and independent data at the encoder -- Appendix : Definition of concavity -- 8. Other aspects of coding theory -- Hamming code and projective geometry -- Coding and game theory
Control code
FIEb17819593
Dimensions
23 cm.
Extent
xiii, 191 pages
Isbn
9781107015838
Media category
unmediated
Media MARC source
rdamedia.
Media type code
  • n
Other physical details
illustrations
System control number
(OCoLC)778123334

Library Locations

    • Badia FiesolanaBorrow it
      Via dei Roccettini 9, San Domenico di Fiesole, 50014, IT
      43.803074 11.283055
Processing Feedback ...