Cover image for Information and coding theory
Title:
Information and coding theory
Personal Author:
Series:
Springer undergraduate mathematics series
Publication Information:
Berlin : Springer, 2000
Physical Description:
xiii, 210 p. : ill. ; 24 cm.
ISBN:
9781852336226

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010226229 Q360 J68 2000 Open Access Book Book
Searching...
Searching...
30000010226230 Q360 J68 2000 Open Access Book Book
Searching...

On Order

Summary

Summary

As this Preface is being written, the twentieth century is coming to an end. Historians may perhaps come to refer to it as the century of information, just as its predecessor is associated with the process of industrialisation. Successive technological developments such as the telephone, radio, television, computers and the Internet have had profound effects on the way we live. We can see pic­ tures of the surface of Mars or the early shape of the Universe. The contents of a whole shelf-load of library books can be compressed onto an almost weight­ less piece of plastic. Billions of people can watch the same football match, or can keep in instant touch with friends around the world without leaving home. In short, massive amounts of information can now be stored, transmitted and processed, with surprising speed, accuracy and economy. Of course, these developments do not happen without some theoretical ba­ sis, and as is so often the case, much of this is provided by mathematics. Many of the first mathematical advances in this area were made in the mid-twentieth century by engineers, often relying on intuition and experience rather than a deep theoretical knowledge to lead them to their discoveries. Soon the math­ ematicians, delighted to see new applications for their subject, joined in and developed the engineers' practical examples into wide-ranging theories, com­ plete with definitions, theorems and proofs.


Reviews 1

Choice Review

This attractive book by G. Jones (Univ. of Southampton, UK) and J. Jones (Open Univ., UK) presents two closely related subjects usually taught separately. Information theory, the invention of Claude Shannon, is fundamental to the modern communications industry. So is coding theory, which treats how information is transformed before and after transmission. Topics from information theory discussed here are the fundamental concepts of communication channel, information, and entropy. The most important result is Shannon's fundamental theorem on channel capacity. (The proof is in an appendix.) Most of these ideas are discussed in the context of the binary symmetric channel only. Topics from coding theory here revolve around the problem of correcting transmission errors. Hamming distance is introduced, as well as various bounds on code size, and the Hamming and Golay linear codes. There is no treatment of data compression or cryptography, the two other major branches of coding theory. Index of symbols; solutions to all problems. Information and Coding Theory is written in a clear but informal style, an excellent book for upper-division undergraduates through faculty as well as readers with the necessary background in linear algebra and probability theory. M. Henle; Oberlin College


Table of Contents

Prefacep. v
Notes to the Readerp. xiii
1. Source Codingp. 1
1.1 Definitions and Examplesp. 1
1.2 Uniquely Decodable Codesp. 4
1.3 Instantaneous Codesp. 9
1.4 Constructing Instantaneous Codesp. 11
1.5 Kraft's Inequalityp. 13
1.6 McMillan's Inequalityp. 14
1.7 Comments on Kraft's and McMillan's Inequalitiesp. 16
1.8 Supplementary Exercisesp. 17
2. Optimal Codesp. 19
2.1 Optimalityp. 19
2.2 Binary Huffman Codesp. 22
2.3 Average Word-length of Huffman Codesp. 26
2.4 Optimality of Binary Huffman Codesp. 27
2.5 r-ary Huffman Codesp. 28
2.6 Extensions of Sourcesp. 30
2.7 Supplementary Exercisesp. 32
3. Entropyp. 35
3.1 Information and Entropyp. 35
3.2 Properties of the Entropy Functionp. 40
3.3 Entropy and Average Word-lengthp. 42
3.4 Shannon-Fano Codingp. 45
3.5 Entropy of Extensions and Productsp. 47
3.6 Shannon's First Theoremp. 48
3.7 An Example of Shannon's First Theoremp. 49
3.8 Supplementary Exercisesp. 51
4. Information Channelsp. 55
4.1 Notation and Definitionsp. 55
4.2 The Binary Symmetric Channelp. 60
4.3 System Entropiesp. 62
4.4 System Entropies for the Binary Symmetric Channelp. 64
4.5 Extension of Shannon's First Theorem to Information Channelsp. 67
4.6 Mutual Informationp. 70
4.7 Mutual Information for the Binary Symmetric Channelp. 72
4.8 Channel Capacityp. 73
4.9 Supplementary Exercisesp. 76
5. Using an Unreliable Channelp. 79
5.1 Decision Rulesp. 79
5.2 An Example of Improved Reliabilityp. 82
5.3 Hamming Distancep. 85
5.4 Statement and Outline Proof of Shannon's Theoremp. 88
5.5 The Converse of Shannon's Theoremp. 90
5.6 Comments on Shannon's Theoremp. 93
5.7 Supplementary Exercisesp. 94
6. Error-correcting Codesp. 97
6.1 Introductory Conceptsp. 97
6.2 Examples of Codesp. 100
6.3 Minimum Distancep. 104
6.4 Hamming's Sphere-packing Boundp. 107
6.5 The Gilbert-Varshamov Boundp. 111
6.6 Hadamard Matrices and Codesp. 114
6.7 Supplementary Exercisesp. 118
7. Linear Codesp. 121
7.1 Matrix Description of Linear Codesp. 121
7.2 Equivalence of Linear Codesp. 127
7.3 Minimum Distance of Linear Codesp. 131
7.4 The Hamming Codesp. 133
7.5 The Golay Codesp. 136
7.6 The Standard Arrayp. 141
7.7 Syndrome Decodingp. 143
7.8 Supplementary Exercisesp. 146
Suggestions for Further Readingp. 149
Appendix A. Proof of the Sardinas-Patterson Theoremp. 153
Appendix B. The Law of Large Numbersp. 157
Appendix C. Proof of Shannon's Fundamental Theoremp. 159
Solutions to Exercisesp. 165
Bibliographyp. 191
Index of Symbols and Abbreviationsp. 195
Indexp. 201