Introduction to Information Theory & Coding Notes
Topics in our Introduction to Information Theory & Coding Notes PDF
In these “Introduction to Information Theory & Coding Notes PDF”, you will study the basic aspects of Information Theory and Coding to the students. Shannon’s work form the underlying theme for the present course. Construction of finite fields and bounds on the parameters of a linear code discussed.
The topics we will cover will be taken from the following list:
Concepts of Information Theory: Communication processes, A model of communication system, A quantitative measure of information, Binary unit of information, A measure of uncertainty, H function as a measure of uncertainty, Sources and binary sources, Measure of information for two-dimensional discrete finite probability schemes.
Entropy Function: A sketch of communication network, Entropy, Basic relationship among different entropies, A measure of mutual information, Interpretation of Shannon’s fundamental inequalities; Redundancy, Efficiency and channel capacity, Binary symmetric channel, Binary erasure channel, Uniqueness of the entropy function, Joint entropy and conditional entropy, Relative entropy and mutual information, Chain rules for entropy, Conditional relative entropy and conditional mutual information, Jensen’s inequality and its characterizations, The log sum inequality and its applications.
Concepts of Coding: Block codes, Hamming distance, Maximum likelihood decoding, Levels of error handling, Error correction, Error detection, Erasure correction, Construction of finite fields, Linear codes, Matrix representation of linear codes.
Bounds of Codes: Orthogonality relation, Encoding of linear codes, Decoding of linear codes, Singleton bound and maximum distance separable codes, Sphere-packing bound and perfect codes, Gilbert−Varshamov bound, MacWilliams’ identities.