1. Basic notions from probability: probability space, probabiity measure, conditional, joint, and marginal probability, independent events, random variable and related notions. 2. Basic notions of information theory: entropy, interpretations of entropy, basic properties, joint entropy, conditional entropy, 3. Further notions of information theory: divergence and its applications, mutual information, AEP. 4. Introduction to generalized information theory: monotone measures and some of their special cases (imprecise probabilities, possibility theory, Dempster-Shafer theory), uncertainty and information measures for these measures. 5. Selected applications of information theory: Optimal codes as an application of information theory: uniquely decipherable codes, prefix codes, Kraft inequality, McMillan inequality, Shannon theorem on noiseless coding, block coding, Huffman code, its construction and optimality. Decision trees as an application of information theory.
|
-
Ash R. (1965). Information Theory. Dover, New York.
-
Han T. S., Kobayashi K. Mathematics of Information and Coding. AMS, Providence, Rhode Island.
-
Klir G. J. (2006). Uncertainty and Information. Foundations of Generalized Information Theory. J. Wiley, Hoboken, New Jersey.
-
Pierce J. R. (1980). An Introduction to Information Theory. Symbols, Signals and Noise. Dover, New York.
|