Useful Links
Computer Science
Computer Science Fundamentals
Information Theory
1. Foundational Concepts of Information
2. Entropy and Information Measures
3. Source Coding and Lossless Compression
4. Channel Capacity and Noisy Communication
5. Error Control Coding
6. Rate-Distortion Theory
7. Information Theory in Continuous Settings
8. Network Information Theory
9. Advanced Topics and Applications
Rate-Distortion Theory
Lossy Compression Fundamentals
Motivation for Lossy Compression
Limitations of Lossless Methods
Perceptual Considerations
Application Requirements
Distortion Measures
Hamming Distortion
Squared Error Distortion
Absolute Error Distortion
Perceptual Distortion Measures
Rate-Distortion Function
Definition and Properties
Convexity
Monotonicity
Continuity
Computing Rate-Distortion Functions
Blahut-Arimoto Algorithm
Analytical Solutions
Parametric Representations
Rate-Distortion for Specific Sources
Bernoulli Source
Gaussian Source
Uniform Source
Rate-Distortion Theorem
Achievability Proof
Random Coding Argument
Joint Typicality
Covering Lemma
Converse Proof
Information-Theoretic Bounds
Operational Significance
Quantization Theory
Scalar Quantization
Uniform Quantizers
Non-uniform Quantizers
Lloyd-Max Quantizers
Companding
Vector Quantization
Codebook Design
Nearest Neighbor Encoding
Centroid Conditions
LBG Algorithm
Entropy-Constrained Quantization
Optimal Quantizer Design
High-Rate Approximations
Operational Rate-Distortion Performance
Previous
5. Error Control Coding
Go to top
Next
7. Information Theory in Continuous Settings