UsefulLinks
Computer Science
Computer Science Fundamentals
Information Theory
1. Foundational Concepts of Information
2. Entropy and Information Measures
3. Source Coding and Lossless Compression
4. Channel Capacity and Noisy Communication
5. Error Control Coding
6. Rate-Distortion Theory
7. Information Theory in Continuous Settings
8. Network Information Theory
9. Advanced Topics and Applications
6.
Rate-Distortion Theory
6.1.
Lossy Compression Fundamentals
6.1.1.
Motivation for Lossy Compression
6.1.1.1.
Limitations of Lossless Methods
6.1.1.2.
Perceptual Considerations
6.1.1.3.
Application Requirements
6.1.2.
Distortion Measures
6.1.2.1.
Hamming Distortion
6.1.2.2.
Squared Error Distortion
6.1.2.3.
Absolute Error Distortion
6.1.2.4.
Perceptual Distortion Measures
6.2.
Rate-Distortion Function
6.2.1.
Definition and Properties
6.2.1.1.
Convexity
6.2.1.2.
Monotonicity
6.2.1.3.
Continuity
6.2.2.
Computing Rate-Distortion Functions
6.2.2.1.
Blahut-Arimoto Algorithm
6.2.2.2.
Analytical Solutions
6.2.2.3.
Parametric Representations
6.2.3.
Rate-Distortion for Specific Sources
6.2.3.1.
Bernoulli Source
6.2.3.2.
Gaussian Source
6.2.3.3.
Uniform Source
6.3.
Rate-Distortion Theorem
6.3.1.
Achievability Proof
6.3.1.1.
Random Coding Argument
6.3.1.2.
Joint Typicality
6.3.1.3.
Covering Lemma
6.3.2.
Converse Proof
6.3.2.1.
Information-Theoretic Bounds
6.3.2.2.
Operational Significance
6.4.
Quantization Theory
6.4.1.
Scalar Quantization
6.4.1.1.
Uniform Quantizers
6.4.1.2.
Non-uniform Quantizers
6.4.1.3.
Lloyd-Max Quantizers
6.4.1.4.
Companding
6.4.2.
Vector Quantization
6.4.2.1.
Codebook Design
6.4.2.2.
Nearest Neighbor Encoding
6.4.2.3.
Centroid Conditions
6.4.2.4.
LBG Algorithm
6.4.3.
Entropy-Constrained Quantization
6.4.3.1.
Optimal Quantizer Design
6.4.3.2.
High-Rate Approximations
6.4.3.3.
Operational Rate-Distortion Performance
Previous
5. Error Control Coding
Go to top
Next
7. Information Theory in Continuous Settings