Useful Links
Computer Science
Computer Science Fundamentals
Information Theory
1. Foundational Concepts of Information
2. Entropy and Information Measures
3. Source Coding and Lossless Compression
4. Channel Capacity and Noisy Communication
5. Error Control Coding
6. Rate-Distortion Theory
7. Information Theory in Continuous Settings
8. Network Information Theory
9. Advanced Topics and Applications
Information Theory in Continuous Settings
Differential Entropy
Definition for Continuous Variables
Relationship to Discrete Entropy
Dependence on Coordinate System
Negative Values
Properties and Limitations
Translation Invariance
Scaling Properties
Maximum Differential Entropy
Differential Entropy of Common Distributions
Uniform Distribution
Gaussian Distribution
Exponential Distribution
Laplacian Distribution
Continuous Mutual Information
Definition and Properties
Invariance Under Invertible Transformations
Data Processing Inequality
Chain Rule
Estimation from Data
Histogram Methods
Kernel Density Estimation
k-Nearest Neighbor Methods
Gaussian Channels and Capacity
AWGN Channel Model
Signal and Noise Characteristics
Power Constraints
Bandwidth Constraints
Capacity Calculations
Single-Input Single-Output Channels
Multiple-Input Multiple-Output Channels
Parallel Gaussian Channels
Optimal Input Distributions
Gaussian Inputs for AWGN
Water-Filling Solutions
Capacity-Achieving Strategies
Previous
6. Rate-Distortion Theory
Go to top
Next
8. Network Information Theory