Useful Links
Computer Science
Computer Science Fundamentals
Information Theory
1. Foundational Concepts of Information
2. Entropy and Information Measures
3. Source Coding and Lossless Compression
4. Channel Capacity and Noisy Communication
5. Error Control Coding
6. Rate-Distortion Theory
7. Information Theory in Continuous Settings
8. Network Information Theory
9. Advanced Topics and Applications
Channel Capacity and Noisy Communication
Channel Models
Discrete Memoryless Channels
Input and Output Alphabets
Transition Probabilities
Channel Matrix Representation
Specific Channel Types
Binary Symmetric Channel
Binary Erasure Channel
Z-Channel
q-ary Symmetric Channel
Channel Characteristics
Noise Models
Channel Memory
Time-Varying Channels
Channel Capacity
Definition as Maximum Mutual Information
Capacity-Achieving Input Distributions
Computing Capacity
Analytical Methods
Numerical Optimization
Blahut-Arimoto Algorithm
Capacity of Specific Channels
BSC Capacity Calculation
BEC Capacity Calculation
Symmetric Channel Capacity
Noisy-Channel Coding Theorem
Statement and Interpretation
Achievability Proof
Random Coding Argument
Typical Set Decoding
Error Probability Analysis
Converse Proof
Fano's Inequality
Information-Theoretic Bounds
Implications for Communication System Design
Continuous Channels
Additive White Gaussian Noise Channel
Channel Model
Power Constraints
Bandwidth Limitations
Shannon-Hartley Theorem
Capacity Formula
Signal-to-Noise Ratio Trade-offs
Bandwidth-Power Trade-offs
Water-Filling Principle
Parallel Gaussian Channels
Optimal Power Allocation
Frequency-Selective Channels
Previous
3. Source Coding and Lossless Compression
Go to top
Next
5. Error Control Coding