Information Theory

  1. Entropy and Information Measures
    1. Shannon Entropy
      1. Definition and Mathematical Formula
        1. Intuitive Interpretation
          1. Average Uncertainty
            1. Average Information Content
              1. Measure of Randomness
              2. Properties of Entropy
                1. Non-negativity
                  1. Maximum Entropy Principle
                    1. Concavity
                      1. Continuity
                        1. Symmetry
                        2. Entropy of Common Distributions
                          1. Uniform Distribution
                            1. Bernoulli Distribution
                              1. Geometric Distribution
                                1. Poisson Distribution
                                2. Entropy in Different Bases
                                  1. Binary Entropy Function
                                    1. Natural Entropy
                                      1. Entropy Rate
                                    2. Joint and Conditional Entropy
                                      1. Joint Entropy
                                        1. Definition for Multiple Variables
                                          1. Properties and Bounds
                                            1. Subadditivity
                                            2. Conditional Entropy
                                              1. Definition and Interpretation
                                                1. Chain Rule for Entropy
                                                  1. Properties of Conditional Entropy
                                                    1. Conditioning Reduces Entropy
                                                    2. Entropy Relationships
                                                      1. Decomposition of Joint Entropy
                                                        1. Venn Diagram Representation
                                                          1. Information Diagrams
                                                        2. Relative Entropy and Divergences
                                                          1. Kullback-Leibler Divergence
                                                            1. Definition and Formula
                                                              1. Asymmetry Property
                                                                1. Non-negativity (Gibbs' Inequality)
                                                                  1. Applications in Statistics
                                                                  2. Cross-Entropy
                                                                    1. Definition and Relationship to KL Divergence
                                                                      1. Applications in Machine Learning
                                                                        1. Cross-Entropy Loss Functions
                                                                        2. Jensen-Shannon Divergence
                                                                          1. Symmetric Divergence Measure
                                                                            1. Relationship to Mutual Information
                                                                              1. Bounded Nature
                                                                              2. f-Divergences
                                                                                1. General Framework
                                                                                  1. Special Cases
                                                                                    1. Properties and Applications
                                                                                  2. Mutual Information
                                                                                    1. Definition and Interpretation
                                                                                      1. Information Shared Between Variables
                                                                                        1. Reduction in Uncertainty
                                                                                          1. Symmetric Property
                                                                                          2. Properties of Mutual Information
                                                                                            1. Non-negativity
                                                                                              1. Symmetry
                                                                                                1. Data Processing Inequality
                                                                                                  1. Chain Rule for Mutual Information
                                                                                                  2. Conditional Mutual Information
                                                                                                    1. Definition and Properties
                                                                                                      1. Three-Way Information Decomposition
                                                                                                        1. Interaction Information
                                                                                                        2. Multivariate Mutual Information
                                                                                                          1. Total Correlation
                                                                                                            1. Dual Total Correlation
                                                                                                              1. Information Bottleneck Method