- Category theory
- >
- Information geometry
- >
- Statistical distance
- >
- F-divergences

- Differential geometry
- >
- Information geometry
- >
- Statistical distance
- >
- F-divergences

- Information theory
- >
- Information geometry
- >
- Statistical distance
- >
- F-divergences

- Metric geometry
- >
- Distance
- >
- Statistical distance
- >
- F-divergences

- Similarity measures
- >
- Distance
- >
- Statistical distance
- >
- F-divergences

- Statistical analysis
- >
- Descriptive statistics
- >
- Statistical distance
- >
- F-divergences

- Subtraction
- >
- Distance
- >
- Statistical distance
- >
- F-divergences

- Summary statistics
- >
- Descriptive statistics
- >
- Statistical distance
- >
- F-divergences

- Theory of probability distributions
- >
- Information geometry
- >
- Statistical distance
- >
- F-divergences

Total variation distance of probability measures

In probability theory, the total variation distance is a distance measure for probability distributions. It is an example of a statistical distance metric, and is sometimes called the statistical dist

Kullback–Leibler divergence

In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted , is a type of statistical distance: a measure of how one probability distribution

Divergence (statistics)

In information geometry, a divergence is a kind of statistical distance: a binary function which establishes the separation from one probability distribution to another on a statistical manifold. The

Hellinger distance

In probability and statistics, the Hellinger distance (closely related to, although different from, the Bhattacharyya distance) is used to quantify the similarity between two probability distributions

F-divergence

In probability theory, an -divergence is a function that measures the difference between two probability distributions and . Many common divergences, such as KL-divergence, Hellinger distance, and tot

© 2023 Useful Links.