Category: Statistical laws

Law of total variance
In probability theory, the law of total variance or variance decomposition formula or conditional variance formulas or law of iterated variances also known as Eve's law, states that if and are random
Law of total probability
In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities. It expresses the total probability of an outcome wh
Wike's law of low odd primes
Wike's law of low odd primes is a methodological principle to help design sound experiments in psychology. It is: "If the number of experimental treatments is a low odd prime number, then the experime
Safety in numbers
Safety in numbers is the hypothesis that, by being part of a large physical group or mass, an individual is less likely to be the victim of a mishap, accident, attack, or other bad event. Some related
Taylor's law
Taylor's power law is an empirical law in ecology that relates the variance of the number of individuals of a species per unit area of habitat to the corresponding mean by a power law relationship. It
Gompertz–Makeham law of mortality
The Gompertz–Makeham law states that the human death rate is the sum of an age-dependent component (the Gompertz function, named after Benjamin Gompertz), which increases exponentially with age and an
Statistical regularity
Statistical regularity is a notion in statistics and probability theory that random events exhibit regularity when repeated enough times or that enough sufficiently similar random events exhibit regul
Bradford's law
Bradford's law is a pattern first described by Samuel C. Bradford in 1934 that estimates the exponentially diminishing returns of searching for references in science journals. One formulation is that
Rank–size distribution
Rank–size distribution is the distribution of size by rank, in decreasing order of size. For example, if a data set consists of items of sizes 5, 100, 5, and 8, the rank-size distribution is 100, 8, 5
Law of averages
The law of averages is the commonly held belief that a particular outcome or event will, over certain periods of time, occur at a frequency that is similar to its probability. Depending on context or
Pareto principle
The Pareto principle states that for many outcomes, roughly 80% of consequences come from 20% of causes (the "vital few"). Other names for this principle are the 80/20 rule, the law of the vital few,
Zipf's law
Zipf's law (/zɪf/, German: [ts͡ɪpf]) is an empirical law formulated using mathematical statistics that refers to the fact that for many types of data studied in the physical and social sciences, the r
Law of truly large numbers
The law of truly large numbers (a statistical adage), attributed to Persi Diaconis and Frederick Mosteller, states that with a large enough number of independent samples, any highly implausible (i.e.
Law of total cumulance
In probability theory and mathematical statistics, the law of total cumulance is a generalization to cumulants of the law of total probability, the law of total expectation, and the law of total varia
Law of likelihood
No description available.
Lotka's law
Lotka's law, named after Alfred J. Lotka, is one of a variety of special applications of Zipf's law. It describes the frequency of publication by authors in any given field. It states that the number
Hellin's law
Hellin's law, also called Hellin-Zeleny's law, is an empirical observation in demography that the approximate rate of multiple births is one n-tuple birth per 89n-1 singleton births: twin births occur
Law of total expectation
The proposition in probability theory known as the law of total expectation, the law of iterated expectations (LIE), Adam's law, the tower rule, and the smoothing theorem, among other names, states th
Twyman's law
Twyman's law states that "Any figure that looks interesting or different is usually wrong", following the principle that "the more unusual or interesting the data, the more likely they are to have bee
Littlewood's law
Littlewood's law states that a person can expect to experience events with odds of one in a million (referred to as a "miracle") at the rate of about one per month. It was framed by British mathematic
Long tail
In statistics and business, a long tail of some distributions of numbers is the portion of the distribution having many occurrences far from the "head" or central part of the distribution. The distrib
Power law
In statistics, a power law is a functional relationship between two quantities, where a relative change in one quantity results in a proportional relative change in the other quantity, independent of
Heaps' law
In linguistics, Heaps' law (also called Herdan's law) is an empirical law which describes the number of distinct words in a document (or set of documents) as a function of the document length (so call
Law of total covariance
In probability theory, the law of total covariance, covariance decomposition formula, or conditional covariance formula states that if X, Y, and Z are random variables on the same probability space, a
Regression toward the mean
In statistics, regression toward the mean (also called reversion to the mean, and reversion to mediocrity) is the fact that if one sample of a random variable is extreme, the next sampling of the same
Benford's law
Benford's law, also known as the Newcomb–Benford law, the law of anomalous numbers, or the first-digit law, is an observation that in many real-life sets of numerical data, the leading digit is likely
Empirical statistical laws
An empirical statistical law or (in popular terminology) a law of statistics represents a type of behaviour that has been found across a number of datasets and, indeed, across a range of types of data
Law of the unconscious statistician
In probability theory and statistics, the law of the unconscious statistician, or LOTUS, is a theorem used to calculate the expected value of a function g(X) of a random variable X when one knows the