Kernel-independent component analysis
In statistics, kernel-independent component analysis (kernel ICA) is an efficient algorithm for independent component analysis which estimates source components by optimizing a generalized variance co
Expectation–maximization algorithm
In statistics, an expectation–maximization (EM) algorithm is an iterative method to find (local) maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where t
Banburismus
Banburismus was a cryptanalytic process developed by Alan Turing at Bletchley Park in Britain during the Second World War. It was used by Bletchley Park's Hut 8 to help break German Kriegsmarine (nava
Ziggurat algorithm
The ziggurat algorithm is an algorithm for pseudo-random number sampling. Belonging to the class of rejection sampling algorithms, it relies on an underlying source of uniformly-distributed random num
Lander–Green algorithm
The Lander–Green algorithm is an algorithm, due to Eric Lander and for computing the likelihood of observed genotype data given a pedigree. It is appropriate for relatively small pedigrees and a large
Elston–Stewart algorithm
The Elston–Stewart algorithm is an algorithm for computing the likelihood of observed data on a pedigree assuming a general model under which specific genetic segregation, linkage and association mode
Count-distinct problem
In computer science, the count-distinct problem(also known in applied mathematics as the cardinality estimation problem) is the problem of finding the number of distinct elements in a data stream with
Metropolis–Hastings algorithm
In statistics and statistical physics, the Metropolis–Hastings algorithm is a Markov chain Monte Carlo (MCMC) method for obtaining a sequence of random samples from a probability distribution from whi
Yamartino method
The Yamartino method is an algorithm for calculating an approximation of the standard deviation of wind direction during a single pass through the incoming data.
Odds algorithm
The odds algorithm (or Bruss algorithm) is a mathematical method for computing optimal strategies for a class of problems that belong to the domain of optimal stopping problems. Their solution follows
Least mean squares filter
Least mean squares (LMS) algorithms are a class of adaptive filter used to mimic a desired filter by finding the filter coefficients that relate to producing the least mean square of the error signal
Buzen's algorithm
In queueing theory, a discipline within the mathematical theory of probability, Buzen's algorithm (or convolution algorithm) is an algorithm for calculating the normalization constant G(N) in the Gord
Repeated median regression
In robust statistics, repeated median regression, also known as the repeated median estimator, is a robust linear regression algorithm.The estimator has a breakdown point of 50%. Although it is equiva
Iterative proportional fitting
The iterative proportional fitting procedure (IPF or IPFP, also known as biproportional fitting or biproportion in statistics or economics (input-output analysis, etc.), RAS algorithm in economics, ra
Laplace's approximation
In mathematics, Laplace's approximation fits an un-normalised Gaussian approximation to a (twice differentiable) un-normalised target density. In Bayesian statistical inference this is useful to simul
Levenberg–Marquardt algorithm
In mathematics and computing, the Levenberg–Marquardt algorithm (LMA or just LM), also known as the damped least-squares (DLS) method, is used to solve non-linear least squares problems. These minimiz
False nearest neighbor algorithm
Within abstract algebra, the false nearest neighbor algorithm is an algorithm for estimating the embedding dimension. The concept was proposed by Kennel et al. (1992). The main idea is to examine how
Wang and Landau algorithm
The Wang and Landau algorithm, proposed by Fugao Wang and David P. Landau, is a Monte Carlo method designed to estimate the density of states of a system. The method performs a non-Markovian random wa
HyperLogLog
HyperLogLog is an algorithm for the count-distinct problem, approximating the number of distinct elements in a multiset. Calculating the exact cardinality of the unique elements of a multiset requires
Pseudo-marginal Metropolis–Hastings algorithm
In computational statistics, the pseudo-marginal Metropolis–Hastings algorithm is a Monte Carlo method to sample from a probability distribution. It is an instance of the popular Metropolis–Hastings a
VEGAS algorithm
The VEGAS algorithm, due to G. Peter Lepage, is a method for reducing error in Monte Carlo simulations by using a known or approximate probability distribution function to concentrate the search in th
Helmert–Wolf blocking
The Helmert–Wolf blocking (HWB) is a least squares solution method for the solution of a sparse block system of linear equations. It was first reported by F. R. Helmert for use in geodesy problems in
Chi-square automatic interaction detection
Chi-square automatic interaction detection (CHAID) is a decision tree technique based on adjusted significance testing (Bonferroni correction, Holm-Bonferroni testing). The technique was developed in
Farr's laws
Farr's law is a law formulated by Dr. William Farr when he made the observation that epidemic events rise and fall in a roughly symmetrical pattern. The time-evolution behavior could be captured by a
Algorithms for calculating variance
Algorithms for calculating variance play a major role in computational statistics. A key difficulty in the design of good algorithms for this problem is that formulas for the variance may involve sums
Random sample consensus
Random sample consensus (RANSAC) is an iterative method to estimate parameters of a mathematical model from a set of observed data that contains outliers, when outliers are to be accorded no influence
Gauss–Newton algorithm
The Gauss–Newton algorithm is used to solve non-linear least squares problems, which is equivalent to minimizing a sum of squared function values. It is an extension of Newton's method for finding a m