Category: Randomized algorithms

Random self-reducibility
Random self-reducibility (RSR) is the rule that a good algorithm for the average case implies a good algorithm for the worst case. RSR is the ability to solve all instances of a problem by solving a l
Morris method
In applied statistics, the Morris method for is a so-called (OAT), meaning that in each run only one input parameter is given a new value. It facilitates a global sensitivity analysis by making a numb
Linear partial information
Linear partial information (LPI) is a method of making decisions based on insufficient or fuzzy information. LPI was introduced in 1970 by Polish–Swiss mathematician Edward Kofler (1911–2007) to simpl
Monte Carlo algorithm
In computing, a Monte Carlo algorithm is a randomized algorithm whose output may be incorrect with a certain (typically small) probability. Two examples of such algorithms are Karger–Stein algorithm a
Derandomization
No description available.
Monte Carlo method
Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use rando
Average-case complexity
In computational complexity theory, the average-case complexity of an algorithm is the amount of some computational resource (typically time) used by the algorithm, averaged over all possible inputs.
Expected linear time MST algorithm
The expected linear time MST algorithm is a randomized algorithm for computing the minimum spanning forest of a weighted graph with no isolated vertices. It was developed by David Karger, Philip Klein
Property testing
In computer science, a property testing algorithm for a decision problem is an algorithm whose query complexity to its input is much smaller than the instance size of the problem. Typically property t
Randomized algorithm
A randomized algorithm is an algorithm that employs a degree of randomness as part of its logic or procedure. The algorithm typically uses uniformly random bits as an auxiliary input to guide its beha
Fisher–Yates shuffle
The Fisher–Yates shuffle is an algorithm for generating a random permutation of a finite sequence—in plain terms, the algorithm shuffles the sequence. The algorithm effectively puts all the elements i
Freivalds' algorithm
Freivalds' algorithm (named after Rūsiņš Mārtiņš Freivalds) is a probabilistic randomized algorithm used to verify matrix multiplication. Given three n × n matrices , , and , a general problem is to v
Random priority item allocation
Random priority (RP), also called Random serial dictatorship (RSD), is a procedure for fair random assignment - dividing indivisible items fairly among people. Suppose partners have to divide (or fewe
PCP theorem
In computational complexity theory, the PCP theorem (also known as the PCP characterization theorem) states that every decision problem in the NP complexity class has probabilistically checkable proof
Basin-hopping
In applied mathematics, Basin-hopping is a global optimization technique that iterates by performing random perturbation of coordinates, performing local optimization, and accepting or rejecting new c
Approximate counting algorithm
The approximate counting algorithm allows the counting of a large number of events using a small amount of memory. Invented in 1977 by Robert Morris (cryptographer) of Bell Labs, it uses probabilistic
Multilevel Monte Carlo method
Multilevel Monte Carlo (MLMC) methods in numerical analysis are algorithms for computing expectations that arise in stochastic simulations. Just as Monte Carlo methods, they rely on repeated random sa
Sipser–Lautemann theorem
In computational complexity theory, the Sipser–Lautemann theorem or Sipser–Gács–Lautemann theorem states that bounded-error probabilistic polynomial (BPP) time is contained in the polynomial time hier
Probabilistic Turing machine
In theoretical computer science, a probabilistic Turing machine is a non-deterministic Turing machine that chooses between the available transitions at each point according to some probability distrib
Randomized algorithms as zero-sum games
Randomized algorithms are algorithms that employ a degree of randomness as part of their logic. These algorithms can be used to give good average-case results (complexity-wise) to problems which are h
With high probability
In mathematics, an event that occurs with high probability (often shortened to w.h.p. or WHP) is one whose probability depends on a certain number n and goes to 1 as n goes to infinity, i.e. the proba
Average performance
No description available.
Las Vegas algorithm
In computing, a Las Vegas algorithm is a randomized algorithm that always gives correct results; that is, it always produces the correct result or it informs about the failure. However, the runtime of
Karloff–Zwick algorithm
The Karloff–Zwick algorithm, in computational complexity theory, is a randomised approximation algorithm taking an instance of MAX-3SAT Boolean satisfiability problem as input. If the instance is sati
Algorithmic information theory
Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as oppo
Baum–Welch algorithm
In electrical engineering, statistical computing and bioinformatics, the Baum–Welch algorithm is a special case of the expectation–maximization algorithm used to find the unknown parameters of a hidde
Entropy compression
In mathematics and theoretical computer science, entropy compression is an information theoretic method for proving that a random process terminates, originally used by Robin Moser to prove an algorit
Probabilistic analysis of algorithms
In analysis of algorithms, probabilistic analysis of algorithms is an approach to estimate the computational complexity of an algorithm or a computational problem. It starts from an assumption about a
Averaging argument
In computational complexity theory and cryptography, averaging argument is a standard argument for proving theorems. It usually allows us to convert probabilistic polynomial-time algorithms into non-u
Probabilistically checkable proof
In computational complexity theory, a probabilistically checkable proof (PCP) is a type of proof that can be checked by a randomized algorithm using a bounded amount of randomness and reading a bounde
List update problem
The List Update or the List Access problem is a simple model used in the study of competitive analysis of online algorithms. Given a set of items in a list where the cost of accessing an item is propo
Multiplicative weight update method
The multiplicative weights update method is an algorithmic technique most commonly used for decision making and prediction, and also widely deployed in game theory and algorithm design. The simplest u
First-order second-moment method
In probability theory, the first-order second-moment (FOSM) method, also referenced as mean value first-order second-moment (MVFOSM) method, is a probabilistic method to determine the stochastic momen
Simultaneous perturbation stochastic approximation
Simultaneous perturbation stochastic approximation (SPSA) is an algorithmic method for optimizing systems with multiple unknown parameters. It is a type of stochastic approximation algorithm. As an op
Biology Monte Carlo method
Biology Monte Carlo methods (BioMOCA) have been developed at the University of Illinois at Urbana-Champaign to simulate ion transport in an electrolyte environment through ion channels or nano-pores e
Arthur–Merlin protocol
In computational complexity theory, an Arthur–Merlin protocol, introduced by , is an interactive proof system in which the verifier's coin tosses are constrained to be public (i.e. known to the prover
Set balancing
The set balancing problem in mathematics is the problem of dividing a set to two subsets that have roughly the same characteristics. It arises naturally in design of experiments. There is a group of s
Atlantic City algorithm
Atlantic City algorithm is a probabilistic polynomial time algorithm that answers correctly at least 75% of the time (or, in some versions, some other value greater than 50%). The term "Atlantic City"
Random permutation
A random permutation is a random ordering of a set of objects, that is, a permutation-valued random variable. The use of random permutations is often fundamental to fields that use randomized algorith
Yao's principle
In computational complexity theory, Yao's principle (also called Yao's minimax principle or Yao's lemma) is a way to prove lower bounds on the worst-case performance of randomized algorithms, by compa
Probabilistic complexity theory
No description available.
Nested sampling algorithm
The nested sampling algorithm is a computational approach to the Bayesian statistics problems of comparing models and generating samples from posterior distributions. It was developed in 2004 by physi
Mean-field particle methods
Mean-field particle methods are a broad class of interacting type Monte Carlo algorithms for simulating from a sequence of probability distributions satisfying a nonlinear evolution equation. These fl
Reservoir sampling
Reservoir sampling is a family of randomized algorithms for choosing a simple random sample, without replacement, of k items from a population of unknown size n in a single pass over the items. The si
Principle of deferred decision
Principle of deferred decisions is a technique used in analysis of randomized algorithms.
Solovay–Strassen primality test
The Solovay–Strassen primality test, developed by Robert M. Solovay and Volker Strassen in 1977, is a probabilistic test to determine if a number is composite or probably prime. The idea behind the te