- Fields of mathematics
- >
- Applied mathematics
- >
- Theoretical computer science
- >
- Information theory

- Probability and statistics
- >
- Statistics
- >
- Statistical theory
- >
- Information theory

Linear partial information

Linear partial information (LPI) is a method of making decisions based on insufficient or fuzzy information. LPI was introduced in 1970 by Polish–Swiss mathematician Edward Kofler (1911–2007) to simpl

Communication channel

A communication channel refers either to a physical transmission medium such as a wire, or to a logical connection over a multiplexed medium such as a radio channel in telecommunications and computer

Channel capacity

Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channe

Channel state information

In wireless communications, channel state information (CSI) is the known channel properties of a communication link. This information describes how a signal propagates from the transmitter to the rece

Gibbs' inequality

In information theory, Gibbs' inequality is a statement about the information entropy of a discrete probability distribution. Several other bounds on the entropy of probability distributions are deriv

Quantum t-design

A quantum t-design is a probability distribution over either pure quantum states or unitary operators which can duplicate properties of the probability distribution over the Haar measure for polynomia

Integrated information theory

Integrated information theory (IIT) attempts to provide a framework capable of explaining why some physical systems (such as human brains) are conscious, why they feel the particular way they do in pa

Rényi entropy

In information theory, the Rényi entropy is a quantity that generalizes various notions of entropy, including Hartley entropy, Shannon entropy, collision entropy, and min-entropy. The Rényi entropy is

History of information theory

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical The

Principle of least privilege

In information security, computer science, and other fields, the principle of least privilege (PoLP), also known as the principle of minimal privilege (PoMP) or the principle of least authority (PoLA)

Tsallis entropy

In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy.

Hartley function

The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set A uniformly at random is picked, the information revealed after the outcome is know

Differential entropy

Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of averag

Limiting density of discrete points

In information theory, the limiting density of discrete points is an adjustment to the formula of Claude Shannon for differential entropy. It was formulated by Edwin Thompson Jaynes to address defects

Random number generation

Random number generation is a process by which, often by means of a random number generator (RNG), a sequence of numbers or symbols that cannot be reasonably predicted better than by random chance is

Per-user unitary rate control

Per-user unitary rate control (PU2RC) is a multi-user MIMO (multiple-input and multiple-output) scheme. PU2RC uses both transmission pre-coding and multi-user scheduling. By doing that, the network ca

Theil index

The Theil index is a statistic primarily used to measure economic inequality and other economic phenomena, though it has also been used to measure racial segregation. The Theil index TT is the same as

Szemerédi regularity lemma

Szemerédi's regularity lemma is one of the most powerful tools in extremal graph theory, particularly in the study of large dense graphs. It states that the vertices of every large enough graph can be

Oversampling

In signal processing, oversampling is the process of sampling a signal at a sampling frequency significantly higher than the Nyquist rate. Theoretically, a bandwidth-limited signal can be perfectly re

Frank Benford

Frank Albert Benford Jr. (July 10, 1883 – December 4, 1948) was an American electrical engineer and physicist best known for rediscovering and generalizing Benford's Law, a statistical statement about

Typical set

In information theory, the typical set is a set of sequences whose probability is close to two raised to the negative power of the entropy of their source distribution. That this set has total probabi

Berlekamp–Welch algorithm

The Berlekamp–Welch algorithm, also known as the Welch–Berlekamp algorithm, is named for Elwyn R. Berlekamp and Lloyd R. Welch. This is a decoder algorithm that efficiently corrects errors in Reed–Sol

Modulo-N code

Modulo-N code is a lossy compression algorithm used to compress correlated data sources using modular arithmetic.

DISCUS

DISCUS, or distributed source coding using syndromes, is a method for distributed source coding. It is a compression algorithm used to compress correlated data sources. The method is designed to achie

Inequalities in information theory

Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear.

A Mathematical Theory of Communication

"A Mathematical Theory of Communication" is an article by mathematician Claude E. Shannon published in Bell System Technical Journal in 1948. It was renamed The Mathematical Theory of Communication in

MIMO

In radio, multiple-input and multiple-output, or MIMO (/ˈmaɪmoʊ, ˈmiːmoʊ/), is a method for multiplying the capacity of a radio link using multiple transmission and receiving antennas to exploit multi

Error exponents in hypothesis testing

In statistical hypothesis testing, the error exponent of a hypothesis testing procedure is the rate at which the probabilities of Type I and Type II decay exponentially with the size of the sample use

Ulam's game

Ulam's game, or the Rényi–Ulam game, is a mathematical game similar to the popular game of twenty questions. In Ulam's game, a player attempts to guess an unnamed object or number by asking yes–no que

Shannon's source coding theorem

In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy. Named afte

Channel use

Channel use is a quantity used in signal processing or telecommunication related to symbol rate and channel capacity. Capacity is measured in bits per input symbol into the channel (bits per channel u

Directed information

Directed information, , is an information theory measure that quantifies the information flow from the random process to the random process . The term directed information was coined by James Massey a

Water filling algorithm

Water filling algorithm is a general name given to the ideas in communication systems design and practice for equalization strategies on communications channels. As the name suggests, just as water fi

Maximal information coefficient

In statistics, the maximal information coefficient (MIC) is a measure of the strength of the linear or non-linear association between two variables X and Y. The MIC belongs to the maximal information-

Fungible information

Fungible information is the information for which the means of encoding is not important. Classical information theorists and computer scientists are mainly concerned with information of this sort. It

Bisection bandwidth

In computer networking, if the network is bisected into two partitions, the bisection bandwidth of a network topology is the bandwidth available between the two partitions. Bisection should be done in

Multiparty communication complexity

In theoretical computer science, multiparty communication complexity is the study of communication complexity in the setting where there are more than 2 players. In the traditional two–party communica

Everything is a file

Everything is a file is an idea that Unix, and its derivatives handle input/output to and from resources such as documents, hard-drives, modems, keyboards, printers and even some inter-process and net

Generalized minimum-distance decoding

In coding theory, generalized minimum-distance (GMD) decoding provides an efficient algorithm for decoding concatenated codes, which is based on using an errors-and-erasures decoder for the outer code

Information source (mathematics)

In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution. The uncertainty, or entropy rate, of an information source i

Code rate

In telecommunication and information theory, the code rate (or information rate) of a forward error correction code is the proportion of the data-stream that is useful (non-redundant). That is, if the

Maximal entropy random walk

Maximal entropy random walk (MERW) is a popular type of biased random walk on a graph, in which transition probabilities are chosen accordingly to the principle of maximum entropy, which says that the

Gambling and information theory

Statistical inference might be thought of as gambling theory applied to the world around us. The myriad applications for logarithmic information measures tell us precisely how to take the best guess i

Conditional entropy

In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here,

Maximum entropy spectral estimation

Maximum entropy spectral estimation is a method of spectral density estimation. The goal is to improve the spectral quality based on the principle of maximum entropy. The method is based on choosing t

Cooperative MIMO

In radio, cooperative multiple-input multiple-output (cooperative MIMO, CO-MIMO) is a technology that can effectively exploit the spatial domain of mobile fading channels to bring significant performa

Z-channel (information theory)

In coding theory and information theory, a Z-channel (binary asymmetric channel) is a communications channel used to model the behaviour of some data storage systems.

Quantum computing

Quantum computing is a type of computation whose operations can harness the phenomena of quantum mechanics, such as superposition, interference, and entanglement. Devices that perform quantum computat

Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is

Lovász number

In graph theory, the Lovász number of a graph is a real number that is an upper bound on the Shannon capacity of the graph. It is also known as Lovász theta function and is commonly denoted by , using

Information algebra

The term "information algebra" refers to mathematical techniques of information processing. Classical information theory goes back to Claude Shannon. It is a theory of information transmission, lookin

Multicast-broadcast single-frequency network

Multimedia Broadcast multicast service Single Frequency Network (MBSFN) is a communication channel defined in the fourth-generation cellular networking standard called Long-Term Evolution (LTE). The t

Pinsker's inequality

In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullb

Minimum Fisher information

In information theory, the principle of minimum Fisher information (MFI) is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation va

EXIT chart

An extrinsic information transfer chart, commonly called an EXIT chart, is a technique to aid the construction of good iteratively-decoded error-correcting codes (in particular low-density parity-chec

Fisher information

In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown p

Zero suppression

Zero suppression is the removal of redundant zeroes from a number. This can be done for storage, page or display space constraints or formatting reasons, such as making a letter more legible.

Information fluctuation complexity

Information fluctuation complexity is an information-theoretic quantity defined as the fluctuation of information about entropy. It is derivable from fluctuations in the predominance of order and chao

Structural information theory

Structural information theory (SIT) is a theory about human perception and in particular about visual perceptual organization, which is a neuro-cognitive process. It has been applied to a wide range o

Index of information theory articles

This is a list of information theory topics, by Wikipedia page.
* A Mathematical Theory of Communication
* algorithmic information theory
* arithmetic coding
* channel capacity
* Communication Th

Nonextensive entropy

Entropy is considered to be an extensive property, i.e., that its value depends on the amount of material present. Constantino Tsallis has proposed a nonextensive entropy (Tsallis entropy), which is a

Network throughput

Network throughput (or just throughput, when in context) refers to the rate of message delivery over a communication channel, such as Ethernet or packet radio, in a communication network. The data tha

Self-dissimilarity

Self-dissimilarity is a measure of complexity defined in a series of papers by David Wolpert and .The degrees of self-dissimilarity between the patterns of a system observed at various scales (e.g. th

Ascendency

Ascendency or ascendancy is a quantitative attribute of an ecosystem, defined as a function of the ecosystem's trophic network. Ascendency is derived using mathematical tools from information theory.

Communication source

A source or sender is one of the basic concepts of communication and information processing. Sources are objects which encode message data and transmit the information, via a channel, to one or more o

Krichevsky–Trofimov estimator

In information theory, given an unknown stationary source π with alphabet A and a sample w from π, the Krichevsky–Trofimov (KT) estimator produces an estimate pi(w) of the probability of each symbol i

Entropic vector

The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may

Pointwise mutual information

In statistics, probability theory and information theory, pointwise mutual information (PMI), or point mutual information, is a measure of association. It compares the probability of two events occurr

Glossary of quantum computing

This glossary of quantum computing is a list of definitions of terms and concepts used in quantum computing, its sub-disciplines, and related fields. Bacon–Shor_codeis a Subsystem error correcting cod

Statistical manifold

In mathematics, a statistical manifold is a Riemannian manifold, each of whose points is a probability distribution. Statistical manifolds provide a setting for the field of information geometry. The

Interaction information

The interaction information is a generalization of the mutual information for more than two variables. There are many names for interaction information, including amount of information, information co

3G MIMO

3G MIMO describes MIMO techniques which have been considered as 3G standard techniques. MIMO, as the state of the art of intelligent antenna (IA), improves the performance of radio systems by embeddin

Zero-forcing precoding

Zero-forcing (or null-steering) precoding is a method of spatial signal processing by which a multiple antenna transmitter can null the multiuser interference in a multi-user MIMO wireless communicati

Kolmogorov complexity

In algorithmic information theory (a subfield of computer science and mathematics), the Kolmogorov complexity of an object, such as a piece of text, is the length of a shortest computer program (in a

Informating

Informating is a term coined by Shoshana Zuboff in her book In the Age of the Smart Machine (1988). It is the process that translates descriptions and measurements of activities, events and objects in

Shannon–Hartley theorem

In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is

Quantities of information

The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following f

Health information-seeking behaviour

Health information-seeking behaviour (HISB), also known as health information seeking, health seeking behaviour or health information behaviour, refers to a series interaction that reduce uncertainty

Wilson's model of information behavior

Wilson's model of information seeking behaviour was born out of a need to focus the field of information and library science on human use of information, rather than the use of sources. Previous studi

Many antennas

Many antennas is a smart antenna technique which overcomes the performance limitation of single user multiple-input multiple-output (MIMO) techniques. In cellular communication, the maximum number of

Entropy rate

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For

Asymptotic equipartition property

In information theory, the asymptotic equipartition property (AEP) is a general property of the output samples of a stochastic source. It is fundamental to the concept of typical set used in theories

Entropy estimation

In various science/engineering applications, such as independent component analysis, image analysis, genetic analysis, speech recognition, manifold learning, and time delay estimation it is useful to

Generalized entropy index

The generalized entropy index has been proposed as a measure of income inequality in a population. It is derived from information theory as a measure of redundancy in data. In information theory a mea

Human information interaction

Human-information interaction or HII is the formal term for information behavior research in archival science; the term was invented by Nahum Gershon in 1995. HII is not transferable from analog to di

Network performance

Network performance refers to measures of service quality of a network as seen by the customer. There are many different ways to measure the performance of a network, as each network is different in n

Three-process view

The three-process view is a psychological term coined by and Robert Sternberg. According to this concept, there are three kinds of insight: selective-encoding, selective-comparison, and selective-comb

Information exchange

Information exchange or information sharing means that people or other entities pass information from one to another. This could be done electronically or through certain systems. These are terms that

Interference channel

In information theory, the interference channel is the basic model used to analyze the effect of interference in communication channels. The model consists of two pairs of users communicating through

Bar product

In information theory, the bar product of two linear codes C2 ⊆ C1 is defined as where (a | b) denotes the concatenation of a and b. If the code words in C1 are of length n, then the code words in C1

Timeline of information theory

A timeline of events related to information theory, quantum information theory and statistical physics, data compression, error correcting codes and related subjects.
* 1872 – Ludwig Boltzmann presen

Shearer's inequality

Shearer's inequality or also Shearer's lemma, in mathematics, is an inequality in information theory relating the entropy of a set of variables to the entropies of a collection of subsets. It is named

Effective complexity

Effective complexity is a measure of complexity defined in a 1996 paper by Murray Gell-Mann and Seth Lloyd that attempts to measure the amount of non-random information in a system. It has been critic

Relay channel

In information theory, a relay channel is a probability model of the communication between a sender and a receiver aided by one or more intermediate relay nodes.

Damerau–Levenshtein distance

In information theory and computer science, the Damerau–Levenshtein distance (named after Frederick J. Damerau and Vladimir I. Levenshtein) is a string metric for measuring the edit distance between t

Distributed source coding

Distributed source coding (DSC) is an important problem in information theory and communication. DSC problems regard the compression of multiple correlated information sources that do not communicate

Incompressibility method

In mathematics, the incompressibility method is a proof method like the probabilistic method, the counting method or the pigeonhole principle. To prove that an object in a certain class (on average) s

Karl Küpfmüller

Karl Küpfmüller (6 October 1897 – 26 December 1977) was a German electrical engineer, who was prolific in the areas of communications technology, measurement and control engineering, acoustics, commun

Water-pouring algorithm

The water-pouring algorithm is a technique used in digital communications systems for allocating power among different channels in multicarrier schemes. It was described by R. C. Gallager in 1968 alon

Ideal tasks

Ideal tasks arise during task analysis. Ideal tasks are different from real tasks. They are ideals in the Platonic sense of a circle being an ideal whereas a drawn circle is flawed and real. The study

Observed information

In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function

MIMO-OFDM

Multiple-input, multiple-output orthogonal frequency-division multiplexing (MIMO-OFDM) is the dominant air interface for 4G and 5G broadband wireless communications. It combines multiple-input, multip

Exformation

Exformation (originally spelled eksformation in Danish) is a term coined by Danish science writer Tor Nørretranders in his book The User Illusion published in English 1998. It is meant to mean explici

Log sum inequality

The log sum inequality is used for proving theorems in information theory.

Conditional mutual information

In probability theory, particularly information theory, the conditional mutual information is, in its most basic form, the expected value of the mutual information of two random variables given the va

Map communication model

The Map Communication Model is a theory in cartography that characterizes mapping as a process of transmitting geographic information via the map from the cartographer to the end-user. It was perhaps

Privilege revocation (computing)

Privilege revocation is the act of an entity giving up some, or all of, the privileges they possess, or some authority taking those (privileged) rights away.

Cobham's theorem

Cobham's theorem is a theorem in combinatorics on words that has important connections with number theory, notably transcendental numbers, and automata theory. Informally, the theorem gives the condit

Constant-weight code

In coding theory, a constant-weight code, also called an m-of-n code, is an error detection and correction code where all codewords share the same Hamming weight.The one-hot code and the balanced code

Generalized distributive law

The generalized distributive law (GDL) is a generalization of the distributive property which gives rise to a general message passing algorithm. It is a synthesis of the work of many authors in the in

Information content

In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random varia

Algorithmic information theory

Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects (as oppo

Receiver (information theory)

The receiver in information theory is the receiving end of a communication channel. It receives decoded messages/information from the sender, who first encoded them. Sometimes the receiver is modeled

Information dimension

In information theory, information dimension is an information measure for random vectors in Euclidean space, based on the normalized entropy of finely quantized versions of the random vectors. This c

Metcalfe's law

Metcalfe's law states that the value of a telecommunications network is proportional to the square of the number of connected usersof the system (n2). First formulated in this form by George Gilder in

Lempel–Ziv complexity

The Lempel–Ziv complexity is a measure that was first presented in the article On the Complexity of Finite Sequences (IEEE Trans. On IT-22,1 1976), by two Israeli computer scientists, Abraham Lempel a

Jakobson's functions of language

Roman Jakobson defined six functions of language (or communication functions), according to which an effective act of verbal communication can be described. Each of the functions has an associated fac

Sanov's theorem

In mathematics and information theory, Sanov's theorem gives a bound on the probability of observing an atypical sequence of samples from a given probability distribution. In the language of large dev

Logic of information

The logic of information, or the logical theory of information, considers the information content of logical signs and expressions along the lines initially developed by Charles Sanders Peirce. In thi

Entropic gravity

Entropic gravity, also known as emergent gravity, is a theory in modern physics that describes gravity as an entropic force—a force with macro-scale homogeneity but which is subject to quantum-level d

Scale-free ideal gas

The scale-free ideal gas (SFIG) is a physical model assuming a collection of non-interacting elements with a stochastic proportional growth. It is the scale-invariant version of an ideal gas. Some cas

Identity channel

In quantum information theory, the identity channel is a noise-free quantum channel. That is, the channel outputs exactly what was put in. The identity channel is commonly denoted as , or .

Name collision

In computer programming, a name collision is the nomenclature problem that occurs when the same variable name is used for different things in two separate areas that are joined, merged, or otherwise g

Cycles of Time

Cycles of Time: An Extraordinary New View of the Universe is a science book by mathematical physicist Roger Penrose published by The Bodley Head in 2010. The book outlines Penrose's Conformal Cyclic C

Hyper-encryption

Hyper-encryption is a form of encryption invented by Michael O. Rabin which uses a high-bandwidth source of public random bits, together with a secret key that is shared by only the sender and recipie

Grey relational analysis

Grey relational analysis (GRA) was developed by Deng Julong of Huazhong University of Science and Technology. It is one of the most widely used models of grey system theory. GRA uses a specific concep

Information–action ratio

The information–action ratio was a concept coined by cultural critic Neil Postman (1931–2003) in his work Amusing Ourselves to Death. In short, Postman meant to indicate the relationship between a pie

Information projection

In information theory, the information projection or I-projection of a probability distribution q onto a set of distributions P is where is the Kullback–Leibler divergence from q to p. Viewing the Kul

Bandwidth (computing)

In computing, bandwidth is the maximum rate of data transfer across a given path. Bandwidth may be characterized as network bandwidth, data bandwidth, or digital bandwidth. This definition of bandwidt

Logical depth

Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It d

Phase factor

For any complex number written in polar form (such as r eiθ), the phase factor is the complex exponential factor (eiθ). As such, the term "phase factor" is related to the more general term phasor, whi

Bra–ket notation

In quantum mechanics, bra–ket notation, or Dirac notation, is used ubiquitously to denote quantum states. The notation uses angle brackets, and , and a vertical bar , to construct "bras" and "kets". A

Chain rule for Kolmogorov complexity

The chain rule for Kolmogorov complexity is an analogue of the chain rule for information entropy, which states: That is, the combined randomness of two sequences X and Y is the sum of the randomness

Operator grammar

Operator grammar is a mathematical theory of human language that explains how language carries information. This theory is the culmination of the life work of Zellig Harris, with major toward the end

Entropy (information theory)

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variab

Common data model

A common data model (CDM) can refer to any standardised data model which allows for data and information exchange between different applications and data sources. Common data models aim to standardise

Grammatical Man

Grammatical Man: Information, Entropy, Language, and Life is a 1982 book written by the Evening Standard's Washington correspondent, Jeremy Campbell. The book touches on topics of probability, Informa

A Symbolic Analysis of Relay and Switching Circuits

"A Symbolic Analysis of Relay and Switching Circuits" is the title of a master's thesis written by computer science pioneer Claude E. Shannon while attending the Massachusetts Institute of Technology

Maximum entropy thermodynamics

In physics, maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies in

Spectral efficiency

Spectral efficiency, spectrum efficiency or bandwidth efficiency refers to the information rate that can be transmitted over a given bandwidth in a specific communication system. It is a measure of ho

Gestalt pattern matching

Gestalt pattern matching, also Ratcliff/Obershelp pattern recognition, is a string-matching algorithm for determining the similarity of two strings. It was developed in 1983 by John W. Ratcliff and an

Harry Nyquist

Harry Nyquist (/ˈnaɪkwɪst/, Swedish: [ˈnŷːkvɪst]; February 7, 1889 – April 4, 1976) was a Swedish-American physicist and electronic engineer who made important contributions to communication theory.

Interactions of actors theory

Interactions of actors theory is a theory developed by Gordon Pask and Gerard de Zeeuw. It is a generalisation of Pask's earlier conversation theory: The chief distinction being that conversation theo

Kullback's inequality

In information theory and statistics, Kullback's inequality is a lower bound on the Kullback–Leibler divergence expressed in terms of the large deviations rate function. If P and Q are probability dis

Surprisal analysis

Surprisal analysis is an information-theoretical analysis technique that integrates and applies principles of thermodynamics and maximal entropy. Surprisal analysis is capable of relating the underlyi

Zyablov bound

In coding theory, the Zyablov bound is a lower bound on the rate and relative distance that are achievable by concatenated codes.

Quantum capacity

In the theory of quantum communication, the quantum capacity is the highest rate at which quantum information can be communicated over many independent uses of a noisy quantum channel from a sender to

Information theory and measure theory

This article discusses how information theory (a branch of mathematics studying the transmission, processing and storage of information) is related to measure theory (a branch of mathematics related t

Shannon–Weaver model

The Shannon–Weaver model is one of the first and most influential models of communication. It was initially published in the 1948 paper A Mathematical Theory of Communication and explains communicatio

Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the

Graph entropy

In information theory, the graph entropy is a measure of the information rate achievable by communicating symbols over a channel in which certain pairs of values may be confused. This measure, first i

Rate–distortion theory

Rate–distortion theory is a major branch of information theory which provides the theoretical foundations for lossy data compression; it addresses the problem of determining the minimal number of bits

Pragmatic theory of information

The pragmatic theory of information is derived from Charles Sanders Peirce's general theory of signs and inquiry. Peirce explored a number of ideas about information throughout his career. One set of

One-way quantum computer

The one-way or measurement-based quantum computer (MBQC) is a method of quantum computing that first prepares an entangled resource state, usually a cluster state or graph state, then performs single

Spatiotemporal pattern

Spatiotemporal patterns are patterns that occur in a wide range of natural phenoma and are characterized by a spatial and a temporal patterning. The general rules of pattern formation hold. In contras

Entropic uncertainty

In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out tha

Specific-information

In information theory, specific-information is the generic name given to the family of state-dependent measures that in expectation converge to the mutual information. There are currently three known

Linear network coding

In computer networking, linear network coding is a program in which intermediate nodes transmit data from source nodes to sink nodes by means of linear combinations. Linear network coding may be used

Compressed sensing

Compressed sensing (also known as compressive sensing, compressive sampling, or sparse sampling) is a signal processing technique for efficiently acquiring and reconstructing a signal, by finding solu

WSSUS model

The WSSUS (Wide-Sense Stationary Uncorrelated Scattering) model provides a statistical description of the transmission behavior of wireless channels. "Wide-sense stationarity" means the second-order m

Total correlation

In probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information. It is also known as the multivariate const

Kelly criterion

In probability theory, the Kelly criterion (or Kelly strategy or Kelly bet), is a formula that determines the optimal theoretical size for a bet. It is valid when the expected returns are known. The K

Computational irreducibility

Computational irreducibility is one of the main ideas proposed by Stephen Wolfram in his 2002 book A New Kind of Science, although the concept goes back to studies from the 1980s.

Claude Shannon

Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory". As a 21-year-old master's de

Homomorphic signatures for network coding

Network coding has been shown to optimally use bandwidth in a network, maximizing information flow but the scheme is very inherently vulnerable to pollution attacks by malicious nodes in the network.

Proebsting's paradox

In probability theory, Proebsting's paradox is an argument that appears to show that the Kelly criterion can lead to ruin. Although it can be resolved mathematically, it raises some interesting issues

Bretagnolle–Huber inequality

In information theory, the Bretagnolle–Huber inequality bounds the total variation distance between two probability distributions and by a concave and bounded function of the Kullback–Leibler divergen

Concatenated error correction code

In coding theory, concatenated codes form a class of error-correcting codes that are derived by combining an inner code and an outer code. They were conceived in 1966 by Dave Forney as a solution to t

Shaping codes

In digital communications shaping codes are a method of encoding that changes the distribution of signals to improve efficiency.

Information continuum

The term information continuum is used to describe the whole set of all information, in connection with information management. The term may be used in reference to the information or the information

Information theory

Information theory is the scientific study of the quantification, storage, and communication of information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in

Joint source and channel coding

In information theory, joint source–channel coding is the encoding of a redundant information source for transmission over a noisy channel, and the corresponding decoding, using a single code instead

Measure-preserving dynamical system

In mathematics, a measure-preserving dynamical system is an object of study in the abstract formulation of dynamical systems, and ergodic theory in particular. Measure-preserving systems obey the Poin

Entropy power inequality

In information theory, the entropy power inequality (EPI) is a result that relates to so-called "entropy power" of random variables. It shows that the entropy power of suitably well-behaved random var

Nyquist–Shannon sampling theorem

The Nyquist–Shannon sampling theorem is a theorem in the field of signal processing which serves as a fundamental bridge between continuous-time signals and discrete-time signals. It establishes a suf

Adjusted mutual information

In probability theory and information theory, adjusted mutual information, a variation of mutual information may be used for comparing clusterings. It corrects the effect of agreement solely due to ch

Information diagram

An information diagram is a type of Venn diagram used in information theory to illustrate relationships among Shannon's basic measures of information: entropy, joint entropy, conditional entropy and m

Unicity distance

In cryptography, unicity distance is the length of an original ciphertext needed to break the cipher by reducing the number of possible spurious keys to zero in a brute force attack. That is, after tr

Redundancy (information theory)

In information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value . Informally, it is the amount of wasted "space" used to

IMU Abacus Medal

The IMU Abacus Medal, known before 2022 as the Rolf Nevanlinna Prize, is awarded once every four years at the International Congress of Mathematicians, hosted by the International Mathematical Union (

Cheung–Marks theorem

In information theory, the Cheung–Marks theorem, named after K. F. Cheung and Robert J. Marks II, specifies conditions where restoration of a signal by the sampling theorem can become ill-posed. It of

Error exponent

In information theory, the error exponent of a channel code or source code over the block length of the code is the rate at which the error probability decays exponentially with the block length of th

Fano's inequality

In information theory, Fano's inequality (also known as the Fano converse and the Fano lemma) relates the average information lost in a noisy channel to the probability of the categorization error. It

Constraint (information theory)

Constraint in information theory is the degree of statistical dependence between or among variables. Garner provides a thorough discussion of various forms of constraint (internal constraint, external

Bandwidth extension

Bandwidth extension of signal is defined as the deliberate process of expanding the frequency range (bandwidth) of a signal in which it contains an appreciable and useful content, and/or the frequency

Information behavior

Information behavior is a field of information science research that seeks to understand the way people search for and use information in various contexts. It can include information seeking and infor

Multi-user MIMO

Multi-user MIMO (MU-MIMO) is a set of multiple-input and multiple-output (MIMO) technologies for multipath wireless communication, in which multiple users or terminals, each radioing over one or more

Shannon capacity of a graph

In graph theory, the Shannon capacity of a graph is a graph invariant defined from the number of independent sets of strong graph products. It is named after American mathematician Claude Shannon. It

Spatial multiplexing

Spatial multiplexing or space-division multiplexing (often abbreviated SM, SDM or SMX) is a multiplexing technique in MIMO wireless communication, fibre-optic communication and other communications te

Formation matrix

In statistics and information theory, the expected formation matrix of a likelihood function is the matrix inverse of the Fisher information matrix of , while the observed formation matrix of is the i

Information flow (information theory)

Information flow in an information theoretical context is the transfer of information from a variable to a variable in a given process. Not all flows may be desirable; for example, a system should not

Grammar-based code

Grammar-based codes or Grammar-based compression are compression algorithms based on the idea of constructing a context-free grammar (CFG) for the string to be compressed. Examples include universal l

Uncertainty coefficient

In statistics, the uncertainty coefficient, also called proficiency, entropy coefficient or Theil's U, is a measure of nominal association. It was first introduced by Henri Theil and is based on the c

Triangular network coding

In coding theory, triangular network coding (TNC) is a network coding based packet coding scheme introduced by .Previously, packet coding for network coding was done using linear network coding (LNC).

Outage probability

In Information theory, outage probability of a communication channel is the probability that a given information rate is not supported, because of variable channel capacity. Outage probability is defi

IEEE Transactions on Information Theory

IEEE Transactions on Information Theory is a monthly peer-reviewed scientific journal published by the IEEE Information Theory Society. It covers information theory and the mathematics of communicatio

Dual total correlation

In information theory, dual total correlation (Han 1978), information rate (Dubnov 2006), excess entropy (Olbrich 2008), or binding information (Abdallah and Plumbley 2010) is one of several known non

© 2023 Useful Links.