Information theory

Information theory

Information theory is the scientific study of the quantification, storage, and communication of information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security. Applications of fundamental topics of information theory include source coding/data compression (e.g. for ZIP files), and channel coding/error detection and correction (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the development of the Internet. The theory has also found applications in other areas, including statistical inference, cryptography, neurobiology, perception, linguistics, the evolution and function of molecular codes (bioinformatics), thermal physics, molecular dynamics, quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection, pattern recognition, anomaly detection and even art creation. (Wikipedia).

Information theory
Video thumbnail

(IC 1.6) A different notion of "information"

An informal discussion of the distinctions between our everyday usage of the word "information" and the information-theoretic notion of "information". A playlist of these videos is available at: http://www.youtube.com/playlist?list=PLE125425EC837021F Attribution for image of TV static:

From playlist Information theory and Coding

Video thumbnail

(IC 1.1) Information theory and Coding - Outline of topics

A playlist of these videos is available at: http://www.youtube.com/playlist?list=PLE125425EC837021F Overview of central topics in Information theory and Coding. Compression (source coding) theory: Source coding theorem, Kraft-McMillan inequality, Rate-distortion theorem Error-correctio

From playlist Information theory and Coding

Video thumbnail

From information theory to learning via Statistical Physics by Florent Krzakala

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

From information theory to learning via Statistical Physics: Introduction: by Florent Krzakala

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

From information theory to learning via Statistical Physics: From statistical by Florent Krzakala

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

Information Theory Meets Quantum Physics: The magic of wave dynamics by Apoorva Patel

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

Thermodynamics of Information by Juan MR Parrondo (Lecture 4)

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

Thermodynamics of Information by Juan MR Parrondo (Lecture 1)

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

Thermodynamics of Information by Juan MR Parrondo (Lecture 2)

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

IMS Public Lecture: Trends in Wireless Communications

Sergio Verdú, Princeton University

From playlist Public Lectures

Video thumbnail

Bernard Geoghegan, “The Difficulty of Gift-Giving: Cybernetics and Postwar French Thought”

A historian and theorist of digital media, Geoghegan is a senior lecturer in Media and Communications at Coventry University and a visiting associate professor in Film and Media Studies at Yale University. He also works as a curator and educational programmer for the Anthropocene Project a

From playlist Whitney Humanities Center

Video thumbnail

La théorie l’information sans peine - Bourbaphy - 17/11/18

Olivier Rioul (Telecom Paris Tech) / 17.11.2018 La théorie l’information sans peine ---------------------------------- Vous pouvez nous rejoindre sur les réseaux sociaux pour suivre nos actualités. Facebook : https://www.facebook.com/InstitutHenriPoincare/ Twitter : https://twitter.com

From playlist Bourbaphy - 17/11/18 - L'information

Video thumbnail

Sergio Verdu - Information Theory Today

Founded by Claude Shannon in 1948, information theory has taken on renewed vibrancy with technological advances that pave the way for attaining the fundamental limits of communication channels and information sources. Increasingly playing a role as a design driver, information theory is b

From playlist NOKIA-IHES Workshop

Video thumbnail

Vigyan Adda Talk: Black Holes and the Reversibility of Time by Suvrat Raju

OUTREACH ACTIVITY VIGYAN ADDA TALK: BLACK HOLES AND THE REVERSIBILITY OF TIME SPEAKER Suvrat Raju (ICTS-TIFR, Bengaluru) WHEN 4:30 pm to 6:00 pm Tuesday, 22 December 2020 WHERE Livestream via the ICTS YouTube channel Abstract: A central tenet of physics is that time evolution is reversibl

From playlist Vigyan Adda

Video thumbnail

Entanglement entropy, quantum field theory, and holography by Matthew Headrick

26 December 2016 to 07 January 2017 VENUE : Madhava Lecture Hall, ICTS, Bengaluru Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classic

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

Constructor Theory: A New Explanation of Fundamental Physics - Chiara Marletto and Marcus du Sautoy

Constructor theory holds promise for revolutionising the way fundamental physics is formulated and for providing essential tools to face existing technological challenges. Chiara's book "The Science of Can and Can't" is available now: https://geni.us/ChiaraMarletto Watch the Q&A: https://y

From playlist Livestreams

Video thumbnail

Introduction to Quantum Entaglement by Albion Lawrence

26 December 2016 to 07 January 2017 VENUE: Madhava Lecture Hall, ICTS Bangalore Information theory and computational complexity have emerged as central concepts in the study of biological and physical systems, in both the classical and quantum realm. The low-energy landscape of classical

From playlist US-India Advanced Studies Institute: Classical and Quantum Information

Video thumbnail

Solving the Puzzle of Black Holes: Hawking, Entropy, and a Theory of Everything

With the power of math, scientists are going even further, using equations to “look” inside black holes, peering at the central singularity where general relativity and quantum mechanics collide. PARTICIPANTS: Cumrun Vafa MODERATOR: Brian Greene MORE INFO ABOUT THE PROGRAM AND PARTICIP

From playlist Space & The Cosmos

Related pages

Communication channel | Channel capacity | Hamming distance | Integrated information theory | Rényi entropy | Extractor (mathematics) | History of information theory | Differential entropy | Prior probability | E (mathematical constant) | Information-theoretic security | Cryptography | Ergodic theory | A Mathematical Theory of Communication | Directed information | Bit | Detection theory | Fungible information | Plaintext | Random variable | Charles Sanders Peirce | Gambling and information theory | Conditional entropy | Common logarithm | Quantum computing | Coding theory | Noisy-channel coding theorem | Andrey Kolmogorov | Information algebra | Min-entropy | Byte | Fisher information | Estimation theory | Info-metrics | Information fluctuation complexity | Covert channel | Block cipher | Pearson's chi-squared test | Communication source | Pointwise mutual information | Kullback–Leibler divergence | Likelihood-ratio test | Alan Turing | Posterior probability | Kolmogorov complexity | Shannon–Hartley theorem | Quantities of information | Conditional probability | Multinomial distribution | Entropy rate | Minimum message length | Digital signal processing | Dice | Expected value | Probability theory | Nat (unit) | Timeline of information theory | Code (cryptography) | Bayesian inference | Triangle inequality | Relay channel | Joint entropy | Inductive probability | Statistics | Stochastic process | Conditional mutual information | Information field theory | Information content | Anomaly detection | Algorithmic information theory | Receiver (information theory) | Cryptanalysis | Logic of information | Quantum information science | Cipher | Binary symmetric channel | Cryptographically secure pseudorandom number generator | Philosophy of information | Random seed | Symmetric-key algorithm | Stationary process | Binary logarithm | Information asymmetry | Shannon (unit) | Hartley (unit) | Entropy (information theory) | Public-key cryptography | Error detection and correction | Grammatical Man | Harry Nyquist | Symmetric function | Information theory and measure theory | Mutual information | Rate–distortion theory | Gaussian noise | Algorithmic probability | Boltzmann constant | One-time pad | Claude Shannon | Binary erasure channel | Key (cryptography) | Statistical inference | Unicity distance | Ciphertext | Redundancy (information theory) | Information geometry | Error exponent | Entropy in thermodynamics and information theory | Probability distribution | Quantification (science) | Enigma machine | Cross entropy | Minimum description length | Pseudorandom number generator | Natural logarithm | Probability mass function