Nonparametric Bayesian statistics | Normal distribution | Stochastic processes

Gaussian process

In probability theory and statistics, a Gaussian process is a stochastic process (a collection of random variables indexed by time or space), such that every finite collection of those random variables has a multivariate normal distribution, i.e. every finite linear combination of them is normally distributed. The distribution of a Gaussian process is the joint distribution of all those (infinitely many) random variables, and as such, it is a distribution over functions with a continuous domain, e.g. time or space. The concept of Gaussian processes is named after Carl Friedrich Gauss because it is based on the notion of the Gaussian distribution (normal distribution). Gaussian processes can be seen as an infinite-dimensional generalization of multivariate normal distributions. Gaussian processes are useful in statistical modelling, benefiting from properties inherited from the normal distribution. For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly. Such quantities include the average value of the process over a range of times and the error in estimating the average using sample values at a small set of times. While exact models often scale poorly as the amount of data increases, multiple approximation methods have been developed which often retain good accuracy while drastically reducing computation time. (Wikipedia).

Gaussian process
Video thumbnail

(ML 19.1) Gaussian processes - definition and first examples

Definition of a Gaussian process. Elementary examples of Gaussian processes.

From playlist Machine Learning

Video thumbnail

(ML 19.2) Existence of Gaussian processes

Statement of the theorem on existence of Gaussian processes, and an explanation of what it is saying.

From playlist Machine Learning

Video thumbnail

(ML 19.3) Examples of Gaussian processes (part 1)

Illustrative examples of several Gaussian processes, and visualization of samples drawn from these Gaussian processes. (Random planes, Brownian motion, squared exponential GP, Ornstein-Uhlenbeck, a periodic GP, and a symmetric GP).

From playlist Machine Learning

Video thumbnail

(ML 19.4) Examples of Gaussian processes (part 2)

Illustrative examples of several Gaussian processes, and visualization of samples drawn from these Gaussian processes. (Random planes, Brownian motion, squared exponential GP, Ornstein-Uhlenbeck, a periodic GP, and a symmetric GP).

From playlist Machine Learning

Video thumbnail

What is Gaussian elimination?

Gaussian elimination example is discussed and the general algorithm explained. Such ideas are important in the solution of systems of equations.

From playlist Intro to Linear Systems

Video thumbnail

PUSHING A GAUSSIAN TO THE LIMIT

Integrating a gaussian is everyones favorite party trick. But it can be used to describe something else. Link to gaussian integral: https://www.youtube.com/watch?v=mcar5MDMd_A Link to my Skype Tutoring site: dotsontutoring.simplybook.me or email dotsontutoring@gmail.com if you have ques

From playlist Math/Derivation Videos

Video thumbnail

Gaussian Integral 7 Wallis Way

Welcome to the awesome 12-part series on the Gaussian integral. In this series of videos, I calculate the Gaussian integral in 12 different ways. Which method is the best? Watch and find out! In this video, I calculate the Gaussian integral by using a technique that is very similar to the

From playlist Gaussian Integral

Video thumbnail

(PP 6.1) Multivariate Gaussian - definition

Introduction to the multivariate Gaussian (or multivariate Normal) distribution.

From playlist Probability Theory

Video thumbnail

(ML 19.9) GP regression - introduction

Introduction to the application of Gaussian processes to regression. Bayesian linear regression as a special case of GP regression.

From playlist Machine Learning

Video thumbnail

ML Tutorial: Gaussian Processes (Richard Turner)

Machine Learning Tutorial at Imperial College London: Gaussian Processes Richard Turner (University of Cambridge) November 23, 2016

From playlist Machine Learning Tutorials

Video thumbnail

06 Inference in Time Series

Slides and more information: https://mml-book.github.io/slopes-expectations.html

From playlist There and Back Again: A Tale of Slopes and Expectations (NeurIPS-2020 Tutorial)

Video thumbnail

Some thoughts on Gaussian processes for emulation of deterministic computer models: Michael Stein

Uncertainty quantification (UQ) employs theoretical, numerical and computational tools to characterise uncertainty. It is increasingly becoming a relevant tool to gain a better understanding of physical systems and to make better decisions under uncertainty. Realistic physical systems are

From playlist Effective and efficient gaussian processes

Video thumbnail

03 Numerical Integration

Slides and more information: https://mml-book.github.io/slopes-expectations.html

From playlist There and Back Again: A Tale of Slopes and Expectations (NeurIPS-2020 Tutorial)

Video thumbnail

(PP 6.3) Gaussian coordinates does not imply (multivariate) Gaussian

An example illustrating the fact that a vector of Gaussian random variables is not necessarily (multivariate) Gaussian.

From playlist Probability Theory

Related pages

White noise | Wiener process | If and only if | Deep learning | Finite set | Artificial neuron | Andrey Kolmogorov | Gaussian free field | Statistics | Bayesian network | Stochastic process | Kronecker delta | Interpolation | Carl Friedrich Gauss | Prior probability | Multivariate normal distribution | Indexed family | Statistical parameter | Bayes linear statistics | Imaginary unit | Brownian bridge | Periodic function | Fractional Brownian motion | Stationary increments | Statistical model | Student's t-distribution | Gaussian process approximations | Covariance function | Gamma function | Probabilistic numerics | Function (mathematics) | Closed-form expression | Continuous stochastic process | Kernel methods for vector output | Autocovariance | Kriging | Normal distribution | Standard deviation | Marginal likelihood | Linear combination | Artificial neural network | Gauss–Markov process | Isotropy | Random variable | Stationary process | Integration by substitution | Probability theory | Gradient-enhanced kriging | Lacunary function | Ornstein–Uhlenbeck process | Reproducing kernel Hilbert space | Gram matrix | Matrix (mathematics) | Nonlinear mixed-effects model | Smoothness | Bayesian inference | Characteristic function (probability theory) | Covariance