- Fields of mathematics
- >
- Applied mathematics
- >
- Mathematical modeling
- >
- Statistical models

- Probability and statistics
- >
- Statistics
- >
- Statistical theory
- >
- Statistical models

Control function (econometrics)

Control functions (also known as two-stage residual inclusion) are statistical methods to correct for endogeneity problems by modelling the endogeneity in the error term. The approach thereby differs

Relative likelihood

In statistics, suppose that we have been given some data, and we are selecting a statistical model for that data. The relative likelihood compares the relative plausibilities of different candidate mo

Reification (statistics)

In statistics, reification is the use of an idealized model of a statistical process. The model is then used to make inferences connecting model results, which imperfectly represent the actual process

Land use regression model

A land use regression model (LUR model) is an algorithm often used for analyzing pollution, particularly in densely populated areas. The model is based on predictable pollution patterns to estimate co

Rubin causal model

The Rubin causal model (RCM), also known as the Neyman–Rubin causal model, is an approach to the statistical analysis of cause and effect based on the framework of potential outcomes, named after Dona

Response modeling methodology

Response modeling methodology (RMM) is a general platform for statistical modeling of a linear/nonlinear relationship between a response variable (dependent variable) and a linear predictor (a linear

Energy based model

An energy-based model (EBM) is a form of generative model (GM) imported directly from statistical physics to learning. GMs learn an underlying data distribution by analyzing a sample dataset. Once tra

Generative model

In statistical classification, two main approaches are called the generative approach and the discriminative approach. These compute classifiers by different approaches, differing in the degree of sta

Whittle likelihood

In statistics, Whittle likelihood is an approximation to the likelihood function of a stationary Gaussian time series. It is named after the mathematician and statistician Peter Whittle, who introduce

Flow-based generative model

A flow-based generative model is a generative model used in machine learning that explicitly models a probability distribution by leveraging normalizing flow, which is a statistical method using the c

Completely randomized design

In the design of experiments, completely randomized designs are for studying the effects of one primary factor without the need to take other nuisance variables into account. This article describes co

All models are wrong

All models are wrong is a common aphorism in statistics; it is often expanded as "All models are wrong, but some are useful". The aphorism acknowledges that statistical models always fall short of the

Statistical model

A statistical model is a mathematical model that embodies a set of statistical assumptions concerning the generation of sample data (and similar data from a larger population). A statistical model rep

Autologistic actor attribute models

Autologistic actor attribute models (ALAAMs) are a family of statistical models used to model the occurrence of node attributes (individual-level outcomes) in network data. They are frequently used wi

Bradley–Terry model

The Bradley–Terry model is a probability model that can predict the outcome of a paired comparison. Given a pair of individuals i and j drawn from some population, it estimates the probability that th

ACE model

The ACE model is a statistical model commonly used to analyze the results of twin and adoption studies. This classic behaviour genetic model aims to partition the phenotypic variance into three catego

Mediation (statistics)

In statistics, a mediation model seeks to identify and explain the mechanism or process that underlies an observed relationship between an independent variable and a dependent variable via the inclusi

Predictive modelling

Predictive modelling uses statistics to predict outcomes. Most often the event one wants to predict is in the future, but predictive modelling can be applied to any type of unknown event, regardless o

Impartial culture

Impartial culture (IC) or the culture of indifference is a probabilistic model used in social choice theory for analyzing ranked voting method rules. The model is understood to be unrealistic, and not

Moderated mediation

In statistics, moderation and mediation can occur together in the same model. Moderated mediation, also known as conditional indirect effects, occurs when the treatment effect of an independent variab

Nonlinear modelling

In mathematics, nonlinear modelling is empirical or semi-empirical modelling which takes at least some nonlinearities into account. Nonlinear modelling in practice therefore means modelling of phenome

Rasch model

The Rasch model, named after Georg Rasch, is a psychometric model for analyzing categorical data, such as answers to questions on a reading assessment or questionnaire responses, as a function of the

Exponential dispersion model

In probability and statistics, the class of exponential dispersion models (EDM) is a set of probability distributions that represents a generalisation of the natural exponential family.Exponential dis

Statistical model specification

In statistics, model specification is part of the process of building a statistical model: specification consists of selecting an appropriate functional form for the model and choosing which variables

Parametric model

In statistics, a parametric model or parametric family or finite-dimensional model is a particular class of statistical models. Specifically, a parametric model is a family of probability distribution

Marginal structural model

Marginal structural models are a class of statistical models used for causal inference in epidemiology. Such models handle the issue of time-dependent confounding in evaluation of the efficacy of inte

Phenomenological model

A phenomenological model is a scientific model that describes the empirical relationship of phenomena to each other, in a way which is consistent with fundamental theory, but is not directly derived f

Statistical model validation

In statistics, model validation is the task of evaluating whether a chosen statistical model is appropriate or not. Oftentimes in statistical inference, inferences from models that appear to fit their

Hurdle model

A hurdle model is a class of statistical models where a random variable is modelled using two parts, the first which is the probability of attaining value 0, and the second part models the probability

Statistical Modelling Society

The Statistical Modelling Society (SMS) is an international society of statisticians, which, according to its statutes, will promote statistical modelling as the general framework for the application

Infinitesimal model

The infinitesimal model, also known as the polygenic model, is a widely used statistical model in quantitative genetics. Originally developed in 1918 by Ronald Fisher, it is based on the idea that var

© 2023 Useful Links.