- Applied statistics
- >
- Econometrics
- >
- Estimation theory
- >
- Point estimation performance

- Inference
- >
- Statistical inference
- >
- Estimation theory
- >
- Point estimation performance

- Mathematical economics
- >
- Econometrics
- >
- Estimation theory
- >
- Point estimation performance

- Statistical analysis
- >
- Descriptive statistics
- >
- Statistical deviation and dispersion
- >
- Point estimation performance

- Statistical analysis
- >
- Statistical inference
- >
- Estimation theory
- >
- Point estimation performance

- Statistical theory
- >
- Statistical inference
- >
- Estimation theory
- >
- Point estimation performance

- Summary statistics
- >
- Descriptive statistics
- >
- Statistical deviation and dispersion
- >
- Point estimation performance

Mean integrated squared error

In statistics, the mean integrated squared error (MISE) is used in density estimation. The MISE of an estimate of an unknown probability density is given by where ƒ is the unknown density, ƒn is its e

Minimum mean square error

In statistics and signal processing, a minimum mean square error (MMSE) estimator is an estimation method which minimizes the mean square error (MSE), which is a common measure of estimator quality, o

Mean absolute error

In statistics, mean absolute error (MAE) is a measure of errors between paired observations expressing the same phenomenon. Examples of Y versus X include comparisons of predicted versus observed, sub

Nash–Sutcliffe model efficiency coefficient

The Nash–Sutcliffe model efficiency coefficient (NSE) is used to assess the predictive skill of hydrological models. It is defined as: where is the mean of observed discharges, and is modeled discharg

Root-mean-square deviation

The root-mean-square deviation (RMSD) or root-mean-square error (RMSE) is a frequently used measure of the differences between values (sample or population values) predicted by a model or an estimator

Mean squared prediction error

In statistics the mean squared prediction error or mean squared error of the predictions of a smoothing or curve fitting procedure is the expected value of the squared difference between the fitted va

Mean absolute scaled error

In statistics, the mean absolute scaled error (MASE) is a measure of the accuracy of forecasts. It is the mean absolute error of the forecast values, divided by the mean absolute error of the in-sampl

Mean squared error

In statistics, the mean squared error (MSE) or mean squared deviation (MSD) of an estimator (of a procedure for estimating an unobserved quantity) measures the average of the squares of the errors—tha

Stein's unbiased risk estimate

In statistics, Stein's unbiased risk estimate (SURE) is an unbiased estimator of the mean-squared error of "a nearly arbitrary, nonlinear biased estimator." In other words, it provides an indication o

Pitman closeness criterion

In statistical theory, the Pitman closeness criterion, named after E. J. G. Pitman, is a way of comparing two candidate estimators for the same parameter. Under this criterion, estimator A is preferre

Bias of an estimator

In statistics, the bias of an estimator (or bias function) is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule

Least absolute deviations

Least absolute deviations (LAD), also known as least absolute errors (LAE), least absolute residuals (LAR), or least absolute values (LAV), is a statistical optimality criterion and a statistical opti

© 2023 Useful Links.