Category: Statistical analysis

InfoQ
Information quality (InfoQ) is the potential of a data set to achieve a specific (scientific or practical) goal using a given empirical analysis method.
Decision curve analysis
Decision curve analysis evaluates a predictor for an event as a probability threshold is varied, typically by showing a graphical plot of net benefit against threshold probability. By convention, the
Weighting
The process of weighting involves emphasizing the contribution of particular aspects of a phenomenon (or of a set of data) over others to an outcome or result; thereby highlighting those aspects in co
Election forensics
Election forensics are methods used to determine if election results are statistically normal or statistically abnormal, which can indicate electoral fraud. It uses statistical tools to determine if o
Natural time analysis
Natural time analysis is a statistical method applied to analyze complex time series and critical phenomena, based on event counts as a measure of "time" rather than the clock time. Natural time conce
Scorigami
In sports, a scorigami (a portmanteau of score and origami) is a scoring combination that has never happened before in a sport or league's history. The term was originated by sportswriter Jon Bois for
Sports analytics
Sports analytics are a collection of relevant, historical, statistics that can provide a competitive advantage to a team or individual. Through the collection and analyzation of these data, sports ana
Functional data analysis
Functional data analysis (FDA) is a branch of statistics that analyses data providing information about curves, surfaces or anything else varying over a continuum. In its most general form, under an F
Z-factor
The Z-factor is a measure of statistical effect size. It has been proposed for use in high-throughput screening (where it is also known as Z-prime), and commonly written as Z' to judge whether the res
Subgroup analysis
Subgroup analysis refers to repeating the analysis of a study within subgroups of subjects defined by a subgrouping variable (e.g. smoking status defining two subgroups: smokers and non-smokers).
Guided analytics
Guided analytics is a sub-field at the interface of visual analytics and predictive analytics focused on the development of interactive visual interfaces for business intelligence applications. Such i
Clustered standard errors
Clustered standard errors (or Liang-Zeger standard errors) are measurements that estimate the standard error of a regression parameter in settings where observations may be subdivided into smaller-siz
Analytics (ice hockey)
In ice hockey, analytics is the analysis of the characteristics of hockey players and teams through the use of statistics and other tools to gain a greater understanding of the effects of their perfor
Boolean analysis
Boolean analysis was introduced by Flament (1976). The goal of a Boolean analysis is to detect deterministic dependencies between the items of a questionnaire or similar data-structures in observed re
Probabilistic genotyping
Probabilistic genotyping is the use of statistical methods and mathematical algorithms in DNA Profiling. It may be used instead of manual methods in difficult situations, such as when a DNA sample is
Predictive analytics
Predictive analytics encompasses a variety of statistical techniques from data mining, predictive modeling, and machine learning that analyze current and historical facts to make predictions about fut
Proxy (statistics)
In statistics, a proxy or proxy variable is a variable that is not in itself directly relevant, but that serves in place of an unobservable or immeasurable variable. In order for a variable to be a go
Probit
In probability theory and statistics, the probit function is the quantile function associated with the standard normal distribution. It has applications in data analysis and machine learning, in parti
Structured data analysis (statistics)
Structured data analysis is the statistical data analysis of structured data. This can arise either in the form of an a priori structure such as multiple-choice questionnaires or in situations with th
Fenwick (statistic)
Fenwick is an advanced statistic used in the National Hockey League to measure shot attempt differential while playing at even strength. It is also known as unblocked shot attempts (USAT) by the NHL.
History index model
In statistical analysis, the standard framework of varying coefficient models (also known as concurrent regression models), where the current value of a response process is modeled in dependence on th
Homogeneity and heterogeneity (statistics)
In statistics, homogeneity and its opposite, heterogeneity, arise in describing the properties of a dataset, or several datasets. They relate to the validity of the often convenient assumption that th