Category: Probability assessment

Coherence (statistics)
In probability theory and statistics, coherence can have several different meanings. Coherence in statistics is an indication of the quality of the information, either within a single data set, or bet
Risk perception
Risk perception is the subjective judgement that people make about the characteristics and severity of a risk. Risk perceptions are different for the real risks since they are affected by a wide range
Formal epistemology
Formal epistemology uses formal methods from decision theory, logic, probability theory and computability theory to model and reason about issues of epistemological interest. Work in this area spans s
Proper score function
No description available.
Extreme risk
Extreme risks are risks of very bad outcomes or "high consequence", but of low probability. They include the risks of terrorist attack, biosecurity risks such as the invasion of pests, and extreme nat
Percentage point
A percentage point or percent point is the unit for the arithmetic difference between two percentages. For example, moving up from 40 percent to 44 percent is an increase of 4 percentage points, but a
Absolute probability judgement
Absolute probability judgement is a technique used in the field of human reliability assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completi
Rule of succession
In probability theory, the rule of succession is a formula introduced in the 18th century by Pierre-Simon Laplace in the course of treating the sunrise problem. The formula is still used, particularly
Prior probability
In Bayesian statistical inference, a prior probability distribution, often simply called the prior, of an uncertain quantity is the probability distribution that would express one's beliefs about this
Principle of maximum caliber
The principle of maximum caliber (MaxCal) or maximum path entropy principle, suggested by E. T. Jaynes, can be considered as a generalization of the principle of maximum entropy. It postulates that th
Brier score
The Brier Score is a strictly proper score function or strictly proper scoring rule that measures the accuracy of probabilistic predictions. For unidimensional predictions, it is strictly equivalent t
Risk assessment
Broadly speaking, a risk assessment is the combined effort of: 1. * identifying and analyzing potential (future) events that may negatively impact individuals, assets, and/or the environment (i.e. ha
Threat assessment
Threat assessment is the practice of determining the credibility and seriousness of a , as well as the probability that the threat will become a reality. Threat assessment is separate to the more esta
Calibrated probability assessment
Calibrated probability assessments are subjective probabilities assigned by individuals who have been trained to assess probabilities in a way that historically represents their uncertainty. For examp
Principle of maximum entropy
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precise
Knowledge crystal
Knowledge crystals are web-based information objects that are used in scientific information production. Especially, they are used in open assessments designed to support societal decisions. They act
Open assessment
Open assessment is a method for making impact assessments where anyone can participate and contribute. Most open assessments have been made in Opasnet, which is a wiki-based web-workspace specifically
Generalised likelihood uncertainty estimation
Generalized likelihood uncertainty estimation (GLUE) is a statistical method used in hydrology for quantifying the uncertainty of model predictions. The method was introduced by Keith Beven and Andrew
Scoring rule
In decision theory, a scoring rule provides a summary measure for the evaluation of probabilistic predictions or forecasts. It is applicable to tasks in which predictions assign probabilities to event
Quantile regression averaging
Quantile Regression Averaging (QRA) is a forecast combination approach to the computation of prediction intervals. It involves applying quantile regression to the point forecasts of a small number of
Probabilistic forecasting
Probabilistic forecasting summarizes what is known about, or opinions about, future events. In contrast to single-valued forecasts (such as forecasting that the maximum temperature at a given site on
A micromort (from micro- and mortality) is a unit of risk defined as a one-in-a-million chance of death. Micromorts can be used to measure the riskiness of various day-to-day activities. A microprobab
A priori probability
An a priori probability is a probability that is derived purely by deductive reasoning. One way of deriving a priori probabilities is the principle of indifference, which has the character of saying t
Opasnet is a web-workspace for making open assessments, which are impact assessments where anyone can freely participate and contribute. Opasnet is a wiki website and it is built on MediaWiki platform
Probabilistic risk assessment
Probabilistic risk assessment (PRA) is a systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity (such as an airliner or a nuclear power pl
Death Risk Rankings
Death Risk Rankings was a website that approximated the likelihood of a European or American person dying within a twelve-month span. Using public data to do its calculations, the website also listed
Good–Turing frequency estimation
Good–Turing frequency estimation is a statistical technique for estimating the probability of encountering an object of a hitherto unseen species, given a set of past observations of objects from diff
Quantitative risk assessment software
Quantitative risk assessment (QRA) software and methodologies give quantitative estimates of risks, given the parameters defining them. They are used in the financial sector, the chemical process indu
Calculus of predispositions
Calculus of predispositions is a basic part of predispositioning theory and belongs to the indeterministic procedures.