- Algorithms
- >
- Data mining algorithms
- >
- Classification algorithms
- >
- Decision trees

- Data mining
- >
- Data mining algorithms
- >
- Classification algorithms
- >
- Decision trees

- Graph families
- >
- Trees (graph theory)
- >
- Trees (data structures)
- >
- Decision trees

- Graphs
- >
- Application-specific graphs
- >
- Trees (data structures)
- >
- Decision trees

- Multivariate statistics
- >
- Statistical classification
- >
- Classification algorithms
- >
- Decision trees

- Statistical data types
- >
- Categorical data
- >
- Classification algorithms
- >
- Decision trees

- Statistical data types
- >
- Statistical classification
- >
- Classification algorithms
- >
- Decision trees

- Trees (set theory)
- >
- Trees (graph theory)
- >
- Trees (data structures)
- >
- Decision trees

- Trees (topology)
- >
- Trees (graph theory)
- >
- Trees (data structures)
- >
- Decision trees

ID3 algorithm

In decision tree learning, ID3 (Iterative Dichotomiser 3) is an algorithm invented by Ross Quinlan used to generate a decision tree from a dataset. ID3 is the precursor to the C4.5 algorithm, and is t

Information gain (decision tree)

In information theory and machine learning, information gain is a synonym for Kullback–Leibler divergence; the amount of information gained about a random variable or signal from observing another ran

Decision tree pruning

Pruning is a data compression technique in machine learning and search algorithms that reduces the size of decision trees by removing sections of the tree that are non-critical and redundant to classi

Alternating decision tree

An alternating decision tree (ADTree) is a machine learning method for classification. It generalizes decision trees and has connections to boosting. An ADTree consists of an alternation of decision n

Information gain ratio

In decision tree learning, Information gain ratio is a ratio of information gain to the intrinsic information. It was proposed by Ross Quinlan, to reduce a bias towards multi-valued attributes by taki

Fast-and-frugal trees

In the study of decision-making, a fast-and-frugal tree is a simple graphical structure that categorizes objects by asking one question at a time. These decision trees are used in a range of fields: p

Random forest

Random forests or random decision forests is an ensemble learning method for classification, regression and other tasks that operates by constructing a multitude of decision trees at training time. Fo

Gradient boosting

Gradient boosting is a machine learning technique used in regression and classification tasks, among others. It gives a prediction model in the form of an ensemble of weak prediction models, which are

Chi-square automatic interaction detection

Chi-square automatic interaction detection (CHAID) is a decision tree technique based on adjusted significance testing (Bonferroni correction, Holm-Bonferroni testing). The technique was developed in

Grafting (decision trees)

Grafting is the process of adding nodes to inferred decision trees to improve the predictive accuracy. A decision tree is a graphical model that is used as a support tool for decision process.

Logistic model tree

In computer science, a logistic model tree (LMT) is a classification model with an associated supervised training algorithm that combines logistic regression (LR) and decision tree learning. Logistic

C4.5 algorithm

C4.5 is an algorithm used to generate a decision tree developed by Ross Quinlan. C4.5 is an extension of Quinlan's earlier ID3 algorithm. The decision trees generated by C4.5 can be used for classific

Incremental decision tree

An incremental decision tree algorithm is an online machine learning algorithm that outputs a decision tree. Many decision tree methods, such as C4.5, construct a tree using a complete dataset. Increm

Decision tree

A decision tree is a decision support tool that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to disp

Decision tree model

In computational complexity the decision tree model is the model of computation in which an algorithm is considered to be basically a decision tree, i.e., a sequence of queries or tests that are done

Decision stump

A decision stump is a machine learning model consisting of a one-level decision tree. That is, it is a decision tree with one internal node (the root) which is immediately connected to the terminal no

Decision tree learning

Decision tree learning is a supervised learning approach used in statistics, data mining and machine learning. In this formalism, a classification or regression decision tree is used as a predictive m

© 2023 Useful Links.