Purpose – The purpose of this paper is to provide classification confidence value to every individual sample classified by decision trees and use this value to combine the classifiers. Design/methodology/approach – The proposed system is first theoretically explained, and then the use and effectiveness of the proposed system is demonstrated on sample datasets. Findings – In this paper, a novel method is proposed to combine decision tree classifiers using calculated classification confidence values. This confidence in the classification is based on distance calculation to the relevant decision boundary (distance conditional), probability density estimation and (distance conditional) classification confidence estimation. It is shown that these values – provided by individual classification trees – can be integrated to derive a consensus decision. Research limitations/implications – The proposed method is not limited to axis‐parallel trees, it is applicable not only to oblique trees, but also to any kind of classifier system that uses hyperplanes to cluster the input space. Originality/value – A novel method is presented to extend decision tree like classifiers with confidence calculation and a voting system is proposed that uses this confidence information. The proposed system possesses several novelties (e.g. it not only gives class probabilities, but also classification confidences) and advantages over previous (traditional) approaches. The voting system does not require an auxiliary combiner or gating network, as in the mixture of experts structure and the method is not limited to decision trees with axis‐parallel splits; it is applicable to any kind of classifiers that use hyperplanes to cluster the input space.
International Journal of Intelligent Computing and Cybernetics – Emerald Publishing
Published: Jun 6, 2008
Keywords: Decision trees; Classification schemes; Programming; Density measurement; Bayesian statistical decision theory; Confidence limits