Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Classification confidence weighted majority voting using decision tree classifiers

Classification confidence weighted majority voting using decision tree classifiers Purpose – The purpose of this paper is to provide classification confidence value to every individual sample classified by decision trees and use this value to combine the classifiers. Design/methodology/approach – The proposed system is first theoretically explained, and then the use and effectiveness of the proposed system is demonstrated on sample datasets. Findings – In this paper, a novel method is proposed to combine decision tree classifiers using calculated classification confidence values. This confidence in the classification is based on distance calculation to the relevant decision boundary (distance conditional), probability density estimation and (distance conditional) classification confidence estimation. It is shown that these values – provided by individual classification trees – can be integrated to derive a consensus decision. Research limitations/implications – The proposed method is not limited to axis‐parallel trees, it is applicable not only to oblique trees, but also to any kind of classifier system that uses hyperplanes to cluster the input space. Originality/value – A novel method is presented to extend decision tree like classifiers with confidence calculation and a voting system is proposed that uses this confidence information. The proposed system possesses several novelties (e.g. it not only gives class probabilities, but also classification confidences) and advantages over previous (traditional) approaches. The voting system does not require an auxiliary combiner or gating network, as in the mixture of experts structure and the method is not limited to decision trees with axis‐parallel splits; it is applicable to any kind of classifiers that use hyperplanes to cluster the input space. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Intelligent Computing and Cybernetics Emerald Publishing

Classification confidence weighted majority voting using decision tree classifiers

Loading next page...
 
/lp/emerald-publishing/classification-confidence-weighted-majority-voting-using-decision-tree-Q2YteXnQ3F

References (19)

Publisher
Emerald Publishing
Copyright
Copyright © 2008 Emerald Group Publishing Limited. All rights reserved.
ISSN
1756-378X
DOI
10.1108/17563780810874708
Publisher site
See Article on Publisher Site

Abstract

Purpose – The purpose of this paper is to provide classification confidence value to every individual sample classified by decision trees and use this value to combine the classifiers. Design/methodology/approach – The proposed system is first theoretically explained, and then the use and effectiveness of the proposed system is demonstrated on sample datasets. Findings – In this paper, a novel method is proposed to combine decision tree classifiers using calculated classification confidence values. This confidence in the classification is based on distance calculation to the relevant decision boundary (distance conditional), probability density estimation and (distance conditional) classification confidence estimation. It is shown that these values – provided by individual classification trees – can be integrated to derive a consensus decision. Research limitations/implications – The proposed method is not limited to axis‐parallel trees, it is applicable not only to oblique trees, but also to any kind of classifier system that uses hyperplanes to cluster the input space. Originality/value – A novel method is presented to extend decision tree like classifiers with confidence calculation and a voting system is proposed that uses this confidence information. The proposed system possesses several novelties (e.g. it not only gives class probabilities, but also classification confidences) and advantages over previous (traditional) approaches. The voting system does not require an auxiliary combiner or gating network, as in the mixture of experts structure and the method is not limited to decision trees with axis‐parallel splits; it is applicable to any kind of classifiers that use hyperplanes to cluster the input space.

Journal

International Journal of Intelligent Computing and CyberneticsEmerald Publishing

Published: Jun 6, 2008

Keywords: Decision trees; Classification schemes; Programming; Density measurement; Bayesian statistical decision theory; Confidence limits

There are no references for this article.