Access the full text.
Sign up today, get DeepDyve free for 14 days.
A. Asuncion, D.J. Newman
UCI Machine Learning Repository
F. Provost, Pedro Domingos (2003)
Tree Induction for Probability-Based RankingMachine Learning, 52
J. Kittler, M. Hatef, R. Duin, Jiri Matas (1996)
Combining classifiersProceedings of 13th International Conference on Pattern Recognition, 2
Isabelle Alvarez, Stephan Bernard (2005)
Ranking Cases with Decision Trees: a Geometric Method that Preserves Intelligibility
L. Breiman, J. Friedman, R. Olshen, C. Stone (1984)
Classification and Regression TreesBiometrics, 40
S. Morishita (1998)
On Classification and Regression
C. Ferri, Peter Flach, J. Hernández-Orallo (2003)
Decision Trees for Ranking: Effect of new smoothing methods, new splitting criteria and simple pruning methods
Bernard Silverman (1987)
Density Estimation for Statistics and Data Analysis
P. Hart, R. Duda, D. Stork (1973)
Pattern Classification
C. Ling, Robert Yan (2003)
Decision Tree with Better Ranking
Isabelle Alvarez, Stephan Bernard, G. Deffuant (2007)
Keep the Decision Tree and Estimate the Class Probabilities Using its Decision Boundary
L. Breiman (1996)
Bagging PredictorsMachine Learning, 24
M. Jordan, R. Jacobs (1993)
Hierarchical Mixtures of Experts and the EM AlgorithmNeural Computation, 6
P. Mahalanobis (1936)
On the generalized distance in statistics, 2
Stephen Boyd, L. Vandenberghe (2005)
Convex OptimizationJournal of the American Statistical Association, 100
Padhraic Smyth, Alexander Gray, U. Fayyad (1995)
Retrofitting Decision Tree Classifiers Using Kernel Density Estimation
C. Bishop (1997)
Classification and regression
Purpose – The purpose of this paper is to provide classification confidence value to every individual sample classified by decision trees and use this value to combine the classifiers. Design/methodology/approach – The proposed system is first theoretically explained, and then the use and effectiveness of the proposed system is demonstrated on sample datasets. Findings – In this paper, a novel method is proposed to combine decision tree classifiers using calculated classification confidence values. This confidence in the classification is based on distance calculation to the relevant decision boundary (distance conditional), probability density estimation and (distance conditional) classification confidence estimation. It is shown that these values – provided by individual classification trees – can be integrated to derive a consensus decision. Research limitations/implications – The proposed method is not limited to axis‐parallel trees, it is applicable not only to oblique trees, but also to any kind of classifier system that uses hyperplanes to cluster the input space. Originality/value – A novel method is presented to extend decision tree like classifiers with confidence calculation and a voting system is proposed that uses this confidence information. The proposed system possesses several novelties (e.g. it not only gives class probabilities, but also classification confidences) and advantages over previous (traditional) approaches. The voting system does not require an auxiliary combiner or gating network, as in the mixture of experts structure and the method is not limited to decision trees with axis‐parallel splits; it is applicable to any kind of classifiers that use hyperplanes to cluster the input space.
International Journal of Intelligent Computing and Cybernetics – Emerald Publishing
Published: Jun 6, 2008
Keywords: Decision trees; Classification schemes; Programming; Density measurement; Bayesian statistical decision theory; Confidence limits
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.