Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Selection methods for extended least squares support vector machines

Selection methods for extended least squares support vector machines Purpose – The purpose of this paper is to present extended least squares support vector machines (LS‐SVM) where data selection methods are used to get sparse LS‐SVM solution, and to overview and compare the most important data selection approaches. Design/methodology/approach – The selection methods are compared based on their theoretical background and using extensive simulations. Findings – The paper shows that partial reduction is an efficient way of getting a reduced complexity sparse LS‐SVM solution, while partial reduction exploits full knowledge contained in the whole training data set. It also shows that the reduction technique based on reduced row echelon form (RREF) of the kernel matrix is superior when compared to other data selection approaches. Research limitations/implications – Data selection for getting a sparse LS‐SVM solution can be done in the different representations of the training data: in the input space, in the intermediate feature space, and in the kernel space. Selection in the kernel space can be obtained by finding an approximate basis of the kernel matrix. Practical implications – The RREF‐based method is a data selection approach with a favorable property: there is a trade‐off tolerance parameter that can be used for balancing complexity and accuracy. Originality/value – The paper gives contributions to the construction of high‐performance and moderate complexity LS‐SVMs. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Intelligent Computing and Cybernetics Emerald Publishing

Selection methods for extended least squares support vector machines

Loading next page...
 
/lp/emerald-publishing/selection-methods-for-extended-least-squares-support-vector-machines-5ez8X7oCHy
Publisher
Emerald Publishing
Copyright
Copyright © 2008 Emerald Group Publishing Limited. All rights reserved.
ISSN
1756-378X
DOI
10.1108/17563780810857130
Publisher site
See Article on Publisher Site

Abstract

Purpose – The purpose of this paper is to present extended least squares support vector machines (LS‐SVM) where data selection methods are used to get sparse LS‐SVM solution, and to overview and compare the most important data selection approaches. Design/methodology/approach – The selection methods are compared based on their theoretical background and using extensive simulations. Findings – The paper shows that partial reduction is an efficient way of getting a reduced complexity sparse LS‐SVM solution, while partial reduction exploits full knowledge contained in the whole training data set. It also shows that the reduction technique based on reduced row echelon form (RREF) of the kernel matrix is superior when compared to other data selection approaches. Research limitations/implications – Data selection for getting a sparse LS‐SVM solution can be done in the different representations of the training data: in the input space, in the intermediate feature space, and in the kernel space. Selection in the kernel space can be obtained by finding an approximate basis of the kernel matrix. Practical implications – The RREF‐based method is a data selection approach with a favorable property: there is a trade‐off tolerance parameter that can be used for balancing complexity and accuracy. Originality/value – The paper gives contributions to the construction of high‐performance and moderate complexity LS‐SVMs.

Journal

International Journal of Intelligent Computing and CyberneticsEmerald Publishing

Published: Mar 28, 2008

Keywords: Information systems; Systems theory; Neural nets; Algorithms

References