Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Relaxing support vectors for classification

Relaxing support vectors for classification We introduce a novel modification to standard support vector machine (SVM) formulations based on a limited amount of penalty-free slack to reduce the influence of misclassified samples or outliers. We show that free slack relaxes support vectors and pushes them towards their respective classes, hence we use the name relaxed support vector machines (RSVM) for our method. We present theoretical properties of the RSVM formulation and develop its dual formulation for nonlinear classification via kernels. We show the connection between the dual RSVM and the dual of the standard SVM formulations. We provide error bounds for RSVM and show it to be stable, universally consistent and tighter than error bounds for standard SVM. We also introduce a linear programming version of RSVM, which we call RSVMLP. We apply RSVM and RSVMLP to synthetic data and benchmark binary classification problems, and compare our results with standard SVM classification results. We show that relaxed influential support vectors may lead to better classification results. We develop a two-phase method called RSVM2 for multiple instance classification (MIC) problems, where RSVM formulations are used as classifiers. We extend the two-phase method to the linear programming case and develop RSVMLP2. We demonstrate the classification characteristics of RSVM2 and RSVMLP2, and report our classification results compared to results obtained by other SVM-based MIC methods on public benchmark datasets. We show that both RSVM2 and RSVMLP2 are faster and produce more accurate classification results. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Annals of Operations Research Springer Journals

Relaxing support vectors for classification

Loading next page...
 
/lp/springer-journals/relaxing-support-vectors-for-classification-q2oiRwcqrw

References (64)

Publisher
Springer Journals
Copyright
Copyright © 2012 by Springer Science+Business Media, LLC
Subject
Economics / Management Science; Operations Research/Decision Theory; Combinatorics; Theory of Computation
ISSN
0254-5330
eISSN
1572-9338
DOI
10.1007/s10479-012-1193-3
Publisher site
See Article on Publisher Site

Abstract

We introduce a novel modification to standard support vector machine (SVM) formulations based on a limited amount of penalty-free slack to reduce the influence of misclassified samples or outliers. We show that free slack relaxes support vectors and pushes them towards their respective classes, hence we use the name relaxed support vector machines (RSVM) for our method. We present theoretical properties of the RSVM formulation and develop its dual formulation for nonlinear classification via kernels. We show the connection between the dual RSVM and the dual of the standard SVM formulations. We provide error bounds for RSVM and show it to be stable, universally consistent and tighter than error bounds for standard SVM. We also introduce a linear programming version of RSVM, which we call RSVMLP. We apply RSVM and RSVMLP to synthetic data and benchmark binary classification problems, and compare our results with standard SVM classification results. We show that relaxed influential support vectors may lead to better classification results. We develop a two-phase method called RSVM2 for multiple instance classification (MIC) problems, where RSVM formulations are used as classifiers. We extend the two-phase method to the linear programming case and develop RSVMLP2. We demonstrate the classification characteristics of RSVM2 and RSVMLP2, and report our classification results compared to results obtained by other SVM-based MIC methods on public benchmark datasets. We show that both RSVM2 and RSVMLP2 are faster and produce more accurate classification results.

Journal

Annals of Operations ResearchSpringer Journals

Published: Aug 2, 2012

There are no references for this article.