The asymmetry of different misclassification costs is a common problem in many realistic applications. However, most of the traditional classifiers pursue high recognition accuracy, assuming that different misclassification errors bring uniform cost. This paper proposes two cost-sensitive models based on support vector data description (SVDD) to minimize classification costs while maximize classification accuracy. The one-class classifier SVDD is extended to two two-class models. The cost information is incorporated to pursue tradeoff generalization performances between different classes in order to minimize the misclassification costs. Cost information is also considered to build the decision rules. The solutions of the optimization problems of the proposed two models are formulated according to sequential minimal optimization (SMO) algorithm. However, SMO needs to check all the samples to select the working set in each iteration, which is very time consuming. Considering that only the support vectors are needed to describe the boundaries, a sample selection approach is proposed to speed up the training time and reduce the storage requirement by selecting edge and overlapping samples, and overcome the local overlearning by remove outliers. Experimental results on synthetic and public datasets demonstrate the effectiveness and efficiency of the proposed methods. Keywords Pattern recognition · Cost-sensitive learning
Applied Intelligence – Springer Journals
Published: Jun 6, 2018
It’s your single place to instantly
discover and read the research
that matters to you.
Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.
All for just $49/month
Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly
Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.
All the latest content is available, no embargo periods.
“Whoa! It’s like Spotify but for academic articles.”@Phil_Robichaud