Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Soft Learning Vector Quantization

Soft Learning Vector Quantization Learning vector quantization (LVQ) is a popular class of adaptive nearest prototype classifiers for multiclass classification, but learning algorithms from this family have so far been proposed on heuristic grounds. Here, we take a more principled approach and derive two variants of LVQ using a gaussian mixture ansatz. We propose an objective function based on a likelihood ratio and derive a learning rule using gradient descent. The new approach provides a way to extend the algorithms of the LVQ family to different distance measure and allows for the design of “soft” LVQ algorithms. Benchmark results show that the new methods lead to better classification performance than LVQ 2.1. An additional benefit of the new method is that model assumptions are made explicit, so that the method can be adapted more easily to different kinds of problems. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Neural Computation MIT Press

Soft Learning Vector Quantization

Neural Computation , Volume 15 (7) – Jul 1, 2003

Loading next page...
 
/lp/mit-press/soft-learning-vector-quantization-wqPlLkNNxT

References (11)

Publisher
MIT Press
Copyright
© 2003 Massachusetts Institute of Technology
ISSN
0899-7667
eISSN
1530-888X
DOI
10.1162/089976603321891819
pmid
12816567
Publisher site
See Article on Publisher Site

Abstract

Learning vector quantization (LVQ) is a popular class of adaptive nearest prototype classifiers for multiclass classification, but learning algorithms from this family have so far been proposed on heuristic grounds. Here, we take a more principled approach and derive two variants of LVQ using a gaussian mixture ansatz. We propose an objective function based on a likelihood ratio and derive a learning rule using gradient descent. The new approach provides a way to extend the algorithms of the LVQ family to different distance measure and allows for the design of “soft” LVQ algorithms. Benchmark results show that the new methods lead to better classification performance than LVQ 2.1. An additional benefit of the new method is that model assumptions are made explicit, so that the method can be adapted more easily to different kinds of problems.

Journal

Neural ComputationMIT Press

Published: Jul 1, 2003

There are no references for this article.