Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Rademacher Chaos Complexities for Learning the Kernel Problem

Rademacher Chaos Complexities for Learning the Kernel Problem We develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes and entropy integrals. Finally, we establish satisfactory excess generalization bounds and misclassification error rates for learning gaussian kernels and general radial basis kernels. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Neural Computation MIT Press

Rademacher Chaos Complexities for Learning the Kernel Problem

Neural Computation , Volume 22 (11) – Nov 1, 2010

Loading next page...
 
/lp/mit-press/rademacher-chaos-complexities-for-learning-the-kernel-problem-1mWqQzI62B

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
MIT Press
Copyright
© 2010 Massachusetts Institute of Technology
Subject
Letters
ISSN
0899-7667
eISSN
1530-888X
DOI
10.1162/NECO_a_00028
pmid
20804384
Publisher site
See Article on Publisher Site

Abstract

We develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes and entropy integrals. Finally, we establish satisfactory excess generalization bounds and misclassification error rates for learning gaussian kernels and general radial basis kernels.

Journal

Neural ComputationMIT Press

Published: Nov 1, 2010

There are no references for this article.