Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Least-Squares Independent Component Analysis

Least-Squares Independent Component Analysis Accurately evaluating statistical independence among random variables is a key element of independent component analysis (ICA). In this letter, we employ a squared-loss variant of mutual information as an independence measure and give its estimation method. Our basic idea is to estimate the ratio of probability densities directly without going through density estimation, thereby avoiding the difficult task of density estimation. In this density ratio approach, a natural cross-validation procedure is available for hyperparameter selection. Thus, all tuning parameters such as the kernel width or the regularization parameter can be objectively optimized. This is an advantage over recently developed kernel-based independence measures and is a highly useful property in unsupervised learning problems such as ICA. Based on this novel independence measure, we develop an ICA algorithm, named least-squares independent component analysis . http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Neural Computation MIT Press

Least-Squares Independent Component Analysis

Neural Computation , Volume 23 (1) – Jan 1, 2011

Loading next page...
 
/lp/mit-press/least-squares-independent-component-analysis-VcVjJQYBsO

References (37)

Publisher
MIT Press
Copyright
© 2010 Massachusetts Institute of Technology
Subject
Letters
ISSN
0899-7667
eISSN
1530-888X
DOI
10.1162/NECO_a_00062
pmid
20964543
Publisher site
See Article on Publisher Site

Abstract

Accurately evaluating statistical independence among random variables is a key element of independent component analysis (ICA). In this letter, we employ a squared-loss variant of mutual information as an independence measure and give its estimation method. Our basic idea is to estimate the ratio of probability densities directly without going through density estimation, thereby avoiding the difficult task of density estimation. In this density ratio approach, a natural cross-validation procedure is available for hyperparameter selection. Thus, all tuning parameters such as the kernel width or the regularization parameter can be objectively optimized. This is an advantage over recently developed kernel-based independence measures and is a highly useful property in unsupervised learning problems such as ICA. Based on this novel independence measure, we develop an ICA algorithm, named least-squares independent component analysis .

Journal

Neural ComputationMIT Press

Published: Jan 1, 2011

There are no references for this article.