Access the full text.
Sign up today, get DeepDyve free for 14 days.
Karl F.R.S. (2009)
X. On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random samplingPhilosophical Magazine Series 1, 50
Aapo Hyvärinen (1999)
Fast and robust fixed-point algorithms for independent component analysisIEEE transactions on neural networks, 10 3
F. Bach, Michael Jordan (2003)
Kernel independent component analysis2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03)., 4
J. Cardoso, A. Souloumiac (1993)
Blind beamforming for non-gaussian signals, 140
S. Ali, S. Silvey (1966)
A General Class of Coefficients of Divergence of One Distribution from AnotherJournal of the royal statistical society series b-methodological, 28
S. Kullback, R. Leibler (1951)
On Information and SufficiencyAnnals of Mathematical Statistics, 22
T. Kanamori, Taiji Suzuki, Masashi Sugiyama (2012)
Computational complexity of kernel-based density-ratio estimation: a condition number analysisMachine Learning, 90
S. Amari (1998)
Natural Gradient Works Efficiently in LearningNeural Computation, 10
P. Comon (1994)
Independent component analysis, A new concept?Signal Process., 36
Masashi Sugiyama (2012)
Machine Learning with Squared-Loss Mutual InformationEntropy, 15
C. Jutten, J. Hérault (1991)
Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architectureSignal Process., 24
Taiji Suzuki, Masashi Sugiyama, J. Sese, T. Kanamori (2008)
Approximating Mutual Information by Maximum Likelihood Density Ratio Estimation
K. Fukumizu, F. Bach, Michael Jordan (2009)
Kernel dimension reduction in regressionAnnals of Statistics, 37
A. Cichocki, R. Zdunek, A. Phan, S. Amari (2009)
Nonnegative Matrix and Tensor Factorizations - Applications to Exploratory Multi-way Data Analysis and Blind Source SeparationIEEE Signal Processing Magazine, 25
Yuhai Wu (2021)
Statistical Learning TheoryTechnometrics, 41
X. Nguyen, M. Wainwright, Michael Jordan (2008)
Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk MinimizationIEEE Transactions on Information Theory, 56
M. Hulle (2008)
Sequential Fixed-Point ICA Based on Mutual Information MinimizationNeural Computation, 20
S. Geisser (1975)
The Predictive Sample Reuse Method with ApplicationsJournal of the American Statistical Association, 70
Masashi Sugiyama, M. Yamada (2012)
On Kernel Parameter Selection in Hilbert-Schmidt Independence CriterionIEICE Trans. Inf. Syst., 95-D
A. Cichocki, S. Amari (2002)
Adaptive Blind Signal and Image Processing - Learning Algorithms and Applications
F. Liese, I. Vajda (2006)
On Divergences and Informations in Statistics and Information TheoryIEEE Transactions on Information Theory, 52
M. Yamada, Taiji Suzuki, T. Kanamori, Hirotaka Hachiya, Masashi Sugiyama (2011)
Relative Density-Ratio Estimation for Robust Distribution ComparisonNeural Computation, 25
Y. Nishimori, S. Akaho (2005)
Learning algorithms utilizing quasi-geodesic flows on the Stiefel manifoldNeurocomputing, 67
K. Fukumizu, A. Gretton, Xiaohai Sun, B. Scholkopf (2007)
Kernel Measures of Conditional Dependence
A. Gretton, R. Herbrich, Alex Smola, O. Bousquet, B. Scholkopf (2005)
Kernel Methods for Measuring IndependenceJ. Mach. Learn. Res., 6
Masashi Sugiyama, Taiji Suzuki, S. Nakajima, H. Kashima, P. Bünau, M. Kawanabe (2008)
Direct importance estimation for covariate shift adaptationAnnals of the Institute of Statistical Mathematics, 60
(1967)
Information-type measures of difference of probability distributions and indirect observation
K. Pearson (1900)
On the Criterion that a Given System of Deviations from the Probable in the Case of a Correlated System of Variables is Such that it Can be Reasonably Supposed to have Arisen from Random SamplingPhilosophical Magazine Series 1, 50
Masashi Sugiyama, Teruyuki Suzuki, T. Kanamori (2012)
Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimationAnnals of the Institute of Statistical Mathematics, 64
A. Gretton, O. Bousquet, Alex Smola, B. Scholkopf (2005)
Measuring Statistical Dependence with Hilbert-Schmidt Norms
Aapo Hyvärinen, J. Karhunen, E. Oja (2001)
Independent Component Analysis
(2000)
10.1162/153244303768966085
Wittawat Jitkrittum, Hirotaka Hachiya, Masashi Sugiyama (2012)
Feature Selection via L1-Penalized Squared-Loss Mutual InformationArXiv, abs/1210.1960
Ron Kohavi (1995)
A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection
S. Amari, A. Cichocki, H. Yang (1995)
A New Learning Algorithm for Blind Signal Separation
T. Kanamori, Shohei Hido, Masashi Sugiyama (2009)
A Least-squares Approach to Direct Importance EstimationJ. Mach. Learn. Res., 10
L. Paninski (2003)
Estimation of Entropy and Mutual InformationNeural Computation, 15
Accurately evaluating statistical independence among random variables is a key element of independent component analysis (ICA). In this letter, we employ a squared-loss variant of mutual information as an independence measure and give its estimation method. Our basic idea is to estimate the ratio of probability densities directly without going through density estimation, thereby avoiding the difficult task of density estimation. In this density ratio approach, a natural cross-validation procedure is available for hyperparameter selection. Thus, all tuning parameters such as the kernel width or the regularization parameter can be objectively optimized. This is an advantage over recently developed kernel-based independence measures and is a highly useful property in unsupervised learning problems such as ICA. Based on this novel independence measure, we develop an ICA algorithm, named least-squares independent component analysis .
Neural Computation – MIT Press
Published: Jan 1, 2011
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.