Probab. Theory Relat. Fields (2018) 170:847–889
On the convergence of the extremal eigenvalues
of empirical covariance matrices with dependence
· Konstantin Tikhomirov
Received: 12 November 2015 / Revised: 15 April 2017 / Published online: 26 April 2017
© Springer-Verlag Berlin Heidelberg 2017
Abstract Consider a sample of a centered random vector with unit covariance matrix.
We show that under certain regularity assumptions, and up to a natural scaling, the
smallest and the largest eigenvalues of the empirical covariance matrix converge,
when the dimension and the sample size both tend to inﬁnity, to the left and right
edges of the Marchenko–Pastur distribution. The assumptions are related to tails of
norms of orthogonal projections. They cover isotropic log-concave random vectors as
well as random vectors with i.i.d. coordinates with almost optimal moment conditions.
The method is a reﬁnement of the rank one update approach used by Srivastava and
Vershynin to produce non-asymptotic quantitative estimates. In other words we provide
a new proof of the Bai and Yin theorem using basic tools from probability theory and
linear algebra, together with a new extension of this theorem to random matrices with
Keywords Convex body · Random matrix · Covariance matrix · Singular value ·
Operator norm · Sherman–Morrison formula · Thin-shell inequality · Log-concave
distribution · Dependence
Mathematics Subject Classiﬁcation 52A23 · 15B52 · 60B20 · 62J10
Dedicated to Alain Pajor & Nicole Tomczak-Jaegermann.
CEREMADE, PSL, IUF, Université Paris-Dauphine, Paris, France
Department of Mathematical and Statistical Sciences, University of Alberta, Edmonton, Canada