Math. Program., Ser. B https://doi.org/10.1007/s10107-018-1297-x FULL LENGTH PAPER On variance reduction for stochastic smooth convex optimization with multiplicative noise 1 2 Alejandro Jofré · Philip Thompson Received: 24 May 2017 / Accepted: 11 May 2018 © Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society 2018 Abstract We propose dynamic sampled stochastic approximation (SA) methods for stochastic optimization with a heavy-tailed distribution (with ﬁnite 2nd moment). The objective is the sum of a smooth convex function with a convex regularizer. Typically, it is assumed an oracle with an upper bound σ on its variance (OUBV). Differently, we assume an oracle with multiplicative noise. This rarely addressed setup is more aggressive but realistic, where the variance may not be uniformly bounded. Our methods achieve optimal iteration complexity and (near) optimal oracle complexity. For the smooth convex class, we use an accelerated SA method a la FISTA which achieves, given tolerance ε> 0, the optimal iteration complexity of O(ε ) with a −2 − 2 near-optimal oracle complexity of O(ε )[ln(ε )] . This improves upon Ghadimi and Lan (Math Program 156:59–99, 2016)whereitisassumedanOUBV. Forthe −1 strongly convex class, our method achieves optimal iteration complexity of O(ln(ε )) −1 and
Mathematical Programming – Springer Journals
Published: Jun 5, 2018
It’s your single place to instantly
discover and read the research
that matters to you.
Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.
All for just $49/month
Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly
Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.
All the latest content is available, no embargo periods.
“Whoa! It’s like Spotify but for academic articles.”@Phil_Robichaud