# On variance reduction for stochastic smooth convex optimization with multiplicative noise

On variance reduction for stochastic smooth convex optimization with multiplicative noise We propose dynamic sampled stochastic approximation (SA) methods for stochastic optimization with a heavy-tailed distribution (with finite 2nd moment). The objective is the sum of a smooth convex function with a convex regularizer. Typically, it is assumed an oracle with an upper bound $$\sigma ^2$$ σ 2 on its variance (OUBV). Differently, we assume an oracle with multiplicative noise. This rarely addressed setup is more aggressive but realistic, where the variance may not be uniformly bounded. Our methods achieve optimal iteration complexity and (near) optimal oracle complexity. For the smooth convex class, we use an accelerated SA method a la FISTA which achieves, given tolerance $$\varepsilon >0$$ ε > 0 , the optimal iteration complexity of $$\mathscr {O}(\varepsilon ^{-\frac{1}{2}})$$ O ( ε - 1 2 ) with a near-optimal oracle complexity of $$\mathscr {O}(\varepsilon ^{-2})[\ln (\varepsilon ^{-\frac{1}{2}})]^2$$ O ( ε - 2 ) [ ln ( ε - 1 2 ) ] 2 . This improves upon Ghadimi and Lan (Math Program 156:59–99, 2016) where it is assumed an OUBV. For the strongly convex class, our method achieves optimal iteration complexity of $$\mathscr {O}(\ln (\varepsilon ^{-1}))$$ O ( ln ( ε - 1 ) ) and optimal oracle complexity of $$\mathscr {O}(\varepsilon ^{-1})$$ O ( ε - 1 ) . This improves upon Byrd et al. (Math Program 134:127–155, 2012) where it is assumed an OUBV. In terms of variance, our bounds are local: they depend on variances $$\sigma (x^*)^2$$ σ ( x ∗ ) 2 at solutions $$x^*$$ x ∗ and the per unit distance multiplicative variance $$\sigma ^2_L$$ σ L 2 . For the smooth convex class, there exist policies such that our bounds resemble, up to absolute constants, those obtained in the mentioned papers if it was assumed an OUBV with $$\sigma ^2:=\sigma (x^*)^2$$ σ 2 : = σ ( x ∗ ) 2 . For the strongly convex class such property is obtained exactly if the condition number is estimated or in the limit for better conditioned problems or for larger initial batch sizes. In any case, if it is assumed an OUBV, our bounds are thus sharper since typically $$\max \{\sigma (x^*)^2,\sigma _L^2\}\ll \sigma ^2$$ max { σ ( x ∗ ) 2 , σ L 2 } ≪ σ 2 . http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Mathematical Programming Springer Journals

# On variance reduction for stochastic smooth convex optimization with multiplicative noise

Mathematical Programming, Volume 174 (2) – Jun 5, 2018
40 pages

/lp/springer_journal/on-variance-reduction-for-stochastic-smooth-convex-optimization-with-BfoPvwtEnP
Publisher
Springer Journals
Copyright © 2018 by Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society
Subject
Mathematics; Calculus of Variations and Optimal Control; Optimization; Mathematics of Computing; Numerical Analysis; Combinatorics; Theoretical, Mathematical and Computational Physics; Mathematical Methods in Physics
ISSN
0025-5610
eISSN
1436-4646
D.O.I.
10.1007/s10107-018-1297-x
Publisher site
See Article on Publisher Site

### Abstract

We propose dynamic sampled stochastic approximation (SA) methods for stochastic optimization with a heavy-tailed distribution (with finite 2nd moment). The objective is the sum of a smooth convex function with a convex regularizer. Typically, it is assumed an oracle with an upper bound $$\sigma ^2$$ σ 2 on its variance (OUBV). Differently, we assume an oracle with multiplicative noise. This rarely addressed setup is more aggressive but realistic, where the variance may not be uniformly bounded. Our methods achieve optimal iteration complexity and (near) optimal oracle complexity. For the smooth convex class, we use an accelerated SA method a la FISTA which achieves, given tolerance $$\varepsilon >0$$ ε > 0 , the optimal iteration complexity of $$\mathscr {O}(\varepsilon ^{-\frac{1}{2}})$$ O ( ε - 1 2 ) with a near-optimal oracle complexity of $$\mathscr {O}(\varepsilon ^{-2})[\ln (\varepsilon ^{-\frac{1}{2}})]^2$$ O ( ε - 2 ) [ ln ( ε - 1 2 ) ] 2 . This improves upon Ghadimi and Lan (Math Program 156:59–99, 2016) where it is assumed an OUBV. For the strongly convex class, our method achieves optimal iteration complexity of $$\mathscr {O}(\ln (\varepsilon ^{-1}))$$ O ( ln ( ε - 1 ) ) and optimal oracle complexity of $$\mathscr {O}(\varepsilon ^{-1})$$ O ( ε - 1 ) . This improves upon Byrd et al. (Math Program 134:127–155, 2012) where it is assumed an OUBV. In terms of variance, our bounds are local: they depend on variances $$\sigma (x^*)^2$$ σ ( x ∗ ) 2 at solutions $$x^*$$ x ∗ and the per unit distance multiplicative variance $$\sigma ^2_L$$ σ L 2 . For the smooth convex class, there exist policies such that our bounds resemble, up to absolute constants, those obtained in the mentioned papers if it was assumed an OUBV with $$\sigma ^2:=\sigma (x^*)^2$$ σ 2 : = σ ( x ∗ ) 2 . For the strongly convex class such property is obtained exactly if the condition number is estimated or in the limit for better conditioned problems or for larger initial batch sizes. In any case, if it is assumed an OUBV, our bounds are thus sharper since typically $$\max \{\sigma (x^*)^2,\sigma _L^2\}\ll \sigma ^2$$ max { σ ( x ∗ ) 2 , σ L 2 } ≪ σ 2 .

### Journal

Mathematical ProgrammingSpringer Journals

Published: Jun 5, 2018

## You’re reading a free preview. Subscribe to read the entire article.

### DeepDyve is your personal research library

It’s your single place to instantly
that matters to you.

over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month ### Explore the DeepDyve Library ### Search Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly ### Organize Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place. ### Access Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals. ### Your journals are on DeepDyve Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more. All the latest content is available, no embargo periods. DeepDyve ### Freelancer DeepDyve ### Pro Price FREE$49/month
\$360/year

Save searches from
PubMed

Create folders to

Export folders, citations