TY - JOUR AU1 - Yang, Yi-ming AU2 - Wang, Fu-sheng AU3 - Li, Jin-xiang AU4 - Qin, Yuan-yuan AB - The inexact SARAH (iSARAH) algorithm as a variant of SARAH algorithm for variance reduction has recently surged into prominence for solving large-scale optimization problems in the context of machine learning. The performance of the iSARAH significantly depends on the choice of step-size sequence. In this paper, we develop a new algorithm called iSARAH-BB, which employs the Barzilai–Borwein (BB) method to automatically compute step size based on SARAH. By introducing this adaptive step size in the design of the new algorithm, iSARAH-BB can take better advantages of both iSARAH and BB methods. Finally, we analyze the convergence rate and the complexity of the new algorithm under the usual assumptions. Numerical experiments on standard datasets indicate that our proposed iSARAH-BB algorithm is robust to the selection of the initial step size, and it is effective and more competitive than the existing algorithms. TI - A new inexact stochastic recursive gradient descent algorithm with Barzilai–Borwein step size in machine learning JF - Nonlinear Dynamics DO - 10.1007/s11071-022-07987-2 DA - 2023-02-01 UR - https://www.deepdyve.com/lp/springer-journals/a-new-inexact-stochastic-recursive-gradient-descent-algorithm-with-s1S1Wc2ZoQ SP - 3575 EP - 3586 VL - 111 IS - 4 DP - DeepDyve ER -