Access the full text.
Sign up today, get DeepDyve free for 14 days.
M. Powell (1998)
Direct search algorithms for optimization calculationsActa Numerica, 7
J. Ford, I. Moghrabi (1996)
Minimum curvature multistep quasi-Newton methodsComputers & Mathematics With Applications, 31
Zhenjun Shi (2004)
Convergence of line search methods for unconstrained optimizationAppl. Math. Comput., 157
J. Ford, I. Moghrabi (1994)
Multi-step quasi-Newton methods for optimizationJournal of Computational and Applied Mathematics, 50
Guanghui Liu, Lili Jing, Lixing Han, D. Han (1999)
A Class of Nonmonotone Conjugate Gradient Methods for Unconstrained OptimizationJournal of Optimization Theory and Applications, 101
L. Grippo, F. Lampariello, S. Lucidi (1991)
A class of nonmonotone stabilization methods in unconstrained optimizationNumerische Mathematik, 59
Shi Zhen (2003)
The Supermemory Gradient Method with Inexact Line SearchesJournal of Applied Sciences
J. Ford, I. Moghrabi (1996)
Using function-values in multi-step quasi-Newton methodsJournal of Computational and Applied Mathematics, 66
J. Morales (2002)
A numerical study of limited memory BFGS methodsAppl. Math. Lett., 15
Javier Moguerza, F. Prieto (2003)
Combining search directions using gradient flowsMathematical Programming, 96
A. Cohen (1973)
Stepsize analysis for descent methodsJournal of Optimization Theory and Applications, 33
H. Huang, J. Chambliss (1973)
Quadratically convergent algorithms and one-dimensional search schemesJournal of Optimization Theory and Applications, 11
Jean Gilbert, J. Nocedal (1992)
Global Convergence Properties of Conjugate Gradient Methods for OptimizationSIAM J. Optim., 2
C. Lemaréchal, R. Mifflin (1982)
GLOBAL AND SUPERLINEAR C ONVERGENCE OF AN
D. Bertsekas (1982)
Constrained Optimization and Lagrange Multiplier Methods
J. Ford, I. Moghrabi (1993)
Alternative parameter choices for multi-step Quasi-Newton methodsOptimization Methods & Software, 2
L. Dixon (1973)
Conjugate Directions without Linear SearchesIma Journal of Applied Mathematics, 11
(2002)
Restricted PR conjugate gradient method and its global convergence
I. Moghrabi, J. Ford (2001)
A nonlinear model for function-value multistep methods☆Computers & Mathematics With Applications, 42
J. Moré, B. Garbow, K. Hillstrom (1981)
Testing Unconstrained Optimization SoftwareACM Trans. Math. Softw., 7
J. Ford, S. Tharmlikit (2003)
New implicit updates in multi-step quasi-Newton methods for unconstrained optimisationJournal of Computational and Applied Mathematics, 152
J. Nocedal, Stephen Wright (2018)
Numerical Optimization
Dong Liu, J. Nocedal (1989)
On the limited memory BFGS method for large scale optimizationMathematical Programming, 45
L. Grippo, S. Lucidi (1997)
A globally convergent version of the Polak-Ribière conjugate gradient methodMathematical Programming, 78
The paper presents a new class of memory gradient methods with inexact line searches for unconstrained minimization problems. The methods use more previous iterative information than other methods to generate a search direction and use inexact line searches to select a step-size at each iteration. It is proved that the new methods have global convergence under weak mild conditions. The convergence rate of these methods is also investigated under some special cases. Some numerical experiments show that these new algorithms converge more stably than other line search methods and are effective in solving large scale unconstrained minimization problems.
Journal of Numerical Mathematics – de Gruyter
Published: Apr 1, 2005
Keywords: Unconstrained optimization,; memory gradient method,; inexact line search,; convergence
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.