# Convergence Properties of the Regularized Newton Method for the Unconstrained Nonconvex Optimization

Convergence Properties of the Regularized Newton Method for the Unconstrained Nonconvex Optimization The regularized Newton method (RNM) is one of the efficient solution methods for the unconstrained convex optimization. It is well-known that the RNM has good convergence properties as compared to the steepest descent method and the pure Newton’s method. For example, Li, Fukushima, Qi and Yamashita showed that the RNM has a quadratic rate of convergence under the local error bound condition. Recently, Polyak showed that the global complexity bound of the RNM, which is the first iteration k such that ‖ ∇ f ( x k )‖≤ ε , is O ( ε −4 ), where f is the objective function and ε is a given positive constant. In this paper, we consider a RNM extended to the unconstrained “nonconvex” optimization. We show that the extended RNM (E-RNM) has the following properties. (a) The E-RNM has a global convergence property under appropriate conditions. (b) The global complexity bound of the E-RNM is O ( ε −2 ) if ∇ 2 f is Lipschitz continuous on a certain compact set. (c) The E-RNM has a superlinear rate of convergence under the local error bound condition. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Applied Mathematics and Optimization Springer Journals

# Convergence Properties of the Regularized Newton Method for the Unconstrained Nonconvex Optimization

, Volume 62 (1) – Aug 1, 2010
20 pages

/lp/springer_journal/convergence-properties-of-the-regularized-newton-method-for-the-eaWrJkV0Jg
Publisher
Springer-Verlag
Subject
Mathematics; Numerical and Computational Physics; Mathematical Methods in Physics; Theoretical, Mathematical and Computational Physics; Systems Theory, Control; Calculus of Variations and Optimal Control; Optimization
ISSN
0095-4616
eISSN
1432-0606
D.O.I.
10.1007/s00245-009-9094-9
Publisher site
See Article on Publisher Site

### Abstract

The regularized Newton method (RNM) is one of the efficient solution methods for the unconstrained convex optimization. It is well-known that the RNM has good convergence properties as compared to the steepest descent method and the pure Newton’s method. For example, Li, Fukushima, Qi and Yamashita showed that the RNM has a quadratic rate of convergence under the local error bound condition. Recently, Polyak showed that the global complexity bound of the RNM, which is the first iteration k such that ‖ ∇ f ( x k )‖≤ ε , is O ( ε −4 ), where f is the objective function and ε is a given positive constant. In this paper, we consider a RNM extended to the unconstrained “nonconvex” optimization. We show that the extended RNM (E-RNM) has the following properties. (a) The E-RNM has a global convergence property under appropriate conditions. (b) The global complexity bound of the E-RNM is O ( ε −2 ) if ∇ 2 f is Lipschitz continuous on a certain compact set. (c) The E-RNM has a superlinear rate of convergence under the local error bound condition.

### Journal

Applied Mathematics and OptimizationSpringer Journals

Published: Aug 1, 2010

## You’re reading a free preview. Subscribe to read the entire article.

### DeepDyve is your personal research library

It’s your single place to instantly
that matters to you.

over 18 million articles from more than
15,000 peer-reviewed journals.

All for just \$49/month

### Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

### Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

### Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

### Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

DeepDyve

DeepDyve

### Pro

Price

FREE

\$49/month
\$360/year

Save searches from
PubMed

Create lists to

Export lists, citations