“Whoa! It's like Spotify but for academic articles.”

Instant Access to Thousands of Journals for just $40/month

Get 2 Weeks Free

A family of preconditioned iteratively regularized methods for nonlinear minimization

The preconditioned iteratively regularized Gauss–Newton algorithm for the minimization of general nonlinear functionals was introduced by Smirnova, Renaut and Khan (Inverse Problems 23: 1547–1563, 2007). In this paper, we establish theoretical convergence results for an extended stabilized family of Generalized Preconditioned Iterative methods which includes ℳ-times iterated Tikhonov regularization with line search. Numerical schemes illustrating the theoretical results are also presented. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Inverse and Ill-Posed Problems de Gruyter

Loading next page...

You're reading a free preview. Subscribe to read the entire article.

And millions more from thousands of peer-reviewed journals, for just $40/month

Get 2 Weeks Free

To be the best researcher, you need access to the best research

  • With DeepDyve, you can stop worrying about how much articles cost, or if it's too much hassle to order — it's all at your fingertips. Your research is important and deserves the top content.
  • Read from thousands of the leading scholarly journals from Springer, Elsevier, Nature, IEEE, Wiley-Blackwell and more.
  • All the latest content is available, no embargo periods.

Stop missing out on the latest updates in your field

  • We’ll send you automatic email updates on the keywords and journals you tell us are most important to you.
  • There is a lot of content out there, so we help you sift through it and stay organized.