Access the full text.
Sign up today, get DeepDyve free for 14 days.
S. Orfanidis (1990)
GramSchmidt Neural NetsNeural Computation, 2
N. Weymaere, J. Martens (1991)
A fast and robust learning algorithm for feedforward neural networksNeural Networks, 4
S. Chow, J. Mallet-Paret, J. Yorke (1978)
Finding zeroes of maps: homotopy methods that are constructive with probability oneMathematics of Computation, 32
Tien-Yien Li (1987)
Solving polynomial systemsThe Mathematical Intelligencer, 9
D. Rumelhart, Geoffrey Hinton, Ronald Williams (1986)
Learning internal representations by error propagation
When training a feedforward neural network with backpropagation (Rumelhart et al . 1986), local minima are always a problem because of the nonlinearity of the system. There have been several ways to attack this problem: for example, to restart the training by selecting a new initial point, to perform the preprocessing of the input data or the neural network. Here, we propose a method which is efficient in computation to avoid some local minima.
Neural Computation – MIT Press
Published: May 1, 1993
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.