Neural network learning with global heuristic search

Ivan Jordanov, Antoniya Georgieva
IEEE Transactions on Neural Networks 2007, 18 (3): 937-42
A novel hybrid global optimization (GO) algorithm applied for feedforward neural networks (NNs) supervised learning is investigated. The network weights are determined by minimizing the traditional mean square error function. The optimization technique, called LP(tau)NM, combines a novel global heuristic search based on LPtau low-discrepancy sequences of points, and a simplex local search. The proposed method is initially tested on multimodal mathematical functions and subsequently applied for training moderate size NNs for solving popular benchmark problems. Finally, the results are analyzed, discussed, and compared with such as from backpropagation (BP) (Levenberg-Marquardt) and differential evolution methods.

Full Text Links

Find Full Text Links for this Article


You are not logged in. Sign Up or Log In to join the discussion.

Related Papers

Remove bar
Read by QxMD icon Read

Save your favorite articles in one place with a free QxMD account.


Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"