Training multilayer neural networks using fast global learning algorithm - Least-squares and penalized optimization methods

Siu Yeung Cho, Tommy W.S. Chow

Research output: Journal PublicationArticlepeer-review

43 Citations (Scopus)

Abstract

The major limitations of conventional learning algorithms are attributed to local minima and slow convergence speed. This paper presents a novel heuristics approach for neural networks global learning algorithm. The proposed algorithm is based upon the least-squares (LS) method to maintain the fast convergence speed and a Penalty (PEN)approach to solve the problem of local minima. The penalty term superimposes into the error surface, which likely to provide a way of escape from the local minima when the convergence stalls. The choice and adjustment for the penalty factor are also derived to demonstrate the effect of the penalty term and to ensure the convergence of the algorithm. The developed learning algorithm is applied to several problems of classification application. In all the tested problems, the proposed algorithm outperforms other conventional algorithms in terms of convergence speed and the ability of escaping from the local minima.

Original languageEnglish
Pages (from-to)115-131
Number of pages17
JournalNeurocomputing
Volume25
Issue number1-3
DOIs
Publication statusPublished - Apr 1999
Externally publishedYes

Keywords

  • Global learning algorithm
  • Least-squares method
  • Multilayer neural networks
  • Penalized optimization

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Training multilayer neural networks using fast global learning algorithm - Least-squares and penalized optimization methods'. Together they form a unique fingerprint.

Cite this