A Layer-by-Layer Least Squares based Recurrent Networks Training Algorithm: Stalling and Escape

Siu Yeung Cho, Tommy W.S. Chow

Research output: Journal PublicationArticlepeer-review

6 Citations (Scopus)

Abstract

The limitations of the least squares based training algorithm is dominated by stalling problem and evaluation error by transformation matrix to obtain an unacceptable solution. This paper presents a new approach for the recurrent networks training algorithm based upon the Layer-by-Layer Least Squares based algorithm to overcome the aforementioned problems. In accordance with our proposed algorithm, all the weights are evaluated by the least squares method without the evaluation of transformation matrix to speed up the rate of convergence. A probabilistic mechanism, based upon the modified weights updated equations, is introduced to eliminate the stalling problem experienced by the pure least squares type computation. As a result, the merits of the proposed algorithm are capable of providing an ability of escaping from local minima to obtain a good optimal solution and still maintaining the characteristic of fast convergence.

Original languageEnglish
Pages (from-to)15-25
Number of pages11
JournalNeural Processing Letters
Volume7
Issue number1
DOIs
Publication statusPublished - 1998
Externally publishedYes

Keywords

  • Convergence stalling
  • Fast convergence speed
  • Layer-by-Layer Least Squares algorithm
  • Recurrent networks

ASJC Scopus subject areas

  • Software
  • General Neuroscience
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'A Layer-by-Layer Least Squares based Recurrent Networks Training Algorithm: Stalling and Escape'. Together they form a unique fingerprint.

Cite this