An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method

Tommy W.S. Chow, Siu Yeung Cho

Research output: Journal PublicationArticlepeer-review

11 Citations (Scopus)

Abstract

A new approach for the training algorithm of a fully connected recurrent neural network based upon the digital filter theory is proposed. Each recurrent neuron is modeled by an infinite impulse response (IIR) filter. The weights of each layers in the network are updated by optimizing IIR filter coefficients and optimization is based on the recursive least squares (RLS) method. Our results indicate that the proposed algorithm is capable of providing an extremely fast convergence rate. In this letter, the algorithm is validated by applying to sunspots time series, Mackey-Glass time series and nonlinear function approximation problems. The convergence speed of the RLS based algorithm are compared with other fast algorithms. In the obtained results, they show that the proposed algorithm could be up to 200 times faster than that of the conventional backpropagation algorithm.

Original languageEnglish
Pages (from-to)1082-1086
Number of pages5
JournalIEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Volume44
Issue number11
DOIs
Publication statusPublished - 1997
Externally publishedYes

Keywords

  • Accelerated training algorithm
  • Recurrent networks
  • Recursive least squares method

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'An accelerated recurrent network training algorithm using IIR filter model and recursive least squares method'. Together they form a unique fingerprint.

Cite this