Abstract
A new approach for the training algorithm of a fully connected recurrent neural network based upon the digital filter theory is proposed. Each recurrent neuron is modeled by an infinite impulse response (IIR) filter. The weights of each layers in the network are updated by optimizing IIR filter coefficients and optimization is based on the recursive least squares (RLS) method. Our results indicate that the proposed algorithm is capable of providing an extremely fast convergence rate. In this letter, the algorithm is validated by applying to sunspots time series, Mackey-Glass time series and nonlinear function approximation problems. The convergence speed of the RLS based algorithm are compared with other fast algorithms. In the obtained results, they show that the proposed algorithm could be up to 200 times faster than that of the conventional backpropagation algorithm.
Original language | English |
---|---|
Pages (from-to) | 1082-1086 |
Number of pages | 5 |
Journal | IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications |
Volume | 44 |
Issue number | 11 |
DOIs | |
Publication status | Published - 1997 |
Externally published | Yes |
Keywords
- Accelerated training algorithm
- Recurrent networks
- Recursive least squares method
ASJC Scopus subject areas
- Electrical and Electronic Engineering