An improved algorithm for learning long-term dependency problems in adaptive processing of data structures

Siu Yeung Cho, Zheru Chi, Wan Chi Siu, Ah Chung Tsoi

Research output: Journal PublicationArticlepeer-review

37 Citations (Scopus)

Abstract

For the past decade, many researchers have explored the use of neural-network representations for the adaptive processing of data structures. One of the most popular learning formulations of data structure processing is backpropagation through structure (BPTS). The BPTS algorithm has been successful applied to a number of learning tasks that involve structural patterns such as logo and natural scene classification. The main limitations of the BPTS algorithm are attributed to slow convergence speed and the long-term dependency problem for the adaptive processing of data structures. In this paper, an improved algorithm is proposed to solve these problems. The idea of this algorithm is to optimize the free learning parameters of the neural network in the node representation by using least-squares-based optimization methods in a layer-by-layer fashion. Not only can fast convergence speed be achieved, but the long-term dependency problem can also be overcome since the vanishing of gradient information is avoided when our approach is applied to very deep tree structures.

Original languageEnglish
Pages (from-to)781-793
Number of pages13
JournalIEEE Transactions on Neural Networks
Volume14
Issue number4
DOIs
Publication statusPublished - Jul 2003
Externally publishedYes

Keywords

  • Adaptive processing of data structures
  • Backpropagation through structure (BPTS)
  • Least-squares method
  • Long-term dependency

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'An improved algorithm for learning long-term dependency problems in adaptive processing of data structures'. Together they form a unique fingerprint.

Cite this