Fractional Tensor Recurrent Unit (fTRU): A Stable Forecasting Model With Long Memory

Hejia Qiu, Chao Li, Ying Weng, Zhun Sun, Qibin Zhao

Research output: Journal PublicationArticlepeer-review

Abstract

The tensor recurrent model is a family of nonlinear dynamical systems, of which the recurrence relation consists of a <inline-formula> <tex-math notation="LaTeX">$p$</tex-math> </inline-formula>-fold (called degree-<inline-formula> <tex-math notation="LaTeX">$p$</tex-math> </inline-formula>) tensor product. Despite such models frequently appearing in advanced recurrent neural networks (RNNs), to this date, there are limited studies on their long memory properties and stability in sequence tasks. In this article, we propose a fractional tensor recurrent model, where the tensor degree <inline-formula> <tex-math notation="LaTeX">$p$</tex-math> </inline-formula> is extended from the discrete domain to the continuous domain, so it is effectively learnable from various datasets. Theoretically, we prove that a large degree <inline-formula> <tex-math notation="LaTeX">$p$</tex-math> </inline-formula> is essential to achieve the long memory effect in a tensor recurrent model, yet it could lead to unstable dynamical behaviors. Hence, our new model, named fractional tensor recurrent unit (fTRU), is expected to seek the saddle point between long memory property and model stability during the training. We experimentally show that the proposed model achieves competitive performance with a long memory and stable manners in several forecasting tasks compared to various advanced RNNs.

Original languageEnglish
Pages (from-to)1-10
Number of pages10
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusAccepted/In press - 2023

Keywords

  • Computational modeling
  • Forecasting
  • Long memory
  • model stability
  • Predictive models
  • Recurrent neural networks
  • recurrent unit
  • Stability criteria
  • Task analysis
  • tensor degree
  • Tensors

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Fractional Tensor Recurrent Unit (fTRU): A Stable Forecasting Model With Long Memory'. Together they form a unique fingerprint.

Cite this