Continuous neural network with windowed Hebbian learning

M. Fotouhi, M. Heidari, M. Sharifitabar

Research output: Journal PublicationArticlepeer-review

3 Citations (Scopus)

Abstract

We introduce an extension of the classical neural field equation where the dynamics of the synaptic kernel satisfies the standard Hebbian type of learning (synaptic plasticity). Here, a continuous network in which changes in the weight kernel occurs in a specified time window is considered. A novelty of this model is that it admits synaptic weight decrease as well as the usual weight increase resulting from correlated activity. The resulting equation leads to a delay-type rate model for which the existence and stability of solutions such as the rest state, bumps, and traveling fronts are investigated. Some relations between the length of the time window and the bump width is derived. In addition, the effect of the delay parameter on the stability of solutions is shown. Also numerical simulations for solutions and their stability are presented.

Original languageEnglish
Pages (from-to)321-332
Number of pages12
JournalBiological Cybernetics
Volume109
Issue number3
DOIs
Publication statusPublished - 29 Jun 2015
Externally publishedYes

Keywords

  • Bump
  • Continuous network
  • Delay equation
  • Existence
  • Neural field
  • Stability
  • Traveling front

ASJC Scopus subject areas

  • Biotechnology
  • General Computer Science

Fingerprint

Dive into the research topics of 'Continuous neural network with windowed Hebbian learning'. Together they form a unique fingerprint.

Cite this