This paper presents a novel Heuristic Global Learning (HER-GBL) algorithm for multilayer neural networks. The algorithm is based upon the least squares method to maintain the fast convergence speed, and the penalized optimization to solve the problem of local minima. The penalty term, defined as a Gaussian-type function of the weight, is to provide an uphill force to escape from local minima. As a result, the training performance is dramatically improved. The proposed HER-GBL algorithm yields excellent results in terms of convergence speed, avoidance of local minima and quality of solution.
ASJC Scopus subject areas
- Neuroscience (all)
- Computer Networks and Communications
- Artificial Intelligence