Surrogate network-based sparseness hyper-parameter optimization for deep expression recognition

Weicheng Xie, Wenting Chen, Linlin Shen, Jinming Duan, Meng Yang

Research output: Journal PublicationArticlepeer-review

15 Citations (Scopus)

Abstract

For facial expression recognition, the sparseness constraints of the features or weights can improve the generalization ability of a deep network. However, the optimization of the hyper-parameters in fusing different sparseness strategies demands much computation, when the traditional gradient-based algorithms are used. In this work, an iterative framework with surrogate network is proposed for the optimization of hyper-parameters in fusing different sparseness strategies. In each iteration, a network with significantly smaller model complexity is fitted to the original large network based on four Euclidean losses, where the hyper-parameters are optimized with heuristic optimizers. Since the surrogate network uses the same deep metrics and embeds the same hyper-parameters as the original network, the optimized hyper-parameters are then used for the training of the original deep network in the next iteration. While the performance of the proposed algorithm is justified with a tiny model, i.e. LeNet on the FER2013 database, our approach achieved competitive performances on six publicly available expression datasets, i.e., FER2013, CK+, Oulu-CASIA, MMI, AFEW and AffectNet.

Original languageEnglish
Article number107701
JournalPattern Recognition
Volume111
DOIs
Publication statusPublished - Mar 2021
Externally publishedYes

Keywords

  • Deep sparseness strategies
  • Expression recognition
  • Heuristic optimizer
  • Hyper-parameter optimization
  • Surrogate network

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Cite this