An efficient asymmetric nonlinear activation function for deep neural networks

Enhui Chai, Wei Yu, Tianxiang Cui, Jianfeng Ren, Shusheng Ding

Research output: Journal PublicationArticlepeer-review

14 Citations (Scopus)

Abstract

As a key step to endow the neural network with nonlinear factors, the activation function is crucial to the performance of the network. This paper proposes an Efficient Asymmetric Nonlinear Activation Function (EANAF) for deep neural networks. Compared with existing activation functions, the proposed EANAF requires less computational effort, and it is self-regularized, asymmetric and non-monotonic. These desired characteristics facilitate the outstanding performance of the proposed EANAF. To demonstrate the effectiveness of this function in the field of object detection, the proposed activation function is compared with several state-of-the-art activation functions on the typical backbone networks such as ResNet and DSPDarkNet. The experimental results demonstrate the superior performance of the proposed EANAF.
Original languageEnglish
Article number1027
JournalSymmetry
Volume14
Issue number5
DOIs
Publication statusPublished - 17 May 2022

Keywords

  • the neural network
  • the activation function
  • asymmetry
  • self-regular
  • non-monotonic
  • backbone network

Fingerprint

Dive into the research topics of 'An efficient asymmetric nonlinear activation function for deep neural networks'. Together they form a unique fingerprint.

Cite this