Local Normalization Based BN Layer Pruning

Yuan Liu, Xi Jia, Linlin Shen, Zhong Ming, Jinming Duan

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)


Compression and acceleration of convolutional neural network (CNN) have raised extensive research interest in the past few years. In this paper, we proposed a novel channel-level pruning method based on gamma (scaling parameters) of Batch Normalization layer to compress and accelerate CNN models. Local gamma normalization and selection was proposed to address the over-pruning issue and introduce local information into channel selection. After that, an ablation based beta (shifting parameters) transfer, and knowledge distillation based fine-tuning were further applied to improve the performance of the pruned model. The experimental results on CIFAR-10, CIFAR-100 and LFW datasets suggest that our approach can achieve much more efficient pruning in terms of reduction of parameters and FLOPs, e.g., 8.64 × compression and 3.79 × acceleration of VGG were achieved on CIFAR, with slight accuracy loss.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2019
Subtitle of host publicationDeep Learning - 28th International Conference on Artificial Neural Networks, Proceedings
EditorsIgor V. Tetko, Pavel Karpov, Fabian Theis, Vera Kurková
PublisherSpringer Verlag
Number of pages13
ISBN (Print)9783030304836
Publication statusPublished - 2019
Externally publishedYes
Event28th International Conference on Artificial Neural Networks, ICANN 2019 - Munich, Germany
Duration: 17 Sept 201919 Sept 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11728 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference28th International Conference on Artificial Neural Networks, ICANN 2019


  • Convolutional neural network (CNN)
  • Knowledge distillation
  • Model compression and acceleration
  • Pruning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science (all)


Dive into the research topics of 'Local Normalization Based BN Layer Pruning'. Together they form a unique fingerprint.

Cite this