Local Normalization Based BN Layer Pruning

Yuan Liu, Xi Jia, Linlin Shen, Zhong Ming, Jinming Duan

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Compression and acceleration of convolutional neural network (CNN) have raised extensive research interest in the past few years. In this paper, we proposed a novel channel-level pruning method based on gamma (scaling parameters) of Batch Normalization layer to compress and accelerate CNN models. Local gamma normalization and selection was proposed to address the over-pruning issue and introduce local information into channel selection. After that, an ablation based beta (shifting parameters) transfer, and knowledge distillation based fine-tuning were further applied to improve the performance of the pruned model. The experimental results on CIFAR-10, CIFAR-100 and LFW datasets suggest that our approach can achieve much more efficient pruning in terms of reduction of parameters and FLOPs, e.g., 8.64 × compression and 3.79 × acceleration of VGG were achieved on CIFAR, with slight accuracy loss.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2019
Subtitle of host publicationDeep Learning - 28th International Conference on Artificial Neural Networks, Proceedings
EditorsIgor V. Tetko, Pavel Karpov, Fabian Theis, Vera Kurková
PublisherSpringer Verlag
Pages334-346
Number of pages13
ISBN (Print)9783030304836
DOIs
Publication statusPublished - 2019
Externally publishedYes
Event28th International Conference on Artificial Neural Networks, ICANN 2019 - Munich, Germany
Duration: 17 Sept 201919 Sept 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11728 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference28th International Conference on Artificial Neural Networks, ICANN 2019
Country/TerritoryGermany
CityMunich
Period17/09/1919/09/19

Keywords

  • Convolutional neural network (CNN)
  • Knowledge distillation
  • Model compression and acceleration
  • Pruning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Local Normalization Based BN Layer Pruning'. Together they form a unique fingerprint.

Cite this