Introspective Gan for Meshface Recognition

Wenting Chen, Linlin Shen, Zhihui Lai

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)


Majority of face recognition systems can only retrieve protected ID photos from the government agency in China. These facial images covered with mesh-like curves are termed meshface. Meshface could significantly affect the performance of face recognition systems. Although some GAN based methods are proposed to address this issue by translating meshface images to clean face images, their capacities are limited. In this paper, we introduce the introspective modules to encourage the generator to reconstruct face images and accelerate the learning process of the discriminator. Both a private meshface dataset and the public LFW dataset are used for experiments. The quantitative evaluation on both datasets proves that the introspective GAN can recover face image with better quality. Additionally, face recognition performance is also significantly improved.

Original languageEnglish
Title of host publication2019 IEEE International Conference on Image Processing, ICIP 2019 - Proceedings
PublisherIEEE Computer Society
Number of pages5
ISBN (Electronic)9781538662496
Publication statusPublished - Sept 2019
Externally publishedYes
Event26th IEEE International Conference on Image Processing, ICIP 2019 - Taipei, Taiwan, Province of China
Duration: 22 Sept 201925 Sept 2019

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880


Conference26th IEEE International Conference on Image Processing, ICIP 2019
Country/TerritoryTaiwan, Province of China


  • GAN
  • image translation
  • introspective modules
  • mesh-like curves removal
  • Meshface recognition

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Signal Processing


Dive into the research topics of 'Introspective Gan for Meshface Recognition'. Together they form a unique fingerprint.

Cite this