Efficient-PrototypicalNet with self knowledge distillation for few-shot learning

Jit Yan Lim, Kian Ming Lim, Shih Yin Ooi, Chin Poo Lee

Research output: Journal PublicationArticlepeer-review

29 Citations (Scopus)


The focus of recent few-shot learning research has been on the development of learning methods that can quickly adapt to unseen tasks with small amounts of data and low computational cost. In order to achieve higher performance in few-shot learning tasks, the generalizability of the method is essential to enable it generalize well from seen tasks to unseen tasks with limited number of samples. In this work, we investigate a new metric-based few-shot learning framework which transfers the knowledge from another effective classification model to produce well generalized embedding and improve the effectiveness in handling unseen tasks. The idea of our proposed Efficient-PrototypicalNet involves transfer learning, knowledge distillation, and few-shot learning. We employed a pre-trained model as a feature extractor to obtain useful features from tasks and decrease the task complexity. These features reduce the training difficulty in few-shot learning and increase the performance. Besides that, we further apply knowledge distillation to our framework and achieve extra performance improvement. The proposed Efficient-PrototypicalNet was evaluated on five benchmark datasets, i.e., Omniglot, miniImageNet, tieredImageNet, CIFAR-FS, and FC100. The proposed Efficient-PrototypicalNet achieved the state-of-the-art performance on most datasets in the 5-way K-shot image classification task, especially on the miniImageNet dataset.

Original languageEnglish
Pages (from-to)327-337
Number of pages11
Publication statusPublished - 7 Oct 2021
Externally publishedYes


  • EfficientNet
  • Few-shot learning
  • Knowledge distillation
  • Meta learning
  • Prototypical network
  • Transfer learning

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence


Dive into the research topics of 'Efficient-PrototypicalNet with self knowledge distillation for few-shot learning'. Together they form a unique fingerprint.

Cite this