Federated learning algorithm based on knowledge distillation

Donglin Jiang, Chen Shan, Zhihui Zhang

Research output: Contribution to conferencePaper

16 Citations (Scopus)
62 Downloads (Pure)


Federated learning is a new scheme of distributed machine learning, which enables a large number of edge computing devices to jointly learn a shared model without private data sharing. Federated learning allows nodes to synchronize only the locally trained models instead of their own private data, which provides a guarantee for privacy and security. However, due to the challenges of heterogeneity in federated learning, which are: (1) heterogeneous model architecture among devices; (2) statistical heterogeneity in real federated dataset, which do not obey independent-identical-distribution, resulting in poor performance of traditional federated learning algorithms. To solve the problems above, this paper proposes FedDistill, a new distributed training method based on knowledge distillation. By introducing personalized model on each device, the personalized model aims to improve the local performance even in a situation that global model fails to adapt to the local dataset, thereby improving the ability and robustness of the global model. The improvement of the performance of local device benefits from the effect of knowledge distillation, which can guide the improvement of global model by knowledge transfer between heterogeneous networks. Experiments show that FedDistill can significantly improve the accuracy of classification tasks and meet the needs of heterogeneous users.
Original languageEnglish
Publication statusPublished Online - 1 Mar 2021
Event2020 International Conference on Artificial Intelligence and Computer Engineering (ICAICE) -
Duration: 2 Jan 0001 → …


Conference2020 International Conference on Artificial Intelligence and Computer Engineering (ICAICE)
Period2/01/01 → …


  • Federated learning
  • Knowledge distillation
  • Non-independent-identical-distribution
  • Heterogeneous network


Dive into the research topics of 'Federated learning algorithm based on knowledge distillation'. Together they form a unique fingerprint.

Cite this