본문 바로가기
  • Home

Knowledge Distillation based-on Internal/External Correlation Learning

  • Journal of The Korea Society of Computer and Information
  • Abbr : JKSCI
  • 2023, 28(4), pp.31-39
  • DOI : 10.9708/jksci.2023.28.04.031
  • Publisher : The Korean Society Of Computer And Information
  • Research Area : Engineering > Computer Science
  • Received : March 7, 2023
  • Accepted : April 3, 2023
  • Published : April 28, 2023

Hun-Beom Bak 1 SEUNGHWAN BAE 1

1인하대학교

Accredited

ABSTRACT

In this paper, we propose an Internal/External Knowledge Distillation (IEKD), which utilizes both external correlations between feature maps of heterogeneous models and internal correlations between feature maps of the same model for transferring knowledge from a teacher model to a student model. To achieve this, we transform feature maps into a sequence format and extract new feature maps suitable for knowledge distillation by considering internal and external correlations through a transformer. We can learn both internal and external correlations by distilling the extracted feature maps and improve the accuracy of the student model by utilizing the extracted feature maps with feature matching. To demonstrate the effectiveness of our proposed knowledge distillation method, we achieved 76.23% Top-1 image classification accuracy on the CIFAR-100 dataset with the “ResNet-32×4/VGG-8” teacher and student combination and outperformed the state-of-the-art KD methods.

Citation status

* References for papers published after 2023 are currently being built.

This paper was written with support from the National Research Foundation of Korea.