본문 바로가기
  • Home

Text Classification Using Heterogeneous Knowledge Distillation

  • Journal of The Korea Society of Computer and Information
  • Abbr : JKSCI
  • 2022, 27(10), pp.29-41
  • DOI : 10.9708/jksci.2022.27.10.029
  • Publisher : The Korean Society Of Computer And Information
  • Research Area : Engineering > Computer Science
  • Received : September 22, 2022
  • Accepted : October 24, 2022
  • Published : October 31, 2022

Yerin Yu 1 Namgyu Kim 1

1국민대학교

Accredited

ABSTRACT

Recently, with the development of deep learning technology, a variety of huge models with excellent performance have been devised by pre-training massive amounts of text data. However, in order for such a model to be applied to real-life services, the inference speed must be fast and the amount of computation must be low, so the technology for model compression is attracting attention. Knowledge distillation, a representative model compression, is attracting attention as it can be used in a variety of ways as a method of transferring the knowledge already learned by the teacher model to a relatively small-sized student model. However, knowledge distillation has a limitation in that it is difficult to solve problems with low similarity to previously learned data because only knowledge necessary for solving a given problem is learned in a teacher model and knowledge distillation to a student model is performed from the same point of view. Therefore, we propose a heterogeneous knowledge distillation method in which the teacher model learns a higher-level concept rather than the knowledge required for the task that the student model needs to solve, and the teacher model distills this knowledge to the student model. In addition, through classification experiments on about 18,000 documents, we confirmed that the heterogeneous knowledge distillation method showed superior performance in all aspects of learning efficiency and accuracy compared to the traditional knowledge distillation.

Citation status

* References for papers published after 2022 are currently being built.