본문 바로가기
  • Home

3D Object Detection via Multi-Scale Feature Knowledge Distillation

  • Journal of The Korea Society of Computer and Information
  • Abbr : JKSCI
  • 2024, 29(10), pp.35-45
  • DOI : 10.9708/jksci.2024.29.10.035
  • Publisher : The Korean Society Of Computer And Information
  • Research Area : Engineering > Computer Science
  • Received : July 16, 2024
  • Accepted : October 4, 2024
  • Published : October 31, 2024

Se-Gwon Cheon, 1 Hyuk-Jin Shin 2 Seung-Hwan Bae 1

1인하대학교
2인하대학교 인공지능융합연구센터

Accredited

ABSTRACT

In this paper, we propose Multi-Scale Feature Knowledge Distillation for 3D Object Detection (M3KD), which extracting knowledge from the teacher model, and transfer to the student model consider with multi-scale feature map. To achieve this, we minimize L2 loss between feature maps at each pyramid level of the student model with the correspond teacher model so student model can mimic the teacher model backbone information which improves the overall accuracy of the student model. We apply the class logits knowledge distillation used in the image classification task, by allowing student model mimic the classification logits of the teacher model, to guide the student model to improve the detection accuracy. In KITTI (Karlsruhe Institute of Technology and Toyota Technological Institute) dataset, our M3KD (Multi-Scale Feature Knowledge Distillation for 3D Object Detection) student model achieves 30% inference speed improvement compared to the teacher model. Additionally, our method achieved an average improvement of 1.08% in 3D mean Average Precision (mAP) across all classes and difficulty levels compared to the baseline student model. Furthermore, when integrated with the latest knowledge distillation methods such as PKD and SemCKD, our approach achieved an additional 0.42% and 0.52% improvement in 3D mAP, respectively, further enhancing performance.

Citation status

* References for papers published after 2023 are currently being built.

This paper was written with support from the National Research Foundation of Korea.