본문 바로가기
  • Home

Hi-BERT : A Semantic Relation Learning Model for Recruitment Ontology and Academic Papers

  • Journal of The Korea Society of Computer and Information
  • Abbr : JKSCI
  • 2025, 30(6), pp.21~29
  • Publisher : The Korean Society Of Computer And Information
  • Research Area : Engineering > Computer Science
  • Received : May 8, 2025
  • Accepted : May 29, 2025
  • Published : June 30, 2025

Sung-Kwang Song 1 Hyeon-Jeong Mun 2 Young-Ji Kim 2 Yong-Tae Woo 1

1국립창원대학교
2하이브레인넷

Accredited

ABSTRACT

In this study, we propose the Hi-BERT model to identify effectively and to learn semantic associations between recruitment ontology generated automatically from recruitment information and academic papers using generative AI. The proposed model can learn semantic associations between ontology-transformed sentences, which are structural information of recruitment ontology converted into natural language, and academic papers. In addition, domain-specific characteristics are learned from recruitment information and academic papers, and optimized by the contrastive learning technique. The proposed model can improve the limitations of knowledge transfer between different domains of existing sentence embedding models such as KLUE and KoSimCSE-BERT. In particular, the learning method utilizing ontology-transformed sentences proposed in this study can provide richer semantic understanding than existing models trained by simple text corpus. In the future, the proposed model is expected to be used as a base model for developing expert search systems.

Citation status

* References for papers published after 2023 are currently being built.