본문 바로가기
  • Home

A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations

  • Journal of The Korea Society of Computer and Information
  • Abbr : JKSCI
  • 2022, 27(11), pp.39-46
  • DOI : 10.9708/jksci.2022.27.11.039
  • Publisher : The Korean Society Of Computer And Information
  • Research Area : Engineering > Computer Science
  • Received : October 6, 2022
  • Accepted : October 27, 2022
  • Published : November 30, 2022

Kim Euhee 1

1신한대학교

Accredited

ABSTRACT

In this paper, we propose a multi-task model that can simultaneously predict general-purpose tasks such as part-of-speech tagging, lemmatization, and dependency parsing using the UD Korean Kaist v2.3 corpus. The proposed model thus applies the self-attention technique of the BERT model and the graph-based Biaffine attention technique by fine-tuning the multilingual BERT and the two Korean-specific BERTs such as KR-BERT and KoBERT. The performances of the proposed model are compared and analyzed using the multilingual version of BERT and the two Korean-specific BERT language models.

Citation status

* References for papers published after 2023 are currently being built.

This paper was written with support from the National Research Foundation of Korea.