In this paper, we propose a multi-task model that can simultaneously predict general-purpose tasks such as part-of-speech tagging, lemmatization, and dependency parsing using the UD Korean Kaist v2.3 corpus. The proposed model thus applies the self-attention technique of the BERT model and the graph-based Biaffine attention technique by fine-tuning the multilingual BERT and the two Korean-specific BERTs such as KR-BERT and KoBERT. The performances of the proposed model are compared and analyzed using the multilingual version of BERT and the two Korean-specific BERT language models.
@article{ART002899651}, author={Kim Euhee}, title={A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations}, journal={Journal of The Korea Society of Computer and Information}, issn={1598-849X}, year={2022}, volume={27}, number={11}, pages={39-46}, doi={10.9708/jksci.2022.27.11.039}
TY - JOUR AU - Kim Euhee TI - A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations JO - Journal of The Korea Society of Computer and Information PY - 2022 VL - 27 IS - 11 PB - The Korean Society Of Computer And Information SP - 39 EP - 46 SN - 1598-849X AB - In this paper, we propose a multi-task model that can simultaneously predict general-purpose tasks such as part-of-speech tagging, lemmatization, and dependency parsing using the UD Korean Kaist v2.3 corpus. The proposed model thus applies the self-attention technique of the BERT model and the graph-based Biaffine attention technique by fine-tuning the multilingual BERT and the two Korean-specific BERTs such as KR-BERT and KoBERT. The performances of the proposed model are compared and analyzed using the multilingual version of BERT and the two Korean-specific BERT language models. KW - Universal Dependency;Multilingual BERT;KR-BERT;KoBERT;Fine-tuning;Pre-training DO - 10.9708/jksci.2022.27.11.039 ER -
Kim Euhee. (2022). A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations. Journal of The Korea Society of Computer and Information, 27(11), 39-46.
Kim Euhee. 2022, "A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations", Journal of The Korea Society of Computer and Information, vol.27, no.11 pp.39-46. Available from: doi:10.9708/jksci.2022.27.11.039
Kim Euhee "A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations" Journal of The Korea Society of Computer and Information 27.11 pp.39-46 (2022) : 39.
Kim Euhee. A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations. 2022; 27(11), 39-46. Available from: doi:10.9708/jksci.2022.27.11.039
Kim Euhee. "A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations" Journal of The Korea Society of Computer and Information 27, no.11 (2022) : 39-46.doi: 10.9708/jksci.2022.27.11.039
Kim Euhee. A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations. Journal of The Korea Society of Computer and Information, 27(11), 39-46. doi: 10.9708/jksci.2022.27.11.039
Kim Euhee. A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations. Journal of The Korea Society of Computer and Information. 2022; 27(11) 39-46. doi: 10.9708/jksci.2022.27.11.039
Kim Euhee. A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations. 2022; 27(11), 39-46. Available from: doi:10.9708/jksci.2022.27.11.039
Kim Euhee. "A Multi-task Self-attention Model Using Pre-trained Language Models on Universal Dependency Annotations" Journal of The Korea Society of Computer and Information 27, no.11 (2022) : 39-46.doi: 10.9708/jksci.2022.27.11.039