@article{ART002580094},
author={Sung-Min Yang and Ok-Ran Jeong},
title={DeNERT: Named Entity Recognition Model using DQN and BERT},
journal={Journal of The Korea Society of Computer and Information},
issn={1598-849X},
year={2020},
volume={25},
number={4},
pages={29-35},
doi={10.9708/jksci.2020.25.04.029}
TY - JOUR
AU - Sung-Min Yang
AU - Ok-Ran Jeong
TI - DeNERT: Named Entity Recognition Model using DQN and BERT
JO - Journal of The Korea Society of Computer and Information
PY - 2020
VL - 25
IS - 4
PB - The Korean Society Of Computer And Information
SP - 29
EP - 35
SN - 1598-849X
AB - In this paper, we propose a new structured entity recognition DeNERT model. Recently, the field of natural language processing has been actively researched using pre-trained language representation models with a large amount of corpus. In particular, the named entity recognition, which is one of the fields of natural language processing, uses a supervised learning method, which requires a large amount of training dataset and computation. Reinforcement learning is a method that learns through trial and error experience without initial data and is closer to the process of human learning than other machine learning methodologies and is not much applied to the field of natural language processing yet. It is often used in simulation environments such as Atari games and AlphaGo. BERT is a general-purpose language model developed by Google that is pre-trained on large corpus and computational quantities. Recently, it is a language model that shows high performance in the field of natural language processing research and shows high accuracy in many downstream tasks of natural language processing. In this paper, we propose a new named entity recognition DeNERT model using two deep learning models, DQN and BERT. The proposed model is trained by creating a learning environment of reinforcement learning model based on language expression which is the advantage of the general language model. The DeNERT model trained in this way is a faster inference time and higher performance model with a small amount of training dataset. Also, we validate the performance of our model's named entity recognition performance through experiments.
KW - Natural language processing;Named entity recognition;Reinforcement learning;BERT;DQN;Language model
DO - 10.9708/jksci.2020.25.04.029
ER -
Sung-Min Yang and Ok-Ran Jeong. (2020). DeNERT: Named Entity Recognition Model using DQN and BERT. Journal of The Korea Society of Computer and Information, 25(4), 29-35.
Sung-Min Yang and Ok-Ran Jeong. 2020, "DeNERT: Named Entity Recognition Model using DQN and BERT", Journal of The Korea Society of Computer and Information, vol.25, no.4 pp.29-35. Available from: doi:10.9708/jksci.2020.25.04.029
Sung-Min Yang, Ok-Ran Jeong "DeNERT: Named Entity Recognition Model using DQN and BERT" Journal of The Korea Society of Computer and Information 25.4 pp.29-35 (2020) : 29.
Sung-Min Yang, Ok-Ran Jeong. DeNERT: Named Entity Recognition Model using DQN and BERT. 2020; 25(4), 29-35. Available from: doi:10.9708/jksci.2020.25.04.029
Sung-Min Yang and Ok-Ran Jeong. "DeNERT: Named Entity Recognition Model using DQN and BERT" Journal of The Korea Society of Computer and Information 25, no.4 (2020) : 29-35.doi: 10.9708/jksci.2020.25.04.029
Sung-Min Yang; Ok-Ran Jeong. DeNERT: Named Entity Recognition Model using DQN and BERT. Journal of The Korea Society of Computer and Information, 25(4), 29-35. doi: 10.9708/jksci.2020.25.04.029
Sung-Min Yang; Ok-Ran Jeong. DeNERT: Named Entity Recognition Model using DQN and BERT. Journal of The Korea Society of Computer and Information. 2020; 25(4) 29-35. doi: 10.9708/jksci.2020.25.04.029
Sung-Min Yang, Ok-Ran Jeong. DeNERT: Named Entity Recognition Model using DQN and BERT. 2020; 25(4), 29-35. Available from: doi:10.9708/jksci.2020.25.04.029
Sung-Min Yang and Ok-Ran Jeong. "DeNERT: Named Entity Recognition Model using DQN and BERT" Journal of The Korea Society of Computer and Information 25, no.4 (2020) : 29-35.doi: 10.9708/jksci.2020.25.04.029