본문 바로가기
  • Home

Probing Sentence Embeddings in L2 Learners’ LSTM Neural Language Models Using Adaptation Learning

  • Journal of The Korea Society of Computer and Information
  • Abbr : JKSCI
  • 2022, 27(3), pp.13-23
  • DOI : 10.9708/jksci.2022.27.03.013
  • Publisher : The Korean Society Of Computer And Information
  • Research Area : Engineering > Computer Science
  • Received : February 10, 2022
  • Accepted : March 18, 2022
  • Published : March 31, 2022

Euhee Kim 1

1신한대학교

Accredited

ABSTRACT

In this study we leveraged a probing method to evaluate how a pre-trained L2 LSTM language model represents sentences with relative and coordinate clauses. The probing experiment employed adapted models based on the pre-trained L2 language models to trace the syntactic properties of sentence embedding vector representations. The dataset for probing was automatically generated using several templates related to different sentence structures. To classify the syntactic properties of sentences for each probing task, we measured the adaptation effects of the language models using syntactic priming. We performed linear mixed-effects model analyses to analyze the relation between adaptation effects in a complex statistical manner and reveal how the L2 language models represent syntactic features for English sentences. When the L2 language models were compared with the baseline L1 Gulordava language models, the analogous results were found for each probing task. In addition, it was confirmed that the L2 language models contain syntactic features of relative and coordinate clauses hierarchically in the sentence embedding representations.

Citation status

* References for papers published after 2023 are currently being built.

This paper was written with support from the National Research Foundation of Korea.