본문 바로가기
  • Home

Development of a Multiple-Choice Writing Examination Validated with a Cognitive Diagnosis Model

  • Modern English Education
  • Abbr : MEESO
  • 2021, 22(4), pp.12-23
  • DOI : 10.18095/meeso.2021.22.4.12
  • Publisher : The Modern English Education Society
  • Research Area : Humanities > English Language and Literature > English Language Teaching
  • Received : September 30, 2021
  • Accepted : November 5, 2021
  • Published : November 30, 2021

Sookyung Cho 1 Park, Chanho 2

1한국외국어대학교
2계명대학교

Accredited

ABSTRACT

The aim of this study was to explore the possibility of diagnosing second language (L2) learners’ writing skills through a multiple-choice examination by applying a CDM (cognitive diagnosis model). L2 learners’ writing abilities have mostly been assessed using direct methods such as essay tests. However, this method not only requires a large amount of human raters’ labor, but also faces several issues such as intra-rater and inter-rater reliabilities. Essay examinations are also hard to validate because they are limited in checking the internal structure (i.e., conducting factor analysis). To compensate for these limitations, this study developed a multiple-choice examination and validated it with factor analysis and a CDM. The examination consisted of 20 items on key L2 writing components: content, organization, grammar, vocabulary, and mechanics. It was conducted on 109 college students. The CDM analysis revealed that the test was valid based on response processes. It provided substantial information about each test-taker’s various skills. These results imply that L2 learners’ writing abilities can be assessed in an indirect way and that the test can be conducted with a large body of stu-dents to provide useful information regarding their writing abilities at the same time.

Citation status

* References for papers published after 2023 are currently being built.

This paper was written with support from the National Research Foundation of Korea.