10:00 AM - 10:20 AM
[2R1-OS-10c-04] Linking Method for Writing Tests using Item Response Theory and Neural Automated Scoring Technology
Keywords:Item Response Theory, Deep Learning, Writing Tests
A difficulty of writing tests is that given scores depend on rater characteristics, such as severity and consistency. To resolve this problem, the generalized many-facet Rasch model (GMFRM), an item response theory model that considers such rater characteristics, has been proposed. When applying such an IRT model to datasets comprising results of multiple writing tests administered to different examinees, test linking is needed to unify the scale for model parameters estimated from individual test results. In test linking, test administrators generally need to design multiple tests such that examinees or raters partially overlap. However, preparing common examinees and raters is often difficult in actual testing environments. Therefore, in this study, we propose a novel method to link the results of the writing tests, which are obtained by applying GMFRM, using recent neural automatic scoring technology. Experimental results show that our method realizes the test linking without common examinees and raters.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.