Files in this item



application/pdfncme-2011-culbertson.pdf (446kB)
(no description provided)PDF


Title:Is It Wrong? Handling Missing Responses in IRT
Author(s):Culbertson, Michael J.
Subject(s):missing data
Item Response Theory
Massachusetts Comprehensive Assessment System
Abstract:Estimation of examinee ability under Item Response Theory is affected by how omitted test items are handled. This paper examines omission patterns in an operational third-grade math assessment and investigates the effects of different treatments of missing test data on ability estimation through simulations. Surprisingly, there was little correlation between omission rate and item difficulty; and while lower-ability students did omit short answer and extended response items more frequently than high-ability students, high-ability students actually skipped multiple choice items more frequently than low-ability. According to simulations, treating omitted items as incorrect, while pervasive, significantly biases ability estimates downward, particularly for high-ability examinees. Treating omitted items as if they had not been administered has the smallest effect on ability estimation, but could encourage examinees to game the system on high-stakes tests. Treating missing responses as half-correct also performs fairly well, but suffers from the same weakness to devious examinees. In the end, treating omitted responses as wrong for operational tests may be the best option at the moment until further studies are available on when and why students skip test items.
Issue Date:2011-04-11
Citation Info:Paper presented at the annual meeting of the National Council on Measurement in Education in New Orleans, LA (April 11, 2011).
Genre:Conference Paper / Presentation
Publication Status:unpublished
Peer Reviewed:is peer reviewed
Date Available in IDEALS:2011-04-13

This item appears in the following Collection(s)

Item Statistics