Files in this item



application/pdf9702509.pdf (4MB)Restricted to U of Illinois
(no description provided)PDF


Title:Investigating the instructional sensitivity of distractors in items exhibiting DIF
Author(s):Evans, John Andrew
Doctoral Committee Chair(s):Ackerman, Terry A.
Department / Program:Education
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):Education, Tests and Measurements
Psychology, Psychometrics
Abstract:When examinees from identifiable subgroups have comparable ability but different probabilities of correctly responding to an item, the item is said to exhibit differential item functioning (DIF). If the amount of DIF is practically significant, and attributable to features that are irrelevant to the test construct, the item may bias the ability estimates of some group of examinees.
The presence of DIF is a necessary but not a sufficient condition for item bias. The simple conclusion of bias (i.e., observing a larger than expected proportion of examinees from a selected group responding incorrectly to a given item) fails to identify the component(s) of the item that are irrelevant to the examination.
Recognizing the need to identify the plausible item feature(s) to substantively explain DIF, this analysis coupled established and reliable, dimensionally sensitive IRT procedures, with log-linear modeling to provide additional practical interpretation of the dimensionality of an item and its distractors.
The purpose of this research is to propose an empirical methodology for identifying items that have a potential to impair the validity of a single unidimensional) construct measure. The purposed methodology is an out growth of the multidimensional DIF paradigm of Shealy and Stout (1993a). The use of statistical multidimensional assessment procedures to assist in the DIF detection process was proposed by Douglas, Roussos, and Stout (1995).
This work specifically considers the relationship between relevant (i.e., an examinee's ability and opportunity to learn) and non-relevant (i.e., race) examinee characteristics and their interaction with item option selection. This methodology utilized both item response theory (IRT) techniques and hierarchical log-linear analysis. This combination of techniques presents the practitioner with methodology to evaluate the validity of an item in order to ascertain inclusion or rejection of the item from the exam as a whole.
Issue Date:1996
Rights Information:Copyright 1996 Evans, John Andrew
Date Available in IDEALS:2011-05-07
Identifier in Online Catalog:AAI9702509
OCLC Identifier:(UMI)AAI9702509

This item appears in the following Collection(s)

Item Statistics