Exploring the item features of a science assessment with complex tasks

Collier, Tina; Morell, Linda; Wilson, Mark

Abstract

Item explanatory models have the potential to provide insight into why certain items are easier or more difficult than others. Through the selection of pertinent item features, one can gather validity evidence for the assessment if construct-related item characteristics are chosen. This is especially important when designing assessment tasks that address new standards. Using data from the Learning Progressions in Middle School Science Instruction and Assessment (LPS) project, this paper adopts an "item explanatory" approach and investigates whether certain item features can explain differences in item difficulties by applying an extension of the linear logistic test model. Specifically, this paper explores the effects of five features on item difficulty: type (argumentation, content, embedded content), scenario-based context, format (multiple-choice or open-ended), graphics, and academic vocabulary. Interactions between some of these features were also investigated. With the exception of context, all features had a statistically significant effect on difficulty.

Más información

Título según WOS: ID WOS:000418349400003 Not found in local WOS DB
Título de la Revista: MEASUREMENT
Volumen: 114
Editorial: ELSEVIER SCI LTD
Fecha de publicación: 2018
Página de inicio: 16
Página final: 24
DOI:

10.1016/j.measurement.2017.08.039

Notas: ISI