psychometrika—vol. 82, no. 3, 795–819
MODELING OMITTED AND NOT-REACHED ITEMS IN IRT MODELS
UNIVERSITY OF TÜBINGEN
Matthias von Davier
EDUCATIONAL TESTING SERVICE
UNIVERSITY OF TÜBINGEN
Item nonresponse is a common problem in educational and psychological assessments. The proba-
bility of unplanned missing responses due to omitted and not-reached items may stochastically depend
on unobserved variables such as missing responses or latent variables. In such cases, missingness can-
not be ignored and needs to be considered in the model. Speciﬁcally, multidimensional IRT models, latent
regression models, and multiple-group IRT models have been suggested for handling nonignorable missing
responses in latent trait models. However, the suitability of the particular models with respect to omitted
and not-reached items has rarely been addressed. Missingness is formalized by response indicators that
are modeled jointly with the researcher’s target model. We will demonstrate that response indicators have
different statistical properties depending on whether the items were omitted or not reached. The impli-
cations of these differences are used to derive a joint model for nonignorable missing responses with
ability to appropriately account for both omitted and not-reached items. The performance of the model is
demonstrated by means of a small simulation study.
Key words: nonignorable, item nonresponses, omitted and not-reached items, IRT.
Unplanned missing data are virtually unavoidable in empirical research in social sciences
for several reasons. Especially in low-stakes assessments, test takers tend to omit some items.
On timed tests, it is likely that test takers fail to reach some items. Item responses that cannot be
scored appropriately (not codable) account for an additional source of missing responses.
Although the rate of unintended nonresponses differs between studies, there is clear evidence
that it is major problem: For example, on average, the percentage of unplanned missing data was
10% in the Program for International Student Assessment (PISA) 2009 study (OECD, 2009). The
total percentage of item nonresponses on a reading test used in the National Educational Panel
Study (Blossfeld, Roßbach, & von Maurice, 2011) was 18.83% (5.37% omitted items, 13.46%
unreached items) (Pohl, Haberkorn, Hardt, & Wiegand, 2012). Within studies, the proportion of
missing data may also vary considerably across items, subtests, and subpopulations. For example,
Electronic supplementary material The online version of this article (doi:10.1007/s11336-016-9544-7) contains
supplementary material, which is available to authorized users.
Parts of this paper are based on the unpublished dissertation of the ﬁrst author. We thank Andreas Frey and Rolf
Steyer who served as members on the thesis committee. We thank the anonymous reviewers for their careful reading of
our manuscript and their many insightful comments and suggestions.
Correspondence should be made to Norman Rose, Hector Research Institute of Education Sciences and Psychology,
University of Tübingen, Europastrasse 6, 72072 Tübingen, Germany. Email: email@example.com
© 2016 The Psychometric Society