Several years ago, we introduced the idea of item response curves (IRC), a simplistic form of item response theory (IRT), to the physics education research community as a way to examine item performance on diagnostic instruments such as the Force Concept Inventory (FCI). We noted that a full-blown analysis using IRT would be a next logical step, which several authors have since taken. In this paper, we show that our simple approach not only yields similar conclusions in the analysis of the performance of items on the FCI to the more sophisticated and complex IRT analyses but also permits additional insights by characterizing both the correct and incorrect answer choices. Our IRC approach can be applied to a variety of multiple-choice assessments but, as applied to a carefully designed instrument such as the FCI, allows us to probe student understanding as a function of ability level through an examination of each answer choice. We imagine that physics teachers could use IRC analysis to identify prominent misconceptions and tailor their instruction to combat those misconceptions, fulfilling the FCI authors’ original intentions for its use. Furthermore, the IRC analysis can assist test designers to improve their assessments by identifying nonfunctioning distractors that can be replaced with distractors attractive to students at various ability levels.

1.
D.
Hestenes
,
M.
Wells
, and
G.
Swackhamer
, “
Force concept inventory
,”
Phys. Teach.
30
,
141
158
(
1992
).
2.
G. A.
Morris
,
L.
Branum-Martin
,
N.
Harshman
,
S. D.
Baker
,
E.
Mazur
,
S.
Dutta
,
T.
Mzoughi
, and
V.
McCauley
, “
Testing the test: item response curves and test quality
,”
Am. J. Phys.
74
,
449
453
(
2006
).
3.
R. P.
McDonald
,
Test Theory: A Unified Treatment
(
Erlbaum
,
Mahwah, NJ
,
1999
).
4.
J.
Wang
and
L.
Bao
, “
Analyzing force concept inventory with item response theory
,”
Am. J. Phys.
78
,
1064
1070
(
2010
).
5.
D.
Thissen
and
L.
Steinberg
, “
A taxonomy of item response models
,”
Psychometrika
51
,
567
577
(
1986
).
6.
S. E.
Embretson
and
S.P.
Reise
,
Item Response Theory for Psychologists
(
Erlbaum
,
Mahwah, NJ
,
2000
).
7.
R. D.
Bock
, “
Estimating item parameters and latent ability when responses are scored in two or more nominal categories
,”
Psychometrika
37
,
29
51
(
1972
).
8.
L.
Ding
and
R.
Beichner
, “
Approaches to data analysis of multiple-choice questions
,”
Phys. Rev. ST Phys. Educ. Res.
5
,
020103
(
2009
).
9.
K. D.
Schurmeier
,
C. H.
Atwood
,
C. G.
Shepler
, and
G. J.
Lautenschlager
, “
Using item response theory to assess changes in student performance based on changes in question wording
,”
J. Chem. Educ.
87
,
1268
1272
(
2010
).
10.
J. A.
Marshall
,
E. A.
Hagedorn
, and
J.
O’Connor
, “
Anatomy of a physics test: validation of the physics items on the Texas Assessment of Knowledge and Skills
,”
Phys. Rev. ST Phys. Educ. Res.
5
,
010104
(
2009
).
11.
F. M.
Lord
,
Applications of Item Response Theory to Practical Testing Problems
(
Lawrence Erlbaum Associates, Inc.
,
Hillsdale, NJ
,
1980
).
12.
M.
Planinic
,
L.
Ivanjek
, and
A.
Susac
, “
Rasch model based analysis of the Force Concept Inventory
,”
Phys. Rev. ST Phys. Educ. Res.
6
,
010103
(
2010
).
AAPT members receive access to the American Journal of Physics and The Physics Teacher as a member benefit. To learn more about this member benefit and becoming an AAPT member, visit the Joining AAPT page.