When answering physics questions, students often have different perspectives than do physics experts. Sometimes this difference does not mean students possess misconceptions, but might indicate that the questions need to be revised. We conducted student interviews to identify and resolve validity issues that stem from the different perspectives of students and question designers. After interviews with 35 student volunteers, we selected 14 verbal and detail-oriented individuals for repeated interviews. Instead of using interviews for soliciting and confirming students’ incorrect answers, we conducted interviews as a “student consultation” process that revealed validity issues missed by physics experts. A four-stage response model was used to examine student verbal reports, and validity issues corresponding to each of the response stages were uncovered.

1.
D.
Hestenes
,
M.
Wells
, and
G.
Swackhamer
, “
Force concept inventory
,”
Phys. Teach.
30
,
141
158
(
1992
);
R.
Beichner
, “
Testing student interpretation of kinematics graphs
,”
Am. J. Phys.
62
(
8
),
750
762
(
1994
);
R.
Thornton
and
D.
Sokoloff
, “
Assessing student learning of Newton’s laws: The force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula
,”
Am. J. Phys.
66
(
4
),
338
352
(
1998
);
D.
Maloney
,
T.
O’Kuma
,
C.
Hieggelke
, and
A.
Van Heuvelen
, “
Surveying students’ conceptual knowledge of electricity and magnetism
,”
Am. J. Phys.
69
,
S12
S23
(
2001
);
P.
Engelhardt
and
R.
Beichner
, “
Students’ understanding of direct current resistive electrical circuits
,”
Am. J. Phys.
72
(
1
),
98
115
(
2004
);
L.
Ding
, “
Designing an energy assessment to evaluate student understanding of energy topics
,” Ph.D. thesis,
North Carolina State University
,
2007
.
2.
N.
Reay
,
L.
Bao
,
P.
Li
,
R.
Warnakulasooriya
, and
G.
Baugh
, “
Toward the effective use of voting machines in physics lectures
,”
Am. J. Phys.
73
(
6
),
554
558
(
2005
);
D.
Duncan
and
E.
Mazur
,
Clickers in the Classroom: How to Enhance Science Teaching Using Classroom Response Systems
(
Pearson
,
San Francisco
,
2005
);
E.
Suchman
,
K.
Uchiyama
,
R.
Smith
, and
K.
Bender
, “
Evaluating the impact of a classroom response system in a microbiology course
,”
Microbiology Educ.
7
,
3
11
(
2006
);
I.
Beatty
,
W.
Gerace
,
W.
Leonard
, and
R.
Dufresne
, “
Designing effective questions for classroom response system teaching
,”
Am. J. Phys.
74
(
1
),
31
39
(
2006
);
R.
Preszler
,
A.
Dawe
,
C.
Shuster
, and
M.
Shuster
, “
Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses
,”
Life Sci. Educ.
6
,
29
41
(
2007
);
J.
Caldwell
, “
Clickers in the large classroom: Current research and best-practice tips
,”
Life Sci. Educ.
6
,
9
20
(
2007
);
N.
Reay
,
P.
Li
, and
L.
Bao
, “
Testing a new voting machine question methodology
,”
Am. J. Phys.
76
(
2
),
171
178
(
2008
);
K.
Crossgrove
and
K.
Curran
, “
Using clickers in nonmajors- and majors-level biology courses: Student opinion, learning, and long-term retention of course material
,”
Life Sci. Educ.
7
,
146
154
(
2008
).
3.
P.
Kline
,
A Handbook of Test Construction: Introduction to Psychometric Design
(
Methuen
,
London
,
1986
).
4.
L.
Ding
,
R.
Chabay
,
B.
Sherwood
, and
R.
Beichner
, “
Evaluating an electricity and magnetism assessment tool: Brief Electricity and Magnetism Assessment
,”
Phys. Rev. ST Phys. Educ. Res.
2
,
010105
1
(
2006
).
5.
T. M.
Haladyna
,
Developing and Validating Multiple-choice Test Items
(
Erlbaum
,
Mahwah, NJ
,
2004
).
6.
L.
Mcdermott
and
E.
Redish
, “
Resource letter on physics education research
,”
Am. J. Phys.
67
(
9
),
755
767
(
1999
).
7.
F.
Conrad
and
J.
Blair
, “
From impressions to data: Increasing the objectivity of cognitive interviews
,” in
Proceedings of the Section on Survey Research Methods, American Statistical Association
(
ASA
,
Alexandria, VA
,
1996
), p.
1
.
8.
R.
Tourangeau
, “
Cognitive science and survey methods
,” in
Cognitive Aspects of Survey Design: Building a Bridge between Disciplines
, edited by
T.
Jabine
,
M.
Straf
,
J.
Tanur
, and
R.
Tourangeau
(
National Academies Press
,
Washington, DC
,
1984
), p.
73
.
9.
L.
Oksenberg
and
C.
Cannell
, “
Some factors underlying the validity of response in self-report
,”
Bull. I’Institut Int. Stati.
48
,
325
346
(
1977
).
10.
S.
Norris
, “
Can we test validity for critical thinking?
Educ. Res.
18
(
9
),
21
26
(
1989
).
11.
S.
Norris
, “
Effect of eliciting verbal reports of thinking on critical thinking test performance
,”
J. Educ. Meas.
27
(
1
),
41
58
(
1990
).
12.
L.
Desimone
and
K.
Le Floch
, “
Are we asking the right questions? Using cognitive interviews to improve surveys in education research
,”
Educ. Eval. Policy Analysis.
26
(
1
),
1
22
(
2004
).
13.
G.
Willis
,
P.
Royston
, and
D.
Bercini
, “
The use of verbal report methods in the development and testing of survey questionnaires
,”
Appl. Cognit. Psychol.
5
,
251
267
(
1991
).
14.
L.
Hamilton
,
E.
Nussbaum
, and
R.
Snow
, “
Interview procedures for validating science assessments
,”
Appl. Meas. Educ.
10
(
2
),
181
200
(
1997
).
15.
W.
Adams
,
K.
Perkins
,
N.
Podolefsky
,
M.
Dubson
,
N.
Finkelstein
, and
C.
Wieman
, “
New instrument for measuring student beliefs about physics and learning physics: The Colorado Learning Attitudes about Science Survey
,”
Phys. Rev. ST Phys. Educ. Res.
2
,
010101
1
(
2006
).
16.
S.
McKagan
,
K.
Perkins
,
M.
Dubson
,
C.
Malley
,
S.
Reid
,
R.
LeMaster
, and
C.
Wieman
, “
Developing and researching PhET simulations for teaching quantum mechanics
,”
Am. J. Phys.
76
,
406
417
(
2008
).
17.
Reference 1,
P.
Engelhardt
and
R.
Beichner
.
18.
D.
Clerk
and
M.
Rutherford
, “
Language as a confounding variable in the diagnosis of misconceptions
,”
Int. J. Sci. Educ.
22
(
7
),
703
717
(
2000
).
19.
Reference 2,
N.
Reay
.
20.
P.
Tao
, “
Detection of missing and irrelevant information within paper and pencil physics problems
,”
Res. Sci. Educ.
22
,
387
392
(
1992
).
21.

We also found in our subsequent interviews that “the work done by the frictional force on the car” could be interpreted as either cumulative work or as instantaneous work. We made further changes and the final version reads: “…What happens to the cumulative work done on the car by the frictional force, while traveling the 900meters?”

22.
R.
Sternberg
,
Cognitive Psychology
, 4th ed. (
Thomson/Wadsworth
,
Belmont, CA
,
2006
).
23.
J.
Minstrell
, “
Facets of students’ knowledge and relevant instruction
,” in
Research in Physics Learning: Theoretical Issues and Empirical Studies, Proceedings of an International Workshop
, edited by
R.
Duit
,
F.
Goldberg
, and
H.
Niedderer
(
IPN
,
Kiel, Bremen
,
1992
), p.
110
.
24.
D.
Hammer
, “
Student resources for learning introductory physics
,”
Am. J. Phys.
68
,
S52
S59
(
2000
).
25.
B.
Eylon
and
F.
Reif
, “
Effects of knowledge organization on task performance
,”
Cogn. Instruct.
1
,
5
44
(
1984
).
26.
J.
Blair
and
S.
Presser
, “
Survey procedures for conducting cognitive interviews to pretest questionnaires: A review of theory and practice
,” in
Proceedings of the Section on Survey Research Methods of the American Statistical Association
(
ASA
,
Alexandria, VA
,
1993
), p.
370
.
AAPT members receive access to the American Journal of Physics and The Physics Teacher as a member benefit. To learn more about this member benefit and becoming an AAPT member, visit the Joining AAPT page.