Multiple-choice tests such as the Force Concept Inventory (FCI) provide useful instruments to probe the distribution of student difficulties on a large scale. However, traditional analysis often relies solely on scores (number of students giving the correct answer). This ignores what can be significant and important information: the distribution of wrong answers given by the class. In this paper we introduce a new method, concentration analysis, to measure how students’ responses on multiple-choice questions are distributed. This information can be used to study if the students have common incorrect models or if the question is effective in detecting student models. When combined with information obtained from qualitative research, the method allows us to identify cleanly what FCI results are telling us about student knowledge.

1.
I. A.
Halloun
and
D.
Hestenes
, “
Common sense concepts about motion
,”
Am. J. Phys.
53
,
1056
(
1985
);
D. P.
Maloney
and
R. S.
Siegler
, “
Conceptual competition in physics learning.
Int. J. Sci. Educ.
15
,
283
296
(
1993
);
R. K. Thornton, “Conceptual Dynamics: Changing Student Views of Force and Motion,” Proceedings of the International Conference on Thinking Science for Teaching: The Case of Physics, Rome, September 1994;
M. Wittmann, “Making Sense of How Students Come to an Understanding of Physics: An Example from Mechanical Waves,” Ph.D. thesis, University of Maryland, 1999.
2.
I. A.
Halloun
and
D.
Hestenes
, “
The initial knowledge state of college physics students
,”
Am. J. Phys.
53
,
1043
1056
(
1985
);
D.
Hestenes
,
M.
Wells
, and
G.
Swackhamer
, “
Force Concept Inventory
,”
Phys. Teach.
30
,
141
158
(
1992
).
D.
Hestenes
and
M.
Wells
, “
A Mechanics Baseline Test
,”
Phys. Teach.
30
,
159
166
(
1992
);
R. J.
Beichner
, “
Testing student interpretation of kinematics graphs
,”
Am. J. Phys.
62
,
750
762
(
1994
);
R. K.
Thornton
and
D. R.
Sokoloff
, “
Assessing student learning of Newton’s laws: The force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula
,”
Am. J. Phys.
66
,
338
352
(
1998
).
3.
R. R.
Hake
, “
Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses
,”
Am. J. Phys.
66
,
64
74
(
1998
);
E. F.
Redish
,
J. M.
Saul
, and
R. N.
Steinberg
, “
On the effectiveness of active-engagement microcomputer-based laboratories
,”
Am. J. Phys.
65
,
45
54
(
1997
).
4.
Lei Bao, “Dynamics of Student Modeling: A Theory, Algorithms, and Application to Quantum Mechanics,” Ph.D. dissertation, University of Maryland, December 1999.
5.
Lillian C.
McDermott
and
Edward F.
Redish
, “
Resource Letter PER-1: Physics Education Research
,”
Am. J. Phys.
67
,
755
(
1999
).
6.
See Ref. 4 and Edward F. Redish, “Diagnosing student problems using the results and methods of physics education research,” edited by Xingkai Luo, in Proceedings of the 1999 International Conference of Physics Teachers and Educators, Guilin, China, 18–23 August 1999 (to be published); Leti Bao and Edward F. Redish, “Model Analysis: Modeling and Assessing the Dynamics of Student Learning” (unpublished).
7.
J. M. Fuster, Memory in the Cerebral Cortex: An Empirical Approach to Neural Networks in the Human and Nonhuman Primate (MIT, Cambridge, 1999);
J. R. Anderson and C. Lebiere, The Atomic Components of Thought (Erlbaum, 1998);
T. Shallice and P. Burgess, “The domain of supervisory processes and the temporal organization of behavior,” in The Prefrontal Cortex: Executive and Cognitive Functions (Oxford U. P., Oxford, 1998), pp. 22–35.
8.
Andrea
diSessa
, “
Toward an Epistemology of Physics
,”
Cogn. Instruct.
10
,
105
225
(
1993
);
J. Minstrell, “Facets of students’ knowledge and relevant instruction,” in Research in Physics Learning: Theoretical Issues and Empirical Studies, Proceedings of an International Workshop, Bremen, Germany, 4–8 March 1991, edited by R. Duit, F. Goldberg, and H. Niedderer (IPN, Kiel, 1992), pp. 110–128;
Donald Norman, “Some observations on mental models,” in Mental Models, edited by Derdre Gentner and Albert L. Stevens (Lawrence Erlbaum Associates, Hillsdale, NJ 1983), pp. 7–14;
D. E. Rumelhart, “Schemata: The building blocks of cognition,” in Comprehension and Teaching: Research Reviews (International Reading Association, Newark, DE 1981), pp. 3–26.
9.
More detail is discussed in Ref. 4.
10.
Since this is close to the random situation where the effect of the random variation is large, it will be difficult to differentiate whether the individual response is due to systematic reasoning with many different models or guessing. Such details can be studied by qualitative methods, e.g., interviews.
11.
More discussion of C and Γ can be found in Ref. 4.
12.
The topic covered during this semester is Newtonian mechanics. The data were collected by Dr. J. Saul.
13.
L. C. McDermott et al., Tutorials in Introductory Physics (Prentice–Hall, New York, 1998).
For details on the application of these tutorials at the University of Maryland, see
E. F.
Redish
,
J. M.
Saul
, and
R. N.
Steinberg
,“
On the effectiveness of active-engagement microcomputer-based laboratories
,”
Am. J. Phys.
65
,
45
54
(
1997
).
14.
These naı̈ve conceptions are well documented in the literature. See Refs. 1,2,3,4,5 for more details.
15.
D.
Hestenes
,
M.
Wells
, and
G.
Swackhamer
, “
Force concept inventory
,”
Phys. Teach.
30
,
141
151
(
1992
).
16.
Specifically, the items are classified as follows: low performance group: 2, 5, 9, 13, 15, 18, 22, 24, 28; mid performance group: 3, 6, 7, 8, 11, 12, 14, 16, 17, 20, 21, 23, 25, 26, 29; high performance group: 1, 4, 10, 19, 27.
17.
See Ref. 6.
18.
More detail is discussed in Chap. 5 of Ref. 4.
This content is only available via PDF.
AAPT members receive access to the American Journal of Physics and The Physics Teacher as a member benefit. To learn more about this member benefit and becoming an AAPT member, visit the Joining AAPT page.