Research-based assessments (RBAs) measure how well a course achieves discipline-specific outcomes. Educators can use outcomes from RBAs to guide instructional choices and to request resources to implement and sustain instructional transformations. One challenge for using RBAs, however, is a lack of comparative data, particularly given the skew in the research literature toward calculus-based courses at highly selective institutions. In this article, we provide a large-scale dataset and several tools educators in introductory physics courses can use to inform how well their courses foster student conceptual understanding of Newtonian physics. The supplemental materials include this dataset and these tools. Educators and administrators will often target courses with high drop, withdrawal, and failure rates for transformations to student-centered instructional strategies. RBAs and the comparative tools presented herein allow educators to address critiques that the course transformations made the courses “easier” by showing that the transformed course supported physics learning compared to similar courses at other institutions. Educators can also use the tools to track course efficacy over time.

1.
Stephen
Kanim
and
Ximena C.
Cid
, “
The demographics of physics education research
,”
Phys. Rev. Phys. Educ. Res.
16
,
1
17
(
2020
).
2.
Readers may view these materials at TPT Online, http://10.1119/5.0023763, under the Supplemental tab.
3.
Xochith
Herrera
,
Jayson M.
Nissen
, and
Benjamin
Van Dusen
, “
Student outcomes across collaborative learning environments
,”
Proc. 2018 Phys. Educ. Res. Conf.
1
4
(
2018
).
4.
Jayson M.
Nissen
,
Robert M.
Talbot
,
Amreen Nasim
Thompson
, and
Ben Van
Dusen
, “
Comparison of normalized gain and Cohen’s d for analyzing gains on concept inventories
,”
Phys. Rev. Phys. Educ. Res.
14
,
1
12
(
2018
).
5.
Bethany R.
Wilcox
and
Steven J.
Pollock
, “
Investigating students’ behavior and performance in online conceptual assessment
,”
Phys. Rev. Phys. Educ. Res.
15
,
1
10
(
2019
).
6.
Jayson M.
Nissen
,
Manher
Jariwala
,
Eleanor W.
Close
, and
Ben Van
Dusen
, “
Participation and performance on paper-and computer-based low-stakes assessments
,”
Int. J. STEM Educ.
5
,
1
17
(
2018
).
7.
Scott
Bonham
, “
Reliability, compliance, and security in web-based course assessments
,”
Phys. Rev. ST Phys. Educ. Res.
4
,
1
8
(
2008
).
8.
Joseph L.
Schafer
, “
Multiple imputation: A primer
,”
Stat. Methods Med. Res.
8
,
3
15
(
1999
).
9.
Jayson M.
Nissen
,
Robin
Donatello
, and
Ben
Van Dusen
, “
Missing data and bias in physics education research: A case for using multiple imputation
,”
Phys. Rev. Phys. Educ. Res.
15
,
1
15
(
2019
).
10.
Y.
Sakamoto
,
M.
Ishiguro
, and
G.
Kitagawa
,
Akaike Information Criterion Statistics
(
D. Reidel
,
Dordrecht, The Netherlands
,
1986
).
11.
Matthew A.
Kraft
, “
Interpreting effect sizes of education interventions
,”
Educ. Res.
49
(
4
),
241
253
(
2020
).
12.
Scott
Freeman
,
Sarah L.
Eddy
,
Miles
McDonough
,
Michelle K.
Smith
,
Nnadozie
Okoroafor
,
Hannah
Jordt
, and
Mary Pat
Wenderoth
, “
Active learning increases student performance in science, engineering, and mathematics
,”
PNAS
111
(
23
),
8410
8415
(
2014
).
13.
Richard R.
Hake
, “
Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses
,”
Am. J. Phys.
66
,
64
74
(
Jan.
1998
).

Supplementary Material

AAPT members receive access to The Physics Teacher and the American Journal of Physics as a member benefit. To learn more about this member benefit and becoming an AAPT member, visit the Joining AAPT page.