Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction.

We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording.

Below we have listed common reasons instructors give for choosing not to use RBAs during class and discuss how online administration addresses these concerns.

I can't spare 30+ minutes of class time twice in a semester to give an RBA. Administering RBAs online allows students to complete RBAs either at home or in class. Studies have found that with sufficient incentives, students' participation and scores are the same whether completed in class or at home (see the discussion in Sec. III).

I don't have the time or TA power to score an RBA. Administering the RBA online removes the step of scoring scantrons or paper surveys and automatically generates spreadsheets of student responses that can be quickly and easily analyzed. Online RBA platforms (e.g., LASSO, E-CLASS, PLIC, and PhysPort DataExplorer7) can automate the scoring process altogether, providing instructors full student responses and scored responses.

I need an online version of the assessment and can't spare the time to set this up myself. Online RBA platforms already host and administer a wide array of physics RBAs for free.

I don't know what my results mean. Online RBA platforms can automatically generate reports that include visualizations and summary statistics to contextualize student outcome data. This can help instructors make sense of their students' performance and inform concrete changes to their instruction.

I don't have access to any comparison data. Online platforms can automate comparisons with their larger datasets. They can also standardize data formats, making it easy to compare or combine course data. These platforms collect course meta-data that can identify appropriate comparison points for a wide range of courses and institutions. They can also automatically aggregate and anonymize datasets to support large-scale, multi-institution investigations.

Moving an RBA online brings with it several potential concerns, including student engagement, test security, and use of unauthorized resources. Below, we articulate some of these concerns and summarize research findings that begin to address them.11 

Does giving the test online impact how many and which of my students participate? Low-stakes RBAs administered online have yielded similar participation rates as the equivalent paper tests administered in class.2 In an experiment where researchers randomly assigned students at one institution to take the same RBA online outside of class versus on paper inside class, participation rates were comparable if instructors administered the RBAs using the recommended practices described in the supplemental materials (also accessible at Ref. 8). Moreover, the participation rates did not differ between online and in-person based on gender or final course grade.2 Incentive structures strongly influence participation rates; for example, another study9 found an increase in online participation rates compared to historical norms, which the authors attributed to changes in incentives (explicit credit for participation when administered online).

Does taking the test online impact the scores for my course and can I compare my scores to previous terms? In the first study described above, researchers found that student performance on the online, computer-based tests were equivalent to performance on the same tests administered on paper during class.2 This result held for both concept inventory tests and attitudinal surveys, suggesting that instructors can compare results from online and in-person administrations. The second study described above9 found slightly lower online scores relative to historical data sets. They attributed this effect to the increased participation rate from lower-performing students in online assessments compared with in-person assessments. Increasing the participation of lower-performing students has the added benefit of reducing bias in the scores and making them more representative.

What if my students use the internet to look up the answers to the questions? In a study examining students' behaviors when taking research-based assessments online,9 researchers found that only 10 % of students showed direct evidence of copying question text, potentially intending to search the text to find the correct answer online. For tests with solutions readily available online, this behavior correlated with increased performance, while for tests without available solutions, it correlated with lower performance. However, because the proportion of students engaging in these behaviors was small, the impact on the overall average for the course was small. These findings align with other2,10 findings about the lack of impact on performance associated with administering an assessment online.

What if my students get distracted and don't take the test seriously? Researchers have used browser focus data (i.e., how often and for how long the assessment tab becomes hidden on the student's screen) to determine how common distraction might be during online RBAs. This study9 found that browser focus data indicated that between half and two-thirds of students lost focus on the assessment at least once, though the majority of these events (two-thirds) were less than 1 min in duration. Additionally, neither the number nor the duration of focus loss events correlated with students' scores. Thus, in that study, there was no apparent negative impact on students' scores due to distraction in the online environment.

What if my students save the test and post it online? Security of RBAs becomes particularly important when administering the assessments online, and, in practice, the nature of these concerns depends on the assessment in question. For example, well-used introductory assessments such as the FMCE or BEMA are already available online on paid sites such as Chegg or CourseHero.9 Less well used or newer assessments do not appear to have worked solutions available online to date. In one study, very few students (less than 1%–2%) attempted to save the test using print commands during online assessments.9 However, it is likely inevitable that questions (and solutions) will become increasingly available to students over time. This makes it all the more important that faculty keep these assessments low-stakes, not graded, and provide appropriate instructions to motivate students to take the assessment in the intended spirit, as a learning tool (see the supplementary material for more details).

RBAs are useful measures of the impact of a course on students' content knowledge, attitudes, and identities and have been a major driver of change in physics education. Many physics instructors, however, do not use RBAs for a variety of reasons. The transition to online courses has only exacerbated the challenges of administering RBAs. We believe that online administration of RBAs, particularly through full-service RBA platforms, can remove many of the barriers to using RBAs. Researchers have found that instructors can get similar amounts, and quality, of RBA data whether they administer them in-class or online. Further, research has found minimal impact in student scores from using unauthorized resources or evidence of students compromising assessment security when administered online. In addition to being free for instructors, full-service online RBA platforms (e.g., LASSO,3 E-CLASS,4 and PLIC5) also contribute to large-scale investigations. We hope that these resources will support physics instructors and researchers.

1.
Bethany R.
Wilcox
,
Benjamin M.
Zwickl
,
Robert D.
Hobbs
,
John M.
Aiken
,
Nathan M.
Welch
, and
H. J.
Lewandowski
, “
Alternative model for administration and analysis of research-based assessments
,”
Phys. Rev. - Phys. Educ. Res.
12
,
010139
(
2016
).
2.
Jayson M.
Nissen
,
Manher
Jariwala
,
Eleanor W.
Close
, and
Ben
Van Dusen
, “
Participation and performance on paper-and computer-based low-stakes assessments
,”
Int. J. STEM Educ.
5
,
21
(
2018
).
5.
Cornell Physics Education Research Lab
, <http://cperl.lassp.cornell.edu/PLIC>.
6.
See supplementary material at https://www.scitation.org/doi/suppl/10.1119/10.0002888 for a practical how-to for administering RBAs online and sample student email wording.
9.
Bethany R.
Wilcox
and
Steven J.
Pollock
, “
Investigating students' behavior and performance in online conceptual assessment
,”
Phys. Rev. - Phys. Educ. Res.
15
,
020145
(
2019
).
10.
Scott
Bonham
, “
Reliability, compliance, and security in web-based course assessments
,”
Phys. Rev. Spec. Top. - Phys. Educ. Res.
4
,
010106
(
2008
).
11.
The findings discussed here represent a snapshot of our current understanding of student engagement with online RBAs; as the use of online tests becomes more common and norms change, these findings may become less generalizable.

Supplementary Material