In any science field, including physics, it is important to remain abreast of new assessment methods to cater to the 21st-century student. The rationale of this paper is to argue for a move away from the use of lower-order thinking skills (LOTS) in e-assessment in favor of higher-order thinking skills (HOTS), in line with Bloom’s Revised Taxonomy. This builds on the work of Livingston, who argued that multiple-choice questions, while presenting some advantages, also entail disadvantages since there are forms of knowledge and skills that they cannot evaluate. This is especially true when certain forms of knowledge and skills cannot be accurately evaluated if the answer is presented within a list of options. Livingston argued that constructed-response questions are a fruitful avenue for addressing this shortcoming. Similarly, as Jones points out, while multiple-choice questions can—with careful design—be used to assess higher-order thinking, it is far more common to find them being used to test factual information recall, which requires LOTS. Schultz echoes this view and made use of randomized non-multiple-choice assignments, delivered via a Learning Management System (LMS) in a chemistry course. The benefits reported include automatic marking, reduction of copying among students due to randomization, and the targeting of higher-order learning outcomes since students were required to work out the answer rather than choosing it from a list. While Schultz did not use the term constructed-response question, Livingston defines it as “questions that require the test taker to produce the answer, rather than simply choosing it from a list.” In this paper, we aim to illustrate a range of new possibilities (within physics) for using constructed-response questions in an increasingly technologically advanced learning and teaching environment. While we use the Sakai LMS, the examples we offer are equally valid for other systems. The use of constructed-response questions to support physics learning and teaching within Sakai is, however, an underresearched area. This is especially the case for the calculated and numeric response questions, and their complementary tools, that we showcase.

2.
S. A.
Livingston
, “
Constructed-response test questions: Why we use them; how we score them
,”
R & D Connections
11
,
1
8
(
2009
).
3.
S. M.
Jones
, “
Assessing the science knowledge of university students: Perils, pitfalls and possibilities
,”
J. Learn. Des.
7
,
16
27
(
2014
).
4.
M.
Schultz
, “
Sustainable assessment for large science classes: Non-multiple choice, randomized assignments through a Learning Management System
,”
J. Learn. Des.
4
,
50
62
(
2011
).
5.
G.
Crisp
,
Teacher’s Handbook on e-Assessment
(
Australian Learning and Teaching Council Ltd
,
2011
), http://transformingassessment.com/sites/default/files/files/Handbook_for_teachers.pdf.
6.
K.
Mullen
and
M.
Schultz
, “
Short answer versus multiple choice examination questions for first year chemistry
,”
Int. J. Innovation Sci. Math. Educ.
20
,
1
18
(
2012
).
7.
J. W.
Gikandi
,
D.
Morrow
, and
N. E.
Davis
, “
Online formative assessment in higher education: A review of the literature
,”
Comput. Educ.
57
,
2333
2351
(
2011
).
8.
J. C. González
de Sande
, “Calculated questions and e-cheating: A case study,” in
Education Applications & Developments
, edited by
M.
Carmo
(
InScience Press
,
Lisbon
,
2015
), pp.
92
100
.
9.
J.
Engelbrecht
and
A.
Harding
, “
Interventions to improve teaching and learning in first year mathematics courses
,”
Int. J. Math. Educ. Sci. Technol.
46
,
1046
1060
(
2015
).
10.
G.
Crisp
, “
Engagement and empowerment: New opportunities for growth in higher education
,”
EDU-COM 2006 Conf. Proc.
71
,
144
153
(Nov.
2006
).
11.
G.
Conole
and
N.
Sclater
, “
Using evaluation to inform the development of a user-focused assessment engine
,”
Int. Comput. Assisted Assess. Conf. Proc.
9
(July
2005
).
12.
Gilly
Salmon
,
Management School, University of Liverpool
, http://www.gillysalmon.com/five-stage-model.html.
13.
R. L.
Kung
, “
Teaching the concepts of measurement: An example of a concept-based laboratory course
,”
Am. J. Phys.
73
,
771
777
(
August
2005
).
14.
S.
Allie
,
A.
Buffler
,
B.
Campbell
,
F.
Lubben
,
D.
Evangelinos
,
D.
Psillos
, and
O.
Valassiades
, “
Teaching measurement in the introductory physics laboratory
,”
Phys. Teach.
41
,
394
(
Oct.
2003
).
15.
D. G.
Taggart
, “
Implementing online content to improve learning in a large engineering freshman programming course
,”
ASEE NE 2016 Conf. Proc.
(
April
2016
).
AAPT members receive access to The Physics Teacher and the American Journal of Physics as a member benefit. To learn more about this member benefit and becoming an AAPT member, visit the Joining AAPT page.