We have developed a complete collection of freely available instructional materials to assist faculty in creating a student-centered quantum mechanics (QM) class that engages students while supporting them in developing both sense-making and calculational skills. Our materials are grounded in research on students' understanding of quantum mechanics and are intended to be adaptable to a variety of instructional settings and faculty styles or preferences. They were designed for a spins-first instructional paradigm and include a set of learning goals, concept (“clicker”) questions, pre-lecture surveys, and homework and exam questions, along with example lecture notes from three instructors at three different institutions. In this work, we describe what active learning can look like in the upper-division as well as describe each of the instructional tools and provide a few representative examples. We also discuss how these materials are used at each of our institutions, illustrating how they may be adapted for use at different institutions.

Quantum mechanics (QM) stands at the core of much of modern physics and constitutes an essential foundation for both the skill set and identity of many physics faculty and students. A middle-division undergraduate course on the fundamentals of quantum mechanics can be a highly sought-after teaching assignment. At the same time, quantum mechanics has a reputation amongst many students, supported in part by popular media, of being an arcane, abstract, unintuitive, and difficult topic. It is not hard to find quotations from famous physicists that support this reputation: “…Those who are not shocked when they first come across quantum theory cannot possibly have understood it” (N. Bohr) or “I think I can safely say that nobody understands quantum mechanics” (R. Feynman). As a result, students may enter a quantum class with a mix of excitement and trepidation.

Given the importance of the topic for students' development, the teaching and learning of undergraduate quantum has been the subject of a growing amount of physics education research. (Note: We have compiled a set of additional references1 for the interested reader in the supplementary material, including many links to helpful quantum course materials.) The bulk of PER research has focused on student difficulties with individual topics, identifying challenges and common ideas (including those that may be incorrect and counterproductive, or correct and useful2,3), and has provided curricular materials or other pedagogical tools.4–9 Particular attention has been paid to the transition from classical to quantum ideas, representations,10 and mathematical tools.11,12 There are also studies of sociocultural aspects related to the attitudes and concerns of students13 and a variety of approaches to teaching quantum mechanics.

Our group conducts research in student learning of quantum mechanics in large and small classes across a diverse set of institutions. We have developed a set of learning goals for quantum mechanics instruction in collaboration with many faculty; we have probed student learning and difficulties in QM in different instructional paradigms; and we have conducted research on student understanding of a variety of topics related to those learning goals.14,15 Part of our intent was to develop a set of accessible and flexible materials that are easy for faculty to use and modify in a diverse range of institutions.

The primary goal of this paper is to introduce an assortment of freely available, readily adaptable teaching materials based on research findings. We also provide guidance to help faculty determine how to introduce such teaching materials and practices into their classes. Note that we are not presenting a particular course or curriculum, nor do we prescribe teaching methods. In our experience, the teaching of quantum mechanics is personal and highly dependent on the needs, interest and background of the instructor and students. Instead of prescribing what and/or how faculty should teach, our aim is to share a broad and easily modified set of materials to support faculty who want to make research-based modifications to their teaching.

These resources16 were developed based on research into student learning using an iterative design process. While our materials have been field-tested in our classrooms, many have not yet been extensively tested for student learning gains. The adaptable and modular design provides flexibility for instructors to make minor modifications or substantial changes to their teaching. While this allows for different implementations based on the needs of individual institutions and instructors, it makes validation of student learning gains in all environments virtually impossible. However, all our materials promote student engagement with the content, and we have seen the potential for them to be powerful learning tools.

One of the primary and best-researched outcomes from physics education research is that students at all levels benefit from interactive engagement.17 This refers to approaches to teaching and learning that are “designed at least in part to promote conceptual understanding through engagement of students in heads-on (always) and hands-on (usually) activities which yield immediate feedback through discussion with peers and/or instructors.”18 There is no single pedagogy associated with this idea, and the tools and approaches to implement it can vary widely.

In the upper-division, we borrow some interactive-engagement techniques popularized in introductory physics.19 However, it is important to consider the differences between upper-division and introductory students. By their third year, physics majors have further matured with respect to mathematical tools, problem solving skills, and background knowledge. They are in smaller classes, and are, as a whole, more motivated and more advanced as learners. As such, we rely on a mix of lecture and interactive engagement strategies with frequent opportunities for students to test themselves and to articulate and refine their thinking. Frequent low-stakes formative assessments (e.g., concept or “clicker” questions) can be helpful, along with the articulation of learning goals, feedback and assessment mechanisms, and opportunities for small-group discussions. Student attitudes and buy-in are important: It can be helpful to explicitly discuss your reasons for using these pedagogical tools and the expected classroom norms around them.

In Sec. V, we describe our transformed course materials. They are built on physics education research literature and target a selection of known student difficulties. A few concrete examples of these difficulties that we have focused on include: technical fluency in expansion in a basis and change of basis,15,20 calculation and interpretation of expectation values, issues with time evolution,5 calculation and interpretation of measurement in QM,14 and the (often) challenging transition for students from discrete to continuous bases. The materials we have developed can be mixed and matched to fit the style and circumstances of each individual course. Indeed, as described in Sec. VI, the authors implement these materials in different ways in each of our courses. Our lectures are not radically different than what might be considered a traditional lecture. However, the inclusion of interactive elements, including clickers, spontaneous Q&A while lecturing, use of simulations, and occasional small-group discussions result in a course centered on the students and their learning.

All three of the authors' quantum courses are taught from a spins-first perspective using the textbook by McIntyre.21 This is the instructional paradigm we consider in our research and when designing instructional materials. A spins-first paradigm is in contrast to a position-first or wave functions first approach, in which the time-independent Schrödinger equation is introduced as a differential equation in the context of position-space wave functions. In the spins-first paradigm, the time-independent Schrödinger equation is introduced later as the energy eigenequation in the context of operators and spin-1/2 states.

Both perspectives cover mostly the same content but in a different order and with varying degrees of emphasis. In position-first QM, students begin by solving the Schrödinger equation for a variety of one-dimensional potentials before moving on to three dimensions. This treatment generally focuses on the mathematics of second-order differential equations, both ordinary and partial, with time dependence and dynamics generally coming second, and Dirac notation and spin later still. Questions of interpretation and issues of measurement are often left as secondary topics, if considered at all. Student challenges in this domain22 are commonly associated with the abstract interpretation of a wave function and a lack of a conceptual framework to help make sense of unfamiliar terminology and methods. Such an approach also typically does not become experimentally grounded until later in the course, when more realistic physical situations are considered.

In contrast, in the spins-first paradigm, the system in question is a mathematically simple spin-1/2 object and the Schrödinger equation can be solved as a 2 × 2 matrix equation. Instruction foregrounds the postulates of QM, illustrating them in the spin-1/2 context before introducing wave functions in position and momentum space and solving one- and three-dimensional potentials (including hydrogen).

This approach grounds the postulates of quantum mechanics in experiment, emphasizing the nature of science by focusing on meaning and physical sensemaking, not just formalism and symbol manipulation. The language of operators, Dirac notation,23 and relevant (but still reasonably elementary) principles of linear algebra are introduced right away. The calculations are considerably easier for most students than those involving partial differential equations and continuous variables, as they involve only basic algebra and the manipulation of 2 × 2 matrices. The segue from classical to quantum ideas is highlighted early on, providing connections to familiar topics (such as magnetic moments). Opportunities for sense-making are present from the start (e.g., when considering chained Stern-Gerlach experiments, where considerable physical intuition and consistency of outcomes are easy to provide). A variety of Stern-Gerlach simulations can be found online (for example, Refs. 24 and 25) that allow students to “experiment” and further ground their newly developing quantum intuitions. Some challenges to the spins-first approach include the fact that spin is a new, abstract concept, and the transition from discrete, two-state systems to continuous wave functions can be difficult for students. The materials presented below explicitly address these challenges.

Regardless of the instructional approach and content ordering used, there is likely some helpful content within our materials, as most canonical topics are introduced regardless of the starting point. Most of these materials help students develop intuitions and expert habits of mind in the context of quantum mechanics. While we explicitly discuss with students that there are aspects of quantum that might be referred to as weird (i.e., not expected based on classical models and our everyday experiences), it is not that weird! Quantum mechanics is built on a rigorous postulate-driven mathematical framework, and students can readily develop strong conceptual understanding of the principles and methods we commonly use to make sense of results in quantum systems.

The process of developing our materials involved articulating a set of learning goals, researching student understanding as related to these goals, and developing instructional materials. Once these materials were field-tested in our classrooms, we collected more data from students, documenting their effectiveness and revising them as needed. In this section, we will discuss our research and development process in more detail.

The first step to build flexible, research-based instructional materials was to develop consensus learning goals. To that end, we identified faculty who frequently teach or design quantum mechanics courses nationwide and invited them to attend a working meeting. In Spring 2017, over 20 faculty experts met with the aim of articulating a set of learning goals suitable for a range of quantum courses in a variety of different types of institutions and student populations. At the meeting, participants discussed different curricular approaches, student populations, high priority needs, existing physics education research literature, and the desires of faculty in different classroom environments. Then the participants began articulating learning goals at a variety of levels and granularity spanning different textbooks and instructional paradigms.

We later compiled an informal list of consensus learning goals, framed as skills, concepts, and habits of mind that we want students to be able to do or have by the end of the course.16 For instance, one such goal was that our students should be able to “…distinguish between expectation values, allowed values, and most probable value.” An example of a broader goal, admittedly vaguer and harder to operationalize or assess, but highly valued by most faculty, was that students should be better able to “…connect mathematical results to physics.” These learning goals do not span everything covered in our courses—we chose to focus on topical areas that received the most attention from the majority of the faculty participants. In our classes, we share these goals with students and use them to help guide instruction and assessment.

The development of our materials is not a linear process. Based on the same cycle of curriculum development heralded by the Physics Education Group at the University of Washington,6 we use an iterative process of research into student thinking, development of instructional materials, testing their effectiveness, and then using those results to refine, and sometimes redesign, our materials.20 

Our process generally starts with an idea that comes up in our experience in the classroom. Students will ask a question during class or office hours, or a number of students will provide an intriguing question on a homework or exam, or we will prepare to teach a topic and just not feel confident in the pedagogical tools we are currently using. These initial ideas are then discussed during research meetings, and if multiple authors agree, we follow up by collecting data on student thinking by asking targeted questions on homework assignments, ungraded surveys (pre-flights), or exams. For more in-depth investigations into student thinking, we then interview students. Based on the findings, we then begin to design materials for use in the classroom. These materials will often be tested with small numbers of students in focused interviews before they are further refined and field-tested in the classroom. Similar data collection techniques are generally used once again to be able to compare student thinking with and without the new materials. Generally, several years of refinement occur before we reach a state of equilibrium with our materials. Often, we share draft versions of our materials with colleagues and collect their feedback to further improve their development. Some of our materials are well-tested, and their effectiveness is published.10,20

Our philosophy towards student learning in quantum mechanics is that students are important characters in their learning journey. They come into our courses with ideas and continue to refine these ideas while learning new concepts in our courses. Occasionally, these ideas are either incorrect or are applied incorrectly. In these situations, our materials may act to correct them. Other times, students struggle to fit all the ideas together, often evidenced by students not applying the same concepts or methods to questions consistently. In these situations, our materials instead work to provide scaffolding so that students can construct the relevant knowledge structures for themselves. Other materials are built on the idea that students need practice with concepts and applying them in varied contexts. No two instructional materials are designed using the exact same principles, but all are based on the belief that students must do much of the legwork in their learning and we, as the instructors and curriculum designers, act as architects who provide a structure and environment that makes learning possible.

In this section, we describe each type of instructional resources we have developed and give an example (or two) of each. All materials can be found at our PhysPort webpage.16 Each example provided is from the first third of an introductory quantum mechanics course where the topics of operators, measurements, and expectation values are discussed in detail for spin-1/2 systems. At this early stage of a spins-first QM course, we find many students are naturally still unfamiliar or uncomfortable with Dirac notation, interpretation and solving of eigenequations, and the basic quantum rules involving probability amplitudes. The examples chosen here were designed to deepen student thinking and ultimately improve their understanding of these ideas. Materials designed for use later in the semester are similar in style but tackle progressively more advanced topics. The materials available for download on our webpage also include additional elements, including lecture notes from the authors' courses and instructor guides and tips, to assist faculty in incorporating these materials in their courses.

Pre-flight questions26 are a small set of questions given to students (typically online) before they attend lecture. Generally, they are not graded for correctness, but students receive a small amount of participation credit for completing them. Pre-flight questions can serve a number of purposes, including: asking students to recall relevant content from a previous course; assessing understanding of recent content; having them describe what they learned from a reading assignment; or learning about student concerns at the beginning or middle of the semester.

For example, very early in the semester it is important for students to remember the complex plane and Euler's formula, which they have nominally learned in a previous course. The pre-flight might ask them some complex number questions to help them prepare for the next lecture. For reading questions, we have asked what they have learned from the reading, what they are confused about, and what questions the reading brought to mind. A common type of pre-flight question reinforces and/or assesses content already covered in lecture, as can be seen in the example in Fig. 1.

Fig. 1.

Example of two pre-flight questions that would be given after this content was covered in lecture, but typically before they have had a chance to complete a homework assignment. Students are not graded for correctness but do get feedback on their answers.

Fig. 1.

Example of two pre-flight questions that would be given after this content was covered in lecture, but typically before they have had a chance to complete a homework assignment. Students are not graded for correctness but do get feedback on their answers.

Close modal

The advantage to ask these questions on the pre-flight instead of on a homework assignment is that it allows students to more immediately test their understanding. (Since they are not graded for correctness, some answers may be given immediately for feedback.) The instructor can look at the submitted answers to quickly get a sense for students' comprehension of the topics and make instructional decisions based on this information. Additionally, the pre-flights act as a low-stakes opportunity for students to assess their own understanding and take corrective actions, such as asking questions in lecture or attending office hours.

A note of caution for implementing pre-flights is that students can view these as highly useful or find them to be an annoying course element, depending on the implementation. In our classes, we make sure to occasionally reference the pre-flight questions in class to underscore their importance. We go over some solutions in class, suggest that students who struggled on the pre-flight might want to come to office hours to discuss the solutions, or provide additional readings or practice problems to help get them up to speed. When we show students that we value these tools as instructors, we have found that student comments on end-of-semester evaluations have been generally positive about the pre-flight assignments.

Concept questions (also known as “clicker questions”) are multiple choice questions that are administered throughout lecture in which students are asked to respond either using a “clicker” device or without technology using voting cards, small whiteboards, holding up a different number of fingers for the different answer options, or simply raising their hand. Concept questions are a simple way to add a highly interactive element into lectures. Every faculty member who introduces concept questions into their classes inevitably does it in their own way, and there exist many references to support their implementation (see the supplementary material for suggestions1). If one wants to make a class more interactive, we have found that frequent low stakes, small group, conceptually focused discussions of this type rank as one of the lowest threshold entries to interactive engagement pedagogy available.

We typically ask a handful of concept questions throughout our class periods, encouraging students to discuss the answers in pairs or small groups. More information about our specific implementations can be found in Sec. VI. Our concept questions are intended to help students make meaning of the math, practice techniques, consider additional examples, and/or lead into the next topic. They are referred to as “formative assessment”—scaffolding student understanding, reducing cognitive load by externalizing their thinking, and providing low-stakes, frequent feedback to us and to our students about how they are doing and what we should focus on next in class.

We strive to develop questions with a wide range of difficulties—a concept question that is initially around 50% correct can be highly effective at generating productive conversations, whereas a question that almost everyone gets wrong can alert the class to a challenging idea or common misconception. It may be important in such situations to reassure students this was the intent! Alternatively, a question that almost all students get correct can be useful for confidence-boosting and letting the instructor know that it's safe to move on to the next topic.

In lecture, some examples of “concept tests” at an early stage of the term might include more formal notational questions as seen in Fig. 2. In a small class where students can answer individually, the question might be initially framed as open-ended for discussion. In our student population, we find typically 2/3 of students get this correct, after rather heated peer discussion. Student reasoning shared in the ensuing whole-class discussion typically involves representing these objects as matrices, and noting that one cannot multiply a 2 × 1 column by a 2 × 2 matrix on its right.

Fig. 2.

Example of a concept question designed to test student understanding of the notation being used for operators and vectors.

Fig. 2.

Example of a concept question designed to test student understanding of the notation being used for operators and vectors.

Close modal

Another in-class question we have used is shown in Fig. 3. By this point in our term, student discussion points focus on squaring the magnitude and how to deal with phase. Scores on this question after discussion are generally high (in our classes, >90%), providing some confidence and skill-practice for most students. Followup discussion can include asking what happens if one repeats this measurement immediately after, or instead follows it with a measurement in the x-direction.

Fig. 3.

Example of a concept question designed to be an immediate assessment of whether or not students understand how to calculate probabilities for a state written in the basis of the operator being measured.

Fig. 3.

Example of a concept question designed to be an immediate assessment of whether or not students understand how to calculate probabilities for a state written in the basis of the operator being measured.

Close modal

As with the rest of our materials, our bank of editable concept questions can be found in Ref. 16. We have found that it is helpful to have an idea of how students will respond to a question, and therefore, we have provided rough estimates for our student populations (where available) along with our answers, our goals for the question, and typical pitfalls or interesting teaching moments from our class experiences.

Our materials include a set of tutorial worksheets, approximately one for every week of instruction, that cover many topics throughout a semester of QM. They are meant to provide students with a deeper conceptual understanding of material, to address common incorrect ideas, and/or to scaffold student development as expert problem solvers by breaking problems up into smaller pieces or by showing multiple ways to attack a problem. Tutorials were also designed to help students develop metacognitive strategies and communication skills. The tutorials are not collected or graded—they are designed somewhat in the tradition of the Tutorials in Introductory Physics6,27 and are intended to be worked on by students in small groups with a high instructor-to-student ratio. Written solutions are not provided, but we will often discuss solutions in class and in office hours.

Our tutorials have been influenced most notably by our interactions with the University of Washington Physics Education Group27 and the Oregon State University Physics Department Paradigms program8,28 and instructional materials, and many others.4,6,7 Each of these influences have a very different pedagogical style and you will see elements of each in our materials. Tutorials are one of the more challenging of our materials to implement in practice as they can be logistically difficult, especially in large classes, and require the instructor to “hand over” considerable lecture time to students. We have provided detailed instructor guides to help faculty. As you will see in Sec. VI, each of the authors administered the tutorials in different ways in their classes, largely due to institutional differences. For faculty with space or time limitations (which is almost all of us), our group has begun work on creating online versions of the tutorials that can be assigned as out-of-class homework.

The example questions seen in Fig. 4 come from a tutorial entitled “Quantum Mouse,” which introduces a whimsical quantum mouse with measurable properties such as “eye size.” The quantum mouse was created as a fictional extension of the spin-1/2 context. We have found that students often over-generalize from the spin-1/2 context, where non-commuting observables Ŝz and Ŝx frequently result in 50/50 probability splits. In the mouse example, students must go back to the postulates of quantum mechanics and rely on the formalism to determine the correct answers. This quantum mouse example is mathematically analogous to common problems with “tilted” spin-1/2 states, which is something we focus on in the homework and future lectures.

Fig. 4.

Reproduced excerpt from the Quantum Mouse tutorial. These questions form the second page of the (five-page) tutorial, after students are first introduced to the quantum mouse and the eye-size and mood operators. On the actual tutorial worksheet, students are provided with blank space after each question where they can write their responses.

Fig. 4.

Reproduced excerpt from the Quantum Mouse tutorial. These questions form the second page of the (five-page) tutorial, after students are first introduced to the quantum mouse and the eye-size and mood operators. On the actual tutorial worksheet, students are provided with blank space after each question where they can write their responses.

Close modal

For instance, eye-size is an observable (measure the diameter of the pupil) with corresponding Hermitian operator Ŝ that satisfies the eigenequations

Ŝ|=1mm|,
(1)
Ŝ|=2mm|,
(2)

where the symbols * and refer to small and wide eyes, respectively. In the tutorial we note

“Being either small-eyed (|) or wide-eyed (|) is totally normal. In fact, let us assume it is orthonormal (and complete).”

Follow-up open-ended questions include asking students to identify the eigenvalues, eigenvectors, and operators in Eqs. (1) and (2), and what “orthonormal” means in this context. We then introduce a second observable: the quantum mood M̂, with eigenstates happy |☺⟩ and sad |☹⟩ (with unitless + 1 and –1 eigenvalues, respectively). After confirming what measurements of mood can yield, we propose a relationship between the two bases and ask calculational and interpretational questions involving probabilities of various sequential measurements, seen in Fig. 4.

In general, the quantum mouse tutorial is fun for students. It brings some levity and concreteness to the sometimes formal and abstract learning of Dirac notation. It supports and develops a conceptual understanding of the technical vocabulary and procedures associated with eigenequations. Additionally, as you will see in our example exam question in Sec. V E, the quantum mouse makes a reappearance with new operators that allow us to assess students' proficiency using an operator they have not seen before.

The mouse tutorial focuses on application of ideas learned in class to a new context. Other tutorials are focused instead on, for example, use of multiple representations, qualitative methods for solving problems, practice with important calculational skills, addressing common incorrect ideas, or preparation for future learning. The techniques used in each tutorial are tied to the content and learning needs for that topical area of the course.

Our homework sets are designed to meet a variety of course goals including supporting conceptual understanding and problem-solving skills. In particular, many of our homework sets seek to improve representational fluency by having students consider questions with a mix of conceptual, mathematical, and notational components. Many of our homework problems are based on end of chapter textbook problems but have been modified to address additional learning goals described in Sec. IV A. Many times this can be accomplished by merely adding sense-making components. Typical modifications ask students to: sketch, plot, estimate, find a limit, describe your mathematical solution in words, interpret, define units, check your answer, explain your reasoning, or use two methods to solve and compare. Some of our questions connect abstract problems to real-world situations or physical contexts, others ask students to draw on common physicists' tools such as approximations, expansions, and estimations.

In Fig. 5, we show a question that might appear on the fourth or fifth homework of the semester. The question setup comes from the textbook (Problem 3.7 in Ref. 21). We added multiple subparts to encourage students to visualize the setup, think explicitly about limiting cases, connect the mathematical formalism to their classical understanding of precession, and answer qualitative as well as quantitative questions. This example shows the way that a mix of abstract and concrete questions is often grouped together.

Fig. 5.

Example of a homework problem (slightly paraphrased here). The setup is a textbook problem with additional elements to match course learning goals beyond just calculation.

Fig. 5.

Example of a homework problem (slightly paraphrased here). The setup is a textbook problem with additional elements to match course learning goals beyond just calculation.

Close modal

Exam questions are meant to assess student learning is a more formal and structured way. When designing our exam questions, we strive to create questions that directly address our learning goals and that provide students the opportunity to show us what they are able to do. Our first exam typically includes a chained (sequential) Stern–Gerlach apparatus. These types of questions allow us to assess students' ability to convert between bases as appropriate, collapse the state upon measurement, and interpret expansion coefficients as probability amplitudes. It also requires that students distinguish “measured values” (eigenvalues) from resulting states.

A representative exam question is an extension of the quantum mouse tutorial example above. As can be seen in Fig. 6, we introduce a two-state quantum cat with similar properties to the quantum mouse. The use of the novel context prevents students from relying on memorized facts about basis change for spin-1/2 particles and provides them with the opportunity to compute probabilities, distinguish eigenvalues from eigenstates, and perform change-of-basis operations. Part (c) of this exam question is more challenging because it requires that students use the definition of an expectation value to work “backwards” to determine the probability of one of the outcomes.

Fig. 6.

Example of an exam question that increases in difficulty for students from parts (a) through (c). Part (a) asks a question that is familiar to students but in an unfamiliar context. Part (b) requires more calculation, while part (c) is a question students likely have not seen before and they must use the definition of expectation value in order to solve. A complete set of sample exam questions are included in the downloadable materials.

Fig. 6.

Example of an exam question that increases in difficulty for students from parts (a) through (c). Part (a) asks a question that is familiar to students but in an unfamiliar context. Part (b) requires more calculation, while part (c) is a question students likely have not seen before and they must use the definition of expectation value in order to solve. A complete set of sample exam questions are included in the downloadable materials.

Close modal

Our exam questions are almost always multi-part questions that increase in difficulty with each part. This allows all students to answer portions of the question correctly. Almost always it is possible to fairly grade the later parts of the question even if students answer a previous part incorrectly. While it is not possible to assess everything on a final exam, we strive to assess as many of the learning goals as possible. We try to include a variety of question types—for example, conceptual questions, more traditional algebraic questions, and questions that involve graphs or other representations. Some of the authors make use of two-stage exams29 where students first complete the assessment individually before redoing it in small groups. This provides immediate feedback from their peers and turns the assessment into a collaborative learning opportunity.

Conceptual assessments have become common in physics education at the university level. They provide standard, validated measures of conceptual learning, and can be useful to evaluate and compare pedagogy and curricula. There is a long history of their effective use at the introductory level18 with increasing interest in assessment at the upper-division as well.19,22,30 We use the Quantum Mechanics Conceptual Assessment (QMCA),9,31 typically given at the end of a first semester of quantum mechanics. The QMCA is a multiple choice instrument, built from earlier open-ended research-based tests, that focuses on measurement, the 1D Schrodinger equation, time evolution, and probability density. It includes a subset of parallel questions, which modify the problems' context from wave functions to spin-1/2 contexts. The QMCA is typically given either in class or online as a final “participation only” homework assignment (so solutions are not posted). The instrument has been given at many universities. Class averages typically range from 40% to 60%. At a single institution, we generally find that scores are consistent from term to term in a given context (same class, same instructor) unless some significant change in pedagogy is made, making this a helpful tool for faculty seeking to systematically improve instruction. Our students' feedback shows they value taking the test, as a chance to review concepts from earlier in the term, and practice/prepare for the upcoming final.

In Sec. V, we have presented the types of materials that we have developed for use in a first semester quantum mechanics course. These materials can be used in many different combinations to create student-centered, interactive courses. In fact, the three authors each teach quantum mechanics using these same materials, but in different ways. Some parameters that affect our decisions include class sizes, number of class meetings per week, whether or not we have a recitation section, and personal stylistic preferences. Additionally, each institution has a unique student population with different demographics, prerequisites, and preparation, which we need to take into account.

One salient difference in our courses is the class size: our courses range from 10 to 100 students. In a class with fewer than 30 students, it's possible to have personal interactions with each student during small group work. The small group environment is very different in a large lecture hall, where concept questions are a more efficient way to engage each student. Even our use of concept questions varies between institutions that use a clicker technology and institutions where that is not the norm and voting happens using colored cards.

The other large difference in our courses is how tutorials are used. Among our three institutions, we have three different structures for tutorials: a separate, required, recitation section; an optional recitation section; and no recitation section. In courses with the recitation section, a tutorial is used each week. How that tutorial interfaces with the content vary depending on whether or not the recitation section is required. Without a recitation section, tutorials are used in lecture and their use is reduced with only 7–8 tutorials being used throughout the semester (as opposed to 12–15 in the other classes).

We have also needed to carefully consider our institutions and the student populations they serve. One institution is a research-intensive, selective, large public, Ph.D.-granting university. The other two are primarily undergraduate, large, public, Hispanic Serving Institutions. Our student population varies in terms of background preparation and previous courses taken. In considering the specific needs of our students, we may choose, for example, to include more or less content that reinforces mathematical concepts learned in previous courses. To include more background material, we might, for example, add a pre-flight targeting an idea in order to better prepare students for lecture. Where we feel less background/review is needed, we might, for example, omit or quickly recap the first page of a tutorial that reminds students of these concepts.

Additional differences between our courses include the use of PowerPoint slides or hand written notes on a tablet or chalkboard during class, and the number of pre-flights used each week. However, despite these differences, our classes all have a similar feel to them, where students and their learning are front and center. We are responsive to student progress throughout the semester and routinely adapt our plans based on students' needs.

We hope that these instructional materials and the discussions of our implementations of them will inspire faculty to incorporate some (or many) new active learning elements into their courses. Our materials are organized so that it's easy to find specific resources, whether an instructor wants to select a single type of materials (for example, concept questions or tutorials), or look for tools to teach a specific topic (for example, time evolution). They were designed to be adapted to each individual environment, and we encourage faculty to do just that.

This work was supported by the National Science Foundation via Grant Nos. DUE 1626280, 1626594, and 1626482. The authors want to thank Benjamin Schermerhorn and Giaco Corsiglia as well as several undergraduate student researchers for their contributions to the development of these materials. Our materials have been heavily influenced by our prior experiences in PER and the work of others. All three authors have ties to the University of Washington, and some have worked directly with the QM tutorials there. There is overlap in methods used with the group at Oregon State University and the Paradigm materials8 and the authors are heavily influenced by David McIntyre's Quantum Mechanics textbook and ancillary materials.21 Additionally, the authors have had productive conversations with others working in the area, including Antje Kohnle from St. Andrews University, Andrew Heckler from Ohio State University (who has in turn been influenced by the QuILTs4 by Chandralekha Singh's group at the University of Pittsburgh), and Charles De Leone from California State University San Marcos. This does not begin to touch on the many researchers whose work has inspired us during conference presentations, informal conversations, or through their journal articles.

The authors have no conflicts to disclose.

1.
See supplementary material at https://www.scitation.org/doi/suppl/10.1119/5.0109124 for websites with practical tips and materials as well as a broader variety of PER studies connected to teaching and learning quantum mechanics.
2.
C.
Singh
and
E.
Marshman
, “
Review of student difficulties in upper-level quantum mechanics
,”
Phys. Rev. ST Phys. Educ. Res.
11
,
020117
(
2015
).
3.
E.
Marshman
and
C.
Singh
, “
Framework for understanding the patterns of student difficulties in quantum mechanics
,”
Phys. Rev. ST Phys. Educ. Res.
11
,
020119
(
2015
).
4.
C.
Singh
, “
Interactive learning tutorials on quantum mechanics
,”
Am. J. Phys.
76
,
400–405
(
2008
).
5.
P. J.
Emigh
,
G.
Passante
, and
P. S.
Shaffer
, “
Developing and assessing tutorials for quantum mechanics: Time dependence and measurements
,”
Phys. Rev. Phys. Educ. Res.
14
,
020128
(
2018
).
6.
P. J.
Emigh
,
E.
Gire
,
C. A.
Manogue
,
G.
Passante
, and
P. S.
Shaffer
, “
Research-based quantum instruction: Paradigms and Tutorials
,”
Phys. Rev. Phys. Educ. Res.
16
,
020156
(
2020
).
7.
A.
Kohnle
,
I.
Bozhinova
,
D.
Browne
,
M.
Everitt
,
A.
Fomins
,
P.
Kok
,
G.
Kulaitis
,
M.
Prokopas
,
D.
Raine
, and
E.
Swinbank
, “
A new introductory quantum mechanics curriculum
,”
Eur. J. Phys.
35
,
015001
(
2013
).
8.
C. A.
Manogue
,
P. J.
Siemens
,
J.
Tate
,
K.
Browne
,
M. L.
Niess
, and
A. J.
Wolfer
, “
Paradigms in Physics: A new upper-division curriculum
,”
Am. J. Phys.
69
,
978–990
(
2001
).
9.
H. R.
Sadaghiani
and
S. J.
Pollock
, “
Quantum mechanics concept assessment: Development and validation study
,”
Phys. Rev. ST Phys. Educ. Res.
11
,
010110
(
2015
).
10.
G.
Passante
and
A.
Kohnle
, “
Enhancing student visual understanding of the time evolution of quantum systems
,”
Phys. Rev. Phys. Educ. Res.
15
,
010110
(
2019
).
11.
E.
Gire
and
E.
Price
, “
Structural features of algebraic quantum notations
,”
Phys. Rev. ST Phys. Educ. Res.
11
,
020109
(
2015
).
12.
M.
Wawro
,
K.
Watson
, and
W.
Christensen
, “
Students' metarepresentational competence with matrix notation and Dirac notation in quantum mechanics
,”
Phys. Rev. Phys. Educ. Res.
16
,
020112
(
2020
).
13.
A.
Johansson
, “
Undergraduate quantum mechanics: lost opportunities for engaging motivated students?
,”
Eur. J. Phys.
39
,
025705
(
2018
).
14.
G.
Passante
,
P.
Emigh
, and
P.
Shaffer
, “
Examining student ideas about energy measurements on quantum states across undergraduate and graduate levels
,”
Phys. Rev. Spec. Top. - Phys. Educ. Res.
11
,
020111
(
2015
).
15.
G.
Corsiglia
,
B.
Schermerhorn
,
H.
Sadaghiani
,
A.
Villase
,
S.
Pollock
, and
G.
Passante
, “
Exploring student ideas on change of basis in quantum mechanics
,”
Phys. Rev. Phys. Educ. Res.
18
,
010144
(
2022
).
16.
Adaptable curricular exercises for quantum mechanics
,” <https://www.physport.org/curricula/ACEQM> (accessed on June 6, 2022).
17.
S. V.
Chasteen
,
S. J.
Pollock
,
R. E.
Pepper
, and
K. K.
Perkins
, “
Transforming the junior level: Outcomes from instruction and research in E&M
,”
Phys. Rev. ST Phys. Educ. Res.
8
,
020107
(
2012
).
18.
R. R.
Hake
, “
Interactive-engagement vs. traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses
,”
Am. J. Phys.
66
,
64–91
(
1998
).
19.
S.
Chasteen
,
B.
Wilcox
,
M. D.
Caballero
,
K. K.
Perkins
,
S. J.
Pollock
, and
C. E.
Wieman
, “
Educational transformation in upper-division physics: The Science Education Initiative model, outcomes, and lessons learned
,”
Phys. Rev. ST Phys. Educ. Res.
11
,
020110
(
2015
).
20.
B.
Schermerhorn
,
G.
Corsiglia
,
H.
Sadaghiani
,
G.
Passante
, and
S.
Pollock
, “
From Cartesian coordinates to Hilbert space: Supporting student understanding of basis in quantum mechanics
,”
Phys. Rev. Phys. Educ. Res.
18
,
010145
(
2022
).
21.
D. H.
McIntyre
,
Quantum Mechanics
(
Pearson Education, Inc
.,
San Francisco, CA
,
2012
).
22.
G.
Zhu
and
C.
Singh
, “
Surveying students' understanding of quantum mechanics in one spatial dimension
,”
Am. J. Phys.
80
,
252–259
(
2012
).
23.
G.
Zuccarini
, “
Analyzing the structure of basic quantum knowledge for instruction
,”
Am. J. Phys.
88
,
385–394
(
2020
).
24.
The quantum mechanics visualisation project
,” <https://www.st-andrews.ac.uk/physics/quvis/> (accessed on June 29, 2022).
25.
Spins laboratory (stern gerlach sim)
;” <https://physics.weber.edu/schroeder/software/Spins.html> (accessed on June 29, 2022).
26.
A. G. W. C. G.
Novak
and
E. T.
Patterson
,
Just-In-Time Teaching: Blending Active Learning with Web Technology
(
Prentice Hall
,
Upper Saddle River, NJ
,
1999
).
27.
L.
McDermott
,
P.
Shaffer
, and
the University of Washington Physics Education Group
,
Tutorials in Introductory Physics
(
Pearson Education, Inc
.,
San Francisco, CA
,
2002
).
28.
Paradigms in physics at Oregon State University
,” <https://paradigms.oregonstate.edu>.
29.
C. E.
Wieman
,
G. W.
Rieger
, and
C. E.
Heiner
, “
Physics exams that promote collaborative learning
,”
Phys. Teach.
52
,
51–53
(
2014
).
30.
E.
Marshman
and
C.
Singh
, “
Validation and administration of a conceptual survey on the formalism and postulates of quantum mechanics
,”
Phys. Rev. Phys. Educ. Res.
15
,
020128
(
2019
).
31.
Physport: Quantum mechanics concept assessment
,” <https://www.physport.org/assessments/assessment.cfm?I=33&A=QMCA, accessed: 2022–06-06>.

Supplementary Material