Simulations can provide opportunities for engaged exploration in physics teaching and learning. Beyond the two-dimensional world of screen-based simulations, abstract concepts like vectors (for example, of electric fields) can frequently be visualized better in a three-dimensional virtual reality (VR) environment. These visualizations can be immersive, where the user is able to walk around, look around, and intuitively interact with objects in virtual space. Finally, it has been shown that this bodily acting out of physics scenarios (“embodiment”) can lead to even better learning results of particularly basic mechanics concepts.

Simulations can provide opportunities for engaged exploration in physics teaching and learning.1 Beyond the two-dimensional world of screen-based simulations, abstract concepts like vectors (for example, of electric fields) can frequently be visualized better in a three-dimensional virtual reality (VR) environment.2 These visualizations can be immersive, where the user is able to walk around, look around, and intuitively interact with objects in virtual space.3,4 Finally, it has been shown that this bodily acting out of physics scenarios (“embodiment”) can lead to even better learning results of particularly basic mechanics concepts.4–7 

Unfortunately, the challenge of writing these simulations can seem daunting. This article aims to provide an entry into writing VR simulations, using the Unity game-development environment8,9 (free for noncommercial use) in connection with consumer-grade VR gear such as HTC Vive10 or Oculus11 (at the time of writing, these devices still cost between $400 and $700). Figure 1 shows the HTC gear with the headset, hand controllers, and a Trackable Object discussed later. In the example presented in this paper, the idea is to write a basic kinematics simulation that in real time shows the velocity and acceleration vectors of a “throwable” ball. Students frequently struggle with these vectors, for example, not realizing that the acceleration of a ball on a freefall trajectory is always pointing down.

Fig. 1.

VR headset, hand controllers, and Trackable Object.

Fig. 1.

VR headset, hand controllers, and Trackable Object.

Close modal

VR is different from augmented reality (AR), which overlays information on objects in the real world, for example using smartphones12,13; Unity supports development of both VR and AR. For readers not familiar with this environment, a helpful introduction can be found in its online tutorials14; for the purposes of writing physics simulations, the “Essentials” and “Junior Programmer” tutorial pathways are recommended. This paper demonstrates the incorporation of VR into Unity using the SteamVR package (available for free from the Unity Asset Store), which abstracts away all interfacing with the equipment, including differences between VR devices by different manufacturers.

Game-development environments traditionally work in terms of two- or three-dimensional Scenes, into which Game Objects (e.g., the floor, walls, trees, enemies, monsters, and gear) are placed; these are objects in the sense of object-oriented programming: they might have some screen manifestation, so they show up, but they also have variables and methods (called Behaviors). For normal screen-based games, an important Game Object is the Main Camera, which “films” what appears on the user’s screen. This camera can be moved around for third-person games, or it can be attached to the user’s character in first-person games, i.e., be his or her eyes. The first and most essential step is replacing Unity’s default Main Camera by the Player object from the SteamVR package—this simple swap implements all functionality needed to let the player move and look around in the three-dimensional scene. Subsequently, all game objects that the player should interact with using the controllers need to have a behavior-script component added to them that binds them to SteamVR’s input system. These scripts govern how the object behaves when the user interacts with them via, for example, the hand controllers. SteamVR provides a library of ready-to-use scripts, the most commonly used of which is likely Throwable: the player can grab the object with the controllers (which appear as gloves), move it around, and intuitively throw it. The screenshot of Unity’s integrated development environment (IDE) in Fig. 2 illustrates all of this: the Player object is used instead of the Main Camera in the listing of game objects inside the left panel of the development environment, and the script Throwable is attached to the game object called “Ball” shown inside the right panel (the script component Interactable got automatically added alongside Throwable). From now on, Unity can be programmed exactly as for non-VR games; for example, the standard Rigidbody Physics component takes care of all mechanics for the object, such as gravity, friction, linear, and angular momentum, etc.

Fig. 2.

Unity’s integrated development environment.

Fig. 2.

Unity’s integrated development environment.

Close modal

Dressing up the scene with a simple cage, so the ball is contained by its walls instead of possibly getting lost; adding texture to the ball, so one can see it roll; and adding Cylinders (a standard Unity Game Object) to visualize vectors is done with the normal IDE. Also, adding a Physic Material (sic) to the ball, so it bounces off the walls, floor, and ceiling of the cage, can be accomplished the same way as in screen-based games and simulations. As an aside, the walls of the cage were left invisible so that the vectors remain visible in their full length when the ball bounces.

All that is left now is to write a behavior script for the vector cylinders, which attaches them to the ball, and which scales and rotates them according to the magnitude and direction of the velocity and acceleration vectors, respectively. Figure 3 shows this state of the simulation development, with the “Ball” set to be the “Thing” parameter of the script called “Alignment,” the beginning of which is shown in the overlapping window.

Fig. 3.

Unity’s scripting editor

Fig. 3.

Unity’s scripting editor

Close modal

Figure 4 shows screenshots of the simulation on the computer screen, which displays a clipped-out section of what the player sees in the VR glasses—if the simulation is used in the classroom, this view could be projected for everybody to see, so all students can witness what is happening in virtual space and possibly give directions to the player. Besides the considerable cost of the equipment, the biggest challenge here might be to find enough open floorspace in the classroom for the player to move.

Fig. 4.

Screenshots of the simulation.

Fig. 4.

Screenshots of the simulation.

Close modal

Figure 4(a) shows the ball and the two gloves corresponding to the left and the right controller. Figure 4(b) shows a thrown ball in flight: the green velocity vector points up at an angle, as the ball is still on the upward branch of the trajectory, and the red acceleration vector points straight down. Figure 4(c) shows the player’s view while holding the ball with an outstretched arm and doing a pirouette: the green velocity vector points tangentially, and the red acceleration vector inward.

Beyond immersive embodiment limited to the virtual world, mixed scenarios of real and virtual objects are possible. In addition to the controllers and headsets, most VR systems also provide various types of robust Trackable Objects (e.g., the star-shaped device in Fig. 1), which can be attached to most anything in the real world. Three-dimensional position data from real objects can be gathered and possibly incorporated into the virtual world, including full-body data on human motion.15 As this paper hopefully demonstrated, programming VR applications like these is greatly facilitated by powerful environments like Unity. While on the one hand, the hype around VR, AR, and mixed realities is once again dying down, on the other hand, we are left with commodity tools that make these technologies more accessible than ever for teaching and learning.

1.
Noah S.
Podolefsky
,
Katherine K.
Perkins
, and
Wendy K.
Adams
, “
Factors promoting engaged exploration with computer simulations
,”
Phys. Rev. Spec. Top. Phys. Educ. Res.
6
,
020117
(
2010
).
2.
Joel
Franklin
and
Andrew
Ryder
,
“Electromagnetic field visualization in virtual reality
,”
Am. J. Phys.
87
,
153
157
(
2019
).
3.
Chris D.
Porter
et al.,
“Using virtual reality in electrostatics instruction: The impact of training
,”
Phys. Rev. Phys. Educ. Res.
16
,
020119
(
2020
).
4.
Yiannis
Georgiou
,
Olia
Tsivitanidou
, and
Andri
Ioannou
,
“Learning experience design with immersive virtual reality in physics education
,”
Educ. Technol. Res. Dev.
69
,
3051
3080
(
2021
).
5.
Yiannis
Georgiou
,
Andri
Ioannou
, and
Marianna
Ioannou
,
“Investigating immersion and learning in a low-embodied versus high-embodied digital educational game: Lessons learned from an implementation in an authentic school classroom
,”
Multimodal Technol. Interact.
3
,
68
(
2019
).
6.
Janice L.
Anderson
and
Steven D.
Wall
,
“Kinecting physics: Conceptualization of motion through visualization and embodiment
,”
J. Sci. Educ. Technol.
25
,
161
173
(
2016
).
7.
Elias
Euler
,
Elmer
Rådahl
, and
Bor
Gregorcic
,
“Embodiment in physics learning: A social-semiotic look
,”
Phys. Rev. Phys. Educ. Res.
15
,
010134
(
2019
).
8.
https://unity.com/, accessed Aug.
2021
.
9.
Jesús D.
González
et al.,
“2D and 3D virtual interactive laboratories of physics on Unity platform
,”
J. Phys.: Conf. Ser.
935
,
012069
(
2017
).
10.
https://www.vive.com/, accessed Aug.
2021
.
11.
https://www.oculus.com/, accessed Aug.
2021
.
12.
Mark
Buesing
and
Michael
Cook
,
“Augmented reality comes to physics
,”
Phys. Teach.
51
,
226
228
(
2013
).
13.
Milan
Chandrakar
and
Kaushal Kumar
Bhagat
, “
Development of an augmented reality-based game for projectile motion
,”
Phys. Teach.
58
,
668
669
(
2020
).
14.
https://learn.unity.com/, accessed Jan.
2022
.
15.
Thomas E.
Augenstein
et al.,
“Enhancing mirror therapy via scaling and shared control: a novel open-source virtual reality platform for stroke rehabilitation
,”
Virtual Reality
26
,
525
538
(
2021
).

Gerd Kortemeyer is the director of Educational Development and Technology at ETH Zurich, Switzerland, and associate professor emeritus of physics at Michigan State University, East Lansing, MI.

Published open access through an agreement with ETH Z rich