A sound understanding of entropy is essential for any rounded education in thermodynamics and many other disciplines of physical science. However, students with a poor understanding of this important topic may be found across all education levels, often conflating entropy with disorder. Many teachers wish to instill an understanding of the quantitative definition of entropy—a function of the number of accessible energy states (Fig. 1). This demonstration explains entropy in a visually interesting manner, using handheld containers of simple objects to model entropically driven processes in systems of noninteracting particles. Within the containers, a spontaneous increase of entropy results in increased visual order, consistent with an increase in the number of available energy states. A failure of the notion that entropy is disorder is also exposed. Because the demo models microscopic particles directly, no analogy is required to translate conclusions into “real” systems. After activities involving these models, students should understand the difference between a microstate and macrostate, remember Boltzmann’s formula for entropy, and understand how it depends on the number of available microstates. Applying this learning helps reveal why spontaneity is favored by increasing entropy. By experimenting with the model, students analyze the relationship between changes in entropy and disorder, and evaluate whether they are always correlated. The model also provides opportunities to extend learning to other physical environments, challenging students to reason critically about how spontaneity may depend on changes in other thermodynamic variables.

^{1}

^{,}

*S*is the entropy,

*k*

_{B}is the Boltzmann constant, and the number of microstates available to the system is denoted by

*W*to emphasize “the number of

*ways*.”

^{2}Despite the fact that Eq. (1) says nothing about “disorder,” “randomness,” or “chaos,” the “disorder paradigm” gained popularity.

^{3,4}This approach claims that entropy is a measure of disorder or, worse, that entropy

*is*disorder; and has been used as a basis from which students should answer conceptual questions about shuffled cards and messy rooms. Not only are such questions often inherently problematic,

^{5}the disorder paradigm can create a conceptual disability in students, leading to incorrect interpretations of entropic phenomena

^{6}and fictitious paradoxes when order arises spontaneously.

^{7,8}The second law of thermodynamics (Δ

*S*

_{universe}≥ 0, where

*S*

_{universe}=

*S*

_{system}+

*S*

_{surroundings}) does not necessarily mean that the disorder in the universe is always increasing (or holding steady), but that the total energy in the universe is “spreading out” over the available places it can be stored.

^{9}Unless a system is thermodynamically isolated from its surroundings, the second law alone does not guarantee where an entropy change will take place. These considerations prompted numerous publications imploring more accurate instruction about entropy, which have seen some success.

^{5,6,10–12}Nonetheless, the most common answer I receive (anecdotally) from students across all levels to the question, “What is entropy?” is still some version of the disorder paradigm.

^{13}To reinforce the strides being taken to teach entropy more accurately, the present model provides a palpable illustration of the dispersal of energy throughout the degrees of freedom in a physical system.

The abstract nature of entropy, being defined in terms of abstract “energy states,” can make it difficult for students to develop a functional understanding of how it relates to surrounding concepts such as microstates and macrostates, probability and spontaneity, etc. And there seem to be relatively few creative models for teaching entropy.^{9,14–17} This model guides students in learning what entropy is (and is not) by understanding how entropy and macrostate probability are dependent on the number of accessible microstates, and how microstates are dependent on energetic degrees of freedom. Polyhedral objects are used to simulate microscopic particles, creating a macroscopic model of a microscopic system. Because particle degrees of freedom are directly observable in this model, no analogy is required to apply the demo’s insights to atoms or molecules.

The demo consists of deriving the relationship between spontaneity and changes in entropy for a specific system and drawing observations from a hands-on model of such a system. I also discuss some likely misunderstandings of the demo and a few caveats. A sample lab handout from Ensemble Interactives is included in the supplementary materials with permission.^{18,19}

## Thermodynamic basis

We seek to devise a system in which a spontaneous process gives rise to a change in entropy. However, because spontaneous processes are those that minimize free energy instead of maximize entropy, knowledge of spontaneity is, ordinarily, insufficient to determine a change in entropy. Therefore, this demo uses known conditions of a carefully chosen system that permit the unambiguous association of a spontaneous process with a change in entropy.

*G*, during a thermodynamic process at constant volume in an isolated system involving only hard particles:

*U*is the “internal” or total energy— the sum of the kinetic and potential energies of the particles of the system;

*T*is the temperature; and

*S*is the entropy. The “expansion work” term,

*P*Δ

*V*, has already been omitted due to the stipulation of constant volume.

^{20}Because the average kinetic energy of a (classical) system is determined by its temperature, the constant

*T*in Eq. (2) implies that there is no change in kinetic energy during the process. “Hard particles” have excluded volume but no interaction potential between themselves and any other particle, and only translational and rotational degrees of freedom, so there is also no change in the potential energy. This is true regardless of the hard particle shape or system density, because the potential of any allowed configuration is zero. Thus, for processes in such a system of hard particles, Δ

*U*= 0, and the previous expression of free energy reduces to

Because spontaneous events reduce the free energy (Δ*G* < 0) and *T* > 0, Eq. (3) implies that any spontaneous process in this special environment is guaranteed to have resulted from an increase in entropy (Δ*S* > 0).

Equation (1) reminds us that an increase in entropy means that there are more energy states (microstates) available in the final state than in the initial. Because these particles can only possess kinetic energy, a greater number of available energy states entails an increase in the number of rotational or translational degrees of freedom throughout the system. If the system is dense enough, particles will trade rotational freedom for extra translational freedom, transitioning from an orientationally disordered system to more ordered, higher-entropy state. As described by Frenkel, “one type of entropy decreases and another kind of entropy increases such that the total entropy becomes larger.”^{21} In other words, the number of units of energy present in the system (its internal energy) stays the same, but the number of ways the energy can be distributed throughout the system (its entropy) increases. While the existence of disorder-to-order transitions that increase entropy is well documented,^{12,21,22} this paper describes simple conditions within which this phenomenon can be observed tangibly and at a macroscopic level. This derivation offers a chance for students to practice qualitative reasoning from physics equations, without merely “plugging in” values to obtain a numerical answer.

## Visual demonstration

### Materials

A system obeying Eq. (3) is instantiated merely by enclosing a set of hard particles in a nondeformable box. Simple particles with some degree of symmetry (rods, disks, few-sided polyhedra, etc.) and clear containers work best (Figs. 2 and 3). Ready-made kits with uniform particles such as those in Fig. 3 are available through Ensemble Interactives^{®}.^{18} On the physical scale of the classroom, there is no discernible interaction between the particles; viz., they do not attract or repel one another. Shaking the filled box models the (constant) temperature via particle motions.

### Directions

To prepare the demo, particles randomly fill the container, ideally with no extended order apparent (Fig. 4, “initial” column). The lid must be held securely in place throughout the demo to maintain a constant volume. If the container is overfilled such that the lid rests on particles rather than the container itself, single particles are removed until the lid can be properly closed with the greatest number of particles.

The filled container is shaken and rotated in all directions until the particles’ configuration transitions from the initial, disordered state, into a final, stable state (Fig. 4, “final” column). The visual transition is usually also accompanied by an audible change: noticeably more rattling ensues as the initially jammed particles “settle into place.” While the transition time may depend on the particle and container shapes, systems such as those in Fig. 4 usually transition in less than 10 s. Videos of this procedure for a few different types of particles are included in the supplementary materials.^{19}

## Discussion

### Learning outcomes

Analyzing the model illuminates how to define macrostates and microstates. The macrostate is seen through the observed “state of matter” or “phase”— whether fluid-like and disordered or crystal-like and ordered. For microstates, the term “energy state” reinforces the idea of an accounting of how the quanta of energy within a system are distributed among the system’s energetic degrees of freedom. While the microstate is not directly observable (when the particles are not in motion), students can reason about which phase has more microstates by considering their degrees of freedom. In the initial state, translationally jammed particles have some possible degrees of freedom essentially “turned off.” Therefore, enabling new forms of motion via more ordered packing increases the available degrees of freedom, number of available microstates, and hence, entropy. The fact that these systems have more entropy in their final state despite being more visually ordered helps crystalize the idea that entropy is a function of the number of ways energy can be arranged within a system, as expressed in the Boltzmann formula [Eq. (1)].

### Some caveats

The fact that high particle density is exploited to induce the disorder-to-order transition^{12,23} bears importance for the demo’s visual appeal but not its validity, as none of the equations invoke assumptions about the overall density. The expected transition might not be observed if the density is either too low (system always appears disordered) or too high (particles will not be able to move at all); nonetheless, Eq. (3) guarantees that if a spontaneous process occurred, that process increased the system’s entropy.

The container should be rotated in all dimensions while shaking or else the constant force of gravity can create an inhomogeneous density, invalidating Eq. (3) as the governing thermodynamic rule. Tumbling the container while shaking approximately “cancels out” the gravitational force by randomizing its direction. Rotations could be disregarded if the model was constrained to only one layer of particles (perpendicular to the force of gravity);^{12} but the present 3D model provides a controlled environment for testing different particle shapes and observing order and packing effects extended in multiple dimensions.

Within the model’s assumptions, the visual change observed is a legitimate phase change, e.g., from a dense liquid to a crystalline solid or between two dense fluids. Some fascinating examples of “entropic crystallizations” of hard polyhedra are given by the computer simulations of Lee et al.^{22}

### Common misperceptions about the demo

Students may be tempted to interpret a motionless container as a system at absolute zero temperature, and subsequently attribute the particles’ rearrangement to a supposed addition of energy upon beginning to shake the container. To see that this is an incorrect interpretation, it is helpful to think of this demo in terms of the discrete frames one would observe from a video (Fig. 5). Referring to Fig. 5, students may see “initial” and “final” configurations similar to frames *t*_{0} and *t*_{n}, respectively; though one could have considered the change between *t*_{1} and *t*_{n−1}, leading to the same qualitative observation (viz., “there was a transition from disorder to order”). Similarly, *t*_{n−3} through *t*_{n−1} are also configurations of the final macrostate, having the same qualitative orientational order as *t*_{n}. The configuration observed in any single frame is merely one of all the possible configurations within that state of the system. Note that all the particles were in motion when frames *t*_{1} through *t*_{n−1} were captured; if the experiment’s timeline was extended by one frame on each end, then one would say that the particles in frames *t*_{0} and *t*_{n} were in motion too. With this explanation, students should become convinced that the moments before and after the container is shaken do not correspond to *T* = 0, but are instead merely two of all the possible *T* > 0 frames that might have been observed when a video of the demo was paused. Furthermore, any energy added to the system by shaking the container is also exactly removed when the shaking stops.

Another possible misjudgment is that the phase change was accompanied by a volume change, citing some “head space” that appears to have evolved [e.g., Fig. 4(a)]. Once again, the “video frames” conception is helpful, reminding that the configurations of both the initial and final macrostates “explore the full space” of the container, even though not all particles touch a container wall at all times in either macrostate. Configurations of the final macrostate such as frames *t*_{n−3} through *t*_{n−1} will not be observed when shaking ceases due to the presence of gravity in the lab. But one must remember that there is (approximately) no gravity acting on the system during the demo and thus no external directionality, rendering frames like *t*_{n−3} through *t*_{n} all plausible configurations within the final macrostate.

### Why does entropy often correlate with disorder?

One response to the frequent correlation of increasing entropy and disorder is that many systems encountered on a daily basis have constant pressure rather than constant volume. In a constant-pressure scenario, the motion of “hot” particles can overcome the external pressure, allowing the system to obtain a larger volume. The increase in entropy permitted by the larger volume can be enough to cause the overall process to be spontaneous.^{6} Repeating the demo with the lid removed to approximate constant pressure, the particles will be ejected all over the room in an amusing demonstration of a system in which increased entropy does correlate with increased spatial disorder. Answering this question requires students to think critically about the possible implications of altering or removing the special constraints and assumptions within this model.

## Summary and conclusion

This demonstration complements instruction about the important topic of entropy. It allows otherwise abstract concepts to be vividly depicted within a model atomic system, not requiring any analogy to other probabilistic phenomena (such as coin flips, dice rolls, etc.). Students already exposed to the disorder paradigm will improve their conceptual understanding by seeing a discrepant event that can be easily explained through the quantitative definition of entropy. Using a handheld model of hard particle phase changes, students view particles’ macrostates and microstates by observing changes in the available degrees of freedom. This provides a memorable example of entropy as “the number of available energetic configurations.” Students are challenged to think critically about the implications of thermodynamic equations and their applications in different physical contexts. The model is easily constructed from common materials or commercially available as a kit. By improving understanding of the rigorous definition of entropy, students develop a foundation on which a more advanced education in thermodynamics can be built.

## Acknowledgments

I would like to acknowledge Sharon Glotzer and Sangmin Lee for first introducing me to this phenomenon at the Gordon Research Conference for the Chemistry and Physics of Liquids in 2019, as well as Michael Grünwald for helping explain the concept to me at the same meeting. I thank Alison Luscomb, Brian Laird, and Azida Walker for critical reviews of manuscript drafts.

**Disclosure:** The author acknowledges a potential conflict of interest via his involvement with Ensemble Interactives, through which materials for implementing this demo and associated laboratory activities are commercially available.

## REFERENCES

*W*to stand for “Wahrscheinlichkeit,” which is German for “probability.”

*Phys. Teach.*

*Phys. Teach.*

*J. Chem. Educ.*

*J. Chem. Educ.*

*Entropy*

*Phys. Teach.*

*J. Chem. Educ.*

*J. Chem. Educ.*

*J. Chem. Educ.*

*Phys. Teach.*

*Phys. Teach.*

*Phys. Teach.*

*J. Chem. Educ.*

^{®}

*TPT Online*at https://doi.org/10.1119/5.0089761, in the “Supplementary Material” section.

*T*,

*V*, and number of particles,

*N*, is a member of the canonical ensemble, whose natural free energy is the Helmholtz free energy (Δ

*U*−

*T*Δ

*S*). I use the Gibbs free energy based on an assumption that it is more familiar to a wider range of students. Of course, once the expansion work term is omitted, one obtains the Helmholtz free energy anyway, denoted by the now-arbitrary symbol

*G*. The derivation should be tailored to the familiarity of the students with the goal of helping them understand the conclusion, rather risking confusion over new terms

*Nat. Mater.*

*Proc. Natl. Acad. Sci. U.S.A.*

*Rep. Prog. Phys.*

**T. Ryan Rogers** *earned his PhD at the University of Arkansas in 2020, and is a theoretical scientist and product developer at TRR Designs, LLC. His interests include computational modeling and prototyping handheld innovations.*