The physics community has sometimes conferred upon a few of its more respected members the unofficial title of the “conscience of physics.” Paul Ehrenfest was considered by his colleagues to have played such a role. So too was Wolfgang Pauli. Both scientists were committed to clarity and rigor. Both scientists were also quick to alert their peers when they found clarity and rigor to be lacking. Throughout his career, Harvey Leff has served a similar role in the community of physics teachers. His recent book Energy and Entropy: A Dynamic Duo offers many insights to many different audiences. But Leff rightly identifies “teachers of physics, chemistry, and engineering” first on his list of prospective readers.
Perhaps no other group of scientists has a greater need for a conscience than those of us who teach thermodynamics. Conventional wisdom holds that entropy is the more elusive member of Leff's dynamic duo. It is the more likely subject of classroom misunderstandings. It is also more susceptible to mischaracterization. The problem, Leff explains, originated in the descriptions that were provided by some of entropy's first innovators. Both Boltzmann and Helmholtz referred to entropy as a measure of disorder. Gibbs, likewise, referred to it as a measure of mixed-up-ness. These men had hoped to translate their mathematics into useful metaphors. But their attempts to reduce the concept to one-word descriptions would ultimately prove to be a disservice. Subsequent generations were left the unenviable task of disentangling themselves from the burden of these misused metaphors. What is disorder? And where is it to be found in S = k ln W?
More meaningful descriptions of entropy are available today—thanks in large part to the concerted efforts of Harvey Leff and the many articles that he has contributed to this journal. Energy and Entropy is another significant contribution to that body of work. A consistent theme throughout his publications is that energy spreading, rather than disorder, provides a more useful view of entropy. Leff was joined in this cause by Frank Lambert, who led a similar campaign in the chemistry community based on the idea of energy dispersal. These efforts have reframed the way that entropy is presented in the classroom. References to entropy as a measure of disorder may not rise to the level of educational malpractice. But today such references are rightly regarded as lazy attempts to fulfill one's minimum obligation as an instructor.
If there is a lesson to be learned from this 100-year search for a better metaphor, perhaps it is the fact that words matter, particularly in thermodynamics—a subject that Nobel Laureate P. W. Bridgman described as “more palpably verbal” than the other branches of physics. I found Chapter 7 of Energy and Entropy, therefore, to be especially instructive. This chapter addresses the Language and Philosophy of Thermodynamics. Before reading this book, I was aware of the grammatical challenges that are posed by the word heat. I have tried (although not always successfully) to avoid using the word as a noun. Heat is not a substance, nor a thing. It is a process. I had not, however, recognized that the term heat capacity is then also a vestige of the defunct caloric theory. Leff suggests that thermal inertia would be a good substitute for heat capacity. But he wisely admits that some terms are so firmly ingrained in our thinking that they are effectively exempt from revision. Elsewhere throughout the book, Leff uses the term tempergy, which he defines as τ = kT. By defining temperature as an energy, there is no subsequent need for Boltzmann's constant. Entropy then becomes a dimensionless quantity. Before we lament the loss of Boltzmann's constant, we should note that k was not even introduced by Boltzmann. It was introduced by Planck. In the Boltzmann framework, entropy is a function of the number of microstates, which is a dimensionless quantity. Tempergy, therefore, is a conceptual simplification that provides a less obstructed view of the underlying physics.
Other sections of the book that I particularly enjoyed were those that presented more recent discoveries. Unlike many other books on the subject, Energy and Entropy does not give its reader the impression that thermodynamics is a fully resolved product of the 19th century. Leff demonstrates that significant discoveries have been made since the contributions of Boltzmann and Gibbs. He provides an accessible introduction to the Jarzynski equality. He also traces the many discoveries that were motivated by Maxwell's demon, illustrating how statistical mechanics led to later developments in information theory. Given the wide scope of the book, I was surprised that this discussion did not reference the work of Claude Shannon or E. T. Jaynes. But to have steered the book in that direction likely would have given entropy too central a role. Leff is careful throughout his book to emphasize that energy and entropy are equal partners. He also refrains from treating these quantities as abstract concepts. The presentation rarely strays from a plausible experiment. Even the discussion of information theory is rooted in measurable physical quantities.
My overall impression of this book can be characterized by the title of an article that Leff contributed to The Physics Teacher. The title of the article is Thermodynamics Is Easy—I've Learned It Many Times. When reading a good book on the subject, I agree. Thermodynamics can seem easy, particularly when the book is written by a scientist whose previous work has helped to clarify fundamental issues. But as I continue to grapple with the subject, I know that I will continue to find more subtle points in need of explanation. And when those future moments inevitably arrive, Energy and Entropy will be among the books to which I'll turn in order to find my conscience.
Eric Johnson is Chair of the Department of Chemistry at Mount St. Joseph University. He is the author of Anxiety and the Equation: Understanding Boltzmann's Entropy.