The second law of thermodynamics and the concept of positive entropy generation are analyzed following classical statistical mechanics methods. Previously, using the generalized Boltzmann–Gibbs entropy and its associated general entropy conservation relation, positive entropy generation expressions were obtained in agreement with phenomenological results and the work of Boltzmann and Gibbs. In this study, using the general approach, we formally and explicitly trace the specific entropy generation expressions to truncations in the full N-body description of the entropy state to a lower s-body description. Using higher-order superposition approximations, it is formally shown that the generalized Boltzmann–Gibbs entropy in the s-order state is always less than the corresponding Boltzmann–Gibbs entropy in the lower (s − 1)-order state. Using the general form of the entropy conservation equation, entropy generation is shown to be a required compensatory effect to ensure that all physical variables and physical processes associated with heat, work, temperature, etc., are independent of the particular entropy definition state.
INTRODUCTION
Positive entropy generation, which is the heart of the second law of thermodynamics, occupies ever increasing attention as critical to the improvement of energy conversion processes and the concomitant reduction in environmental harm. In this study, following our previous general development of entropy conservation from molecular theory approaches, the general molecular theory expressions of entropy generation are examined in detail. As shown by Jaynes,1,2 positive entropy generation is associated with willful approximations of the underlying probability density functions. Mathematically, entropy is a formal measure of “uncertainty,” and any approximate description can be a generator of positive entropy generation. Recently, following Irving and Kirkwood’s approach to the classical statistical mechanics of the transport equations of mass, momentum, and energy, extensions to entropy conservation were formally developed.3,4 It was shown that the entropy conservation equation of statistical mechanics is in full agreement with phenomenological expressions and the molecular theory results of Boltzmann.3,4 Moreover, the analysis formally links the entropy definitions of Gibbs, celebrated in equilibrium systems, with those of Boltzmann, celebrated in nonequilibrium gas dynamics, and provides a fully transparent, rigorous analysis of entropy variables and their behavior. In this study, using the general Gibbs–Boltzmann entropy and following Jaynes’ work with Boltzmann’s entropy,5 the specific molecular expressions and approximations associated with entropy generation are formally and explicitly traced to their origin and manifestations in the second law. Here, it is formally shown, using classical statistical mechanics, that if entropy is defined in a space less than the complete N-particle space, the reduced space entropy is always greater than the complete-space, or Gibbs’ entropy. Similarly, it is formally shown, using the general Boltzmann–Gibbs entropy, that lower-ordered defined states have greater entropy, in general, than higher-order states. Since all physically measured variables of heat, work, temperature, etc., and all physical laws must not depend on the particular defining state for entropy, positive entropy generation must result in a compensatory effect on any incompletely defined state. Importantly, the role of the “surroundings” and the different roles that statistical mechanical approximations play in the determination of entropy generation are also delineated.
NONEQUILIBRIUM ENTROPY CONSERVATION
TRUNCATIONS AND ENTROPY GENERATION
Next, we prove that the general Boltzmann–Gibbs entropy in the incomplete state must always be greater than or equal to the Gibbs’ entropy for the complete state. Furthermore, it is shown that this inequality results in the second law.
GENERAL BOLTZMANN–GIBBS INEQUALITY AND THE SECOND LAW
Finally, we note that we have not considered specific “working” expressions for the entropy generation term, which necessarily come from the solutions to the truncated Liouville equation in the assumed state or order of entropy.3,4 Expansions about local equilibrium states, for example, lead to the so-called phenomenological laws that dictate the direction of property flux, among other aspects of such systems. However, it is to be noted that there are many possible systems and solutions for density function expressions beyond perturbations about local equilibrium states. Moreover, the entropy generation expressions become specific to that expansion and can be classically derived from the general expressions given here. In any event, it is clear from Eq. (17) that positive entropy generation is still contingent upon defining entropy either in a lower order space than the complete system space or as result of the lack of inclusion of the surroundings in the defining state of entropy for a non-isolated system. The second law, in the sense of Jaynes, is the law of conservation of uncertainty.
DISCUSSION AND CONCLUSION
As shown by Jaynes, classically, entropy generation is associated with approximations inherent in statistical mechanical theories of entropy, which are based on the mathematical definition of uncertainty. Physical systems and variables, however, must not arbitrarily depend on approximations since they could lead to conflicting results. Here, we have traced in detail the approximations associated with the general, molecular-based approaches to the entropy conservation equation. Using the generalized Boltzmann–Gibbs entropy, it is demonstrated that the entropy defined in any lower order state is greater than the higher-order state. Boltzmann and Gibbs entropies represent the two opposite limits of entropy state definitions. It is further shown that entropy generation is a compensatory effect in the entropy conservation equation in an incompletely defined system state to ensure that physically measured variables of heat, work, temperature, etc., are independent of the uncertainty definition and never in violation of the physical laws. Interactions of a defined system with its surroundings will also generate entropy unless the surroundings are included in the defined state of uncertainty.
AUTHOR DECLARATIONS
Conflict of Interest
The author has no conflicts to disclose.
Author Contributions
Michael H. Peters: Conceptualization (equal); Formal analysis (equal); Methodology (equal); Writing – original draft (equal); Writing – review & editing (equal).
DATA AVAILABILITY
Data sharing is not applicable to this article as no new data were created or analyzed in this study.