The second law of thermodynamics and the concept of positive entropy generation are analyzed following classical statistical mechanics methods. Previously, using the generalized Boltzmann–Gibbs entropy and its associated general entropy conservation relation, positive entropy generation expressions were obtained in agreement with phenomenological results and the work of Boltzmann and Gibbs. In this study, using the general approach, we formally and explicitly trace the specific entropy generation expressions to truncations in the full N-body description of the entropy state to a lower s-body description. Using higher-order superposition approximations, it is formally shown that the generalized Boltzmann–Gibbs entropy in the s-order state is always less than the corresponding Boltzmann–Gibbs entropy in the lower (s − 1)-order state. Using the general form of the entropy conservation equation, entropy generation is shown to be a required compensatory effect to ensure that all physical variables and physical processes associated with heat, work, temperature, etc., are independent of the particular entropy definition state.

Positive entropy generation, which is the heart of the second law of thermodynamics, occupies ever increasing attention as critical to the improvement of energy conversion processes and the concomitant reduction in environmental harm. In this study, following our previous general development of entropy conservation from molecular theory approaches, the general molecular theory expressions of entropy generation are examined in detail. As shown by Jaynes,1,2 positive entropy generation is associated with willful approximations of the underlying probability density functions. Mathematically, entropy is a formal measure of “uncertainty,” and any approximate description can be a generator of positive entropy generation. Recently, following Irving and Kirkwood’s approach to the classical statistical mechanics of the transport equations of mass, momentum, and energy, extensions to entropy conservation were formally developed.3,4 It was shown that the entropy conservation equation of statistical mechanics is in full agreement with phenomenological expressions and the molecular theory results of Boltzmann.3,4 Moreover, the analysis formally links the entropy definitions of Gibbs, celebrated in equilibrium systems, with those of Boltzmann, celebrated in nonequilibrium gas dynamics, and provides a fully transparent, rigorous analysis of entropy variables and their behavior. In this study, using the general Gibbs–Boltzmann entropy and following Jaynes’ work with Boltzmann’s entropy,5 the specific molecular expressions and approximations associated with entropy generation are formally and explicitly traced to their origin and manifestations in the second law. Here, it is formally shown, using classical statistical mechanics, that if entropy is defined in a space less than the complete N-particle space, the reduced space entropy is always greater than the complete-space, or Gibbs’ entropy. Similarly, it is formally shown, using the general Boltzmann–Gibbs entropy, that lower-ordered defined states have greater entropy, in general, than higher-order states. Since all physically measured variables of heat, work, temperature, etc., and all physical laws must not depend on the particular defining state for entropy, positive entropy generation must result in a compensatory effect on any incompletely defined state. Importantly, the role of the “surroundings” and the different roles that statistical mechanical approximations play in the determination of entropy generation are also delineated.

Previously, a general entropy conservation equation was derived, following Irving and Kirkwood’s paradigm in the development of the transport theory for mass, momentum, and energy, as3,4
(1)
where S̄BG is the Boltzmann–Gibbs entropy per molecule,
(2)
with
(3)
In these expressions, fN is the N-particle density function, ri and pi are the position and momentum coordinates for the ith particle of mass m, fs is the reduced density function for the set of s indistinguishable molecules in the set N, and kB is Boltzmann’s constant, n(r, t) is the local number density, and the sum is over the set of molecules s, and j is any molecule in the set s. Note that short-hand notation for differentials has been used (drN = dr1dr2drN, etc.). Following Green,6, zs is the natural logarithm of an sth ordered, normalized density function that depends on the multi-particle expansion method.6,7 In particular,
(4)
(5)
etc., for higher orders.7,8
For the first-order set s = 1, Boltzmann’s definition is recovered,9,
(6)
where the scale factor is Planck’s constant and due to Tetrode.10 For the second order, set s = 2,
(7)
where
(8)
etc. Note that for s = N, zN = ln fN, and we have Gibbs’ entropy,3 
(9)
where j is any molecule in the set N. Higher-order multi-particle expansions have been presented and reviewed by Singer.7 By definition, entropy in these expressions represents the formal mathematical definition of “uncertainty” in finding the group of s molecules in the set of N molecules. Typically, the lowest set s = 1 corresponds to dilute gases, the set s = 2 to dense gases, and so-forth for higher-order sets.
The entropy flux vector s in Eq. (1) follows as4 
(10)
which is the entropy flux relative to the local mass average velocity v0, where p1/m=p1/mv0. As shown previously, this term is formally equivalent to the local energy flux vector q divided by the local absolute temperature T, which formally links microscopic to macroscopic thermodynamic descriptions of entropy flux.
The last term on the right-hand side of Eq. (1) is the entropy generation term, defined as
(11)
Expanding the set s, the entropy generation term can be written, in general, as
(12)
where, for the lowest set s = 1, the well-known Boltzmann’s entropy generation term is obtained9,11
(13)
Below, more general forms of the entropy generation term are examined, including approximations aimed at its resolution and its tracing to the second law.
In general, truncation of the N-body density function to a lower-order state, such as through the superposition approximations above, is practically necessary to further elaborate flux and generation expressions. However, if we replace the superposition approximation term in Eq. (11) with its complete N-body term lnfN,
(14)
where fN=h3NfN is a dimensionless density function, we can combine the log term with the momentum gradient term and expand as
(15)
The resulting momentum space integrations must be zero by virtue of the density function properties,
(16)
Thus, a complete N-body description without approximations cannot generate entropy, which is a known result.12 Using a similar analysis, we can generalize the entropy generation term by first distributing the outer summation term of Eq. (11) into those molecules in the set s and those outside of it, which after reduction results in
(17)
So, entropy generation is only associated with interactions of the set of molecules {s}, for which entropy (uncertainty) is defined, with only molecules outside of the set {s}. The Boltzmann entropy generation expression of Eq. (13) is the lowest order case. Macroscopically, we could also consider the set {s} as the “system” and those molecules outside of {s} as the “surroundings.” If there are no interactions of the system outside of the set {s}, then there can be no entropy generation as long as entropy is defined in the complete system space; this is consistent with known N-body results, {s} = {N}, as well. Indeed, a corollary to this statement is that any system with an empirically observed generation of entropy, where entropy is believed to be defined using the complete or known system and surrounding state information, must either be an incomplete description or interacting with another unidentified system.

Next, we prove that the general Boltzmann–Gibbs entropy in the incomplete state must always be greater than or equal to the Gibbs’ entropy for the complete state. Furthermore, it is shown that this inequality results in the second law.

Following Jaynes, we first subtract Gibbs entropy, Eq. (9), from the Boltzmann–Gibbs entropy, Eq. (2), which we write in the expanded form as
(18)
where we have used dimensionless densities denoted by asterisk and the short-hand notation,
(19)
(20)
etc.
Because the choice of the locator vector is arbitrary, we can symmetrize, following Jaynes,5 
(21)
and
(22)
First, we examine the difference between the Boltzmann and Gibbs entropies, and noting that ln x ≤ (x − 1) for all positive values of x, we have
(23)
Thus,
(24)
which has been previously given by Jaynes.5 In order to generalize these results to higher orders, we first look at the difference between adjacent states, e.g.,
(25)
Now, following the same methods, it can be shown that each of the (N)(N − 1)/2 terms in Eq. (25) is less than zero and, therefore, the sum of terms must be less than zero, giving
(26)
This result can also be obtained from Eqs. (5)-(8) and the approach can be readily continued to higher orders using higher-order superposition approximations,7,8 which results in the hierarchy of functions,
(27)
Although the above proof was accomplished for the superposition approximation, we note that density function expansions, in general, are typically based on resolving higher order density functions as combinations of lower order density functions, and the observation of higher uncertainty in the lower order space is reasonable in general. Formal proofs beyond the superposition approximation are beyond the scope of the current work.
We can now return to the local entropy conservation equation, Eq. (1), and consider the same physical system isolated from any surroundings and its evolution with differing definitions of entropy or uncertainty. We let the two systems begin with the same starting values of uncertainty or entropy: S̄Initial and using the total derivative operator, D/Dt, we begin with
(28)
(29)
Now, consider any change along the two paths from the same initial entropy value to any state at a future time, t. Subtracting the two alternative paths, utilizing Eq. (27), and noting that q/T is a locally, physically measured quantity that must be the same for both systems, we must have
(30)
which is the second law. Similarly, if we consider the same analysis for any two different defined states of entropy and pathways, the value of the generation term will always become larger for the lower-order state. However, this change will also concomitantly change the entropy state values, yielding the same physical result or observation.
Of course, the interaction of the system with its surroundings must also generate entropy, when entropy is defined in a space not including any aspect of the state of the surroundings, as shown by Eq. (13), where s can be considered the system space. Classically, then, positive entropy generation is relative to the defined entropy space that, in principle, may be expanded to include specific information on the surroundings. This observation is independent of the extent of the surroundings, which in macroscopic thermodynamics is often taken to be infinite in extent, constituting an infinite reservoir. System processes that include positive entropy generation fall into the domain of the so-called “irreversible processes.” Reversible processes, on the other hand, are ones in which entropy generation due to a defined system space without inclusion of the existing surroundings is arbitrarily set to zero. Consider two processes, one reversible (R) and one irreversible (I), with the same local entropy state definition and change. Then, from Eq. (28), we must have
(31)
So, local gradients in the energy flux, q, divided by the local temperature, T, are always greater in irreversible processes, which is consistent with the notions of macroscopic thermodynamics.

Finally, we note that we have not considered specific “working” expressions for the entropy generation term, which necessarily come from the solutions to the truncated Liouville equation in the assumed state or order of entropy.3,4 Expansions about local equilibrium states, for example, lead to the so-called phenomenological laws that dictate the direction of property flux, among other aspects of such systems. However, it is to be noted that there are many possible systems and solutions for density function expressions beyond perturbations about local equilibrium states. Moreover, the entropy generation expressions become specific to that expansion and can be classically derived from the general expressions given here. In any event, it is clear from Eq. (17) that positive entropy generation is still contingent upon defining entropy either in a lower order space than the complete system space or as result of the lack of inclusion of the surroundings in the defining state of entropy for a non-isolated system. The second law, in the sense of Jaynes, is the law of conservation of uncertainty.

As shown by Jaynes, classically, entropy generation is associated with approximations inherent in statistical mechanical theories of entropy, which are based on the mathematical definition of uncertainty. Physical systems and variables, however, must not arbitrarily depend on approximations since they could lead to conflicting results. Here, we have traced in detail the approximations associated with the general, molecular-based approaches to the entropy conservation equation. Using the generalized Boltzmann–Gibbs entropy, it is demonstrated that the entropy defined in any lower order state is greater than the higher-order state. Boltzmann and Gibbs entropies represent the two opposite limits of entropy state definitions. It is further shown that entropy generation is a compensatory effect in the entropy conservation equation in an incompletely defined system state to ensure that physically measured variables of heat, work, temperature, etc., are independent of the uncertainty definition and never in violation of the physical laws. Interactions of a defined system with its surroundings will also generate entropy unless the surroundings are included in the defined state of uncertainty.

The author has no conflicts to disclose.

Michael H. Peters: Conceptualization (equal); Formal analysis (equal); Methodology (equal); Writing – original draft (equal); Writing – review & editing (equal).

Data sharing is not applicable to this article as no new data were created or analyzed in this study.

1.
E. T.
Jaynes
, “
Information theory and statistical mechanics
,”
Phys. Rev.
106
(
4
),
620
630
(
1957
).
2.
E. T.
Jaynes
, “
Information theory and statistical mechanics. II
,”
Phys. Rev.
108
(
2
),
171
190
(
1957
).
3.
M.
Peters
, “
Generalized entropy generation expressions in gases
,”
Entropy
21
(
4
),
330
(
2019
).
4.
M. H.
Peters
, “
Nonequilibrium entropy conservation and the transport equations of mass, momentum, and energy
,”
Energies
14
(
8
),
2196
(
2021
).
5.
E. T.
Jaynes
, “
Gibbs vs Boltzmann entropies
,”
Am. J. Phys.
33
(
5
),
391
398
(
1965
).
6.
H. S.
Green
,
The Molecular Theory of Fluids
(
Dover Publications Constable
,
New York, London
,
1969
).
7.
A.
Singer
, “
Maximum entropy formulation of the Kirkwood superposition approximation
,”
J. Chem. Phys.
121
(
8
),
3657
3666
(
2004
).
8.
M. H.
Peters
, “
Proper normalization removes nonlocal behavior of canonical-based thermodynamic entropy expressions
,”
AIP Adv.
12
(
12
),
125311
(
2022
).
9.
J.
Oakland Hirschfelder
,
C. F.
Curtiss
, and
R. B.
Bird
,
Molecular Theory of Gases and Liquids
,
Structure of Matter Series
(
Wiley
,
New York
,
1964
), corr. print. with notes added edition.
10.
R.
Williams
, “
The Sackur-Tetrode equation: How entropy met quantum mechanics
,”
APS News
18
(
8
),
2
(
2009
).
11.
S.
Chapman
and
T. G.
Cowling
,
The Mathematical Theory of Non-uniform Gases: An Account of the Kinetic Theory of Viscosity, Thermal Conduction, and Diffusion in Gases
,
Cambridge Mathematical Library
, 3rd ed. (
Cambridge University Press
,
Cambridge
,
1990
).
12.
P.
Gaspard
, “
Entropy production in open volume-preserving systems
,”
J. Stat. Phys.
88
(
5–6
),
1215
1240
(
1997
).