An analogy between the thermodynamic inequalities presented by Nicholson et al. [Nat. Phys. 16, 1211 (2020)] and by Yoshimura and Ito [Phys. Rev. Res. 3, 013175 (2021)] is discussed. As a result, a time–energy uncertainty relation in chemical thermodynamics in terms of Gibbs free energy and chemical potential is derived. It is numerically demonstrated that the uncertainly relation holds in a model system of oscillatory Brusselator reactions. Our result bridges the thermodynamic time–information uncertainty relation and free energy evolution in chemical reactions.

Stochastic thermodynamics has emerged as a comprehensive framework to understand the energetics and thermodynamics of stochastic processes away from equilibrium. Nonequilibrium systems inevitably have entropy production as one of their most distinguished characteristics in contrast to thermal equilibrium. However, it is not typically easy to quantitatively determine the entropy production associated with a nonequilibrium process without a detailed knowledge of the system. Recently developed thermodynamics uncertainty relations provide a bound on entropy production in terms of current fluctuations.1–3 Nicholson et al. presented time–information uncertainty relations for the flux of heat, entropy, and work, demonstrating that the timescales of their dynamical fluctuations away from equilibrium are all bounded by the fluctuations in information rates and indicating that natural processes must trade speed for thermodynamic costs.4 Yoshimura and Ito presented information geometric inequalities that give a speed limit for the changing rate of the Gibbs free energy and a general bound of chemical fluctuations, offering a framework to analyze the thermodynamic profile of biological systems.5 The formulations in Refs. 4 and 5 are built on the basis of a firm combination of finite-time thermodynamics6–8 and information geometry,9–11 with an employment of the Fisher information.12–14 In this paper, we discuss an analogy between their thermodynamic inequalities and, consequently, derive a time–energy uncertainty relation in chemical thermodynamics in terms of Gibbs free energy and chemical potential.

Nicholson et al. presented the following time–information uncertainty relation:4 

(1)

where A is a general variable and Ȧ denotes the evolution rate or time derivative of the variable, Ȧ := dA/dt. Δİ is the standard deviation of the evolution rate of the surprisal or information content Ii,

(2)

where pi is the probability of the state i (=1, 2, …, N). An entropy version of the time–information uncertainty relation in thermodynamics is also shown as4 

(3)

S is the Shannon entropy,

(4)

where kB is the Boltzmann constant, and the simplified symbol of summation in this paper denotes

(5)

where ai is a general variable. ΔI is the standard deviation of the surprisal. A derivation of Eq. (3) using pi is given in Ref. 15.

Yoshimura and Ito presented the following inequality:5 

(6)

G is the Gibbs free energy, and ϕ is the Fisher information,

(7)

where [Xi] is the concentration of the chemical species Xi. Δμ2 is defined as the chemical variance of the chemical potential μ,

(8)

where μi is the chemical potential of the chemical species Xi,

(9)

for an ideal solution, where μi0, R, and T are the standard chemical potential, the gas constant, and the temperature, respectively, and μieq is the equilibrium chemical potential.

First, we employ a conversion of the concentration of the chemical species [Xi] used in Ref. 5 into the probability pi to fit the variable in the context of statistical mechanics,

(10)

If [Xi] is defined as the molar fraction instead of the concentration, pi = [Xi] since Σ[Xi] = 1 in this case. For the evolution rate of the Gibbs free energy, Yoshimura and Ito studied, in the framework of information geometry, the employment of the Kullback–Leibler divergence between the time-evolving and equilibrium variables as Eqs. (20), (21), and (83) of Ref. 5. We instead employ the original, general definition of the Gibbs free energy, as given in Eqs. (8), (9), and (93) of Ref. 5,

(11)

Because the chemical potential is a relative variable, setting μi0, the referential standard value, to be zero would not lose the generality required in this subject of fundamental non-equilibrium statistical mechanics. Then, we have

(12)

Because G := HTS, where H is the enthalpy,

(13)

Therefore, for isothermal (Ṫ = 0) and adiabatic or thermally isolated (Ḣ = 0) systems, Ġ = –TṠ, and Eq. (3) can be transformed into

(14)

where we converted kB into R to fit the context of chemical thermodynamics.

Let us take a look at the standard deviation of the surprisal ΔI,

(15)

Instead of Δμ2 defined by Yoshimura and Ito in the framework of information geometry as Eq. (78) of Ref. 5, we employ the standard deviation of the chemical potential for generality,

(16)

By setting μi0 as zero again, we have

(17)

From Eqs. (15) and (17), interestingly,

(18)

Next, for the standard deviation of the evolution rate of the surprisal Δİ, since

(19)
(20)

Because ∑px = 1, ṗx=ddtpx=0. Therefore,

(21)

From Eqs. (7) and (21), it is recognized that there is a connection

(22)

between Refs. 4 and 5. For the standard deviation of the evolution rate of the chemical potential, since

(23)
(24)

We, thus, further obtain

(25)

Therefore, Eq. (14) can be rewritten as

(26)

This result provides an upper bound of the evolution rate of Gibbs free energy, which is bounded by the spread in the chemical potential and its rate of change. As a practical merit of the inequality, the value of the right-hand side of Eq. (26) is relatively easily determined, for instance, by measuring the electric potential or voltage of the chemical solution and its time derivative or evolution rate, to estimate the upper bound of the difficult Gibbs free energy rate. As discussed in Ref. 5, this inequality can be extended to subsystems as

(27)

where the subscript S denotes some specific chemical species out of Xi (i = 1, 2, …, N).

To numerically demonstrate these inequalities, we employ the oscillatory chemical reaction system model presented in Ref. 5. For our calculations, we use exactly the same reaction rate equations and parameter values of Ref. 5, with R and T set as unity. The Brusselator is a model of oscillatory reactions, such as the Belousov–Zhabotinsky reaction,16–18 comprising the following chemical reactions:

(28)
(29)
(30)

where A, B, X, and Y are chemical species. The reaction rates [J1, J2, and J3 for Eqs. (28)(30), respectively] in the forward direction from the left-hand side to right-hand side of each equation are formulated as

(31)
(32)
(33)

where [A], [B], [X], and [Y] are the concentrations of the chemical species, and ki+ and ki (i = 1, 2, 3) are the reaction rate constants in the forward and backward reactions, respectively, of the ith chemical equation. Then, the time derivatives of the chemical species’ concentrations are

(34)
(35)
(36)
(37)

Note that the first equation of Eq. (124) in Ref. 5, corresponding to Eq. (34) of this paper, contains an apparent typo: J2 is supposed to be added to the right-hand side. The calculations in Ref. 5 are still conducted in the correct manner. Figure 1 presents the calculated time evolution of the concentrations [X], [Y] and the probabilities pX, pY of the chemical species X and Y involved in the oscillating reaction system. The values of the reaction rate constants in Ref. 5, i.e., k1+ = 1 × 10−3, k1 = 1, k2+ = 1, k2 = 1, k3+ = 1 × 10−2, and k3 = 1 × 10−4, were employed. The initial values of the chemical species’ concentrations, [A]0 = 1 × 103, [B]0 = 1 × 103, [X]0 = 1, and [Y]0 = 6, also followed Ref. 5. R and T were set to unity. The curves of [X] and [Y] are identical to those plotted in Ref. 5. Figure 2 presents the calculated time evolution of |Ġ| and Δμ̇Δμ. In the case of a subsystem consisting of a subset S = {X, Y}, Fig. 3 presents the calculated time evolution of |Gṡ| and ΔμṠΔμs. As shown in Figs. 2 and 3, each of |Ġ| and |Gṡ| does not exceed its own upper bound, Δμ̇Δμ and ΔμṠΔμs, according to Eqs. (26) and (27), respectively.

FIG. 1.

Time evolution of [X], [Y], pX, and pY calculated for the Brusselator reaction model.

FIG. 1.

Time evolution of [X], [Y], pX, and pY calculated for the Brusselator reaction model.

Close modal
FIG. 2.

Time evolution of |Ġ| and Δμ̇Δμ calculated for the Brusselator reaction model to demonstrate Eq. (26).

FIG. 2.

Time evolution of |Ġ| and Δμ̇Δμ calculated for the Brusselator reaction model to demonstrate Eq. (26).

Close modal
FIG. 3.

Time evolution of |GṠ| and ΔμṠΔμS for a subset S = {X, Y} calculated for the Brusselator reaction model to demonstrate Eq. (27).

FIG. 3.

Time evolution of |GṠ| and ΔμṠΔμS for a subset S = {X, Y} calculated for the Brusselator reaction model to demonstrate Eq. (27).

Close modal

In this paper, we discussed an analogy between the thermodynamic inequalities presented by Nicholson et al.4 and Yoshimura and Ito.5 As a result, we derived a time–energy uncertainty relation, ĠΔμ̇Δμ/RT, for isothermal and adiabatic or thermally isolated systems. We numerically demonstrated that the uncertainly relation holds in a model system of oscillatory Brusselator reactions. Our result bridges the thermodynamic time–information uncertainty relation and free energy evolution in chemical reactions.

We have no conflicts of interest to disclose.

The data that support the findings of this study are available within the article and from the corresponding author upon reasonable request.

1.
J. M.
Horowitz
and
T. R.
Gingrich
,
Nat. Phys.
16
,
15
(
2020
).
2.
G.
Falasco
,
M.
Esposito
, and
J.-C.
Delvenne
,
New J. Phys.
22
,
053046
(
2020
).
3.
T.
Koyuk
and
U.
Seifert
,
Phys. Rev. Lett.
125
,
260604
(
2020
).
4.
S. B.
Nicholson
,
L. P.
García-Pintos
,
A.
del Campo
, and
J. R.
Green
,
Nat. Phys.
16
,
1211
(
2020
).
5.
K.
Yoshimura
and
S.
Ito
,
Phys. Rev. Res.
3
,
013175
(
2021
).
6.
F.
Weinhold
,
J. Chem. Phys.
63
,
2479
(
1975
).
7.
G.
Ruppeiner
,
Phys. Rev. A
20
,
1608
(
1979
).
8.
D. A.
Sivak
and
G. E.
Crooks
,
Phys. Rev. Lett.
108
,
190602
(
2012
).
9.
H.
Hotelling
,
Bull. Am. Math. Soc.
36
,
191
(
1930
).
10.
C. R.
Rao
,
Bull. Calcutta Math. Soc.
37
,
81
(
1945
).
12.
R. A.
Fisher
,
Philos. Trans. R. Soc. London, Ser. A
222
,
594
(
1922
).
13.
J. J.
Rissanen
,
IEEE Trans. Inf. Theory
42
,
40
(
1996
).
14.
R.
Hollerbach
and
E.-j.
Kim
,
Entropy
19
,
268
(
2017
).
15.
16.
I.
Prigogine
and
R.
Lefever
,
J. Chem. Phys.
48
,
1695
(
1968
).
17.
J. J.
Taboada
,
A. P.
Munuzuri
,
V.
Perez-Munuzuri
,
M.
Gomez-Gesteira
, and
V.
Perez-Villar
,
Chaos
4
,
519
(
1994
).
18.
C. T.
Hamik
,
N.
Manz
, and
O.
Steinbock
,
J. Phys. Chem. A
105
,
6144
(
2001
).