The simulation hypothesis is a philosophical theory, in which the entire universe and our objective reality are just simulated constructs. Despite the lack of evidence, this idea is gaining traction in scientific circles as well as in the entertainment industry. Recent scientific developments in the field of information physics, such as the publication of the mass-energy-information equivalence principle, appear to support this possibility. In particular, the 2022 discovery of the second law of information dynamics (infodynamics) facilitates new and interesting research tools at the intersection between physics and information. In this article, we re-examine the second law of infodynamics and its applicability to digital information, genetic information, atomic physics, mathematical symmetries, and cosmology, and we provide scientific evidence that appears to underpin the simulated universe hypothesis.
I. INTRODUCTION
In 2022, a new fundamental law of physics has been proposed and demonstrated, called the second law of information dynamics, or simply the second law of infodynamics.1 Its name is an analogy to the second law of thermodynamics, which describes the time evolution of the physical entropy of an isolated system, which requires the entropy to remain constant or to increase over time. In contrast to the second law of thermodynamics, the second law of infodynamics states that the information entropy of systems containing information states must remain constant or decrease over time, reaching a certain minimum value at equilibrium. This surprising observation has massive implications for all branches of science and technology. With the ever-increasing importance of information systems such as digital information storage or biological information stored in DNA/RNA genetic sequences, this new powerful physics law offers an additional tool for examining these systems and their time evolution.2
In what follows, we will examine a few diverse applications of the second law of infodynamics and demonstrate the universal nature of this new physics law, including the fact that it points to the characteristics of a computational system, underpinning to some degree the simulated universe hypothesis. Sections II and III have been covered in greater detail in the 2022 article,1 but they are discussed briefly here to reinforce our point and introduce the context of the second law of infodynamics to the reader.
II. SECOND LAW OF INFODYNAMICS AND DIGITAL INFORMATION
(a) Schematics of the word INFORMATION is written on a material in binary code using magnetic recording. Red denotes magnetization pointing out of the plane and blue is magnetization pointing into the plane. (b)–(d) Time evolution of the digital magnetic recording information states simulated using micromagnetic Monte Carlo. (b) Initial random state. (c) INFORMATION is written (t = 0 s). (d) Iteration 930 (t = 1395 s) showing the degradation of information states. Reproduced with permission from M. M. Vopson and S. Lepadatu, AIP Adv. 12, 075310 (2022). Copyright 2022 AIP Publishing.
(a) Schematics of the word INFORMATION is written on a material in binary code using magnetic recording. Red denotes magnetization pointing out of the plane and blue is magnetization pointing into the plane. (b)–(d) Time evolution of the digital magnetic recording information states simulated using micromagnetic Monte Carlo. (b) Initial random state. (c) INFORMATION is written (t = 0 s). (d) Iteration 930 (t = 1395 s) showing the degradation of information states. Reproduced with permission from M. M. Vopson and S. Lepadatu, AIP Adv. 12, 075310 (2022). Copyright 2022 AIP Publishing.
The average unit cell size (cubic) was V = 10−27 m3, which is intentionally ∼1.9 times lower than the required size for a thermally stable medium, in order to speed up the computation time. This resulted in a relaxation time of 1.5 s, which corresponds to a single iteration in the Monte Carlo algorithm. The simulations show that the entropy of the information bearing states will remain constant or decrease over time, and after a sufficiently long time, all information states will become self-erased, leading to zero entropy of information states. Figure 1(b) shows the simulated specimen before data were recorded on it. Figure 1(c) shows the same sample with the data written on it at time zero. Figure 1(d) shows the time evolution of the data after 930 Monte Carlo cycles, showing the degradation of the data. After 1990 cycles, the entire data got self-erased, and the information entropy became zero.
III. SECOND LAW OF INFODYNAMICS AND GENETIC INFORMATION
Similar to the case of digital information, a reduction of N would most likely result in a reduction of the overall entropy of the information bearing states, so “deletion” mutations would automatically fulfill the second law of infodynamics. In our previous study, we examined real data from RNA sequences that underwent only SNP mutations, which maintained the value of the N constant, and the reduction in the information entropy came only from Shannon’s information entropy function.1,2 Our test RNA sequences were variants of the novel SARS-CoV-2 virus, which emerged in December 2019 resulting in the COVID-19 pandemic. The reference RNA sequence of the SARS-CoV-2, collected in Wuhan, China in December 2019 (MN908947),6 has 29 903 nucleotides, so N = 29 903. All analyzed variants had 29 903 nucleotides and have been collected and sequenced at a later time, after undergoing an incremental number of SNP mutations. Shannon information entropies of the reference sequence and of the variants were computed using relation (1) and previously developed software, GENIES.7,8
Remarkably, the results indicate a unique correlation between the information and the dynamics of the genetic mutations by showing that the Shannon information entropy, H(X), and the overall information entropy of the SARS-CoV-2 variants (SInfo) computed using Eq. (8) decrease linearly with the number of mutations and over time, i.e., because number of mutations increase over time (see Fig. 2). The corresponding code names of the genome variants extracted from the NCBI database9–14 and analyzed in this work are shown next to each data point in Fig. 2. This result not only confirms the universal validity of the second law of infodynamics but also points to a possible governing mechanism of genetic mutations,2 currently believed to be just random events. The observation of the information entropic force that governs genetic mutations is very powerful because it challenges the Darwinian view that genetic mutations are complete random events and could be used to develop predictive algorithms for genetic mutations before they occur.2 We should acknowledge that, while all analyzed SARS-Cov-2 variants showed a decrease in their information entropy as they underwent genetic mutations, the data points presented in Fig. 2 have been carefully selected to emphasize the linear trend.
Shannon information entropy values of variants of the SARS-CoV-2 virus as a function of the number of SNP mutations per variant.
Shannon information entropy values of variants of the SARS-CoV-2 virus as a function of the number of SNP mutations per variant.
Naturally, we asked whether the same RNA system would display behavior consistent with the second law of infodynamics, when the SARS-CoV-2 variants suffered “addition” mutations, so the number of nucleotides N is no longer constant but becomes larger than 29 903, increasing the information entropy. Using the NCBI database, we searched all the sequenced SARS-CoV-2 variants from January 1 2020 to January 1 2022. We searched only complete sequences with no missing/undetermined nucleotides, and the result was a total of 4.48 × 106 sequences. When we restricted the results to only the sequences that had at least 29 903 nucleotides or more, then 48 450 sequences were identified. Unfortunately, only one suffered a mutation where the resultant number of nucleotides increased by 1–29 904. Hence, 98.92% of all mutations took place via “deletion,” reducing the total number of nucleotides. Since only one genome out of 4.48 × 106 appeared to increase the number of nucleotides, this is statistically irrelevant. Hence, we concluded that, for this test case, genetic mutations appear to take place in a way that reduces their information entropy, mostly via a deletion mechanism or a SNP. This is fully consistent with the second law of infodynamics, as a deletion would automatically decrease the total information entropy, and the SNPs have been shown to take place in a way that the information entropy is again reduced due to a reduction in Shannon’s information entropy.
We would also like to quote the famous Spiegelman’s experiment that took place in 1972.15 In this experiment, Spiegelman studied the evolution of a virus over 74 generations. The virus was kept isolated in ideal conditions to survive, and with each generation, the virus was sequenced. The initial virus had 4500 base points, and with each generation, the genome decreased consistently in size. After 74 generations, the virus evolved to only 218 base points, showing an interesting and unexplained reduction of its genome of over 95%. Just as the 2022 study on SARS-CoV-2,1 Spiegelman’s experiment is fully consistent with the second law of infodynamics, which requires the information entropy to remain constant or to decrease over time, reaching a minimum value at equilibrium.
IV. SECOND LAW OF INFODYNAMICS AND HUND’S RULE
Electronic states in atoms are fully described by four principal quantum numbers: (a) the principal quantum number, n. This number determines the energy of a particular shell or orbit, and it takes non-zero positive integral values n = 1, 2, 3, 4, … (b) the orbital angular momentum quantum number, ℓ. This quantum number describes the subshell, and gives the total angular momentum of an electron due to its orbital motion. This quantum number takes integral values restricted to ℓ = 0, 1, 2, …, n − 1. (c) The magnetic quantum number, mℓ. This quantum number determines the component (projection) of the orbital angular momentum along a specific direction, usually the direction of an applied magnetic field. It takes integral values, and for a given value of ℓ, it may have (2ℓ + 1) possible values: mℓ = ℓ, ℓ − 1, ℓ − 2, …, 0, −ℓ, …, −(ℓ − 1), −ℓ. (d) The spin quantum number s, and the secondary spin quantum number, ms. The spin quantum number s gives the eigenvalues of the spin angular momentum operator, and it is related to the fact that the electron has an intrinsic angular momentum called “spin” or spin angular momentum, which results from the rotation of the electron around an internal axis. The spin quantum number takes the values s = n/2, where n is a positive integer, so that s = 0, 1/2, 1, 3/2, 2, …. The secondary quantum spin number ms determines the direction (i.e., projection) of the spin angular momentum along the direction of an applied magnetic field. The allowed values of ms are 2s + 1 values from −s to +s in steps of 1. For example, an electron has s = 1/2, so the allowed values of ms are −1/2 and +1/2.
The electrons occupy atomic shells according to Pauli’s exclusion principle,16 which states that two or more identical fermions cannot simultaneously occupy the same quantum state within a quantum system. In the case of electrons in atoms, this means that it is impossible for two electrons in a multi-electron atom to have the same values of the four quantum numbers described above. For example, if two electrons reside in the same orbital, then their n, ℓ, and mℓ values are the same, so their ms must be different, imposing that the electrons must have opposite half-integer spin projections of 1/2 and −1/2.
However, in the case of multi-electron atoms, multiple electron arrangements are possible while fulfilling Pauli’s exclusion principle.
In order to determine the electron population of an atomic orbital corresponding to the ground state of a multi-electron atom, German physicist Friedrich Hund formulated in 1927 a set of rules17 derived from phenomenological observations. These are called Hund’s rules, and when used in conjunction with Pauli’s exclusion principle, they are useful in atomic physics to determine the electron population of atoms corresponding to the ground state.
To explain this, let us assume that an atom has three electrons on its p orbital. Figure 3 shows the three allowed ground state distinctive configurations that fulfill Pauli’s exclusion principle, resulting in total spin quantum values of 1/2, 3/2, and 1/2, respectively.
Three electrons residing on a p orbital and their allowed arrangements according to Pauli’s exclusion principle.
Three electrons residing on a p orbital and their allowed arrangements according to Pauli’s exclusion principle.
The correct electronic arrangement is given by Hund’s first rule, which is the most important, and it is simply called Hund’s rule. This states that the lowest energy atomic state is the one that maximizes the total spin quantum number, meaning simply that the orbitals of the subshell are each occupied singly with electrons of parallel spin before double occupation occurs. Therefore, the term with the lowest energy is also the term with the maximum number of unpaired electrons, so for our example shown in Fig. 3, Hund’s rule dictates that the correct configuration is the middle one, resulting in a total spin quantum value of 3/2.
Hund’s rule is derived from empirical observations, and there is no clear understating of why the electrons populate atomic orbitals in this way. So far, two different physical explanations have been given in Ref. 18. Both explanations revolve around the energetic balance of the electrons and their interactions in the atom. The first mechanism implies that electrons in different orbitals are further apart, so that electron–electron repulsion energy is reduced. The second mechanism claims that the electrons in singly occupied orbitals are less effectively screened from the nucleus, resulting in a contraction of the orbitals, which increases the electron–nucleus attraction energy.19
In this article, we examine the electronic population in atoms within the framework of information theory3 and we demonstrate that Hund’s rule (Hund’s first rule) is a direct consequence of the second law of information dynamics.1 This requires that, at equilibrium in the ground state, electrons occupy the orbitals in such a way that their information entropy is minimum, or equivalently, the bit information content per electron is minimum.
A. Numerical calculations
We treat the two possible values of the secondary quantum spin number ms of the electrons in atoms, ms = −1/2, +1/2, as two possible events, or as a two-letter message within Shannon’s information theory framework. The secondary quantum spin number ms is a very important parameter because it is the only quantity that distinguishes two electrons residing in the same orbital. Since their n, ℓ, and mℓ values are the same, their ms must be different to fulfill Pauli’s exclusion principle.
We will allocate to the two possible projections of the ms the spin up ↑ and spin down ↓ states. In this context, the set of n = 2 independent and distinctive information states is X = {↑, ↓}, with a discrete probability distribution P = {p↑, p↓}.
B. s-orbital
The s orbital can accommodate a maximum of N = 2 electrons. Figure 4 shows the possible electronic configurations of an s orbital. When N = 1, or N = 2, the IE = 1 bit in both cases, while the total spin quantum value is 0.5 and 0, respectively. Since there are no other possible configurations, the case for s-orbital is rather trivial. Figure 5(a) shows a plot of the IE values vs the total spin quantum value, S for the s orbital.
Representation of all possible distinctive electronic populations of an s orbital.
Representation of all possible distinctive electronic populations of an s orbital.
Calculated IE values for (a) s orbital, (b) p orbital, (c) d orbital, and (d) f orbital. Data represent each possible distinct electronic configuration vs the total spin quantum value, S. The data show categorically that IE is minimum when S is maximum is each case.
Calculated IE values for (a) s orbital, (b) p orbital, (c) d orbital, and (d) f orbital. Data represent each possible distinct electronic configuration vs the total spin quantum value, S. The data show categorically that IE is minimum when S is maximum is each case.
C. p-orbital
The p orbital can accommodate a maximum of N = 6 electrons. Figure 6 shows the electronic populations on the p orbital for all possible N values. We should mention that only distinct configurations have been represented in the diagram. Any electronic arrangement that results in the same ratio of spin-up and spin-down electrons is not represented, as it would duplicate the results.
Representation of all possible distinctive electronic populations of a p orbital. Configurations highlighted in green are the correct arrangements when multiple states are possible for N = 2, 3, and 4, respectively.
Representation of all possible distinctive electronic populations of a p orbital. Configurations highlighted in green are the correct arrangements when multiple states are possible for N = 2, 3, and 4, respectively.
Similarly, configurations obtained by inverting all spins, i.e., mirror images, result in the same IE values and are not considered to avoid duplications.
Figure 5(b) shows the graph of the IE values vs the total spin quantum value for all possible distinct occupancy cases of the p orbital. As shown in Fig. 6, each time multiple arrangements are possible, as is the case for N = 2, 3, and 4, respectively, the maximum spin quantum value corresponds to the minimum IE value estimated using Eq. (10). For N = 2 and 3, the minimum IE is 0 in each case, while for N = 4, the minimum IE value is 0.811. To emphasize this, we highlighted, in Fig. 6, the correct configurations that are required by Hund’s rule.
D. d-orbital
We now examine the case of the d orbital, which can accommodate a maximum of N = 10 electrons. Figure 7 shows the distinct electronic populations allowed on the d orbital for all possible N values.
Representation of all possible distinctive electronic populations of a d orbital. Configurations highlighted in green are the correct arrangements when multiple states are possible for N = 2–8.
Representation of all possible distinctive electronic populations of a d orbital. Configurations highlighted in green are the correct arrangements when multiple states are possible for N = 2–8.
For N = 1, 9, and 10, only a single distinct arrangement is possible, while for all the other N values, multiple electronic arrangements are allowed within Pauli’s exclusion principle.
Figure 5(c) shows the IE values vs the total spin quantum value for all possible distinct occupancy cases of the d orbital. The data indicates that the maximum spin quantum value corresponds to the minimum IE value estimated using Eq. (10). For each N value, we highlighted, in Fig. 7, the correct configurations that are required by Hund’s rule. These all correspond exactly to the lowest IE value, reinforcing the validity of the second law of infodynamics. The minimum IE value of 0 is achieved for N = 2, 3, 4, and 5. The minimum IE values are IE = 0.65 for N = 6, IE = 0.863 for N = 7, and IE = 0.954 for N = 8, respectively.
E. f-orbital
Finally, we examine the f-orbital, which can accommodate a maximum of N = 14 electrons. Therefore, we have 14 possible groups, with N = 1, 13, and 14 having only a single distinct electronic arrangement possible, while for all the other N values, multiple electronic arrangements are allowed by Pauli’s exclusion principle. Figure 8 shows the distinct electronic populations allowed on the f orbital for all possible N values.
Representation of all possible distinctive electronic populations of an f orbital. Configurations highlighted in green are the correct arrangements according to Hund’s rule when multiple states are possible for N = 2–12.
Representation of all possible distinctive electronic populations of an f orbital. Configurations highlighted in green are the correct arrangements according to Hund’s rule when multiple states are possible for N = 2–12.
Again, we highlighted the correct arrangements as dictated by Hund’s rule, and we calculated the IE values for all possible configurations. The minimum IE value of 0 is achieved for N = 2, 3, 4, 5, 6, and 7. For the remaining groups with multiple electronic configurations, the minimum IE values are IE = 0.544 for N = 8, IE = 0.764 for N = 9, IE = 0.881 for N = 10, IE = 0.946 for N = 11, and IE = 0.98 for N = 12, respectively. The data show categorically that, in all cases, the minimum IE value corresponds to the maximum spin quantum value, S, so the second law of infodynamics appears to be the real driving force behind Hund’s rule.
V. SECOND LAW OF INFODYNAMICS IN COSMOLOGY
Diagram of a physical system under continuous expansion in time, resulting in entropy increasing.
Diagram of a physical system under continuous expansion in time, resulting in entropy increasing.
Just as in our expanding universe, the space expansion in the schematic physical system shown in Fig. 9 facilitates the emergence of more microstates, and the total entropy increases rapidly. If the universe does not expand, at some point, the entropy will reach its maximum and the universe will achieve equilibrium.
This process allows for the physical entropy of the universe in the past to have been at its maximum value when the universe would have been much smaller than today and at near equilibrium. The evidence for this is the cosmic microwave background (CMB) radiation,20 which is almost isotropic, having a temperature of ∼2.7 K in all directions,21 and a very low-level temperature anisotropy (ΔT/T ∼ 10−5).22,23 The time origin of this low level of temperature anisotropy can be traced back to ∼370 000 years after the big bang24 when the universe was close to chemical and thermal equilibrium and the density inhomogeneities were comparable to the temperature anisotropies (Δρ/ρ ∼ ΔT/T ∼ 10−5).
However, in order to comply with the first law of thermodynamics and the adiabatic expansion, we just showed that the total entropy of the universe must be constant. If this is the case, how can the physical entropy of our expanding universe increase continuously? This is called the “Entropic Paradox” and to solve it, there are only three possibilities:
The laws of thermodynamics are not valid;
The universe is not expanding;
The entropy budget of the universe contains an unaccounted entropy term.
The readers would agree that possibilities (a) and (b) are out of the question as these are supported by undisputed empirical evidence. Therefore, we are left with the search for another entropy term responsible for the initial high entropy of the universe. This entropy term must also balance the total entropy budget of the universe in order to ensure that the overall entropy remains constant over time, despite the evident increase in physical entropy that we can observe in the expanding universe.
In this paper, we propose that the missing entropy term is the entropy associated with the information content of the universe.
The relation (16) is identical to relation (4), and it is exactly the second law of infodynamics, requiring that the entropy of the information states must decrease over time. Hence, the second law of infodynamics appears to be universally applicable and is, in fact, a cosmological necessity. It is important to realize that in order for the overall entropy of the universe to remain constant, the absolute values of physical entropy and information entropy do not have to be equal. Only their absolute change over time must be equal, in order to ensure a constant overall entropy of the universe.
VI. SECOND LAW OF INFODYNAMICS AND SYMMETRIES
Symmetry is a mathematical concept in which a certain property, for instance, the geometrical shape of an object, is preserved under certain transformations applied to the object. Such transformations include translations, rotations, reflections, and more complex operations combining these. In each case, the object remains invariant upon transformation. In the context of Euclidian geometry, these transformations are called symmetry operations. A symmetry operation is the movement of an object into an equivalent and indistinguishable orientation that is carried around a symmetry element. A symmetry element is a point, line, or plane about which a symmetry operation is carried out. The classical group theory is the mathematical tool for the study of symmetry, describing the structure of transformations that map objects to themselves exactly.
However, symmetry is not merely a mathematical concept. It transcends disciplines, connecting mathematics, chemistry, biology, and physics, and appears to be a fundamental property of the universe.
This is evidenced by everything around us, from the elegant symmetrical patterns of snowflakes to the fundamental symmetries governing subatomic particles. Symmetry occurs at all scales, playing a pivotal role in the structure and behavior of matter in the universe. Figure 10 shows a few examples of amazing symmetries manifesting in nature.
This abundance of symmetry in the natural world begs the question: Why does symmetry dominate all systems in the universe instead of asymmetry? After all, the entropic evolution of the universe tends to a higher entropy state, yet everything in nature appears to prefer high symmetry and a high degree of order.
Here, we explore the mathematical underpinnings of symmetry and its crucial significance in the context of the second law of infodynamics. We demonstrate a unique observation that a high symmetry corresponds to a low information entropy state, which is exactly what the second law of infodynamics requires. Hence, this remarkable observation appears to explain why symmetry dominates in the universe: it is due to the second law of information dynamics.
Before we proceed to our proof, it is useful to establish a way of measuring the symmetry of an object quantitatively. In other words, how much symmetry does a shape have? One accepted method of measuring the symmetry of an object is by counting the number of symmetry operations that one can carry out on the object. The more symmetry operations a shape has, the more symmetric it is.
Since the symmetry operations are carried out around the symmetry elements, we propose to quantify the symmetry of a shape by counting its number of symmetry elements instead of counting the number of symmetry operations. For example, a perfect square has eight symmetry operations (four rotations and four reflections) and five symmetry elements (one axis of rotation and four axes of reflection).
Our main objective is to describe the relationship between the symmetry of an object, determined by the number of its symmetry elements (SE) and its information entropy (IE).
In order to do this, let us consider a range of simple Euclidian 2D geometric shapes. We start with an ordinary triangle, defined by three sides of length a, b, and c and three corresponding angles α, β, and γ, respectively (see Fig. 11). These parameters are a unique representation of the shape, as there is no other possible way of forming a triangle that looks different using this set of parameters.
Within Shannon’s information theory framework, we define the set of six distinct characters, n = N = 6, , and a probability distribution on X. The probabilities of the set are . The average information per character, or the number of bits of information per character for this set, is given by Eq. (1), .
This ordinary triangle has no symmetry, and accordingly, it has zero symmetry elements, so SE = 0.
High symmetry = low information entropy This behavior is clearly emphasized in Fig. 14, showing the IE vs SE for all possible triangle shapes.
Information entropy vs number of symmetry elements of triangular shapes.
Although this was observed in the case of a single 2D geometric shape, we postulate that this is a universal behavior of symmetries. In order to convince the reader, let us examine the case of quadrilaterals. There are seven possible geometries of a quadrilateral figure in terms of its possible symmetries. Table I gives all seven possible geometries and their SE values. For each geometry, we computed the IE value. The data are also summarized in Fig. 15.
Summarized results of the analysis performed on quadrilaterals.
. | . | . | . | IE . |
---|---|---|---|---|
Shape . | SE . | Set X of distinct characters . | Probabilities . | . |
![]() | 0 | 3 | ||
![]() | 1 | 2.25 | ||
![]() | 1 | 2.25 | ||
![]() | 1 | 1.905 | ||
![]() | 3 | 1.5 | ||
![]() | 3 | 1.5 | ||
![]() | 5 | 1 |
. | . | . | . | IE . |
---|---|---|---|---|
Shape . | SE . | Set X of distinct characters . | Probabilities . | . |
![]() | 0 | 3 | ||
![]() | 1 | 2.25 | ||
![]() | 1 | 2.25 | ||
![]() | 1 | 1.905 | ||
![]() | 3 | 1.5 | ||
![]() | 3 | 1.5 | ||
![]() | 5 | 1 |
Again, the shape with the highest symmetry has the lowest information content.
The same analysis can be applied to any geometric figure, including 3D geometries, producing the same results. In each case, the symmetry scales inversely with the information content.
This remarkable result demonstrates that the symmetries manifesting everywhere in nature, and in the entire universe, are a consequence of the second law of information dynamics, which requires the minimization of the information entropy in any system or process in the universe.
VII. CONCLUSIONS
In this study, we revisited the second law of infodynamics, first introduced in 2022.1 The second law of infodynamics states that the information entropy of systems containing information states must remain constant or decrease over time, reaching a certain minimum value at equilibrium. This is very interesting because it is in total opposition to the second law of thermodynamics, which describes the time evolution of the physical entropy that must increase up to a maximum value at equilibrium.
We showed that the second law of infodynamics is universally applicable to any system containing information states, including biological systems and digital data. Remarkably, this indicates that the evolution of biological life tends in such a way that genetic mutations are not just random events as per the current Darwinian consensus, but instead undergo genetic mutations according to the second law of infodynamics, minimizing their information entropy. This discovery has massive implications for genetic research, evolutionary biology, genetic therapies, pharmacology, virology, and pandemic monitoring, to name a few.
Here, we also expanded the applicability of the second law of infodynamics to explain phenomenological observations in atomic physics. In particular, we demonstrated that the second law of infodynamics explains the rule followed by the electrons to populate the atomic orbitals in multi-electron atoms, known as the Hund’s rule. Electrons arrange themselves on orbitals, at equilibrium in the ground state, in such a way that their information entropy is always minimal.
Most interesting is the fact that the second law of infodynamics appears to be a cosmological necessity. Here, we re-derived this new physics law using thermodynamic considerations applied to an adiabatically expanding universe.
Finally, one of the great mysteries of nature is: Why does symmetry dominate in the universe? has also been explained using the second law of infodynamics. Using simple geometric shapes, we demonstrated that high symmetry always corresponds to the lowest information entropy state, or lowest information content, explaining why everything in nature tends to symmetry instead of asymmetry.
The key question is now: “What can we learn from the second law of infodynamics and what is its meaning?”
The second law of infodynamics essentially minimizes the information content associated with any event or process in the universe. The minimization of the information really means an optimisation of the information content, or the most effective data compression, as described in Shannon’s information theory. This behavior is fully reminiscent of the rules deployed in programming languages and computer coding. Since the second law of infodynamics appears to be manifesting universally and is, in fact, a cosmological necessity, we could conclude that this points to the fact that the entire universe appears to be a simulated construct. A super complex universe like ours, if it were a simulation, would require a built-in data optimization and compression mechanism in order to reduce the computational power and the data storage requirements. This is exactly what we are observing via empirical evidence all around us, including digital data, biological systems, atomistic systems, symmetries, and the entire universe.
Another important aspect of the second law of infodynamics is the fact that it appears to validate the mass-energy-information equivalence principle formulated in 2019.4 According to this principle, the information itself is not just a mathematical construct or just physical, as postulated by Landauer25 and experimentally demonstrated recently,26–29 but it has a small mass and can be regarded as the fifth form of matter.4 This principle has not been confirmed experimentally yet, and it has attracted a fair share of skepticism. Whether information is physical or not is irrelevant to this study because the second law of infodynamics is applicable regardless of whether information has mass or not. However, if information is physical (equivalent to mass and energy), then the second law of thermodynamics requires systems to evolve in such a way that the energy is minimized at equilibrium. Hence, a reduction in the information content, would translate into a reduction of mass-energy according to the mass-energy-information equivalence principle. Therefore, the second law of infodynamics is not just a cosmological necessity, but since it is required to fulfill the second law of thermodynamics, we can conclude that this new physics law proves that information is indeed physical. The scientific evidence supporting the simulated universe theory is also discussed in greater detail in a recently published book. 30
ACKNOWLEDGMENTS
M.M.V. acknowledges financial support received for this research from the University of Portsmouth and the Information Physics Institute. The author is also deeply grateful to all his supporters and would like to acknowledge the generous contributions received to his research in the field of information physics, from the following donors and crowd funding backers, listed in alphabetical order: Alban Frachisse, Alexandra Lifshin, Allyssa Sampson, Ana Leao-Mouquet, Andre Brannvoll, Andrews83, Angela Pacelli, Aric R. Bandy, Ariel Schwartz, Arne Michael Nielsen, Arvin Nealy, Ash Anderson, Barry Anderson, Benjamin Jakubowicz, Beth Steiner, Bruce McAllister, Caleb M. Fletcher, Chris Ballard, Cincero Rischer, Colin Williams, Colyer Dupont, Cruciferous1, Daniel Dawdy, Darya Trapeznikova, David Catuhe, Dirk Peeters, Dominik Cech, Kenneth Power, Eric Rippingale, Ethel Casey, Ezgame Workplace, Frederick H. Sullenberger III, Fuyi Zhou, George Fletcher, Gianluca Carminati, Gordo Tek, Graeme Hewson, Graeme Kirk, Graham Wilf Taylor, Heath McStay, Heyang Han, Ian Wickramasekera, Ichiro Tai, Inspired Designs LLC, Ivaylo Aleksiev, Jamie C. Liscombe, Jan Stehlak, Jason Huddleston, Jason Olmsted, Jennifer Newsom, Jerome Taurines, John Jones, John Vivenzio, John Wyrzykowski, Josh Hansen, Joshua Deaton, Josiah Kuha, Justin Alderman, Kamil Koper, Keith Baton, Keith Track, Kristopher Bagocius, Land Kingdom, Lawrence Zehnder, Lee Fletcher, Lev X, Linchuan Wang, Liviu Zurita, Loraine Haley, Manfred Weltenberg, Mark Matt Harvey-Nawaz, Matthew Champion, Mengjie Ji, Michael Barnstijn, Michael Legary, Michael Stattmann, Michelle A. Neeshan, Michiel van der Bruggen, Molly R. McLaren, Mubarrat Mursalin, Nick Cherbanich, Niki Robinson, Norberto Guerra Pallares, Olivier Climen, Pedro Decock, Piotr Martyka, Ray Rozeman, Raymond O’Neill, Rebecca Marie Fraijo, Robert Montani, Shenghan Chen, Sova Novak, Steve Owen Troxel, Sylvain Laporte, Tamás Takács, Tilo Bohnert, Tomasz Sikora, Tony Koscinski, Turker Turken, Walter Gabrielsen III, Will Strinz, William Beecham, William Corbeil, Xinyi Wang, Yanzhao Wu, Yves Permentier, Zahra Murad, and Ziyan Hu.
AUTHOR DECLARATIONS
Conflict of Interest
The author has no conflicts to disclose.
Author Contributions
Melvin M. Vopson: Conceptualization (lead); Formal analysis (lead); Funding acquisition (lead); Investigation (lead); Methodology (lead); Writing – original draft (lead); Writing – review & editing (lead).