Using the second law of information dynamics and the mass–energy–information equivalence principle, we show that gravitational attraction manifests as a requirement to reduce the information entropy of matter objects in space. This is another example of data compression and computational optimization in our universe, which supports the possibility of a simulated or computational universe. Here, we derive Newton’s gravitational force from information dynamics and show that gravity emerges as an entropic information force governed by the second law of infodynamics. This is fully aligned with Verlinde’s entropic gravity studies published in 2011 but is demonstrated here via a different approach.
I. INTRODUCTION
The second law of infodynamics describes the time evolution of the entropy of information states in an isolated system evolving to equilibrium. This law requires the entropy of information to decrease or remain constant over time, up to a minimum value reached when the system achieves equilibrium. This is in total contrast to the second law of thermodynamics, which requires the entropy of physical states to remain constant or increase over time. The second law of infodynamics was first introduced for digital and biological information states,1 and it was also demonstrated to be a cosmological necessity.2,3 This new law of physics was successfully tested on other systems, explaining, for example, the rules followed by the electrons in populating atomic orbitals on the ground states.3 Moreover, the second law of infodynamics successfully explained a long-standing curiosity related to the abundance of symmetries in the universe, by demonstrating that high symmetry is a preferred state in nature because high symmetry corresponds to a low information entropy content.3,4 Since the second law of infodynamics appears in nature at all scales, from subatomic to cosmological scales, and since it originates from information considerations, as defined in the context of Shannon’s classical information theory,5 the possibility that the entire universe is informational in nature and resembles a computational process was speculatively suggested.3,4
A computational or simulated universe would exhibit specific signatures of the computational process, with the second law of infodynamics being one such indicator. This intriguing possibility invites us to revisit the entire physical description of our universe, where the laws of physics are just manifestations of computational rules within a source code. In other words, for an observer outside of the universe, everything follows a set of coding instructions, while for an observer inside the code (i.e., in our universe), these coding instructions manifest as mathematical and physical laws of nature. For example, Pauli’s exclusion principle states that two or more identical fermions cannot simultaneously occupy the same quantum state within a quantum system. In the case of electrons in atoms, it is impossible for two electrons in a multi-electron atom to have the same values of the four quantum numbers. Pauli’s exclusion principle requirement of distinguishable particles bears an uncanny resemblance to the rules of coding and programming, specifically in the context of defining variables that must be distinguishable, ensuring order and predictability in the execution of the code. This is exactly how a set of variables in a computer code would be defined to ensure computability. Looking from the same angle, the abundance of symmetries observed at all scales in the universe is another example of a computational optimization process, or data compression, because symmetry scales inversely proportional to the information content or computational power, i.e., high symmetry means less computational power and low information content. Assuming the simulated universe hypothesis, in this article, we examine another intriguing possible evidence of this computational process, which manifests in our universe as gravity or the gravitational force.
II. MOTIVATION
To understand the motivation of this approach, we begin by emulating exactly how we perform computational simulations in routine computer programming. For example, when performing computational analysis using finite element analysis (FEA),6 a typical approach is to create a discrete mesh. Meshing breaks down the simulated object into very small cells that define accurately the object and its geometry. A governing equation can be assigned to each cell, allowing the solver to efficiently simulate the physical behavior of the entire macro-object.
Hence, our key assumption in this work is the fact that space-time is not continuum but discrete, similar to a pixelation or FEA meshing as described above. This discrete view of space is supported by evidence from quantum mechanics, the discovery of Planck quanta, and subsequently the Planck scales. Essentially, space-time gives meaning to physical properties of material objects, such as position and momentum, and for this reason, space could be regarded as an information storage medium of matter properties in the universe.
In this approximation, each quantum of space is called an elementary cell, and each elementary cell would have the function to register properties of matter in space, including position and velocity, acting as storage of information. The role of the information stored in the elementary cells is to provide the properties and the coordinates of matter in the space-time simulated construct. This process is exactly identical to how a digital computer game, a VR application, or advanced simulations would be designed via meshing.
It is natural to assume that the smallest physical length, Planck length (Lp ≈ 1.6 · 10−35 m), defines the size of an elementary space cell, and each Planck elementary cell stores one digital bit state in its surface area, equal to a Planck area. Assigning position coordinates to objects/matter in a 3D computed reality would require stacked 2D space surfaces to provide the detailed coordinates for the computation. This should be seen as a foliation of space in surfaces, similar to an onion structure.
Hence, we assume that information is stored in 2D space surfaces, with a maximum of one digital bit per elementary Planck area. This is fully consistent with the holographic principle and Beckenstein’s black hole entropy studies.7
Let us assume a simplified single 2D space surface, consisting of 10 × 10 elementary cells and a surface area A ≈ R2, where R is the lateral size of the surface or the radius if spherical coordinates are chosen (Fig. 1). The cells in this diagram are not to scale, and for simplicity, we show a flat space surface, while recognizing the curved complex geometry of our space-time. Each cell can register information in the form of binary data. For example, if a cell is empty, it registers a digital “0,” and if matter is present in a cell, it registers a digital “1.” We also assume that the point mass approximation applies here, so a matter object physically larger than a single cell would be represented by a point mass inside the cell that corresponds to the coordinates of its center of mass. In the initial state, we assume that this simplified and hypothetical 2D space surface contains no matter, so each cell would register a digital “0” [see Fig. 1(a)]. The maximum number of bits stored in this simplified thought example is N = 100, where N0 = 100 digital 0s and N1 = 0 digital 1s.
(a) 2D diagram of discrete and empty space, where each elementary space cell can register information; (b) four static point masses are placed inside this space structure at random locations; (c) the point masses begin to move toward the center of mass; (d) all masses are joined together into a single object inside the cell that corresponds to the center of the mass location.
(a) 2D diagram of discrete and empty space, where each elementary space cell can register information; (b) four static point masses are placed inside this space structure at random locations; (c) the point masses begin to move toward the center of mass; (d) all masses are joined together into a single object inside the cell that corresponds to the center of the mass location.
Considering each elementary cell of space as an event in Shannon’s information theory framework, these form a set of n = 2 independent and distinctive events having a probability distribution on X and p0 + p1 = 1.
As the probability of finding a digital “0” in an empty space is 100%, then using (1), we determine the information entropy of the system in Fig. 1(a) to be H(X) = 0 bits, or each cell of space contains zero bits.
We now assume that a few particles of matter are placed in this space fabric at some random locations [see Fig. 1(b)]. Each particle occupies an individual elementary cell, so the occupied cells would register a digital “1” each. Let us populate our hypothetical 2D space structure with an arbitrary number of four particles, having the following random coordinates: (x1 = 2, y1 = 2), (x2 = 3, y2 = 7), (x3 = 7, y3 = 10), and (x4 = 8, y4 = 5). For simplicity, we also assume that each particle has the same mass, m. In this case, N = N0 + N1 = 100, where N0 = 96 and N1 = 4. For this system, the probability distribution is , and using (1), we calculate the information entropy of the system as H(X) = 0.242 bits. We noticed that the empty space had the lowest information entropy of zero bits, while placing four random matter particles in it produced an increase in the information entropy from 0 to 0.242 bits per event or a total information content of NH(X) = 24.2 bits.
Indeed, the particles will move into the direction that brings them at this location, which for the hypothetical example discussed here corresponds to the center of mass (xcm = 5, ycm = 6), as can be seen in Fig. 1(c). The particle movement will obey the least action principle, taking the shortest path in space. Hence, the final system will contain N0 = 99 and N1 = 1, giving a Shannon information entropy of H(X) = 0.081 bits [see Fig. 1(d)]. The total information content of the system is then NH(X) = 8.1 bits, reduced from 24.2 bits. For the example discussed here, consisting of a finite 2D space with four particles randomly placed in it, this is the lowest possible Shannon information entropy and, by extension, the lowest entropy of the information states. The total mass in this system is conserved, but merging the particles together into a single larger particle minimized the entropy of information and the system will remain perpetually in this state at equilibrium.
The requirement to reduce the entropy of information, as dictated by the second law of infodynamics, generated an attractive entropic force between the particles. One could not help noticing that this entropic attractive force has all the hallmarks of a gravitational force. For example, taking a real case where a number of matter particles are placed in an isolated region of space at zero initial velocity, and assuming no other forces are acting on the particles, the particles will start attracting each other under the action of the gravitational force. Newton’s law of gravity beautifully describes this classical force, while Einstein described the relativistic case of this force and he made the connection between gravity and the geometry of the space-time manifold, demonstrating how matter curves space-time. However, despite their absolute success in describing classical and relativistic gravitational phenomena, none of these theories explain why matter objects attract each other gravitationally.
In the example detailed here, our computational construct triggers the attractive force because of the rule set in the computational system, requiring the minimization of the information content and, by extension, a reduction in the computational power. To put it simply, it is far more computationally effective to track and compute the location and momentum of a single object in space than n objects.
Hence, it appears that the gravitational attraction is just another optimization mechanism in a computational process that plays a role in reducing the computational power and compressing information. This is nicely illustrated in Fig. 2, where cosmic dust spread over a vast volume of space would have a much higher entropy of information than a single cosmic object, such as a planet of equal mass to the entire mass of the cosmic dust but concentrated into a densely packed solid object and located in the same volume of space.
Typical evolution of matter in the universe under gravitational attraction. The tendency is to merge smaller matter objects into larger cosmic objects.
Typical evolution of matter in the universe under gravitational attraction. The tendency is to merge smaller matter objects into larger cosmic objects.
The example illustrated diagrammatically in Fig. 1 is perfectly reproduced in our universe when matter objects cluster together gravitationally forming a larger object, as shown in Fig. 2. In fact, in this example, not only the information entropy and computational power are reduced, but also the physical entropy is lower when gravitational clustering occurs. In Sec. III, we will attempt to develop a mathematical justification of the entropic force.
III. THE ENTROPIC FORCE
Let us reconsider our hypothetical 2D space structure of discrete elementary cells containing matter in the micro-canonical ensemble, having total energy E and temperature T. The information entropy of the system is then a function of the energy, E, Sinf = Sinf(E). As we already explained, invoking the second law of infodynamics requires the system to evolve in a way that minimizes its information entropy. The evolution to minimum information entropy takes place under the action of an entropic force, Fs. This force is responsible for the system’s evolution to the minimum Sinf, and it has a universal character. In other words, in the case discussed here, the entropic force appears to be the origin of the gravitational force, but in other information systems, the entropic force can manifest differently.
IV. NEWTON’S LAW OF GRAVITY FROM THE SECOND LAW OF INFODYNAMICS
Having established the foundation of the entropic force, which is required to fulfill the second law of infodynamics, and having noted the similarities between this entropic force and the gravitational attraction of matter, we now attempt to establish an analytical connection between the two forces. Returning to our 2D discrete space structure containing matter, we ask the following question:
What actually moves under the entropic force to reduce the information entropy?
Schematic of the entropic force acting on the object of mass m, which moves toward mass M to reduce the information entropy in space.
Schematic of the entropic force acting on the object of mass m, which moves toward mass M to reduce the information entropy in space.
Information entropy content per elementary cell as a function of N. For a large N, H(X) remains constant when a matter object moves over a single cell.
Information entropy content per elementary cell as a function of N. For a large N, H(X) remains constant when a matter object moves over a single cell.
V. RELATION TO VERLINDE’S ENTROPIC GRAVITY STUDIES
Verlinde’s seminal work published in 20118 proposed that gravity arises as an entropic force due to changes in a holographic screen storing information associated with the positions of material bodies in space. His approach relied on the holographic principle7,17,18 and the concept of entropic force, showing that Newton’s law of gravity and Einstein’s field equations emerge naturally from information-theoretic principles. Following Verlinde’s work, other studies demonstrated the entropic nature of gravity,9,11,12 including testing the theory on empirical data.19–21
Our study arrives at a similar conclusion but through a distinct approach, and there are key differences between this study and Verlinde’s work. While Verlinde’s approach derives gravitational attraction from entropic force considerations linked to holographic screens, this study emphasizes the second law of infodynamics as the primary driver, combined with the M/E/I equivalence principle, which are both absent in Verlinde’s formulation. In this study, the gravitational force emerges as an optimization process where matter moves in space to reduce information entropy, a perspective deeply rooted in computational physics and information theory. This is in direct contrast to Verlinnde’s work, where his entropic force points in the direction of increasing entropy. Verlinde claims that the “dynamics of information stored on each screen is given by some unknown rules, which can be thought of as a way of processing the information that is stored on it.” In this study, we show that the second law of infodynamics governs the dynamics of information, without the need for the emergent space-time concepts and arbitrary introduction of holographic screens. Despite these differences, both approaches lead to the conclusion that gravity is not a fundamental interaction but rather a macroscopic consequence of microscopic information dynamics. The results achieved in this study, therefore, complement Verlinde’s work, making this study a novel extension of the entropic gravity paradigm.
VI. CONCLUSIONS
This study presents a novel perspective on gravity as an entropic force, grounded in the second law of infodynamics and the mass–energy–information equivalence principle. By deriving Newton’s law of gravity from information-theoretic considerations, this work supports the view that gravitational attraction arises due to a fundamental drive to reduce information entropy in the universe. The results obtained here align with Verlinde’s entropic gravity framework but introduce distinct conceptual and methodological differences. This study suggests that gravity serves as a computational optimization process, where matter self-organizes to minimize the complexity of information encoding within space-time. The broader implications of this work extend to fundamental physics, including black hole thermodynamics, dark matter and dark energy considerations, and potential connections between gravity and quantum information theory. Whether the universe is indeed a computational construct remains an open question, but the entropic nature of gravity provides compelling evidence that information is a fundamental component of physical reality and data compression drives physical processes in the universe. Future research should focus on refining this framework, exploring its applicability in relativistic and quantum gravitational contexts, and investigating possible experimental validations.
ACKNOWLEDGMENTS
M.V. acknowledges that this work was stimulated by discussions with Professor David Bacon, who introduced him to Verlinde’s entropic gravity studies, after M.V. published the M/E/I equivalence principle. M.V. acknowledges the financial support received for this research from the University of Portsmouth and the Information Physics Institute, as well as from the following generous donors and crowd funding backers on www.indiegogo.com, listed in alphabetical order: Alban Frachisse, Alexandra Lifshin, Ali Eslami, Ali Bissat, Allyssa Sampson, Ana Leao-Mouquet, Andre Brannvoll, Andrews83, Angela Pacelli, Aric R Bandy, Ariel Schwartz, Arne Michael Nielsen, Arvin Nealy, Ash Anderson, Barry Anderson, Benjamin Jakubowicz, Beth Steiner, Bruce Chmieleski, Bruce McAllister, Caleb M Fletcher, Charles Hurlocke, Chedlia Lamloumi, Chris Ballard, Cincero Rischer, Colin Williams, Colyer Dupont, Cruciferous1, Daniel Dawdy, Darya Trapeznikova, David Catuhe, Dirk Peeters, Dominik Cech, Eric Rippingale, Ethel Casey, Ezgame Workplace, Frederick H. Sullenberger III, Fuyi Zhou, George Fletcher, Gérald Gingras, Gianluca Carminati, Gordo TEK, Graeme Hewson, Graeme Kirk, Graham Wilf Taylor, Heath McStay, Heyang Han, Ian Wickramasekera, Ichiro Tai, Inspired Designs LLC, Ivaylo Aleksiev, Jamie C Liscombe, Jan Stehlak, Jason Huddleston, Jason Olmsted, Jennifer Newsom, Jerome Taurines, John Jones, John Vivenzio, John Wyrzykowski, Josh Hansen, Joshua Deaton, Josiah Kuha, Justin Alderman, Kamil Koper, Keith Baton, Keith Track, Kenneth Power, Kristopher Bagocius, Land Kingdom, Lawrence Zehnder, Lee Fletcher, Lev X, Linchuan Wang, Liviu Zurita, Loraine Haley, Manfred Weltenberg, Mark Matt Harvey-Nawaz, Matthew Champion, Matthieu Graux, Mengjie Ji, Michael Barnstijn, Michael Legary, Michael Stattmann, Michelle A Neeshan, Michiel van der Bruggen, Mohammed Alassaf, Molly R McLaren, Mubarrat Mursalin, Nick Cherbanich, Niki Robinson, Norberto Guerra Pallares, Olivier Climen, Pedro Decock, Piotr Martyka, Ray Rozeman, Raymond O’Neill, Rebecca Marie Fraijo, Robert Montani, Rocco Morelli, Shenghan Chen, Sova Novak, Steve Owen Troxel, Sylvain Laporte, Tamás Takács, Tilo Bohnert, Tomasz Sikora, Tony Koscinski, Turker Turken, Vincent Auteri, Walter Gabrielsen III, Will Strinz, William Beecham, William Corbeil, Xinyi Wang, Yanzhao Wu, Yves Permentier, Zahra Murad, Ziyan Hu.
AUTHOR DECLARATIONS
Conflict of Interest
The author has no conflicts to disclose.
Author Contributions
Melvin M. Vopson: Conceptualization (lead); Formal analysis (lead); Investigation (lead); Methodology (lead); Writing – original draft (lead); Writing – review & editing (lead).
DATA AVAILABILITY
The data that support the findings of this study are available from the corresponding author upon reasonable request.