Using the second law of information dynamics and the mass–energy–information equivalence principle, we show that gravitational attraction manifests as a requirement to reduce the information entropy of matter objects in space. This is another example of data compression and computational optimization in our universe, which supports the possibility of a simulated or computational universe. Here, we derive Newton’s gravitational force from information dynamics and show that gravity emerges as an entropic information force governed by the second law of infodynamics. This is fully aligned with Verlinde’s entropic gravity studies published in 2011 but is demonstrated here via a different approach.

The second law of infodynamics describes the time evolution of the entropy of information states in an isolated system evolving to equilibrium. This law requires the entropy of information to decrease or remain constant over time, up to a minimum value reached when the system achieves equilibrium. This is in total contrast to the second law of thermodynamics, which requires the entropy of physical states to remain constant or increase over time. The second law of infodynamics was first introduced for digital and biological information states,1 and it was also demonstrated to be a cosmological necessity.2,3 This new law of physics was successfully tested on other systems, explaining, for example, the rules followed by the electrons in populating atomic orbitals on the ground states.3 Moreover, the second law of infodynamics successfully explained a long-standing curiosity related to the abundance of symmetries in the universe, by demonstrating that high symmetry is a preferred state in nature because high symmetry corresponds to a low information entropy content.3,4 Since the second law of infodynamics appears in nature at all scales, from subatomic to cosmological scales, and since it originates from information considerations, as defined in the context of Shannon’s classical information theory,5 the possibility that the entire universe is informational in nature and resembles a computational process was speculatively suggested.3,4

A computational or simulated universe would exhibit specific signatures of the computational process, with the second law of infodynamics being one such indicator. This intriguing possibility invites us to revisit the entire physical description of our universe, where the laws of physics are just manifestations of computational rules within a source code. In other words, for an observer outside of the universe, everything follows a set of coding instructions, while for an observer inside the code (i.e., in our universe), these coding instructions manifest as mathematical and physical laws of nature. For example, Pauli’s exclusion principle states that two or more identical fermions cannot simultaneously occupy the same quantum state within a quantum system. In the case of electrons in atoms, it is impossible for two electrons in a multi-electron atom to have the same values of the four quantum numbers. Pauli’s exclusion principle requirement of distinguishable particles bears an uncanny resemblance to the rules of coding and programming, specifically in the context of defining variables that must be distinguishable, ensuring order and predictability in the execution of the code. This is exactly how a set of variables in a computer code would be defined to ensure computability. Looking from the same angle, the abundance of symmetries observed at all scales in the universe is another example of a computational optimization process, or data compression, because symmetry scales inversely proportional to the information content or computational power, i.e., high symmetry means less computational power and low information content. Assuming the simulated universe hypothesis, in this article, we examine another intriguing possible evidence of this computational process, which manifests in our universe as gravity or the gravitational force.

To understand the motivation of this approach, we begin by emulating exactly how we perform computational simulations in routine computer programming. For example, when performing computational analysis using finite element analysis (FEA),6 a typical approach is to create a discrete mesh. Meshing breaks down the simulated object into very small cells that define accurately the object and its geometry. A governing equation can be assigned to each cell, allowing the solver to efficiently simulate the physical behavior of the entire macro-object.

Hence, our key assumption in this work is the fact that space-time is not continuum but discrete, similar to a pixelation or FEA meshing as described above. This discrete view of space is supported by evidence from quantum mechanics, the discovery of Planck quanta, and subsequently the Planck scales. Essentially, space-time gives meaning to physical properties of material objects, such as position and momentum, and for this reason, space could be regarded as an information storage medium of matter properties in the universe.

In this approximation, each quantum of space is called an elementary cell, and each elementary cell would have the function to register properties of matter in space, including position and velocity, acting as storage of information. The role of the information stored in the elementary cells is to provide the properties and the coordinates of matter in the space-time simulated construct. This process is exactly identical to how a digital computer game, a VR application, or advanced simulations would be designed via meshing.

It is natural to assume that the smallest physical length, Planck length (Lp ≈ 1.6 · 10−35 m), defines the size of an elementary space cell, and each Planck elementary cell stores one digital bit state in its surface area, equal to a Planck area. Assigning position coordinates to objects/matter in a 3D computed reality would require stacked 2D space surfaces to provide the detailed coordinates for the computation. This should be seen as a foliation of space in surfaces, similar to an onion structure.

Hence, we assume that information is stored in 2D space surfaces, with a maximum of one digital bit per elementary Planck area. This is fully consistent with the holographic principle and Beckenstein’s black hole entropy studies.7 

Let us assume a simplified single 2D space surface, consisting of 10 × 10 elementary cells and a surface area AR2, where R is the lateral size of the surface or the radius if spherical coordinates are chosen (Fig. 1). The cells in this diagram are not to scale, and for simplicity, we show a flat space surface, while recognizing the curved complex geometry of our space-time. Each cell can register information in the form of binary data. For example, if a cell is empty, it registers a digital “0,” and if matter is present in a cell, it registers a digital “1.” We also assume that the point mass approximation applies here, so a matter object physically larger than a single cell would be represented by a point mass inside the cell that corresponds to the coordinates of its center of mass. In the initial state, we assume that this simplified and hypothetical 2D space surface contains no matter, so each cell would register a digital “0” [see Fig. 1(a)]. The maximum number of bits stored in this simplified thought example is N = 100, where N0 = 100 digital 0s and N1 = 0 digital 1s.

FIG. 1.

(a) 2D diagram of discrete and empty space, where each elementary space cell can register information; (b) four static point masses are placed inside this space structure at random locations; (c) the point masses begin to move toward the center of mass; (d) all masses are joined together into a single object inside the cell that corresponds to the center of the mass location.

FIG. 1.

(a) 2D diagram of discrete and empty space, where each elementary space cell can register information; (b) four static point masses are placed inside this space structure at random locations; (c) the point masses begin to move toward the center of mass; (d) all masses are joined together into a single object inside the cell that corresponds to the center of the mass location.

Close modal

Considering each elementary cell of space as an event in Shannon’s information theory framework, these form a set of n = 2 independent and distinctive events X=0,1 having a probability distribution P=p0,p1 on X and p0 + p1 = 1.

We recall Shannon’s information entropy formula for binary bits of information that gives the average information per event or the number of bits of information per event,5 
(1)

As the probability of finding a digital “0” in an empty space is 100%, then using (1), we determine the information entropy of the system in Fig. 1(a) to be H(X) = 0 bits, or each cell of space contains zero bits.

We now assume that a few particles of matter are placed in this space fabric at some random locations [see Fig. 1(b)]. Each particle occupies an individual elementary cell, so the occupied cells would register a digital “1” each. Let us populate our hypothetical 2D space structure with an arbitrary number of four particles, having the following random coordinates: (x1 = 2, y1 = 2), (x2 = 3, y2 = 7), (x3 = 7, y3 = 10), and (x4 = 8, y4 = 5). For simplicity, we also assume that each particle has the same mass, m. In this case, N = N0 + N1 = 100, where N0 = 96 and N1 = 4. For this system, the probability distribution is P=96100,4100, and using (1), we calculate the information entropy of the system as H(X) = 0.242 bits. We noticed that the empty space had the lowest information entropy of zero bits, while placing four random matter particles in it produced an increase in the information entropy from 0 to 0.242 bits per event or a total information content of NH(X) = 24.2 bits.

We now invoke the second law of infodynamics, which requires that the system must evolve to equilibrium, in a state of lowest possible entropy of information (Sinf). The link between the entropy of the information states and Shannon’s information entropy, H(X), is given by1,3,4
(2)
where kb = 1.380 64 × 10−23 J/K is the Boltzmann constant and N is the number of bit states in the system, which is also equal to the number of elementary space cells. In this case, the reduction in Sinf can only come from a reduction in H(X), since neither matter particles nor the space itself can be made to vanish. Assuming again the point mass approximation, which implies that a cell can accommodate more than one particle, the system will evolve itself by moving the particles in space to join them together into a single larger particle inside a single cell. The final state will therefore consist of a single cell occupied by a particle of mass equal to the sum of the other four particles. This cell will be located in the 2D space at the coordinates corresponding to the center of mass of the n particle system (four in this case study).

Indeed, the particles will move into the direction that brings them at this location, which for the hypothetical example discussed here corresponds to the center of mass (xcm = 5, ycm = 6), as can be seen in Fig. 1(c). The particle movement will obey the least action principle, taking the shortest path in space. Hence, the final system will contain N0 = 99 and N1 = 1, giving a Shannon information entropy of H(X) = 0.081 bits [see Fig. 1(d)]. The total information content of the system is then NH(X) = 8.1 bits, reduced from 24.2 bits. For the example discussed here, consisting of a finite 2D space with four particles randomly placed in it, this is the lowest possible Shannon information entropy and, by extension, the lowest entropy of the information states. The total mass in this system is conserved, but merging the particles together into a single larger particle minimized the entropy of information and the system will remain perpetually in this state at equilibrium.

The requirement to reduce the entropy of information, as dictated by the second law of infodynamics, generated an attractive entropic force between the particles. One could not help noticing that this entropic attractive force has all the hallmarks of a gravitational force. For example, taking a real case where a number of matter particles are placed in an isolated region of space at zero initial velocity, and assuming no other forces are acting on the particles, the particles will start attracting each other under the action of the gravitational force. Newton’s law of gravity beautifully describes this classical force, while Einstein described the relativistic case of this force and he made the connection between gravity and the geometry of the space-time manifold, demonstrating how matter curves space-time. However, despite their absolute success in describing classical and relativistic gravitational phenomena, none of these theories explain why matter objects attract each other gravitationally.

In the example detailed here, our computational construct triggers the attractive force because of the rule set in the computational system, requiring the minimization of the information content and, by extension, a reduction in the computational power. To put it simply, it is far more computationally effective to track and compute the location and momentum of a single object in space than n objects.

Hence, it appears that the gravitational attraction is just another optimization mechanism in a computational process that plays a role in reducing the computational power and compressing information. This is nicely illustrated in Fig. 2, where cosmic dust spread over a vast volume of space would have a much higher entropy of information than a single cosmic object, such as a planet of equal mass to the entire mass of the cosmic dust but concentrated into a densely packed solid object and located in the same volume of space.

FIG. 2.

Typical evolution of matter in the universe under gravitational attraction. The tendency is to merge smaller matter objects into larger cosmic objects.

FIG. 2.

Typical evolution of matter in the universe under gravitational attraction. The tendency is to merge smaller matter objects into larger cosmic objects.

Close modal

The example illustrated diagrammatically in Fig. 1 is perfectly reproduced in our universe when matter objects cluster together gravitationally forming a larger object, as shown in Fig. 2. In fact, in this example, not only the information entropy and computational power are reduced, but also the physical entropy is lower when gravitational clustering occurs. In Sec. III, we will attempt to develop a mathematical justification of the entropic force.

Let us reconsider our hypothetical 2D space structure of discrete elementary cells containing matter in the micro-canonical ensemble, having total energy E and temperature T. The information entropy of the system is then a function of the energy, E, Sinf = Sinf(E). As we already explained, invoking the second law of infodynamics requires the system to evolve in a way that minimizes its information entropy. The evolution to minimum information entropy takes place under the action of an entropic force, Fs. This force is responsible for the system’s evolution to the minimum Sinf, and it has a universal character. In other words, in the case discussed here, the entropic force appears to be the origin of the gravitational force, but in other information systems, the entropic force can manifest differently.

Returning to our simplified system in discussion, since it is isolated, the work done by the entropic force within the system on its own components must come from the total energy within the system itself. Hence, the work done by the entropic force on n particles changes the total energy of the system as
(3)
where Fs · ri is the work done by the entropic force on the particle i over a distance r. Considering a single particle case, we can write EEFs·r, so Sinf(E) → Sinf(EFs·r). Applying a Taylor expansion of Sinf around E, we obtain
(4)
We now impose the minimum information entropy condition, dSinfdr=0, which after algebraic manipulation results in
(5)
We now recall that SinfE=1T, so relation (5) becomes
(6)
Relation (6) describes the entropic force acting on a system to reduce its information entropy, as dictated by the second law of infodynamics. This relation is also identical to the relation derived by Verlinde in his entropic gravity article.8 Indeed, other studies proposed the existence of an entropic force in various physical systems,9–13 including magnetic systems.14 

Having established the foundation of the entropic force, which is required to fulfill the second law of infodynamics, and having noted the similarities between this entropic force and the gravitational attraction of matter, we now attempt to establish an analytical connection between the two forces. Returning to our 2D discrete space structure containing matter, we ask the following question:

What actually moves under the entropic force to reduce the information entropy?

The answer is that matter moves in space to minimize its own imprint of information in the fabric of space. In our own example, four particles of mass m each will move to join together into a larger particle of mass 4m. Hence, the entropic force on a single particle will be
(7)
where m is the mass of the particle and a is its acceleration under the action of the entropic force. We now write for convenience the entropic force in terms of the entropy and position change as
(8)
We consider that a single particle m moves in space toward a larger mass M, under this entropic force (Fig. 3), in order to reduce the overall information entropy as dictated by the second law of infodynamics.
FIG. 3.

Schematic of the entropic force acting on the object of mass m, which moves toward mass M to reduce the information entropy in space.

FIG. 3.

Schematic of the entropic force acting on the object of mass m, which moves toward mass M to reduce the information entropy in space.

Close modal
Taking the nominator as the entropy change due to the movement of the particle of mass m over an infinitesimal distance Δr (i.e., a Planck length or a single elementary cell), Δr can be taken equal to one reduced Compton wavelength,7 which is equal to the Planck length when the particle has a mass close to the Planck mass (i.e., on the order of ∼10−8 Kg),
(9)
where c is the speed of light. Note that the use of reduced Compton wavelength (h/2π) is a common representation of mass at the quantum scale when it pertains to inertial mass. Again, this relation has been used previously in an identical form by Bekenstein7 and Verlinde.8 Using relation (2), the change in the entropy of information due to the position change of the mass m over a single elementary cell is
(10)
The particle’s movement is tracked by the change in the function H. If the particle’s movement produced no change in the information entropy, then HN(X) = HN−1(X) = H(X), so (10) becomes
(11)
In order to demonstrate the validity of this approximation, we computed HN(X) and HN−1(X) for two matter objects in space for a range of N up to 10 000 elementary cells. Figure 4 shows that for a large N, the two functions can be safely approximated as equal.
FIG. 4.

Information entropy content per elementary cell as a function of N. For a large N, H(X) remains constant when a matter object moves over a single cell.

FIG. 4.

Information entropy content per elementary cell as a function of N. For a large N, H(X) remains constant when a matter object moves over a single cell.

Close modal
We now use the mass–energy–information (M/E/I) equivalence principle, proposed in 2019,15 and we equate the entire mass of an object M to an information mass. Since the mass of a bit of information is known,15,16 the mass of NH(X) bits of information would have an equivalent mass of
(12)
Extracting the expression of T from relation (12) and combining (8), (9), and (11), we obtain the following expression for the entropic force:
(13)
We now recall that the Planck length Lp is given by
(14)
where G is the gravitational constant. Since the number of information states equals the number of elementary cells, NR2Lp2, combining (13) and (14), we recover an expression of the entropic force that is identical to Newton’s law of gravity,
(15)

Verlinde’s seminal work published in 20118 proposed that gravity arises as an entropic force due to changes in a holographic screen storing information associated with the positions of material bodies in space. His approach relied on the holographic principle7,17,18 and the concept of entropic force, showing that Newton’s law of gravity and Einstein’s field equations emerge naturally from information-theoretic principles. Following Verlinde’s work, other studies demonstrated the entropic nature of gravity,9,11,12 including testing the theory on empirical data.19–21 

Our study arrives at a similar conclusion but through a distinct approach, and there are key differences between this study and Verlinde’s work. While Verlinde’s approach derives gravitational attraction from entropic force considerations linked to holographic screens, this study emphasizes the second law of infodynamics as the primary driver, combined with the M/E/I equivalence principle, which are both absent in Verlinde’s formulation. In this study, the gravitational force emerges as an optimization process where matter moves in space to reduce information entropy, a perspective deeply rooted in computational physics and information theory. This is in direct contrast to Verlinnde’s work, where his entropic force points in the direction of increasing entropy. Verlinde claims that the “dynamics of information stored on each screen is given by some unknown rules, which can be thought of as a way of processing the information that is stored on it.” In this study, we show that the second law of infodynamics governs the dynamics of information, without the need for the emergent space-time concepts and arbitrary introduction of holographic screens. Despite these differences, both approaches lead to the conclusion that gravity is not a fundamental interaction but rather a macroscopic consequence of microscopic information dynamics. The results achieved in this study, therefore, complement Verlinde’s work, making this study a novel extension of the entropic gravity paradigm.

This study presents a novel perspective on gravity as an entropic force, grounded in the second law of infodynamics and the mass–energy–information equivalence principle. By deriving Newton’s law of gravity from information-theoretic considerations, this work supports the view that gravitational attraction arises due to a fundamental drive to reduce information entropy in the universe. The results obtained here align with Verlinde’s entropic gravity framework but introduce distinct conceptual and methodological differences. This study suggests that gravity serves as a computational optimization process, where matter self-organizes to minimize the complexity of information encoding within space-time. The broader implications of this work extend to fundamental physics, including black hole thermodynamics, dark matter and dark energy considerations, and potential connections between gravity and quantum information theory. Whether the universe is indeed a computational construct remains an open question, but the entropic nature of gravity provides compelling evidence that information is a fundamental component of physical reality and data compression drives physical processes in the universe. Future research should focus on refining this framework, exploring its applicability in relativistic and quantum gravitational contexts, and investigating possible experimental validations.

M.V. acknowledges that this work was stimulated by discussions with Professor David Bacon, who introduced him to Verlinde’s entropic gravity studies, after M.V. published the M/E/I equivalence principle. M.V. acknowledges the financial support received for this research from the University of Portsmouth and the Information Physics Institute, as well as from the following generous donors and crowd funding backers on www.indiegogo.com, listed in alphabetical order: Alban Frachisse, Alexandra Lifshin, Ali Eslami, Ali Bissat, Allyssa Sampson, Ana Leao-Mouquet, Andre Brannvoll, Andrews83, Angela Pacelli, Aric R Bandy, Ariel Schwartz, Arne Michael Nielsen, Arvin Nealy, Ash Anderson, Barry Anderson, Benjamin Jakubowicz, Beth Steiner, Bruce Chmieleski, Bruce McAllister, Caleb M Fletcher, Charles Hurlocke, Chedlia Lamloumi, Chris Ballard, Cincero Rischer, Colin Williams, Colyer Dupont, Cruciferous1, Daniel Dawdy, Darya Trapeznikova, David Catuhe, Dirk Peeters, Dominik Cech, Eric Rippingale, Ethel Casey, Ezgame Workplace, Frederick H. Sullenberger III, Fuyi Zhou, George Fletcher, Gérald Gingras, Gianluca Carminati, Gordo TEK, Graeme Hewson, Graeme Kirk, Graham Wilf Taylor, Heath McStay, Heyang Han, Ian Wickramasekera, Ichiro Tai, Inspired Designs LLC, Ivaylo Aleksiev, Jamie C Liscombe, Jan Stehlak, Jason Huddleston, Jason Olmsted, Jennifer Newsom, Jerome Taurines, John Jones, John Vivenzio, John Wyrzykowski, Josh Hansen, Joshua Deaton, Josiah Kuha, Justin Alderman, Kamil Koper, Keith Baton, Keith Track, Kenneth Power, Kristopher Bagocius, Land Kingdom, Lawrence Zehnder, Lee Fletcher, Lev X, Linchuan Wang, Liviu Zurita, Loraine Haley, Manfred Weltenberg, Mark Matt Harvey-Nawaz, Matthew Champion, Matthieu Graux, Mengjie Ji, Michael Barnstijn, Michael Legary, Michael Stattmann, Michelle A Neeshan, Michiel van der Bruggen, Mohammed Alassaf, Molly R McLaren, Mubarrat Mursalin, Nick Cherbanich, Niki Robinson, Norberto Guerra Pallares, Olivier Climen, Pedro Decock, Piotr Martyka, Ray Rozeman, Raymond O’Neill, Rebecca Marie Fraijo, Robert Montani, Rocco Morelli, Shenghan Chen, Sova Novak, Steve Owen Troxel, Sylvain Laporte, Tamás Takács, Tilo Bohnert, Tomasz Sikora, Tony Koscinski, Turker Turken, Vincent Auteri, Walter Gabrielsen III, Will Strinz, William Beecham, William Corbeil, Xinyi Wang, Yanzhao Wu, Yves Permentier, Zahra Murad, Ziyan Hu.

The author has no conflicts to disclose.

Melvin M. Vopson: Conceptualization (lead); Formal analysis (lead); Investigation (lead); Methodology (lead); Writing – original draft (lead); Writing – review & editing (lead).

The data that support the findings of this study are available from the corresponding author upon reasonable request.

1.
M. M.
Vopson
and
S.
Lepadatu
, “
Second law of information dynamics
,”
AIP Adv.
12
,
075310
(
2022
).
2.
M. M.
Vopson
, “
On the second law of infodynamics from cosmological thermodynamics
,”
IPI Lett.
3
(
1
),
N6
N9
(
2025
).
3.
M. M.
Vopson
, “
The second law of infodynamics and its implications for the simulated universe hypothesis
,”
AIP Adv.
13
(
10
),
105308
(
2023
).
4.
M. M.
Vopson
,
Reality Reloaded: The Scientific Case for a Simulated Universe
(
IPI Publishing
,
2023
).ISBN: 978-1-80517-057-0
5.
C. E.
Shannon
, “
A mathematical theory of communication
,”
Bell Syst. Tech. J.
27
,
623
656
(
1948
).
6.
T.
Belytschko
,
R.
Gracie
, and
G.
Ventura
, “
A review of extended/generalized finite element methods for material modeling
,”
Modell. Simul. Mater. Sci. Eng.
17
,
043001
(
2009
).
7.
J. D.
Bekenstein
, “
Black holes and entropy
,”
Phys. Rev. D
7
,
2333
(
1973
).
8.
P. E.
Verlinde
, “
On the origin of gravity and the laws of Newton
,”
J. High Energy Phys.
04
,
29
(
2011
).
9.
M.
Visser
, “
Conservative entropic forces
,”
J. High Energy Phys.
2011
,
140
.
10.
N.
Roos
, “
Entropic forces in Brownian motion
,”
Am. J. Phys.
82
,
1161
1166
(
2014
).
11.
J. W.
Lee
, “
On the origin of entropic gravity and inertia
,”
Found Phys.
42
,
1153
1164
(
2012
).
12.
E.
Bormashenko
, “
Jeans instability, jeans entropy, and the entropy origin of gravity
,”
World J. Phys.
1
(
02
),
79
86
(
2023
).
13.
E.
Bormashenko
, “
What is temperature? Modern outlook on the concept of temperature
,”
Entropy
22
(
12
),
1366
(
2020
).
14.
E.
Bormashenko
, “
Magnetic entropic forces emerging in the system of elementary magnets exposed to the magnetic field
,”
Entropy
24
,
299
(
2022
).
15.
M. M.
Vopson
, “
The mass-energy-information equivalence principle
,”
AIP Adv.
9
(
9
),
095206
(
2019
).
16.
M. M.
Vopson
, “
Experimental protocol for testing the mass–energy–information equivalence principle
,”
AIP Adv.
12
(
3
),
035311
(
2022
).
17.
L.
Susskind
, “
The World as a hologram
,”
J. Math. Phys.
36
,
6377
(
1995
).
18.
J. D.
Bekenstein
, “
Universal upper bound on the entropy-to-energy ratio for bounded systems
,”
Phys. Rev. D
23
,
287
(
1981
).
19.
M. M.
Brouwer
et al, “
First test of Verlinde’s theory of emergent gravity using weak gravitational lensing measurements
,”
Mon. Not. R. Astron. Soc.
466
(
3
),
2547
2559
(
2017
).
20.
A.
Tamosiunas
,
D.
Bacon
,
K.
Koyama
, and
R. C.
Nichol
, “
Testing emergent gravity on galaxy cluster scales
,”
J. Cosmol. Astropart. Phys.
5
,
053
(
2019
).
21.
V.
Halenka
and
C. J.
Miller
, “
Testing emergent gravity with mass densities of galaxy clusters
,”
Phys. Rev. D
102
,
084007
(
2020
).