Landauer’s principle formulated in 1961 states that logical irreversibility implies physical irreversibility and demonstrated that information is physical. Here we formulate a new principle of mass-energy-information equivalence proposing that a bit of information is not just physical, as already demonstrated, but it has a finite and quantifiable mass while it stores information. In this framework, it is shown that the mass of a bit of information at room temperature (300K) is 3.19 × 10-38 Kg. To test the hypothesis we propose here an experiment, predicting that the mass of a data storage device would increase by a small amount when is full of digital information relative to its mass in erased state. For 1Tb device the estimated mass change is 2.5 × 10-25 Kg.
I. INTRODUCTION AND THEORY
Shannon gave the mathematical formulation of the amount of information extracted from observing the occurrence of an event in his 1948 seminal paper.1 Ignoring any particular features of the event, the observer or the observation method, Shannon developed his theory using an axiomatic approach in which he defined information (I) extracted from observing an event as a function of the probability (p) of the event to occur or not, I(p). The second axiomatic property is that the information measure is a continuous positive function of the probability I(p) ≥ 0. An event that is certain, i.e. p = 1, gives therefore no information from its occurrence, so I(1) = 0. Assuming that for n independent events of individual probabilities pi the joint probability p is the product of their individual probabilities, then the information we get from observing the set of n events is the sum of the individual event’s information, I(p) = I(p1·p2·…·pn) = I(p1) + I(p2) + … + I(pn). Shannon identified that the only function satisfying these axiomatic properties is a logarithmic function and, for an event whose probability of occurring is p, the information extracted from observing the event is:
where b is an arbitrary base, which gives the units of information, i.e. for binary bits of information, b = 2. Let us assume a set of n independent and distinctive events X = {x1, x2,…,xn} having a probability distribution P = {p1, p2,…,pn} on X, so that each event xi has a probability of occurring pi = p(xi), where pi ≥ 0 and Σpi =1. According to Shannon,1 the average information per event, or the number of bits of information per event, one can extract when observing the set of events X once is:
The function H(X) resembles an information entropy function and it is maximum when the events xj have equal probabilities of occurring, pj = 1/n, so H(X) = logbn. When observing N sets of events X, or equivalently observing N times the set of events X, the number of bits of information extracted from the observation is N·H(X). The number of possible states, also known as distinct messages in Shannon’s original formalism, is equivalent to the number of information bearing microstates, Ω, compatible with the macro-state:
This allows the introduction of an entropy of the information bearing states, using Boltzmann thermodynamic entropy:
where kb = 1.38064 × 10-23 J/K is the Boltzmann constant.
Let us examine the specific case of digital information, implying b = 2, and two possible distinctive events/states occurring, so n = 2 and X = {0,1}. If we assume no biasing or external work on the system, then the two events/states have equal probabilities of occurring, so that pj = 1/n = 1/2 and p = {p1,p2} = {1/2,1/2}, then using (2) it can be shown that:
The meaning of H(X) = 1 is that 1 bit of information is required to encode one letter message, or conversely, observing the above event generates 1 bit of information. Using this result in (4) we obtain the information entropy of one bit S = kb·ln(Ω) = kb·ln(2). A computational process creates digital information via some sort of physical process, which obeys physical laws, including thermodynamics. Hence, there must be a direct connection between the process of creating, manipulating or erasing information and thermodynamics. In 1961, Landauer first proposed a link between thermodynamics and information by postulating that logical irreversibility of a computational process implies physical irreversibility.2 Since irreversible processes are dissipative, it results that logical irreversibility is also a dissipative process and, by extrapolation, information is physical.3 An example of logical irreversible process is the operation “erase” of a memory device. A memory device is a distinct finite array of N binary elements, which can hold information without dissipation. Let us consider an isolated physical system that works as a digital memory device consisting of an array of N bits. Using (3) we can calculate that there are 2N possible microstates and the initial information entropy of the system is Siinfo = Nkb·ln(2). The total entropy of the system consists of the physical entropy, Siphys related to the non-information bearing states, and the information entropy, characteristic to the information bearing states. Performing an irreversible logical operation like “erase” brings the system into one of the three equivalent erased states as exemplified in Figure 1 for an array of 8 bits, also known as a byte. The initial byte in this example is randomly selected as 01101001, which represents the letter “i” codded binary (Fig. 1a). The erased state defined by Landauer is in fact a reset operation with all bits in 1 (Fig. 1b) or 0 (Fig. 1d) state, but they are equivalent to a true “erased” state that is neither 0, nor 1 as in Fig. 1c. An example of true erased state would be an array of bits in a magnetic data storage memory, in which the erase operation does not imply reset of all bits to identical magnetized state, but total demagnetization of each bit, so neither 1, nor 0 could be identified in any of the bits. This implies that the system has only one possible information state, n = 1, so using (2) we get H(X) = 0 and Sinfo(erased) = Sfinfo = 0. Hence, the “erase” operation decreases the information entropy of the system, ΔSinfo = Sfinfo - Siinfo = - Nkb·ln2. Since the second law of thermodynamics states that the total entropy change cannot decrease over time, ΔStot = ΔSphys + ΔSinfo ≥ 0, then the irreversible computation must reduce the information entropy of the information bearing states by increasing the entropy of the non-information bearing states via a thermal dissipation of energy, ΔQ/T = ΔSphys ≤ Nkb·ln(2). For one bit of information lost irreversibly, then the entropy of the system must increase with an absolute value of heat released per bit lost, ΔQ = kb·T·ln(2), known as Landauer’s principle.2,3 Although the Landauer’s principle has been the matter of some controversy, today the scientific community widely accepts it, and we refer the reader to the recent experimental confirmation of the Landauer’s principle,4–7 as well as various theoretical arguments in its support.8
a) Byte in a random recorded memory micro-state; b) Byte after erase operation resetting all bits to 1 state; c) Byte after true erase operation with all bits in neither 0, nor 1 state; d) Byte after erase operation resetting all bits to 0 state.
a) Byte in a random recorded memory micro-state; b) Byte after erase operation resetting all bits to 1 state; c) Byte after true erase operation with all bits in neither 0, nor 1 state; d) Byte after erase operation resetting all bits to 0 state.
II. LANDAUERS’S EXTENDED PRINCIPLE
We established that the process of creating information requires W ≥ kb·T·ln(2) work externally applied to modify the physical system and to create a bit of information, while the process of erasing a bit of information generates ΔQ ≤ kb·T·ln(2) heat energy released to the environment, and this has been already determined and confirmed experimentally.6,7 However, once a bit of information is created, assuming no external perturbations, it can stay like this indefinitely without any energy dissipation. In this paper a radical idea is proposed, in which the process of holding information indefinitely without energy dissipation can be explained by the fact that once a bit of information is created, it acquires a finite mass, mbit. This is the equivalent mass of the excess energy created in the process of lowering the information entropy when a bit of information is erased. Using the mass-energy equivalence principle, the mass of a bit of information is:
where c is the speed of light and T is temperature at which the bit of information is stored. Having the information content stored in a physical mass allows holding the information without energy dissipation indefinitely. Erasing the information requires input external work and the mass mbit is converted back into energy/heat. The implications of this rationale are that the equivalence mass – energy principle inferred from the special relativity can be extrapolated to the mass – energy – information equivalence principle as depicted in Figure 2, which essentially represents an extension of the original Landauer’s principle.
Diagrammatic representation of the mass – energy – information equivalence.
Furthermore, the information depends on the temperature at which the information bit exists. From (6), mbit = 0 at T = 0K, so as expected, no information can exist at zero absolute. Using relation (6) at room temperature (T = 300K), the estimated mass of a bit is ∼ 3.19×10-38 Kg.
It is important to point out that external work or forces applied to any kind of digital bits of information, could result in perturbations of the memory states and even self-erasure. In this article, we restrict the approach to digital memory states at equilibrium, at a given temperature. Any deviations from these conditions are permitted, assuming that the system is maintained at equilibrium. If this is condition is broken, then memory self-erasure could occur and the mass of a bit is dissipated back into heat energy, proportional to the temperature at which this perturbation occurs.
To understand this concept, let us imagine a balance as a memory device (see Figure 3). When the balance has no left or right tilt, i.e. it is fully balanced, the device is in erased memory state storing no information. By convention, when it tilts to the left, the device is in memory state “1”, and when it tilts to the right is in memory state “0”. The balance will tilt only when some mechanical work is performed against it, and it will always revert to erased state when the perturbing force is cancelled. In order to make the device to hold a bit of information, a permanent force/work must be present. However, digital information requires an initial input energy to create a bit, but then this is stored indefinitely without energy dissipation. The equivalent of this process in terms of our thought balance memory device experiment is when external work is performed to place an object of finite mass on the left or right side of the balance. This is the “write” process of the memory. However, having the mass present allows a digital “1” or “0” state to be maintained indefinitely without energy dissipation. The memory erase process is equivalent to external work done to remove the mass from the balance. In this process the mass is converted back into heat, as described in the Landauer principle and confirmed experimentally (Figure 3).
Energy cycle of the digital bit creation and erasure indicating the energies transferred in the process and the equivalent concept in terms of a mechanical balance as a memory device. The bit holds information without energy dissipation because the abstract digital bit has a finite mass.
Energy cycle of the digital bit creation and erasure indicating the energies transferred in the process and the equivalent concept in terms of a mechanical balance as a memory device. The bit holds information without energy dissipation because the abstract digital bit has a finite mass.
The balance memory device thought experiment described in Figure 3, also shows the energy cycles corresponding to transitions from erased state to a “1” or “0” bit of information states and back from a bit of information to erased state. Minimum energy input or output is required to transition a bit in or out of erased state. However, once a bit of information is created, transitions from “1” to “0” and vice versa can take place without dissipation associated to his process. This is equivalent to moving the mass from the left of the balance to the right, directly without going through the erased state.
The mass – energy – information equivalence principle proposed here is strictly applicable only to classical digital memory states at equilibrium. Information carried by relativistic media, moving waves or photons require a quantum relativistic information theory approach and it is outside the applicability framework of this article. Similarly, other forms of information including analogue information, or information embedded in biological living systems such as DNA are not within the scope of this work.
III. PROPOSED EXPERIMENT
In what follows, we propose a simple experiment capable of testing this theory by physically measuring the mass of digital information.
This consists of an ultra accurate mass measurement of a digital data storage device, when all its memory bits are in fully erased state. This is then followed by the operation of recording digital data on all of its memory bits until is at full capacity, followed by another accurate mass measurement. If the proposed mass – energy – information equivalence principle is correct, then the data storage device should be heavier when information is stored on it than when it is in fully erased state. One could easily estimate the mass difference, Δm in this experiment. Let us assume a memory device of 1Tb storage capacity, then the total number of memory bits is 1012 bytes = 8×1012 bits, as 1 byte = 8 bits. Hence the predicted mass change in this experiment is Δm = 2.5×10-25 Kg. The proposed experiment is simple in terms of physical complexity, but very challenging overall as the success depends on one’s ability to measure accurately mass changes in the order of ∼10-25 Kg. The required measurement sensitivity could be reduced by a factor f if the amount of data storage under test is increased from 1Tb to f × 1Tb. Since the measurement is in fact not the absolute mass, but rather the mass change Δm, one measurement option would be a sensitive interferometer similar to the Laser Interferometer Gravitational-Wave Observatory (LIGO),9 although smaller sizes and sensitivities would be probably sufficient for the proposed measurement. Another possible option to test the proposed principle could be using an ultra-sensitive Kibble balance used for defining the Kilogram, as the one developed at NPL in the UK,10 although the current reported uncertainties of ∼ 10-9 are far from the requirements of the proposed experiment.
IV. CONCLUSION
In this letter the Landauer’s principle is extrapolated to the mass - energy - information equivalence principle by providing viable arguments that the physical nature of digital information requires a bit of information to have a very small, non-zero mass. This is a very abstract concept with some speculative aspects, but it has the virtue of being verifiable in a laboratory environment and a possible experiment to validate the proposed idea is described in this letter. The experiment is achievable and, a successful test would offer a direct experimental confirmation of the mass - energy - information equivalence principle with far reaching implications in physics, cosmology, big data, computation and technologies. Within the digital Universe concept, all the baryonic matter has an associated information content.11 The estimated mass of a bit of information at T = 2.73K is mbit = 2.91 × 10-40 Kg. Assuming that all the missing dark matter is in fact information mass, the initial estimates (to be reported in a different article) indicate that ∼1093 bits would be sufficient to explain all the missing dark matter in the visible Universe. Remarkably, this number is reasonably close to another estimate of the Universe information bit content of ∼1087 given by Gough in 2008 via a different approach.12 In fact, one could argue that information is a distinct form of matter, or the 5th state, along the other four observable solid, liquid, gas, and plasma states of matter. It is expected that this work will stimulate further theoretical and experimental research, bringing the scientific community one-step closer to understanding the abstract nature of matter, energy and information in the Universe.
ACKNOWLEDGMENTS
This work was stimulated while performing research on a closely related digital data storage research grant funded by EPSRC (EP/R028656/1). The author would like to acknowledge the support received from the School of Mathematics and Physics, University of Portsmouth, to undertake this research.