To study fundamental physics is to try to understand the laws that govern the behavior of mass and energy at the deepest and most foundational level. And over the past century or so, scientists have made tremendous progress, from being able to model the quarks and leptons that are the smallest and most basic building blocks of matter discovered so far to describing the birth, life, and death of the universe itself.

However, despite the spectacular successes of which the physics community can be rightfully proud, there are a few situations where researchers attempt to blend two well-validated theories and the result disagrees with data. Such a situation could represent a disaster for one or both of the theories, or the discrepancy could merely demonstrate that at least one of the two ideas is incomplete. One very striking example is when the idea of antimatter confronts the theory of the Big Bang. When combined, they predict a very different universe than the one we observe. This unsettling tension sets the stage for one of the most pressing research programs in contemporary physics.

The first few decades of the 20th century brought us the familiar triumphs of modern physics: first Einstein’s theory of special relativity,1 followed by quantum mechanics, culminating in the Schrödinger equation.2 Unfortunately, the Schrödinger equation does not incorporate relativity, which is a significant flaw. The situation was resolved in 1928, when British physicist Paul Dirac (Fig. 1) derived what is called the Dirac equation.3 This equation is a relativistic wave equation—meaning a fully relativistic description of quantum mechanics.

Fig. 1.

Paul Dirac (left) and Georges LeMaître (right) used Albert Einstein’s (center) equations to solve different physical problems. However, when the two men’s solutions were combined, the result was a prediction that is wildly at odds with measurement.

Fig. 1.

Paul Dirac (left) and Georges LeMaître (right) used Albert Einstein’s (center) equations to solve different physical problems. However, when the two men’s solutions were combined, the result was a prediction that is wildly at odds with measurement.

Close modal

Like Einstein’s earlier work, the Dirac equation described the equivalence of mass and energy, but it made an additional surprising prediction. It predicted the existence of what we now call antimatter.4 Antimatter is the opposite of matter.5 When matter and antimatter are combined, they annihilate into a substantial amount of energy. Using Einstein’s famous equation E = mc2, we can see that if we combine a gram of matter and a gram of antimatter, the resulting energy is 1.8 × 1014 J, or roughly equivalent to the combined energy release of nuclear weapons that devastated Hiroshima and Nagasaki (1.7 × 1014 J).6 

Conversely, when energy is converted into matter, this process occurs with the simultaneous creation of an equal amount of antimatter. For example, a photon (energy) will convert into an electron (matter) and positron (antimatter). (Energy conservation dictates that this process will only proceed if the photon has 1.022 MeV of energy, which is twice the rest mass of the electron.) The positron has all of the properties of an electron, except with the opposite electrical charge. All known subatomic matter particles have an antimatter counterpart. The fact that energy is converted into equal amounts of matter and antimatter is a crucial factor that is the first piece of the puzzle.

In 1915, Einstein published a paper that presented his general theory of relativity.7 General relativity is a theory of gravity,8 which can be applied to a system of arbitrary size. When the equations are applied to the universe as a whole, they can describe the evolution of the matter and energy of the universe.

In 1927, Belgian astronomer Georges LeMaître applied these ideas to the universe and asked what would happen if he projected the behavior of the universe backward in time. He concluded that the universe was once much smaller and hotter.9 This idea has come to be called the Big Bang10 and it is now accepted by the scientific community as a good representation of the origins of the universe (Fig. 2).

Fig. 2.

When astronomers look into deep space, all of the stars and galaxies they see are made of matter, which is in stark disagreement with the simplest theoretical predictions. Credit: ESA/Webb, NASA and CSA, A. Martel.

Fig. 2.

When astronomers look into deep space, all of the stars and galaxies they see are made of matter, which is in stark disagreement with the simplest theoretical predictions. Credit: ESA/Webb, NASA and CSA, A. Martel.

Close modal

While both the existence and behavior of antimatter and the theory of the Big Bang are considered to be scientific fact, when one combines the two ideas, a substantial problem arises. The Big Bang says that the universe was once smaller, hotter, and full of energy. The Dirac equation says that energy converts into matter and antimatter in equal quantities. As the universe expanded and cooled, we’d expect to see energy transitioning to matter and antimatter. Yet when we turn our telescopes to the sky, peering for billions of light-years in every direction, we see only matter. And this leads us to one very vexing question:

So, where is the antimatter?

The fact that the universe seems to be largely devoid of antimatter is one of the biggest unsolved problems of modern physics. Let’s see what the combination of the existence of antimatter and the Big Bang predicts.

When the universe was small and hot, the energy density was very high. Matter and antimatter could be made in wanton profusion. Quarks and antimatter quarks could be made, as long as the local energy density was double the quark’s mass. The quarks and antiquarks could then annihilate, returning to energy form. If the volume of the universe were static, this would be a steady-state condition. However, the universe wasn’t static; it was increasing in size.

As the universe expanded, the density of energy dropped, simply by the increase in volume. In addition, the expanding universe also stretched the wavelength of light. Since the energy of photons of light is inversely proportional to wavelength, this stretching of light further reduced the local energy.

In the initial high-energy conditions of the universe, all quarks and leptons could be made essentially equally. However, as the universe cooled, the local energy density was no longer sufficient to make the heaviest quarks. The production of top quarks (mass ∼172 GeV) stopped first. The top quark/antiquark pairs could still annihilate, but the stretching of space reduced the energy of the resultant photon or gluon (another energy particle), which stopped the top quark/antiquark production. The process continued, with the production of matter/antimatter pairs of each species of quarks turning off, followed by leptons.

The lightest of the highly interacting (i.e., electrically charged) leptons is the electron (mass ∼0.511 MeV). An electron and positron can annihilate into a photon with energy 1.022 MeV.

When the expansion of the universe dropped the photon energy below 1.022 MeV, electron/positron production ceased, nominally locking in the universe as being a bath of photons, electrons, and positrons. However, chance encounters between electrons and positrons allowed ongoing annihilation.

Since the universe was expanding, the electron/positron density continued to drop, making chance encounters less and less likely. Over time, the density of electrons and positrons would drop enough that further interactions would become rare and annihilation would stop. The final universe would be a bath of photons, interspersed with a diffuse gas of electrons and positrons.

The universe would also contain neutrinos, the production of which is beyond this article, as well as a few protons and antimatter protons that were created by chance encounter by triplets of quarks or antiquarks.

This simple scenario should be contrasted with the universe in which we live (Fig. 2). Protons and electrons exist in equal quantities, with a smattering of neutrons found in the center of heavy elements. Essentially, the universe that should arise from combining Dirac and LeMaître theories doesn’t look much like the cosmos we inhabit.

If the “Case of the Missing Antimatter” is a pressing mystery, what is the solution? There is one commonly suggested answer that seems reasonable, but can be easily disproven. Basically, this answer denies that there is any problem at all.

The proposal hinges on the fact that antimatter obeys the same laws of nature as does matter. Indeed, for the most part, if we replaced every instance of matter in our universe with its antimatter equivalent, we’d be hard pressed to know the difference. Hydrogen, with its proton and electron, would be replaced by antihydrogen, with an anti-proton and positron. However, the light emission spectrum from hydrogen and antihydrogen are the same. Similarly, antimatter fusion proceeds in the same manner as fusion. So, the thinking goes, how do we know that there aren’t antimatter stars out there? Or antimatter galaxies? Maybe the answer to the matter/antimatter asymmetry is that matter and antimatter have been separated.

Luckily, we can definitively demonstrate that this is not the case. For example, we know that the moon is made of matter, because astronauts have landed on it and didn’t explode in a nuclear catastrophe. Many of the planets in the solar system have been visited by probes from Earth, so we can conclude that the solar system is made of matter.

Other stars are a different thing. However, we know that stars give off a constant “wind” of protons and electrons. This “wind” flies into interstellar space. We also know from radio measurements that interstellar space is full of hydrogen (or antihydrogen) gas.

While the details of interactions between protons and antimatter protons can be complex, with electrons and positrons, it is simple. At slow speeds (which includes the solar wind), when an electron annihilates with a positron, the result is the emission of two gamma rays, each with an energy of 0.511 MeV.

When we look at where the solar wind from the Sun intersects with the hydrogen gas of interstellar space, we do not detect a bath of 0.511 MeV-gamma rays. Accordingly, we can conclude that interstellar gas is hydrogen. Similarly, we can look at other stars, and we also don’t see those telltale gamma rays, so we can conclude that other stars in the Milky Way are made of matter. From this, we can conclude that the Milky Way is a matter galaxy (Fig. 3).

Fig. 3.

When the gas cloud surrounding a matter galaxy (left) comes into contact with the gas cloud surrounding an antimatter galaxy (right), at the point of contact, an enormous amount of gamma rays of a very specific energy will be emitted. No such emission of gamma rays has been observed.

Fig. 3.

When the gas cloud surrounding a matter galaxy (left) comes into contact with the gas cloud surrounding an antimatter galaxy (right), at the point of contact, an enormous amount of gamma rays of a very specific energy will be emitted. No such emission of gamma rays has been observed.

Close modal

We can repeat this process with the Milky Way and intergalactic space, which is also filled with hydrogen (or antihydrogen) gas. Since the telltale gamma rays have not been observed, using this bootstrap procedure, we can conclude that the galaxies of the visible universe are made of matter.

The visible universe is not the entire universe. Perhaps the antimatter in the universe exists in large pockets beyond the visible universe. While this conjecture is possible, it is difficult to understand how matter and antimatter could be isolated on such cosmic scales. It is the consensus of the scientific community that the universe is made of matter.

If the antimatter thought to have been created when the universe began disappeared, did all of it disappear (Fig. 4)? Or just most of it? If most of it, what was the mechanism that caused the disappearance, and can one estimate the size of the effect?

Fig. 4.

From looking at the remnant light emitted in the Big Bang, scientists have been able to determine that very early in the history of the cosmos, matter outnumbered antimatter by a very tiny excess.

Fig. 4.

From looking at the remnant light emitted in the Big Bang, scientists have been able to determine that very early in the history of the cosmos, matter outnumbered antimatter by a very tiny excess.

Close modal

Using the broad paradigm discussed above, whereby in an expanding and cooling universe the matter and antimatter becomes a photon bath, one can use the number of photons to estimate the number of times matter and antimatter annihilated. To get a sense of the degree to which matter outnumbered antimatter, one can count protons in the universe. (Protons make up approximately 75% of the mass of the universe and electrons add negligible mass.) When one estimates these numbers, done simply in Ref. 11 and more carefully in Ref. 12, one finds that the proton-to-photon ratio η = ρp/ργ ≈ 6.1 × 10−10). Thus, there are approximately 1.6 billion photons for every proton.

Given that the proton-to-photon ratio is approximately conserved, this implies that early in the history of the universe a very small imbalance arose between matter and antimatter. Very approximately, for every 1,600,000,000 antimatter particles, there were 1,600,000,001 matter particles. The 1.6 billions canceled, and the remaining one matter particle went on to form the matter of the visible universe (Fig. 4).

It should be emphasized that the numbers discussed here are ratios. In addition, different calculations, using slightly different modeling and calculational approaches, give somewhat different quantitative estimates. However, irrespective of these minor differences, it is believed that early in the history of the universe, an unknown physical process came into play that favored matter over antimatter by of order one part in a billion.

The mechanism that caused the imbalance remains unknown, but the name for it is baryogenesis, literally the creation of baryons. [The same mechanism also would have created the matter leptons, but since the common baryons (protons and neutrons) outweigh the common leptons (electrons) by a factor of approximately 2000, it is the baryons that dominate the ordinary matter of the universe.]

While the baryogenic process remains a mystery, in 1967 Russian physicist and peace activist Andrei Sakharov (Fig. 5) enumerated the three key features of baryogenesis,13 which any theory must contain. They are as follows:

  1. Baryon number violation

  2. C and CP symmetry violation

  3. An interaction that isn’t in thermal equilibrium

Fig. 5.

Andrei Sakharov determined what properties a theory explaining the matter/antimatter asymmetry must have.

Fig. 5.

Andrei Sakharov determined what properties a theory explaining the matter/antimatter asymmetry must have.

Close modal

The first condition requires a process that will change the number of baryons. A hypothetical example would be the decay of a proton (positively charged baryon) into a positron (positively charged non-baryon). The second requires interactions that treat matter and antimatter differently. The third one requires that there exists a relevant process and reverse process that should occur at equal rates, but don’t.

Baryon number violation has not been observed, nor is it a significant process in the standard model of particle physics. However, there have been other proposed theories that allow baryon number violation, e.g., proton decay in grand unified theories or in some supersymmetric theories.14 

Charge (C) symmetry is a property whereby one can interchange matter and antimatter in the equations without any change in the predictions. If C-violation is observed, it implies that matter and antimatter are different. Parity (P) symmetry is the property whereby one can interchange directions (e.g., left ↔ right, up ↔ down, forward ↔ backward) without changing predictions. If P-violation is observed, it is possible to determine an orientation in the universe. CP symmetry is when both operations can be simultaneously done without changing predictions.

While C, P, and CP symmetry applies both in electromagnetic and strong nuclear force interactions, in 1957 researchers demonstrated15 that the weak nuclear force does not respect both C and P symmetry. Indeed, it was found that the weak force only interacts with matter particles with left-handed spin and only interacts with antimatter particles with right-handed spin. And, in 1964, another group of researchers observed CP symmetry-violating decays in neutral K mesons.16 Accessible discussions of these experiments can be found in Ref. 17. While there have been several examples observed of violations of C and CP symmetry,18 none of the observations are sufficient to explain the matter/antimatter asymmetry in the universe.

A non-thermal equilibrium process relevant to the matter/antimatter asymmetry was discussed above. If the universe is expanding, it is possible for the annihilation of matter/antimatter pairs to occur at a slower rate than their creation. Essentially, due to the expansion, the pairs have more difficulty “finding each other.” This effect would have predominantly occurred in the early universe, when the expansion was much quicker.19 

Scientists still cannot explain the process of baryogenesis, and several experimental efforts are ongoing to explore the various requirements of the Sakharov conditions. However, there is one effort that is particularly noteworthy. This is the Deep Underground Neutrino Experiment (DUNE) (Fig. 6).20 The DUNE experiment will take a beam of neutrinos (or antineutrinos) generated at Fermilab and pass them through Earth to a detector located a mile underground in South Dakota at the Sanford Underground Research Facility (SURF).

Fig. 6.

The DUNE detector is planned to consist of four large vessels, each containing 17,000 tons of liquid argon. Pictured here is a much smaller prototype detector. Credit: CERN.

Fig. 6.

The DUNE detector is planned to consist of four large vessels, each containing 17,000 tons of liquid argon. Pictured here is a much smaller prototype detector. Credit: CERN.

Close modal

The major goal of most modern neutrino experiments is to study neutrino oscillation, which is a behavior of neutrinos that is unique among the fundamental quarks and leptons described by the standard model. There are three types of neutrinos: the electron type, muon type, and tau type. However, over time, these three types of neutrinos can transform into one another in an ongoing and repeating quantum process of subatomic switcheroo called neutrino oscillation.21 Antimatter neutrinos also oscillate in this way.

While the DUNE experimental program will investigate many facets of neutrino interactions, its main goal is to investigate the matter/antimatter asymmetry using a different theory called leptogenesis.22 Leptogenesis is a proposed mechanism of baryogenesis using a complex series of assumptions. It is extremely complicated, and a full description is beyond the scope of this article. However, the necessary components can be enumerated. In leptogenesis, the following things are required (beyond the Sakharov conditions):

  1. Known neutrinos are very low mass and interact only via the weak force. For leptogenesis to be true, there must be “cousin” neutrinos that have very high mass.21,23 The masses of the known and hypothetical neutrinos follow the relationship Mknown × Mproposed = C, where C is a constant. Furthermore, while known neutrinos with left-handed spin only interact via the weak nuclear force, these proposed new neutrinos must have right-handed spin.

  2. Matter and antimatter neutrinos must be the same. Such a particle is called a Majorana particle21,24 and is to be contrasted with particles for which the matter and antimatter versions are different. Such particles are called Dirac particles, and the electron is a familiar example.

  3. The newly proposed neutrino must be able to transform into familiar neutrinos more frequently than it can transform into familiar antimatter neutrinos.

  4. Finally, in the standard model, the number of baryons (e.g., protons and neutrons) is conserved, as are the number of leptons (e.g., electrons and neutrinos). However, for leptogenesis to be true, at high energy neither baryon number (B) nor lepton number (L) is conserved. Instead, what is conserved is (BL). Since for baryons and leptons, positive numbers represent matter and negative numbers represent antimatter, the consequence of (BL) conservation is that baryons can convert into antimatter leptons and antimatter baryons can convert into matter leptons; for example, a proton could convert into a positron (antimatter electron).

The DUNE experiment will not be able to test all facets of leptogenesis. However, if the theory accurately represents reality, one of the consequences of the four points mentioned above is that neutrinos and antimatter neutrinos will oscillate at different rates.

While earlier experiments have hinted25 at this oscillatory difference between matter and antimatter neutrinos, the evidence is not yet statistically compelling. It is hoped that the DUNE experiment will be able to make a definitive statement on the subject. It should be noted that a difference between neutrino and antimatter neutrino oscillation does not mean that leptogenesis is correct. The first could still occur without the second. However, since the standard model predicts that no differences should occur, any observation of matter/antimatter oscillation differences will point toward undiscovered physical phenomena. The DUNE experiment is being assembled, with the expectation that first beam will occur toward the end of this decade.

The observed cosmic matter/antimatter asymmetry is extremely puzzling. Two well-tested and validated theories, when combined, make a very wrong prediction. While it might seem rational to discard either one or the other theory, they both work too well to not be retained. Instead, it seems that some additional and undiscovered principle came into play in the first fractions of a second after the universe began. And fame awaits the person or people who figure out what it was.

1.
D.
Styer
,
Relativity for the Questioning Mind
(
Johns Hopkins
,
Baltimore
,
2017
).
2.
K.
Krane
,
Modern Physics
(
John Wiley
,
New York
,
2019
).
3.
P. A. M.
Dirac
, “
The quantum theory of the electron
,”
Proc. R. Soc. A
117
,
610
624
(
1928
);
D.
Griffiths
,
Introduction to Elementary Particles
, 2nd ed. (
Wiley-VCH
,
New York
,
2008
).
4.
The derivation is complex, but in essence, the conclusion arose from the fact that the final step in the derivation is structured with approximately the following form: (Equation)2 = (Answer)2. Taking the square roots, one gets Equation = ±Answer, where (+Answer) represents matter and (–Answer) represents antimatter. While many would discard the negative root as unphysical, Dirac insisted that it represented a real solution.
5.
F.
Close
,
Antimatter
(
Oxford University Press
,
New York
,
2009
).
6.
R.
Rhodes
,
The Making of the Atomic Bomb
(
Simon and Schuster
,
New York
,
1986
).
7.
A.
Einstein
,
The Collected Papers of Albert Einstein
, Vol.
6
(English): The Berlin Years: Writings, 1914–1917 (English translation supplement), translated by
A.
Engel
(
Princeton University Press
,
Princeton
,
1997
).
8.
C.
Misner
,
K.
Thorne
, and
J.
Wheeler
,
Gravitation
(
Princeton University Press
,
Princeton
,
2017
).
9.
G.
LeMaître
, “
A homogeneous universe of constant mass and increasing radius accounting for the radial velocity of extra-galactic nebulae
,”
Mon. Not. R. Astron. Soc.
91
,
483
490
(
1931
). (This is a translation of the original article published in French in 1927.)
10.
J.
Gregory
,
Fred Hoyle’s Universe
(
Oxford University Press
,
Oxford
,
2005
).
11.
The baryon-to-photon ratio can be estimated in a simple way. The energy density ρE associated with blackbody radiation of temperature T is found using the Stefan–Boltzmann law ρE = (4σc)T4, where σ = 5.67 × 10−8 W · m−2 · K−4 is the Stefan–Boltzmann constant and c = 3 × 108 m/s. In the classical limit, the mean energy per photon is Eγ ≈ 2.7kT, where k = 1.38 × 10−23 J/K is the Boltzmann constant. The average temperature of the universe is 2.7 K, so one can calculate the number density of blackbody photons by nγ = ρEEγ ≈ 400 photons/cm3. The number density of protons can be expressed by ρp = ρm/mp, where ρm is the mass density of the universe and mp is the mass of the proton. Measurements of the amount of baryonic matter in the universe shows that it is about 5% of the critical density; thus, ρm = 4.2 × 10−31 kg/cm3. Combining, we get ρp ≈ 2.6 × 10−7 protons/cm3. Thus, the proton-to-photon ratio is approximately equal to η = ρpργ ≈ 6.5 × 10−10 protons/photon. This crude estimate agrees with a more sophisticated analysis using Planck data,12 which gives proton-to-photon ratio η = ρp/ργ ≈ 6.1 × 10−10. Since the photon number and the baryon number are approximately conserved, the baryon-to-photon ratio stays constant as the universe expands.
12.
R. H.
Cyburt
et al, “
Big Bang nucleosynthesis: 2015
,”
Rev. Mod. Phys.
88
,
015004
(
2016
).
13.
A.
Sakharov
, “
Violation of CP invariance, C asymmetry, and baryon asymmetry of the universe
,”
J. Exp. Theor. Phys. Lett.
5
,
24
27
(
1967
).
14.
P. S. B.
Dev
et al, “
Searches for baryon number violation in neutrino experiments: A white paper
,” arXiv:2203.08771v3 [hep-ex] (2021), submitted to the Proceedings of the US Community Study on the Future of Particle Physics (Snowmass 2021), https://arxiv.org/abs/2203.08771.
15.
Wu
,
C. S.
et al, “
Experimental test of parity conservation in beta decay
,”
Phys. Rev.
105
,
1413
1415
(
1957
).
16.
J. H.
Christenson
et al, “
Evidence for the 2π decay of the K20 meson
,”
Phys. Rev. Lett.
13
,
138
(
1964
).
17.
L.
Lederman
,
The God Particle: If the Universe Is the Answer, What Is the Question?
(
Houghton Mifflin
,
New York
,
2006
);
D.
Lincoln
,
Understanding the Universe: From Quarks to the Cosmos,
revised edition (
World Scientific
,
Singapore
,
2012
);
M.
Gardener
,
The New Ambidextrous Universe: Symmetry and Asymmetry from Mirror Reflections to Superstrings
, 3rd rev. ed. (
Dover Publications
,
Mineola, NY
,
2005
).
18.
R. L.
Workman
et al (
Particle Data Group
), “
CP violation in the quark sector
,”
Prog. Theor. Exp. Phys.
2022
,
083C01
(section 13) (
2022
).
19.
A.
Riotto
and
M.
Trodden
, “
Recent progress in baryogenesis
,”
Annu. Rev. Nucl. Part. Sci.
49
,
46
(
1999
).
20.
R.
Acciarri
et al, “
Long-Baseline Neutrino Facility (LBNF) and Deep Underground Neutrino Experiment (DUNE) conceptual design report volume 1: The LBNF and DUNE projects. Deep Underground Neutrino Experiment
,” arXiv:1601.05471 (
2016
).
21.
A.
Chodos
and
R.
Riordon
,
The Ghost Particle: In Search of the Elusive and Mysterious Neutrino
(
MIT Press
,
Cambridge
,
2023
);
F.
Close
,
Neutrino
(
Oxford University Press
,
Oxford
,
2013
); https://www.youtube.com/watch?v=RGv-pcKRf6Q.
22.
S.
Davidson
,
E.
Nardi
, and
Y.
Nir
, “
Leptogenesis
,”
Phys. Rep.
466
,
105
177
(
2008
); https://www.youtube.com/watch?v=PsqEcGMjEfo.
23.
M.
Drewes
, “
The phenomenology of right handed neutrinos
,”
Int. J. Mod. Phys. E
22
,
1330019
(
2013
).
24.
A. Baha
Balantekin
and
B.
Kayser
, “
On the properties of neutrinos
,”
Annu. Rev. Nucl. Part. Sci.
68
,
313
338
(
2018
).
25.
The T2K Collaboration
, “
Constraint on the matter-antimatter symmetry-violating phase in neutrino oscillations
,”
Nature
580
,
339
344
(
2020
);
[PubMed]
M. A.
Acero
et al (
The NOvA Collaboration
), “
Improved measurements of neutrino oscillation parameters by the NOvA experiment
,”
Phys. Rev. D
106
,
032004
(
2022
).

Don Lincoln is a senior scientist at Fermi National Accelerator Laboratory. He uses data collected using high-energy particle accelerators to study the laws of nature and has co-authored over 1500 papers. He is also an avid popularizer of frontier physics and has written several books for the general public, most recently Einstein’s Unfinished Dream: Practical Progress Towards a Theory of Everything. He also writes for online venues like BigThink, CNN, Forbes, and others. He also makes videos on the Fermilab YouTube channel and with Wondrium. www.facebook.com/Dr.Don.Lincoln/