Since its inception in the 1970s, the standard model of particle physics has been remarkably successful at describing the building blocks that make up the universe. Despite its triumphs, the model is known to be incomplete; it fails to explain gravity, dark matter, and matter–antimatter asymmetries, to name just a few phenomena. But it does a good job characterizing its 17 constituent particles, and so far there’s no consensus around anything more complete.
Violations of the standard model can serve as clues about what might be missing, but they’re extremely hard to come by. So it was big news when, in 2001, researchers at Brookhaven National Laboratory announced such a disagreement: Their measurement of the muon’s magnetic moment anomaly, (described in the box on page 16), was 2.6 standard deviations above the value predicted by the standard model (see Physics Today, April 2001, page 18). The level of certainty was too low to claim a discovery—particle-physics experiments generally require a discrepancy of 5 standard deviations, which corresponds to a chance of a false positive of less than 0.000056%.
Now, 20 years after that initial result, the Muon g − 2 collaboration at Fermilab has a stronger claim of a standard-model discrepancy. Its measurement of agrees with the Brookhaven result and has an uncertainty of 3.3 standard deviations.1 Together, the Brookhaven and Fermilab data give a value for that differs from the standard-model value by 4.2 standard deviations—tantalizingly close to the 5 needed to claim a discovery.
Decay detection
In a sense, the Muon g − 2 experiment at Fermilab is an upgrade of the earlier Brookhaven one. Both used the same experimental technique to measure , and they even shared equipment. The 14.2-meter-diameter magnet ring at the heart of the earlier experiment was shipped from New York to Illinois for the new measurements (see figure 1). The move was driven by the need for better statistics,2 and the researchers chose Fermilab because of its higher-intensity proton source.
As was the case at Brookhaven, the experiment begins when protons are smashed into a fixed target. The collisions produce pions that decay into muons. Protons generate more positively charged pions and muons than negatively charged ones, so the apparatus siphons off the positive muons and injects them into the magnet ring. After about 64 µs and a few hundred trips around the ring, each muon decays into two neutrinos that fly away and a positron that gets detected by one of the 24 calorimeters situated around the ring (see figure 2).
During their short lifetimes, the muons are directed in a circular orbit around the ring by a near-perfectly uniform 1.45 T vertical magnetic field. They move in the horizontal plane, so their momentum vectors orbit around the magnetic field direction at the cyclotron frequency .
Muons have spin, which means they also have magnetic dipole moments. The dipole moments rotate in the magnetic field at a frequency . If muons were completely described by relativistic quantum mechanics, their dipole moments and momenta would rotate at the same rate around the magnetic field direction. But real muons are more complicated, and interactions with the electromagnetic vacuum cause their magnetic moments to rotate slightly faster than their momenta. The difference between those frequencies, , reflects contributions from quantum electrodynamics and quantum chromodynamics (QCD).
The Muon g − 2 experiment doesn’t directly probe the muons. Instead, can be inferred from the positrons they emit. The level of alignment between the muons’ momenta and magnetic moments affects the energy distribution of the positrons produced when the muons decay—specifically, close alignment produces more high-energy positrons. To find , the researchers track the time dependence of the energy distribution’s shape changes.
Measuring alone is not enough to confront the standard model. Theoretical calculations predict a value for , a parameter that connects the muon’s spin to its magnetic moment. (See the box for details.) The two quantities are related by , where is the muon’s charge, is its mass, and is the magnetic field strength. To derive , the researchers also need to measure with high accuracy and precision.
The name of the Muon g − 2 collaboration stems from the fact that the experiment measures the muon anomaly , where is a dimensionless magnetic moment that relates a particle’s spin to its magnetic moment , charge , and mass : .
The Dirac equation predicts that for a spin-½ particle, = 2 and = 0. But because of interactions with the electromagnetic vacuum, the muon has a slightly larger than 2. The most precise experimental value of the anomaly to date is = 0.001 165 920 61(41); the standard-model value is 0.001 165 918 10(43).
Upcycling
Although the magnet ring in the Fermilab experiments came from Brookhaven, its magnetic field is now better characterized and controlled. The researchers used adjustable wedges, shims, and coils to fine-tune the field locally and a series of NMR magnetometers to monitor it. Additionally, they suppressed ambient temperature fluctuations that cause the steel in the magnets to expand and contract. Those adjustments reduced the amplitude of fluctuations in the field strength, averaged around the ring, by a factor of 2.5.
The new measurements also benefited from improved positron detection. Upgraded detectors have increased spatial and temporal resolution for separating individual events, and the researchers employed a new laser calibration system that monitored how each calorimeter’s response to an event varied over time. The technology didn’t exist during the Brookhaven experiment.
Advances in computing underpin the new result. For one, the Fermilab researchers store all their data—nearly 1 petabyte per month—which wasn’t possible at Brookhaven. Now they have much more complete information to search for potentially overlooked systematics, and so far they haven’t uncovered any such issues.
Simulations have also improved significantly. They were used alongside the Brookhaven experiments to look for unanticipated problems, but at the time they were rather crude. Now simulations can accurately capture the details of the muon-beam dynamics in the storage ring, the evolution of the muon spin, and the predicted detector response. Reassuringly, the researchers still haven’t seen any unexpected behavior.
The technical improvements tackled every known source of experimental uncertainty from the Brookhaven experiments and reduced the overall systematic error by about a factor of 2, thereby increasing confidence in both the approach and the result. But the largest source of uncertainty, then and now, is an insufficient number of events.
The amount of data presented in the collaboration’s new papers is comparable to that from Brookhaven, but it comes from only the first of at least five runs and represents just 6% of the data that are expected to be generated at Fermilab. The second and third runs, which incorporated additional improvements informed by the first run, are already complete; their results are expected to be published by next summer. According to Chris Polly, a spokesperson for the collaboration and a physicist at Fermilab, there’s about a 50-50 chance that those results will push the muon anomaly beyond 5 standard deviations.
A deeper dive
But before they can confidently claim evidence of physics beyond the standard model, particle physicists will have to grapple with the following question: Is it possible that either the experimental or the theoretical value is wrong?
When the Muon g − 2 experiment moved to Fermilab, researchers were divided about whether additional data would support or refute the intriguing, but far from definitive, evidence of a muon anomaly. Now that the experiment has been scrutinized and fine-tuned, the researchers are confident in their result and in their control over sources of systematic errors. Proving the experiment wrong, Polly says, would mean uncovering a serious misunderstanding of its underlying physics.
If the discrepancy between experiment and theory does reach discovery-level certainty, that would be a sign of new physics. But it wouldn’t be a map for figuring out what or where that physics is. When the muon anomaly was first discovered, researchers were hopeful that it would point to supersymmetry, the idea that each fundamental particle has a yet-unseen superpartner. Data collected by the ATLAS and CMS experiments during the Large Hadron Collider’s first two runs have since ruled out the simplest supersymmetric models, though, so support for that explanation has weakened.
Ideally, another experiment will provide a second sign to narrow down the theoretical options. Fermilab’s Mu2e experiment, which observes muon-to-electron conversions, and B factories, which study the decay of B mesons, are prime candidates for such a signal.
Calculations and conjectures
Corrections to the theoretical value of are also still being refined. They come from quantum electrodynamics calculations that account for muons interacting with the electromagnetic vacuum through the creation and annihilation of virtual particles. The first and largest correction, uncovered by Julian Schwinger in 1948, accounts for a muon emitting and reabsorbing a virtual photon.
Since then, the effects of more than 10 000 electromagnetic, electroweak, and strong-interaction corrections have been calculated and are included in the standard-model prediction for . By far the largest source of uncertainty comes from the strong-interaction, or hadronic, corrections, which are notoriously hard to calculate. Indeed, they can’t be calculated directly, and the contributions from processes involving virtual quarks are estimated using a model informed by experimental data.
In 2017 dozens of researchers from around the world united to form the Muon g − 2 Theory Initiative. Their goal was to improve the theoretical value of so it could be compared with experiments—particularly the then-upcoming Fermilab results—and their focus was on lowering uncertainty in the hadronic contributions. The group’s result,3 published in 2020, was used for comparison with the Fermilab measurement. It represents a significant improvement and incorporates multiple independent calculations of the hadronic corrections, complete with quantified uncertainties.
Theoretical and experimental values of currently have comparable uncertainty, but the experimental value’s uncertainty is expected to drop by about a factor of 4 with ongoing and planned experiments, and theory needs to keep up. One attractive approach is to use a first-principles calculation of the hadronic corrections to based on lattice QCD, which approximates the physics of quarks and gluons on a grid of points. In fact, a lattice QCD calculation of was published the same day as the Fermilab results and garnered significant attention for agreeing with experiments, thereby challenging the notion that new physics might lie behind the muon’s magnetic moment.4
Although the lattice result is intriguing, Polly cautions against putting it on equal footing with the Theory Initiative’s value. Multiple groups are still working to figure out the right way to use lattice QCD to calculate hadronic corrections for and cross-check their results, whereas decades of work by hundreds of physicists underpins the data-reinforced approach. If mounting evidence continues to support a difference between the two techniques, understanding that difference will be essential to determining if the muon is actually exhibiting physics beyond the standard model.