Skip to Main Content
Skip Nav Destination

Column: The collider question

14 March 2019

There are strong physics motivations, along with economic and cultural ones, for building a next-generation particle collider.

Which high-energy colliders should be built next? Should any? To answer those questions, we need to understand the history, physics, and economic and cultural benefits of colliders.

In the 1960s particle physics was in a chaotic state, with little understanding of observations. The goal of the first-generation high-energy colliders being constructed then was exploratory; their purpose was to search for some kind of new physics that would arise. An electron–positron collider was built at SLAC and a proton–antiproton one at Fermilab. That strategy worked.

Gordon Kane
Credit: David Giroux

Parsing Progress

A monthly column by Gordon Kane

Some recent talks, blogs, and pieces in the New York Times have argued that today we have little theoretical guidance to help us decide which new colliders, if any, are well motivated. To some, the situation seems like an even bleaker version of the 1960s. But those perspectives mostly ignore the hierarchy problem, probably the central issue of particle physics today, and doing that yields misleading conclusions about the strength of the physics motivations for building a future collider. Some people, including me, find reasons to move to higher energies in the properties of the Higgs boson, the apparent unification of the forces at short distances, string theory, and more.

Past success

The construction of the earliest colliders proved especially prudent once the standard model emerged in the early 1970s. After the tau lepton was found at SLAC in 1975, implying a third family of leptons, the search for a third family of quarks intensified. In 1977 the bottom quark was found at Fermilab.

Going forward from that point, the priority was finding the SM’s two still-missing particles, the top quark and the Higgs boson. Several facilities were upgraded or built in the US, Japan, and Europe to search for the expected top quark, which the SM predicted would be in a doublet with the bottom. Finally, after upgrades, the Tevatron collider at Fermilab discovered the top quark in 1995, at a mass about 41 times that of the bottom quark.

SPEAR collider at SLAC
Physicists discovered the tau lepton using SLAC National Accelerator Laboratory’s SPEAR collider. Credit: SLAC, CC BY-NC-SA 2.0

Initially there was no theoretical guidance about what mass the Higgs boson should have. Each new collider with higher energy or intensity could search a new region. Then in the mid 1990s at CERN, the Large Electron–Positron Collider took precision data on many Z boson decays and provided indirect evidence for a Higgs boson with a mass of about 120 GeV/c2. The Higgs physics was too important to let indirect evidence count as a discovery, but it set a goal. There were also increasingly strong theoretical arguments for a Higgs boson mass of a similar value. We now know that if Fermilab and the US Department of Energy had taken the Higgs physics more seriously, the Tevatron would have discovered the Higgs boson years before the Large Hadron Collider did. As it happened, finding the Higgs boson became the major target of the LHC when it turned on in 2010.

Current conundrum

In 2012 the LHC did, as expected, find the Higgs boson. It hasn’t yet found the other new physics. For our next colliders the goal is to provide data for a more comprehensive theory, hopefully one that incorporates dark matter, quantum gravity, and neutrino masses and solves the hierarchy problem. But what does that mean in practice?

The hierarchy problem is hard to explain. I don’t know of any good nontechnical explanation, nor any analogies to describe it. Yet it is the central problem of particle physics today. Basically, the problem is that there are two main energy (or mass) scales in nature, but that situation shouldn’t be stable. One is the Planck scale, which is defined via fundamental constants: the speed of light, c; Planck’s quantum size, h; and Newton’s gravitational force strength, G. The associated energy scale is about 1019 GeV. The other is the electroweak scale, which is set by the masses of the Higgs and the W and Z bosons, at about 102 GeV. (Protons and atoms have smaller scales, but we understand how to derive those.) It is a conceptual problem, not a conflict with observations.

The Planck and electroweak scales differ by about 1017 GeV, a huge amount. Quantum fluctuations into virtual particles occur up to mass values near the Planck mass. When some particle masses are calculated, there is a contribution from the heavy virtual particles that should raise them up to near the Planck scale.

Theory does not allow a description that maintains the large separation of the electroweak scale from the Planck scale. That is the hierarchy problem.

If we don’t solve the hierarchy problem, we cannot hope to formulate theories at the Planck scale and then calculate their predictions at the electroweak scale, or to extrapolate implications of data and successful models from the electroweak scale to the Planck scale. We don’t want to give up that goal. It can open the path to comprehending the universe. If new colliders are built, finding experimental clues to the hierarchy problem, such as the proposed superpartner particles in supersymmetry or higher-mass Z´ gauge bosons, is very likely, even if no target as clear as the Higgs boson exists.

Watching the Higgs announcement
More than 200 employees at Fermilab watch the Higgs boson discovery announcement in the wee hours of 4 July 2012. Credit: US Department of Energy

The argument is actually stronger. It’s been known since the 1980s that a mathematically consistent quantum theory of gravity has to be formulated in 9 or 10 spatial dimensions. Since experiments are done in three spatial dimensions, the theory has to be projected (this is called compactification) to make predictions and do tests. The theory is written at the Planck scale and then calculated at the electroweak scale. That leads to models that give a good description of our world, in a de Sitter vacuum, consistent with known data and theory constraints.

In recent years there has been progress in understanding those models. They predict or describe the Higgs boson mass. We can now study the masses that new particles have in such models to get guidance for what colliders to build. The models generically have some observable superpartners with masses between about 1500 GeV and 5000 GeV. The lower third or so of this range will be observable at the upgraded LHC. The full range and beyond can be covered at proposed colliders. The full range might be covered at a proton–proton collider with only two to three times the energy of the LHC. One important lesson from studying such models is that we should not have expected to find superpartners at the LHC with masses below about 1500 GeV. Such theoretical work provides quantitative predictions to help set goals for collider construction, similar to how theorists helped zero in on the mass of the Higgs boson.

Economic considerations

If the needed new colliders cost only as much as typical scientific and engineering facilities, probably several would be built, but the estimates are much larger than that. Although the economic benefits from colliders are not widely understood, there is evidence that the colliders built so far have more than paid for themselves when their benefit to the economy is considered along with their up-front costs. For the Tevatron, the resulting magnet industries and superconducting wire technologies alone expanded the economy more than the cost of the facility.

The LHC’s impact on the economy had an outlier benefit, namely the World Wide Web, which was developed by then CERN fellow Tim Berners-Lee to solve the problem of having many universities and labs remotely analyzing LHC data and results. In addition, there were major contributions in grid computing, magnets, superconducting wire, and more. Even without the Web, the LHC has paid for itself.

That’s no accident. All scientific facilities lead to some spin-offs and startups, but the ones working at the frontier have larger effects because they are at the frontier, so the needed technologies do not already exist. In addition, the collider provides a market for the initial development. Lots of startups fail, but they are much more likely to succeed when they have a guaranteed customer for the first stage.

Detailed designs exist for several future colliders, including the Chinese Circular Electron Positron Collider (CEPC) and the European Future Circular Collider (FCC). Just as the LHC has about seven times the Tevatron energy, both proposed colliders plan eventually to be about seven times the LHC energy, close to 100 TeV. Aiming for such increases should be realistic. They are circular colliders, with circumferences close to 100 km.

The site for the CEPC most often mentioned is about 280 km east of Beijing, near the port city of Qinhuangdao (population about 3 million), which is located at the beginning of the Great Wall. The city has a magnificent new art museum that is built into a large sand dune. The collider could be approved soon, with construction to begin in 2022.

The FCC would be at CERN, beginning construction after a decade or more of LHC planned running. For historical reasons the member country budget contributions to CERN are locked in by treaty, so CERN can plan far ahead, something not possible in countries where budgets are determined year by year.

Stephen Hawking and I wrote an essay about future colliders that is relevant to both the CEPC and the FCC. We were encouraged by others to chime in because of discussions that arose in China about physics and economic and cultural issues surrounding building a future collider.

The theoretical arguments for building a larger collider with several times the energy of the LHC are thus very strong, particularly in regard to solving the hierarchy problem. What would happen without data? Someone may get or even already have the solution, but no one will be convinced. With data pointing to the solution, we may be able to move on and obtain consensus about a comprehensive theory that incorporates the standard models of particle physics and cosmology and a quantum theory of general relativity, giving us a profound understanding of our universe.

Gordon Kane is the Victor Weisskopf Distinguished University Professor of Physics at the University of Michigan and director emeritus at the Leinweber (formerly Michigan) Center for Theoretical Physics. He has received the Julius Edgar Lilienfeld Prize and the J. J. Sakurai Prize from the American Physical Society. Kane has written two books for any curious reader, The Particle Garden and Supersymmetry and Beyond, as well as a short book at a Physics Today technical level, String Theory and the Real World.

Close Modal

or Create an Account

Close Modal
Close Modal