Particle physics is at the brink of a new era. CERN’s Large Hadron Collider, by many measures the largest scientific instrument ever built, is scheduled to be commissioned in 2008; figure 1 gives a sense of its scale. In terms of resolving power, it will be humankind’s most impressive microscope. The LHC will probe energies of some trillion electron volts—an order of magnitude larger than energies previously studied—and it will likely address longstanding questions about the nature of interactions among the elementary particles. (For a brief overview of the LHC and what it might find, see the Quick Study by Fabiola Gianotti and Chris Quigg in Physics Today, September 2007, page 90).

Figure 1. The Large Hadron Collider at CERN will smash together protons that have been accelerated to energies of 7 TeV. The illustration shows a string of superconducting magnets in the tunnel; the outer red curve indicates the 27-km-circumference ring that will house the LHC.

Figure 1. The Large Hadron Collider at CERN will smash together protons that have been accelerated to energies of 7 TeV. The illustration shows a string of superconducting magnets in the tunnel; the outer red curve indicates the 27-km-circumference ring that will house the LHC.

Close modal

Much of the activity of high-energy theorists since the 1980s has been geared toward the new TeV energy frontier. Extensive studies have considered the standard model, which describes our current understanding of the laws of nature, and that work will be tested at the LHC. But much of the effort has focused on speculations about new physics that might be discovered, including exotic phenomena with names like technicolor, supersymmetry, large extra dimensions, and warped extra dimensions. At the same time, many particle theorists have devoted their energies to questions of extremely high-energy physics—string theory and, more generally, quantum gravity. The various efforts of high-energy physicists have often appeared to be totally divorced from one another, and the seeming schism between phenomenologists and string theorists—and the rifts among the subcultures of each—has sometimes become polemical. Witness, for example, popular books on the one hand promoting string theory as representing a dramatic scientific revolution and on the other disparaging it as totally removed from experiment and the traditional realm of science.

The reality is more complicated. Many who pride themselves on their focus on phenomenology have been led to speculations that cannot be properly framed without an underlying theory such as string theory. On the other hand, many theorists are interested in string theory precisely because of its ability to address the questions that are left unresolved by the standard model. Those questions have been sharpened by various recent astrophysical and cosmological discoveries that require extensions of the laws of nature beyond the standard model. Those discoveries include dark matter, a new form of matter with zero pressure that makes up about 23% of the energy of the universe; dark energy, which is quite possibly Einstein’s cosmological constant and responsible for about 73% of the universe’s energy; and the inflationary paradigm, the idea that the universe went through a period of very rapid expansion almost immediately after the Big Bang.

The standard model has been well established for nearly three decades. It explains a host of experiments that have been conducted at energies up to a few hundred GeV to investigate the strong, weak, and electromagnetic interactions. In the early days, the agreement between theory and experiment, while persuasive, was often crude. But as illustrated in figure 2, experimental programs at CERN, Fermilab, SLAC, the German Electron Synchrotron (DESY), and Cornell University had turned the study of the weak and strong interactions into precision science by the end of the 20th century, with numerous tests of the theory at the parts-per-thousand level.

Figure 2. Precision experiments have subjected the standard model to parts-per-thousand tests, and the theory has passed splendidly. In one such experiment, CERN’s OPAL collaboration measured the line shape of the Z0 resonance in electron–positron collisions. The solid red circles show the experimental data, which lie right on the theoretical curve.

Figure 2. Precision experiments have subjected the standard model to parts-per-thousand tests, and the theory has passed splendidly. In one such experiment, CERN’s OPAL collaboration measured the line shape of the Z0 resonance in electron–positron collisions. The solid red circles show the experimental data, which lie right on the theoretical curve.

Close modal

At the beginning of this century, there were no serious discrepancies between standard model and experiment, but two aspects of the theory remained untested. The first was the origin of CP violation. The second was the standard model’s prediction of a particle called the Higgs boson.

The CP symmetry, closely related to time reversal (T), combines parity and charge conjugation—the exchange of particles and antiparticles. It is a very good symmetry of nature, conserved to a high degree by the strong and electromagnetic interactions. But violation of CP is essential to understanding why we find ourselves in a universe that is highly asymmetric in its abundances of matter and antimatter (see the article by Helen Quinn, Physics Today, February 2003, page 30).

Until relatively recently, violation of CP had been observed only in special situations involving the weak interactions of the neutral K mesons. The standard model contains a parameter that violates CP , but without additional experimental input, it was not possible to say whether that parameter accounted for the CP violation observed with K mesons. To provide a test, one would need a large sample of B mesons, which contain b quarks. Two electron–positron machines — B factories—optimized for the purpose were proposed and established, one (BaBar) at SLAC and one (Belle) at KEK, the high-energy accelerator research organization in Japan. During the past seven years, the B factories have performed beyond expectations, and the standard-model explanation for the violation of CP symmetry has received striking confirmation. Some additional contribution could yet be possible, but it would be rather small.

The still-missing piece of the standard model is the Higgs boson. This particle is responsible for the masses of the W and Z bosons and of the quarks and leptons. The standard model does not predict its mass. Figure 3 illustrates how a combination of theoretical and experimental input suggests that the Higgs mass is in the range of 114–182 GeV. Because of its modest mass, the Higgs is likely to be discovered at the LHC, or possibly at the Fermilab Tevatron before that. It is predicted to couple rather weakly to ordinary matter, and its detection will be challenging.

Figure 3. Three theoretical calculations with somewhat different approaches all give a preferred value (minimum chi-squared) of about 80 GeV for the mass of the Higgs boson. The blue band gives the theoretical uncertainty of one of those calculations; uncertainties for the others are comparable. On the experimental front, direct searches for the Higgs boson at the Large Electron–Positron Collider (LEP) at CERN have established that the Higgs mass is greater than 114 GeV; the yellow shading shows the experimentally excluded mass region. The theoretical and experimental results can be summarized with the statement that within the standard model, the Higgs mass is in the range of 114–182 GeV, at the 95% confidence level. Some experimental analysis remains to be completed, so this preliminary plot may need modification in the near future.

Figure 3. Three theoretical calculations with somewhat different approaches all give a preferred value (minimum chi-squared) of about 80 GeV for the mass of the Higgs boson. The blue band gives the theoretical uncertainty of one of those calculations; uncertainties for the others are comparable. On the experimental front, direct searches for the Higgs boson at the Large Electron–Positron Collider (LEP) at CERN have established that the Higgs mass is greater than 114 GeV; the yellow shading shows the experimentally excluded mass region. The theoretical and experimental results can be summarized with the statement that within the standard model, the Higgs mass is in the range of 114–182 GeV, at the 95% confidence level. Some experimental analysis remains to be completed, so this preliminary plot may need modification in the near future.

Close modal

Despite its many triumphs, the standard model must eventually give way to some more complete structure. For starters, at least two classes of phenomena show that it cannot be a complete theory. The first is gravitation. That is, Albert Einstein’s general theory of relativity cannot be grafted onto the standard model without leading to serious difficulties. The second class of phenomena has to do with the physics of neutrinos. One of the great experimental discoveries of recent years is that neutrinos have tiny masses. Within particle physicists’ current understanding, those masses result from some sort of new physics at a very high energy scale, perhaps 1014–1016 GeV.

Those limitations aside, the standard model possesses several troubling features. For example, it has many parameters — 18 or 19, depending how one counts. Many of the standard model’s parameters are dimensionless numbers. One might expect that they would be numbers like 1 or π, but they actually form a much more bizarre pattern. That is clear from the particle masses; the ratio of the top quark mass to the electron mass is 3 × 105. Among the various numbers, one of the most puzzling is a parameter of the strong interactions known as the θ parameter. This quantity multiplies a CP -violating term that leads to an electric dipole moment for the neutron. Experimental searches for such a moment limit the dimensionless θ to less than 10−9. Since CP is not a symmetry of the standard model, it is hard to see what principle might explain the parameter’s enormous suppression.

The mass of the Higgs particle poses an even greater puzzle. Although a mass greater than 114 GeV is in a practical sense very large, from the point of view of simple dimensional analysis it is surprisingly small. Absent any grand principle, one would expect that the Higgs mass should be something like the largest mass scale that appears in the laws of physics. Among the known laws, that is the Planck mass MP, which is built from Newton’s constant, Planck’s constant, and the speed of light; its value is 1019 GeV. But even if some principle segregates gravity from the Higgs mass, other very large mass scales exist, such as that associated with neutrino physics. Quantum corrections to the mass are expected to be at least the size of that neutrino-physics scale. So the relative lightness of the Higgs would seem to arise from a bizarre conspiracy of different effects, what theorists refer to as fine tuning. The puzzle of the Higgs mass is known as the hierarchy problem.

The combination of quantum mechanics with the principles of special relativity is called quantum field theory (QFT). The successes of the standard model represent the triumph of that synthesis. But the model’s failures, particularly in accounting for general relativity, also suggest that some new framework may be necessary for physics involving very short distances or, equivalently, very high energies. When QFT is combined with gravity, the resulting theory does not behave sensibly at short distances. Even on larger scales, Stephen Hawking has formulated a sharp “information paradox” suggesting that quantum mechanics and black holes are incompatible. String theory seems to resolve those puzzles: Short-distance behavior presents no problem, and black holes obey the rules of quantum mechanics. But beyond that, string theory seems to address all of the open questions of the standard model.

What is string theory? In QFT, particles are simply points, with no intrinsic properties apart from their masses, spins, and charges. Objects of one-dimensional extent — strings—are the simplest structures beyond points. Strings would seem to be comparatively straightforward systems, but the rules of special relativity and quantum mechanics subject them to tight consistency conditions. When those constraints are satisfied, the resulting structures describe theories like general relativity and interactions like those of the standard model. Those features emerge automatically; they are not imposed from the outset.

Only a few such theories with flat spacetime may be formulated, and they exist only in 10 dimensions. The extra dimensions are not, by themselves, troubling. Since the early days of general relativity, theorists have entertained the possibility that spacetime might have more than four dimensions, with some of them “compactified” to a small size; figure 4 illustrates the concept. The string-theory equations allow a vast array of spacetimes of this type, many of which have features that closely resemble those of our world: photons, gluons, W and Z bosons, Higgs particles, and multiple generations of quarks and leptons. In principle, it is possible to start with those solutions and compute the parameters of the standard model. The problem of understanding the features of the standard model would thus seem to be a problem of dynamics: One just needs to understand how some particular solution—what is loosely called a vacuum state—is selected from among the myriad possibilities.

Figure 4. Spacetimes admissible in string theory typically have more than four dimensions. The extra dimensions beyond the usual space and time are imperceptibly small. In this illustration, the plane represents the familiar four dimensions of space and time. Associated with every point of the 4D spacetime are internal dimensions, here represented as spheres.

Figure 4. Spacetimes admissible in string theory typically have more than four dimensions. The extra dimensions beyond the usual space and time are imperceptibly small. In this illustration, the plane represents the familiar four dimensions of space and time. Associated with every point of the 4D spacetime are internal dimensions, here represented as spheres.

Close modal

Even before the string theory revolution of the mid-1980s, theorists had put forth an array of conjectures to resolve many of the puzzles of the standard model. All of those find a home in string theory.

The large number of parameters in the standard model is mitigated in theories with so-called grand unification. In grand unified models, the strong, weak, and electromagnetic interactions of the standard model become part of a single interaction at a very high energy scale. That hypothesis enables one to predict the strength of the strong force in terms of the weak and electromagnetic interaction strengths and allows for a prediction of the tau lepton mass in terms of the bottom quark mass. Grand unified theories made two additional predictions: The proton has a finite lifetime, and magnetic monopoles exist.

The simplest grand unified theories, however, have failed experimental tests. They predict a proton lifetime of less than 1028 years and, in light of the precision measurements of the past two decades, don’t get the strong coupling right. But the proton-lifetime and monopole predictions have stimulated important science. Underground experiments have set a lower limit on the proton lifetime of 1033 years, discovered neutrino masses, and studied astrophysical phenomena. Issues surrounding magnetic monopoles were a principle motivation in the development of inflationary theories in cosmology. Simple grand unified theories also tackled other longstanding questions such as the origin of electric-charge quantization, and they provided the first concrete realization of Andrei Sakharov’s proposal to understand the matter–antimatter asymmetry of the universe.

The strong CP problem—that is, the smallness of the CP -violating parameter θ—has attracted much attention from theorists. The most promising explanation implies the existence of a new, very light particle known as the axion. Although extremely weakly interacting, axions, if they exist, were copiously produced in the early universe, and can readily account for the universe’s dark matter. Experimental searches are challenging but are now beginning to set interesting limits on the axion mass and couplings (see the article by Karl van Bibber and Leslie Rosenberg in Physics Today, August 2006, page 30). On the theoretical side, however, the axion idea is troubling. It seems to require an extraordinary set of accidents, arguably more remarkable than the very small θ that it is meant to explain.

The hierarchy problem, connected as it is with physics at scales of hundreds of GeV, points most directly to phenomena one could expect to observe at foreseeable accelerator experiments. Among the proposed solutions are technicolor, large extra dimensions, warped extra dimensions, and supersymmetry.

In technicolor models, the Higgs particle is a composite of a fermionic particle and a fermionic antiparticle that participate in a new set of strong interactions. The idea is attractive but difficult to reconcile with precision studies of the Z boson. If technicolor is the explanation for the hierarchy, accelerators like the LHC should see resonances with masses on the order of hundreds of GeV, similar to the resonances of the strong interactions.

The discrepancy between the Planck mass and the scale of the Higgs mass could be explained by positing that spacetime has more than four dimensions and that some of the extra dimensions are large (see the Quick Study by Lisa Randall, Physics Today, July 2007, page 80). In such models, forces other than gravity are essentially confined to the four spacetime dimensions we experience. The consequences for accelerator experiments include dramatically rising cross sections for processes that appear to have large missing energy. Some models predict modifications of gravity on millimeter distance scales. Like technicolor, the idea of large extra dimensions and variants such as warped extra dimensions faces challenges accommodating precision studies of elementary particles.

Supersymmetry is a hypothetical symmetry between fermions and bosons. Clearly, the symmetry cannot be exact; if it were, then all fermions would be accompanied by bosons of the same mass and electric charge. But if supersymmetry is present at high energies and broken at scales on the order of a few hundred GeV, then the superpartners of all ordinary particles would have masses something like the breaking scale and would not yet have been observed. In that scenario, dimensional analysis predicts that the Higgs boson should have a mass of a few hundred GeV, although more sophisticated analysis suggests that in supersymmetric theories, the Higgs mass cannot be much greater than that of the Z boson, about 91 GeV. At CERN’s Large Electron–Positron Collider (LEP) and at the Tevatron, strong limits have been set on the as-yet unobserved superpartners, and supersymmetry aficionados expect superpartners to be discovered at the LHC. For more on supersymmetry, see the box on this page.

Even without a detailed picture of how string theory is connected to nature, theorists have used the theory to address a number of qualitative questions about physics beyond the standard model. Some conjectured properties are typical of string theory vacua; others are not.

For example, axions, which many think to be unnatural, emerge readily from string theory. Variation of the fundamental constants on cosmic time scales, first suggested by Paul Dirac, seems highly unlikely, as do explanations for the dark energy in which the energy varies slowly with time. Theorists have long speculated that a theory of quantum gravity should have no conserved quantum numbers except for those that, like electric charge, are sources for massless vector fields. That speculation is a theorem in string theory. Some ideas for inflation find a natural home in string theory; others look implausible. The CPT theorem, a triumph of field theory, almost certainly holds in string theory as well.

At first sight, string theory presents an exciting picture. It has pretensions to being an ultimate theory, jokingly called a theory of everything. (Theorist John Ellis relates that he invented the term in response to critics who had called string theory a theory of nothing.)

As noted earlier, the biggest obstacle to connecting the theory to nature is the theory’s many solutions. It admits discrete sets of solutions in which, for example, the number of dimensions of spacetime varies, as does the number of particles of a given type. Continuous sets also exist, in which the couplings and masses of the particles change. While some of those closely resemble the world we observe, most do not. And nothing jumps out as a principle that might select one from among all of those solutions, never mind one with the peculiar features of our world.

The continuous sets of solutions are particularly problematic. They lead to massless or very light particles, called moduli, that give rise to long-range forces that compete with gravity. One could hope that quantum effects would give large masses for those particles, but until recently no one had constructed even unrealistic examples in which such a phenomenon could be studied.

One particular number, it would seem, is almost impossible for the theory to get right: the magnitude of the cosmological constant or dark-energy density. The cosmological constant is the energy density of the possibly metastable ground state of the universe. In units for which Planck’s constant and the speed of light are set equal to unity, dimensional analysis might lead one to expect that the cosmological constant is of order MP4 and certainly not smaller than, say, MZ4, where MZ is the Z boson mass. The observed value is 55 orders of magnitude smaller than even the lower estimate. In conventional QFTs one can’t actually calculate the cosmological constant, so one doesn’t worry about that discrepancy. But for many string theory solutions, the computation can be done, and the result is consistent with expectations from dimensional analysis.

String theorists have tended to hope that the problem would find a solution as the theory is better understood. Some principle would require the cosmological constant to be very small in some privileged solution that doesn’t contain unwanted massless particles. As of yet, there is no inkling of such a principle or mechanism. But in 1987 Steven Weinberg, following a suggestion by Thomas Banks, offered a solution of a very different sort.

Supersymmetry, experiment, and string theory

Supersymmetry has been the most widely studied of the conjectured solutions to the hierarchy problem. There are a number of reasons for that. All of the other ideas for understanding the hierarchy run afoul of precision studies of the W and Z gauge bosons. Supersymmetry makes dramatic, detailed predictions for phenomena that will be studied in accelerators. In addition, adopting the supersymmetry hypothesis, surprisingly, makes for significant progress on some of the outstanding questions discussed in the main text.

In combination with grand unification, supersymmetry accurately predicts the strength of the strong force, and it did so before the era of precision measurements. The modification arises precisely because of the additional particles required by supersymmetry. That success points to the possibility of both supersymmetry and the unification of forces. The proton lifetime is much longer than predicted in theories without supersymmetry, but perhaps only barely large enough to be compatible with current experiments.

The supersymmetry hypothesis almost automatically implies a light, stable particle produced in the early universe in an abundance roughly consistent with the observed dark-matter density. The density of that “neutralino,” the lightest of the supersymmetry partners, can be calculated precisely if the full spectrum of superpartners is known.

Despite those successes, good reasons exist for skepticism. Some are experimental: Apart from coupling unification, no direct evidence yet argues for supersymmetry. The Large Hadron Collider will either make a spectacular discovery or rule out supersymmetry entirely.

But the whole idea, when first encountered, seems rather contrived: a large new symmetry and an array of new particles, put forward to answer a small number of questions. In string theory, however, supersymmetry emerges readily, and many of the features that seem most troubling when one simply builds supersymmetric models become natural. Indeed, the possibility of supersymmetry was first discovered in string theory. So supersymmetry might well be the arena in which string theory is subjected to experimental test. For that to happen, one needs to establish that supersymmetry not only can arise, but inevitably must arise, from string theory. Ideally, one should understand in more detail what the theory predicts for the spectrum of the superpartners. If string theory does not predict that supersymmetry will be observed at LHC energies, then it will be critical to establish what kind of phenomena the theory favors.

Suppose, as illustrated in figure 5, some underlying theory has many, many possible ground states, with a discretuum—a nearly continuous distribution of values of the energy. Suppose further that all of the ground states were probed by the universe in its early history. In most cases, Weinberg realized, the expansion of the universe would accelerate too rapidly for galaxies to form. Only in those regions of the universe with a sufficiently small cosmological constant would stars—and later, observers of those stars—form. The so-called anthropic principle is the idea that of the many possible environments, only those in which observers can exist are of interest. Weinberg found that applying the principle requires that the cosmological constant we observe be extremely small. On the other hand, the dimensional analysis argument shows that small is unlikely, so the cosmological constant should be more or less as large as it can be, consistent with producing a reasonable number of galaxies. In this way, one finds ln ( Λ / GeV 4 ) = 103 3 + 0 , where Λ denotes the cosmological constant. The measured number for the logarithm is about – 108. Weinberg’s is the only argument at present that predicts a density of dark energy consistent with what is observed. His prediction was made more than a decade before the observations.

Figure 5. The many solutions of string theories allow for a great number of possible environments, each with its own cosmological constant. (a) A string-theory potential energy (PE) with many local minima, each of which represents a possible, perhaps metastable string solution. (b) The spectrum of the local PE minima makes up a quasi-continuum called the discretuum.

Figure 5. The many solutions of string theories allow for a great number of possible environments, each with its own cosmological constant. (a) A string-theory potential energy (PE) with many local minima, each of which represents a possible, perhaps metastable string solution. (b) The spectrum of the local PE minima makes up a quasi-continuum called the discretuum.

Close modal

When Weinberg made his proposal, perhaps tens of thousands of string solutions were known; all had moduli or other difficulties. Making sense of Weinberg’s idea required something very different: 1060 to 10200 isolated, metastable states. Those outrageously large numbers are required by the large discrepancy between the observed value of the cosmological constant and the naive estimate obtained with dimensional analysis. It might seem bizarre, even impossible, to posit such a large number of states.

More than 10 years after Weinberg’s article, Raphael Bousso and Joseph Polchinski made the first plausible proposal for how such a vast set of states might arise. They noted that string theory includes many types of fluxes—the familiar electric and magnetic fluxes, but others as well. Like magnetic flux in ordinary quantum electrodynamics, the additional fluxes are quantized and take discrete values. Often hundreds of types of flux exist, each of which can take something like 10–100 different integer values. So it is easy to imagine that there are 10500 or more states. Bousso and Polchinski conjectured, without any real evidence, that those states would be free of moduli. Three years later, in 2003, Shamit Kachru, Renata Kallosh, Andrei Linde, and Sandip Trivedi (KKLT) built upon work of Steven Giddings, Eva Silverstein, and others to provide a concrete realization of the Bousso–Polchinski idea. Leonard Susskind, making an analogy with phenomena in condensed-matter physics and biology, coined the term “landscape” to describe the KKLT collection of possible vacua. Shortly after the KKLT work was published, Michael Douglas and others provided a statistical analysis of the landscape states and demonstrated that a tiny fraction would indeed have a small cosmological constant.

With the emergence of the landscape, those who dream of making detailed connections between string theory and nature have a real program. Since the work of KKLT, theorists have devoted much effort to understanding how the standard model might emerge. Many ground states have been enumerated with photons, gluons, three generations of quarks and leptons, and W and Z bosons—key features of the standard model. Those provide a proof of principle: The standard model, with all its detailed features, is almost surely found among the states of the landscape. A second area of activity has focused around cosmology. It addresses questions such as, Just how does the universe transition between the various ground states? What does the universe look like on extremely large scales? Are there mechanisms that select for states with, say, low cosmological constant, without requiring anthropic considerations? Finally, do the answers to those questions imprint any signals on the sky?

But with the LHC startup rapidly approaching, the most pressing question would seem to be, Does the landscape provide a solution to the hierarchy problem, with actual predictions for accelerators? In my opinion, the most accessible questions in the landscape are precisely those for which dimensional analysis fails. The cosmological constant is the most extreme example; the next most severe is the question of the Higgs mass. The very notion of fine-tuning parameters suggests the existence of an ensemble of possible universes, from which ours is somehow selected; the landscape provides a realization of that ensemble.

To make predictions in the context of the landscape requires deciding how states are selected, a process that is likely to be a combination of statistical, cosmological, and anthropic considerations. Theorists have identified states in the landscape that imply that supersymmetry will be observed at the LHC. Other states exhibit warped extra dimensions or technicolor. Still others have none of those features, but those states contain relatively light particles that could play the role of the Higgs. Most research has focused on the supersymmetric states, which are the easiest to study both individually and statistically. Among those, statistical methods allow investigators to look at questions such as the scale of supersymmetry breaking. As expected, it is tied to the scale of weak interactions. Further, there has even been some progress in predicting the superparticle spectrum.

It is not yet possible to say whether the supersymmetric class of states is favored, or the warped class, or states that have a light Higgs simply by accident. We don’t yet know the relative numbers of such states, nor do we yet understand the cosmology well enough to determine whether selection effects favor one type or another. Those are questions under active investigation.

The landscape and its explorations are exciting developments; still, theorists have expressed skepticism. Many string theorists are unhappy that instead of leading inevitably to a unique or nearly unique picture of nature, the landscape perspective appears to engender many possibilities. They argue that theorists are missing something important and that we should just wait until we understand quantum gravity at a more fundamental level.

Apart from what might be called philosophical concerns, a number of genuine scientific issues attend the application of the landscape idea. Banks has stressed that analyses such as that of KKLT rest on shaky theoretical foundations; the landscape might turn out to be simply wrong. With Elie Gorbatov, Banks and I have noted that among the states that have been studied, not only the cosmological constant but most or all of the other parameters of ordinary physics are random variables. Some physical constants such as the cosmological constant or the electromagnetic coupling might be selected by environmental effects. But many, including heavy quark masses and the θ parameter, seem to have little consequence for the existence of galaxies and stars, or observers. In nature, however, those quantities exhibit intricate patterns that seem unlikely to result from random distributions. One can imagine resolutions, but the problem is a serious one. In his recent popular critique of string theory, The Trouble with Physics (Houghton Mifflin, 2006), Lee Smolin repeats our argument, using it as the basis for his claim that string theory must fail. Clearly I differ with him on this point.

A few years ago, there seemed little hope that string theory could make definitive statements about the physics of the LHC. The development of the landscape has radically altered that situation. An optimist can hope that theorists will soon understand enough about the landscape and its statistics to say that supersymmetry or large extra dimensions or technicolor will emerge as a prediction and to specify some detailed features. But even a pessimist can expect that the experimental program at the LHC will bring new insights into the laws of nature at scales never probed before.

1.
M.
Dine
,
Supersymmetry and String Theory: Beyond the Standard Model
,
Cambridge U. Press
,
New York
(
2007
).
2.
M. R.
Douglas
,
S.
Kachru
, “Flux Compactification,”
Rev. Mod. Phys.
79
,
733
(
2007
).
3.
B.
Greene
,
The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory
,
W. W. Norton
,
New York
(
1999
).
4.
Kavli Institute for Theoretical Physics string phenomenology webpage, http://www.kitp.ucsb.edu/activities/auto2/?id=337.
5.
J.
Polchinski
,
String Theory
,
Cambridge U. Press
,
New York
(
1998
).
6.
B.
Zwiebach
,
A First Course in String Theory
,
Cambridge U. Press
,
New York
(
2004
).

Michael Dine is a professor of physics at the University of California, Santa Cruz, and a faculty member at the university’s Santa Cruz Institute for Particle Physics.