The article by Carleton DeTar and Steven Gottlieb (Physics Today, Physics Today 0031-9228 572200445 https://doi.org/10.1063/1.1688069February 2004, page 45 ) misleads readers from outside lattice field theory (LFT) about its past, present, and future. At best, the article may present a consensus in the collaboration, known as MILC, of which the authors are a part.

In their historical overview, the authors miss too many of the field’s truly remarkable achievements. The renormalization group, a conceptual organizing principle of all field theories, was first concretely formulated in LFT. A non-gauge result relevant to particle physics was that the Higgs mass must be less than 700 GeV. Du-ality and the role monopoles play in it also originated in LFT, and so did confinement and finite-temperature deconfinement. The first decade of LFT has been extremely productive and has had a long-lasting impact on theoretical particle physics and field theory. One could call this period the bosonic era of LFT, and it is an illustrious one.

The inclusion of fermions, a much needed step beyond the bosonic era, has preoccupied a large fraction of the community. Fermions had a conceptual defect in their original formulation by Kenneth Wilson 30 years ago. Only quite recently has that problem been finally solved. The solution constituted important theoretical progress, validating continuum ideas in a fully nonperturbative setting and restoring precise chiral symmetry.

A significant physical step was the formulation of the valence approximation and the discovery of its surprising numerical agreement with experiment. Technically, a most important development was the discovery of an algorithm that could take us beyond the valence approximation to truly ab initio numerical quantum chromodynamics (QCD). Neither of these two steps originated from MILC, although the collaboration made contributions at later stages. Both the surprises surrounding the valence approximation and the algorithms making it feasible to go beyond it are important advances.

After the initial formulation of LFT and the associated renormalization group ideas, it became clear that the approach to continuum could be sped up by fine-tuning the lattice action. This “improvement,” while important in practice, lacks the theoretical novelty of the above-mentioned achievements.

The recent calculations reviewed in the article go beyond the valence approximation and attempt to improve the approach to continuum by a logarithmic factor relative to previous simulations. That these calculations required the use of an unfounded artificial suppression that purportedly reduces the number of sea quarks by four was not mentioned by the authors. Without the artificial suppression, the violation of taste equivalence would indeed have been logarithmically improved, but there would have been fourfold too many sea quarks. No effective field theory representation of the actual simulation exists that also includes the artificial suppression of sea quark contributions.

A correct and direct way to full QCD is known, based on exact lattice chiral symmetry. Today’s computational cost holds us back, but a few more years will almost certainly bring us the power we need to do the calculations right and to present to the rest of the particle physics community accurate numbers that were obtained directly from the QCD Hamiltonian, with no additional assumptions. It is wrong to present the two methods of including fermions, one based on Kogut–Susskind fermions and the other based on lattice fermions with exact chirality, as being on equal theoretical footing. The true objective of numerical QCD—to assist theoretical analysis to produce numbers that, if they disagree with a correct experiment, imply the discovery of new physics—will eventually be attainable, but only with truly chiral fermions.

I disagree with the authors that lattice QCD has matured; rather, its practitioners have, and their relent-less pursuit of computer resources seems to have drained some of them of the self-discipline required when presenting results to the rest of the particle physics community.

Unlike experiments in nuclear or particle physics, lattice projects do not have to be big in terms of personnel. We should rethink the policy that concentrates almost all of the computing power in the hands of a few large collaborations like MILC. That policy has tended to stifle individual thinking, imaginative risk-taking, and self-criticism. Now is a good time to do that rethinking, because an alternative exists: Small but reasonably effective commodity clusters (groups of standard personal computers net-worked to act as one computational resource) have reached prices affordable for small groups and even individual researchers. More money should go to small research groups, or even single researchers, and should be earmarked for purchasing computer clusters. No science would be lost if the funds of large collaborations were restricted to make this possible.