It’s sometimes said that the field of quantum information and computing ought to be called applied quantum foundations. That’s because so many of the ideas that first arose when scientists began thinking deeply about the mysteries of quantum theory—entanglement, Bell inequality violations, parallel worlds, interference of probabilities, and quantum contextuality—are now seen to be resources for attaining feats in information processing unimaginable in a classical world. Not only has Reviews of Modern Physics (RMP) nurtured that vibrant young field, it deserves credit for laying its very foundation.

John Bell in his office at CERN, 1982. Bell’s work on quantum entanglement inspired theorists to explore the foundations of quantum theory for decades. (Photo courtesy of CERN.)

John Bell in his office at CERN, 1982. Bell’s work on quantum entanglement inspired theorists to explore the foundations of quantum theory for decades. (Photo courtesy of CERN.)

Close modal

Arguably the most far-reaching article on quantum foundations to come through the pages of RMP was also its first: Richard Feynman’s 1948 “Space-time approach to non-relativistic quantum mechanics.”1 Well-known for introducing the technique of path integrals, the paper goes deeper in presenting what Feynman considered the distinguishing mark between classical and quantum physics. At issue was how probabilities for the outcomes of an actual measurement are calculated in terms of the probabilities given by unperformed measurements. Feynman’s resolution was to introduce the amplitude calculus, but the foundational statement on which it was based was quite clear: “We are led to say that the statement, ‘B had some value,’ may be meaningless whenever we make no attempt to measure B.”

But maybe there is a way to preserve the notion that unperformed measurements have unrevealed values after all, perhaps at the cost of giving up some less-cherished classical intuition. That was the subject of three groundbreaking papers in RMP’s 1966 volume.2–4 In a 1952 non-RMP paper, David Bohm proposed the first hidden-variable extension of nonrelativistic quantum theory:5 A spinless particle actually could be modeled as having a preexistent position and momentum despite Niels Bohr’s edict of “complementarity.” Indeed, researchers showed in the years since that there are many ways to supplement quantum theory with hidden variables; Bohm and Jeffrey Bub wrote in RMP about one such way.3 The only deciding factors seemed to be the inventors’ intuitions and their hopes that the new hidden-variable models might lead to new physics.

Yet in 1952 a young John Bell was already thinking “How could this be?” For John von Neumann had “proved” years earlier the impossibility of hidden-variable extensions of quantum theory. Bell’s paper2 (which had accidentally languished for two years in the RMP editorial office!) and Bohm and Bub’s papers3,4 tackled the question head-on. The common conclusion was that von Neumann’s theorem and later refinements of it rested on overly restrictive assumptions that the various hidden-variable models simply shrugged off. Most portentous was a line at the end of Bell’s paper: “It would … be interesting … to pursue some further ‘impossibility proofs,’ replacing the arbitrary axioms [I] objected to … by some condition of locality, or of separability of distant systems.” In fact, Bell had already settled his question in the intervening years: No local hidden-variable model could ever be up to the job of reproducing quantum theory’s statistics, he concluded. In other words, all successful hidden-variable models must have what Albert Einstein dubbed “spooky action at a distance.”

Is locality a less cherished principle to give up than the idea that unperformed measurements have preexistent (but yet to be seen) outcomes? Einstein in 1948 had already expressed the conundrum with admirable clarity: “Without … an assumption of the mutually independent existence … of spatially distant things . . . physical thought in the sense familiar to us would not be possible. Nor does one see how physical laws could be formulated and tested without such a clean separation.”6 In the years after Bell, the points were made increasingly sharp, culminating 45 years after Einstein’s statement with one of the most powerful and thorough presentations of what is at stake with those considerations: David Mermin’s RMP analysis of the then newly discovered three-particle Greenberger-Horne-Zeilinger paradox.7 

Unperformed measurements either have no outcomes or they have some but with spooky action at a distance. Is there any other option besides those? Might unperformed measurements have all possible outcomes? As strange as it might seem, that question too was first explored in the pages of RMP—through Hugh Everett III’s seminal paper on the many-worlds interpretation of quantum theory.8 His idea was that the universe obeys a giant Schrödinger equation, and there is no such thing as “measurement” in any preferred or fundamental sense. There is only physical interaction as specified by the Hamiltonian of the universe, and that interaction leads the universe to continuously branch into parallel worlds.

As John Wheeler argued in a companion piece to the paper, a key attraction to the many-worlds view is that it seems to offer a way forward for quantizing general relativity.9 Yet the Everett interpretation has not been without its problems. Most prominent among them is how one can justify the particular probability calculus of quantum theory from its completely deterministic picture. Since 1957 a surprising number of distinct potential solutions have been proposed for that fundamental problem, with still no consensus at hand. But RMP has been there too, with Wojciech Zurek’s comprehensive analysis of what the notion of decoherence brings to the table.10 

Wheeler eventually had his own problems with Everett’s interpretation,11 but the influence he had on all the interpretations discussed here is interesting in its own way. Wheeler was the PhD adviser of both Feynman and Everett when they were doing their foundational work, and Zurek was his postdoc. In the last 25 years of his life, Wheeler landed on a peculiar thought. He desperately wanted to know “Why the quantum?” and it was his conjecture that whatever the answer, it should be of an “information-theoretic color.”

In fact, Wheeler’s perspective was in no small part responsible for the field of quantum information. One of us (Fuchs) was lucky enough to be under Wheeler’s tutelage at the time, and it led to a quest for how to think about quantum states consistently as (subjective) information. The end point was a view of quantum theory called quantum Bayesianism, or QBism, which also made its debut in RMP.12 One of the things that sets QBism apart from the other interpretations is its reliance on the technical details of quantum information to amplify Feynman’s point—that the modification of the probability calculus in quantum theory indicates that something new is created in the universe with each quantum measurement. Only it takes the formalism of quantum information to see it with the greatest clarity. (See the Commentary by N. David Mermin, Physics Today, July 2012, page 8.)

Indeed, by the example of QBism, one might wonder whether quantum foundations is “applied quantum information” instead. So the subject comes full circle. Whatever the future directions in quantum foundations research, history bears out that RMP will be there publishing the deepest and most far-reaching articles on the subject.

The rest of the Reviews of Modern Physics 90th anniversary articles can be found in the special feature’s introduction by Randall Kamien.

passage transl. in
Stud. Hist. Philos. Sci. Part A
), p. 187.
J. A.
, in Quantum Mechanics, a Half Century Later,
Leite Lopes
, eds.,
), p.
C. A.
Rev. Mod. Phys.
N. D.
Physics Today
Randall D.
Physics Today

David DiVincenzo directs the Institute for Theoretical Nanoelectronics at the Peter Grünberg Institute in Jülich, Germany. He is also a professor at the Institute for Quantum Information at RWTH Aachen University, also in Germany. Christopher Fuchs is a professor of physics at the University of Massachusetts Boston and a fellow of the Stellenbosch Institute for Advanced Study in South Africa.