Information, Physics, and Computation , MarcMézard and AndreaMontanari , Oxford U. Press, New York, 2009. $99.00 (569 pp.). ISBN 978-0-19-857083-7

One great physics achievement has been the statistical approach to determining the behavior of interacting-particle systems; surprisingly, certain macroscopic behaviors turn out not to be strongly related to their deterministic laws of interaction. Particles can be assumed to behave randomly, and macroscopic transitions are revealed as the parameters of the random model are changed. Those transitions, observed at the large scale where fluctuations due to microscopic interactions are averaged out, manifest the behaviors of the most probable concentration.

In their book Information, Physics, and Computation, statistical physicists Marc Mézard and Andrea Montanari masterfully show that the concept is also pivotal to computation and information theory. The authors argue that the information-theoretic view of communication put forth by Claude Shannon in 1948 is based on a similar strategy of looking at the large scale. Shannon considered the limit of long code words, which when picked at random, revealed a “concentration” around the most typical sets and identified the low-error-probability decoding region. Similarly, if a computer scientist considers a large ensemble of random inputs where the parameters are restricted to a certain region, then certain combinatorial optimization problems, such as the random satisfiability problem, can be solved efficiently and with high probability.

Many approaches are available to study the set of problems that are defined by concentration behaviors, such as those outlined above. The information theorist might take a purely statistical approach. The probability theorist might consider a combinatorial approach or study the problem from the point of view of ergodic theory, which analyzes the behavior of a dynamical system when it is allowed to run for a long time. The applied scientist might be more interested in the experimental performance of different algorithms and codes. Mézard and Montanari choose to follow the statistical physicist’s approach, which is to reveal the macroscopic fluctuations of computational or communication systems by finding their minimum “energy” configuration in the thermodynamic limit.

Following that path, the authors describe many statistical physics tools developed over the years: replica theory, the cavity method, density evolution, mean-field theory, and simulated annealing, among others. They also discuss methods that are familiar to non-physicists such as large deviations, Monte Carlo simulations, Markov chains, and belief propagation. And they always manage to draw interesting parallels with the underlying physics of the problems to which those methods are applied.

Information, Physics, and Computation is self-contained and should be accessible to any graduate student with a good background in probability theory and analysis. It is not an easy book though. It begins mildly, but rapidly develops into a tornado, pulling in theory and tools from the furthest reaches of mathematics and physics. The presentation naturally becomes more heuristic when the book gets to the most recent research advances.

By the authors’ own admission, their choice of presenting in detail a selection of problems, primarily from computing and modern coding theory, resulted in a number of interesting topics being left aside. The reader will not find a treatment of networking systems, communication systems with multiple inputs and outputs, or learning theory. Other topics, such as source coding, are relegated to a few introductory notes.

As those omissions and abbreviations show, writing a cross-disciplinary book is a somewhat dangerous endeavor. Nonetheless, Information, Physics, and Computation stimulates that cross-disciplinary dialog, which is always desirable because from it, new perspectives emerge. However, the implementation of such a dialog comes at a cost: Like Odysseus, Mézard and Montanari chose to sacrifice part of the crew to Scylla, rather than lose the whole ship to the whirlpools of Charybdis. In any circumstance, their choice must be considered honorable.