Skip to Main Content
Skip Nav Destination

Quantum computing is exciting and important—really!

12 October 2012

Research into quantum computing has improved by leaps and bounds, on the experimental and theoretical side.

Quantum computing, say its champions, promises prodigious power. Its basic currency, the qubit, exists in an on/off limbo until it's read out, so if you could operate on k qubits, a potentially vast space of 2k values opens up for computation. The fundamental operation on qubits is a rotation. Combine the rotations, and you have logic gates. Combine the logic gates, and you have algorithms. In principle, these algorithms can perform calculations far beyond classical computing's conceivable reach.

But to wield that power, you need an actual quantum computer, and building one has proved impossible. Qubits live in small, cold enclaves within the classical macroworld. When heat and other environmental disturbances inevitably intrude, they rob a quantum system of its coherence, its entanglement, and its ability to compute.

So beguiling is the potential of quantum computers that rather than putting people off, the difficulty of building one has assumed the qualities of a mythical quest. Like Jason's for the Golden Fleece, the quest for a quantum computer is hard and long. To sustain it, the champions of quantum computing appeal not to Olympian gods but to terrestrial funding agencies. Not surprisingly, quantum computing has acquired an aura of hope—and hype. Researchers have made steady progress, though. Physicists have fashioned qubits from superconducting Josephson junctions, trapped ions, semiconducting quantum dots, and other systems. They've even built working logic gates.

Still, scaling up a handful of logic gates, whose physical embodiments could require a roomful of lasers, cryopumps, and other finicky equipment, to an actual computer remains out of reach. Rolf Landauer, the IBM physicist who pioneered the notion that information is intrinsically physical, was famously skeptical of quantum computing. All papers on the topic, he said, should come with a disclaimer, and if you didn't have one, he was happy to offer his own:

This proposal, like all proposals for quantum computation, relies on speculative technology, does not in current form take into account all possible sources of noise, unreliability and manufacturing error, and probably will not work.

Landauer's skepticism could prove justified in the end, but it would be a pity if research in quantum computing stopped now. Much of it continues to be worthwhile. At NIST's lab in Boulder, Colorado, for example, David Wineland and his collaborators have applied the techniques they developed for atomic clocks to build logic gates based on trapped ions. Thanks to their work on logic gates, they developed new, entanglement-based clocks of unprecedented precision.

In making qubits out of gallium arsenide quantum dots, Jason Petta, who is now at Princeton University, and his collaborators at Harvard measured the tiny fluctuating magnetic field of 106 gallium and arsenic nuclei inside a quantum dot—a remarkable feat.

Results have been just as impressive on the theoretical front. The work of Microsoft's Alexei Kitaev and others on topological quantum computation has spawned rich and fruitful explorations of the mathematical similarities of field theory, knots, and the fractional quantum Hall effect. Princeton's Robert Calderbank has applied the theory of quantum error correction to understand radar polarimetry, and Ignacio Cirac and Frank Verstraete of the Max Planck Institute for Quantum Optics outside Munich have used the entangled states that crop up in quantum information theory to analyze networks of coupled spins.

Do all these advances, and others, represent milestones on a longer, ultimately successful journey or the ends of truncated trips? I don't know. But they're exciting and important—really.

This essay by Charles Day first appeared on page 104 of the March/April 2007 issue of Computing in Science & Engineering, a bimonthly magazine published jointly by the American Institute of Physics and IEEE Computer Society.

Close Modal

or Create an Account

Close Modal
Close Modal