The number 15. A program for a quantum computer uses quantum bits, called qubits, and the property of superposition to look at all possible combinations of “0” and “1” simultaneously. Researchers from IBM’s Almaden Research Center and Stanford University needed seven qubits for this first experimental demonstration of Shor’s algorithm—a quantumcomputer method that can factor large numbers much more quickly than can be done on traditional binary computers. The group designed a “computer molecule” that incorporated five fluorine-19 atoms and two carbon-13 atoms to provide seven nuclear-spin qubits that could be manipulated by a sequence of spin-selective radio-frequency pulses. The pulses were applied to a liquid containing those molecules at room temperature and the first three qubits were then “read” with nuclear magnetic resonance spectroscopy, yielding the factors three and five. Although the implementation cannot easily be scaled up, it nevertheless provides a demonstration of real-life quantum computing together with...

You do not currently have access to this content.