Skip to Main Content
Skip Nav Destination

Quantum computer lends a hand at writing numbers

22 August 2022

A classical neural network paired with a quantum circuit produces digits that look convincingly handwritten.

Before quantum computers become practical, they’ll need more qubits, lower error rates, and more effective error-mitigation strategies. In the meantime, industrial and academic researchers are investigating how current quantum computing capabilities can augment the problem-solving of conventional computers (see, for example, “Quantum computer models a chemical reaction,” Physics Today online, 8 September 2020). What sorts of tasks would benefit from the addition of a quantum computing step? A study by Alejandro Perdomo-Ortiz of Zapata Computing, a quantum software company, and his colleagues provides new evidence that tasks in machine learning may be promising candidates.

Neural networks are most often assigned the job of recognizing patterns, but a growing body of work aims to use them to produce text, images, music, videos, and more. The current best architecture for those so-called generative neural networks is what’s known as an adversarial network, which employs generator and discriminator subnetworks. The generator tries to imitate the training data, and the discriminator tries to pick out which images are network-generated fakes and which are real training ones. As the two subnetworks try to outmaneuver one another, they get better at producing and spotting fakes.

A collection of numbers that look handwritten but were produced by a neural network
Credit: M. S. Rudolph et al., Phys. Rev. X 12, 031010 (2022)

Adversarial networks are not without potential pitfalls. One such obstacle is a phenomenon called mode collapse. Say, for example, the network is trying to produce digits that look convincingly like handwritten ones, a common benchmark task in machine learning. If the discriminator had a slight preference early on in training for deeming the digit “0” as real, the generator could seize on that weakness and churn out only 0s rather than the full range of digits from 0 to 9. The quality and resolution of those 0s would improve with time, but the network wouldn’t be doing what was truly intended. Alternatively, the network could, rather than steadily improving, continue to wildly diverge in what it produces over the course of training.

Those training problems are, in part, because of the choice of initial probability distributions that are fed into the generator. Quantum computers could offer a way to stabilize the training process and avoid such issues. To test the idea, Perdomo-Ortiz and his colleagues compared how a classical adversarial network and a classical–quantum adversarial network fare at generating digits that look handwritten. Both systems trained on a standard pool of 60 000 handwritten numbers.

In the classical–quantum case, in addition to the usual generator and discriminator subnetworks, an eight-bit quantum circuit measures and represents which nodes are firing—in effect, how the neural network is representing the data—in part of the discriminator. That information is then fed into the generator as its initial probability distribution before each round of training. Why use a quantum circuit? Primarily because it can represent probability distributions not accessible when the classical network is trained on its own. To cope with the limited number of qubits available in current quantum computers, the researchers measured the qubits’ states in multiple bases, a feat not possible in conventional circuits, and thereby increased the information that could be stored in each qubit.

The quantum-aided networks produce convincing handwritten numbers, shown in the image, and although by eye they look about the same as classical network ones, they do beat them slightly: The inception score, which quantifies the quality and diversity of the digits, for quantum-aided implementations gets closer to a perfect 10, with a score of about 9.36 compared with around 9.20 for classical adversarial networks.

The addition of the quantum circuit also makes the training more robust; the inception scores didn’t decrease as much over time when the training parameters were unfavorable. But it comes at the cost of time and money. Quantum circuits must be accessed and trained in their own additional step. Still, those downsides would be worth it or even necessary for data sets too complicated for classical computers to reasonably handle. (M. S. Rudolph et al., Phys. Rev. X 12, 031010, 2022.)

Sample of handwritten letters

Related: Image classification approaches the speed of light

A chip with a photonic neural network distinguishes among simple figures faster than an electronic chip and without the need for data storage.

Ashley Piccone

Close Modal

or Create an Account

Close Modal
Close Modal