The Google AI Quantum team made a big media splash a year ago when it announced quantum supremacy, the point at which a quantum device can solve a problem that a classical computer can’t in a reasonable time frame. But quantum computing still faces many challenges before it becomes practical. As academic and industrial researchers work to increase the number of qubits, reduce error rates, and find more effective error-mitigation strategies, they’ve also become interested in near-term quantum devices, which work with current capabilities.

In that spirit, the Google team has now applied its 54-qubit Sycamore superconducting processor, shown in the photo, to chemistry simulations. The researchers are the first to include a quantum computer in the modeling of a chemical reaction, and their Hartree–Fock calculations are a performance yardstick for a combined quantum and classical computation.

The Hartree–Fock method assumes that the wavefunction for a system of electrons can be written in terms of single-electron functions, without electron–electron interactions, and that each electron feels the average electric field from other electrons. The wavefunction is then adjusted to minimize its energy.

In the Google team’s calculations, each qubit represents a single-electron wavefunction, or orbital. The researchers apply a series of rotation logic gates to effectively rewrite the system’s wavefunction as a sum of those orbitals. The qubits’ degrees of excitation—between 0 and 1—indicate the probability that each orbital is filled. The wavefunction’s energy is measured and fed into a classical computer, which sets new rotation parameters for the gates. The parameters are repeatedly tweaked to find the minimum energy.

The researchers used that method for two common electronic-structure benchmarks: distinguishing the pathways for a diazene molecule, HNNH, to transform between cis and trans isomers and finding the binding energy of stretched linear hydrogen chains for lengths of 6, 8, 10, and 12 atoms. Previous electronic-structure calculations by quantum computers required only up to 6 qubits, but here the researchers used as many as 12 qubits interacting through 72 two-qubit logic gates.

With all those qubits and gates, error mitigation was essential. The team kept only measurements in which the number of particles stayed the same; a change in that number is a clear sign of an error. The researchers also looked at the one-particle densities, and if the wavefunction didn’t yield the expected 0 and 1 eigenvalues, they projected it onto the closest state that did. They were able to get an accuracy that was high enough to make chemical predictions with 99% fidelity for the logic gates and 97% fidelity for readout.

Classical computers can model all the systems in the current study, and classically intractable problems will require an additional one or two orders of magnitude more qubits. But the strategies that the researchers developed and implemented should scale up. (Google AI Quantum et al., *Science* **369**, 1084, 2020; highlight image credit: Google AI Quantum, illustration by Ella Maru Studio.)