Quantum computers are a little like librarians: both abhor noise.
Compared with their classical counterparts, quantum computers are finicky and need a serene environment to perform their calculations in peace. But even the quietest space in the universe reverberates with quantum noise—the inevitable movement of electrons and other atomic effects. If physicists could quell quantum errors caused by noise on a large enough quantum computer, they could perform some computations, such as exact simulations of molecules, that are intractable for classical computers.
While improvements to hardware help, an essential ingredient is quantum error correction (QEC), a set of techniques to protect the information from this quantum din. “We need our qubits to be almost perfect, and we can’t get there with engineering alone,” says Michael Newman, a quantum computing researcher at Google.
On supporting science journalism
If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
On Monday Google published its latest research on error correction in the journal Nature and showed, for the first time, that errors can be suppressed exponentially as a quantum computer increases in size. “As you make a bigger and bigger system, you get better at correcting errors, but you’re also causing more errors,” says Daniel Gottesman, a quantum information theorist at the University of Maryland, who was not involved with the study. “When you pass this transition, where you can correct errors faster than they’re caused, is when making bigger and bigger systems makes it better.”
Researchers at Google created a silicon chip with 105 qubits, quantum counterparts to classical bits. Then they linked up multiple physical qubits to form a conglomerate called a logical qubit. The logical qubit lasted more than twice as long as any individual qubit it was composed of, and it had a one-in-1,000 chance of error per cycle of computation. (For comparison, the rate of error in a typical classical computer is about one in 1,000,000,000,000,000,000—essentially zero.)
The results were first posted on the preprint server arXiv.org in August, but today Google shared additional details about the technology that enabled the advance: a new quantum processor called Willow (an upgrade to its arboreally named predecessor, Sycamore). “Really good qubits are the thing that enables quantum error correction,” says Julian Kelly, director of quantum hardware at Google and a co-author on the new paper.
Google is not the only company to have made strides in error correction. In September a joint team of researchers at Microsoft and Quantinuum, a quantum computing firm based in Broomfield, Colo., posted results to arXiv.org that showed that, using qubits made from ions trapped by lasers, they could encode 12 logical qubits that had a two-in-1,000 error rate.
Even with advances in error correction, practical applications for quantum computers are unlikely in the near term. Estimates vary, but consensus among many researchers is that to solve useful algorithms or perform robust simulations of chemistry, a quantum computer would need hundreds of logical qubits with error rates below about one in a million.
All That Noise
Two main types of error plague quantum computers: bit flips and dephasing. A bit flip, which also occurs in classical computers, switches a qubit from 0 to 1, or vice versa. Dephasing yanks qubits out of their delicate quantum state, like taking a pie out of the oven before it’s ready. Either error can ruin a computation.
Classical error correction often preserves information via redundancy. If Alice wants to send Bob the message “1,” she could send it in triplicate, copying the 1 two times to transmit “111.” In this way, even if a bit flips—leading to “101”—Bob can still surmise Alice meant to send “1.” But copying information in this manner is forbidden by the laws of quantum mechanics. So in the 1990s researchers had to develop error correction for quantum computers. “We have to spread the information out in such a way that there is redundancy but there’s not copies,” Gottesman says. With the information spread out as a logical qubit, it can be preserved even if one physical qubit is lost to error.
Researchers have been implementing codes that can detect and correct errors for decades, but until recently, there simply weren’t enough high-quality qubits. Now the hardware has finally reached the point where it merits the impressive software. In 2022 Google used error correction on its Sycamore processor to lower the overall error rate. But the rate was still shy of a key threshold, so adding more physical qubits to a logical qubit produced diminishing returns. “As the logical qubits are getting larger, there’s more opportunities for error,” says Newman, who was a co-author of the new study as well as a preprint paper about the 2022 results.
The latest advance is largely thanks to Willow, which improves on Sycamore in three key ways. First, Willow simply has more physical qubits—105, compared with Sycamore’s 72. More physical qubits mean larger logical qubits. “It’s not just the number of qubits,” Kelly says. “Everything has to be working at the same time.” By refining their fabrication processes, Kelly and his colleagues were able to improve individual qubit quality: Willow’s qubits are more robust than Sycamore’s: they maintain their delicate quantum state five times as long and having lower error rates.
To test error correction, Google researchers encoded larger and larger logical qubits: they were first composed of a 3×3 grid of physical qubits, then made up of a 5×5 grid and finally represented a 7×7 grid. As the logical qubits grew, the error rate dropped precipitously. “I saw these numbers, and I thought, ‘Oh, my god, this is really going to work,” Newman says.
Sense of Scale
Experts were broadly impressed by the Google results. Scientific American examined the peer review reports from four anonymous referees. “I think this is a fantastic achievement that has excited the community,” one concluded. Another concurred, writing that “this is one [of] the most important results of the year (if not of the decade) in experimental quantum information.”
Graeme Smith, a quantum information researcher at the University of Waterloo in Ontario, is impressed by the result because it doesn’t cut corners. “Focusing on the error correction is the right thing to do,” he says. “It is a real improvement.” Many previous error correction results relied on postselection, or the practice of throwing away error-ridden runs to create an artificially lower error rate.
There are still caveats to be made, even with Google’s result. Krysta Svore, a quantum computing researcher at Microsoft, points out that by another metric, the error was not one in 1,000 but one in 100. Responding to the critique, a spokesperson from Google said that “the exact number ... is not as important as the increase in performance with increasing size. That’s the key thing that makes this scalable.”
What everyone seems to agree on is that recent advances in error correction are a sea change. “What's absolutely thrilling right now is the progress in quantum error correction,” Svore says. For Gottesman and others who helped develop the theory behind error correction decades ago, a long wait is over. “It's about time we’re finally seeing these demonstrations of fault tolerance,” he says.
The hype around quantum computers has been enormous. In its most extreme form, it includes claims that the devices will cure cancer or solve climate change—or even that they have created a wormhole. Responsible researchers frequently bemoan that hype will lead to unreasonably high expectations and may even lead to a “quantum winter,” in which funding will dry up. The latest error correction results reveal another potential casualty: genuinely impressive advances—like this one—could be dismissed out of hand.