AlphaQubit addresses one of the main issues with quantum computing.
By precisely detecting flaws within quantum computers, this fresh AI system contributes to the increased dependability of this emerging technology.
If it can get quantum computers to function consistently, they have the potential to completely transform fundamental physics, drug discovery, and material design.
A quantum computer would answer some problems in a matter of hours as opposed to billions of years for a regular computer. But compared to traditional CPUs, these new ones are more susceptible to noise. It must precisely detect and fix these flaws if it’s possible to increase the reliability of quantum computers, particularly when used at scale.
To present AlphaQubit, an AI-based decoder that detects quantum computing defects with cutting-edge precision, in a work that was published today in Nature. This joint effort combined the machine learning skills of Google DeepMind with the error correction know-how of Google Quantum AI to expedite the development of a dependable quantum computer.
The ability of quantum computers to execute lengthy computations at scale, which would lead to several new fields of study and scientific advances, depends on their ability to accurately detect mistakes.
Correcting quantum computing errors
Compared to classical computers, quantum computers may solve some complicated tasks in a significantly less number of steps by utilizing the special qualities of matter at the tiniest scales, such as superposition and entanglement. In order to discover an answer, the technology uses qubits, or quantum bits, which may use quantum interference to sort through enormous sets of possibilities.
The qubit’s delicate natural quantum state may be broken by a number of things, including heat, vibration, electromagnetic interference, tiny hardware flaws, and even cosmic rays, which are present everywhere.
By employing redundancy combining many qubits into a single logical qubit and doing frequent consistency checks on it quantum error correction provides a path ahead. By employing these consistency tests to find and fix logical qubit faults, the decoder maintains quantum information.
How a logical qubit is formed by nine physical qubits (small gray circles) arranged in a qubit grid with a side length of 3 (code distance). The neural network decoder (AlphaQubit) is informed by eight additional qubits that carry out consistency checks at each time step (square and semicircle regions, blue and magenta when failing, and gray otherwise). AlphaQubit identifies the mistakes that happened at the conclusion of the experiment.
Creating a neural-network contender for decoding
AlphaQubit is a neural-network decoder that uses Google’s Transformers deep learning architecture, which serves as the foundation for many of the big language models used today. Its purpose is to accurately anticipate if the logical qubit has flipped from its planned state when measured at the conclusion of the experiment, using the consistency checks as an input.
In order to decipher the data from a set of 49 qubits within a Sycamore quantum processor the main computing component of a quantum computer to first trained a simulation. It created hundreds of millions of samples in a range of settings and error levels using a quantum simulator to teach AlphaQubit the basic decoding issue. Then, using hundreds of trial samples from a specific Sycamore processor, everyone optimized AlphaQubit for a particular decoding job.
AlphaQubit outperformed the previous top decoders in terms of accuracy when evaluated on fresh Sycamore data. Compared to tensor network approaches, which are extremely accurate but unfeasible due to their slowness, AlphaQubit makes 6% fewer mistakes in the biggest Sycamore trials. Additionally, AlphaQubit produces 30% less mistakes than correlated matching, a scalable and accurate decoder.
Scaling AlphaQubit for future systems
Also anticipate that quantum computing will surpass current capabilities. It trained AlphaQubit using data from simulated quantum systems of up to 241 qubits, which was more than what was accessible on the Sycamore platform, to evaluate how it would adjust to bigger devices with reduced error levels.
AlphaQubit once more surpassed top algorithmic decoders, indicating that it will eventually be compatible with mid-sized quantum devices.
Advanced functionalities such as accepting and reporting input and output confidence levels were also shown by this system. The quantum processor’s performance may be further enhanced by these information-rich interfaces.
Additionally, AlphaQubit demonstrated its capacity to generalize to situations outside of its training data by maintaining strong performance on simulated experiments for up to 100,000 rounds after had trained it on samples that contained up to 25 rounds of error correction.
Moving towards practical quantum computing
A significant advancement in the use of machine learning to quantum error correction is represented by AlphaQubit. However, there are still a lot of issues with scalability and performance.
A fast superconducting quantum processor, for instance, measures each consistency check a million times per second. AlphaQubit is excellent at correctly detecting mistakes, but it is still too slow to instantly fix problems in a superconducting processor. It will also need to create more data-efficient methods of training AI-based decoders as quantum computing advances toward the possibly millions of qubits required for commercially viable applications.
To overcome these obstacles and clear the path for dependable quantum computers capable of solving some of the most challenging issues in the world, the teams are fusing cutting-edge developments in machine learning with quantum error correction.