A prototype quantum processor repeatedly beat a traditional, classical processor in a race to solve a puzzle, figuring out a secret combination up to 100 times faster by using exotic physics to sort through data that was deliberately packed with errors.
The experiment, detailed in a new academic paper by scientists from Raytheon BBN and IBM, marks a small but significant step forward for quantum computing – an emerging area of technology that could one day lead to things like advanced encryption technologies, complex new medicines and more accurate radars. Specifically, the team's work challenges the notion that quantum computers will never have more than a negligible advantage over classical machines.
“Despite all the excitement about quantum computing, there are still lingering doubts that what we’re after is on loose foundations,” said Blake Johnson, a co-author of the paper, "Demonstration of quantum advantage in machine learning," published in the April 2017 issue of the journal Nature Quantum Information. “We’re really on the verge of putting the nail in the coffin of those doubts.”
Researchers say the findings are encouraging, but they caution that practical quantum computers are still years away. The processor chip used in the experiment has only five quantum bits, used to store data, compared to the millions of bits found on a typical computer chip. The combination it cracked had only four digits.
Still, Johnson and his colleagues believe they're on the path to solving problems that are too big or too messy for even the world's most powerful classical processors. Quantum computers specialize in those types of problems, and it all has to do with the unusual way they process information.
Through a phenomenon called superposition, quantum computers can see data as a combination of yes and no. While a classical computer might assign one piece of data a value like "black," and then separately call another "white," a quantum computer would process both at the same time. The more quantum bits you add, the greater the processor's ability to perform many tasks simultaneously.
Through another phenomenon called entanglement, quantum computers see two pieces of data as identical twins. As long as the computer can see one of those pieces, it knows everything about the other one as well. And as the researchers discovered, entanglement turns out to be a useful technique for reading data that is hidden or obscured by digital noise.
Quantifying 'the quantum advantage'
The nine-member research team conducted its experiment in a windowless lab at Raytheon BBN in Cambridge, Massachusetts.
Working amid a web of wires, dials and digital displays, the team programmed an ordinary desktop computer to generate a four-digit combination. They sent that data to a small quantum circuit that can also function as a classical processor.
That computer wasn't just sitting on a desktop; it runs inside a cylindrical cooler known as a dilution refrigerator that reaches temperatures colder than anywhere in the universe. The extreme cold is necessary for control; even a slight shift in temperature can scuttle a quantum computer's calculations. The setup included electronics custom-built to control quantum bits, and at the heart of processor was a five-qubit quantum chip furnished by IBM.
The point of the experiment: to see how much faster the circuit could decipher the digits using quantum processing versus traditional processing.
At first, with plain, clean data, the race was a tie – a result the IBM researchers predicted in a previous paper. So the team made the problem harder. With the turn of a dial, they told the desktop computer to confuse the processor by mixing a few errors into the information. As the data became dirtier, the quantum processor's margin of victory increased.
Researchers found the classical computer was far less tolerant of the distorted data; just one glitch would throw off the rest of its computations. By contrast, the quantum program used entanglement to distinguish the good data from the bad, meaning it had to read the code far fewer times than the classical program did. While it might take the classical algorithm 10,000 tries to know the first number was a 7, for example, the quantum algorithm may need only 100.
The results validated the researchers' belief that quantum computers can do certain things better. But it's one thing to believe something and another to watch it happen, said Diego Riste, the Raytheon BBN scientist who designed the experiment.
"When you know what you know in theory, you're never 100 percent sure. There's some parameter that's not accounted for," he said. "Demonstrating it in the lab means having experimental proof that either the model is right, or good enough to predict the results. It's the difference between the idea and putting it into practice."
The findings hint at a day when scaled-up quantum computers could start solving seriously complicated problems involving many variables or incomplete data, researchers said. A pharmaceutical chemist, for example, might use quantum processing to simulate the construction of complex molecules – an important step in speeding up the design of new medications. An aircraft radar with a quantum processor might be able to see through electromagnetic noise to tell if another plane is a friend or a foe.
“It’s the first very clear demonstration of a quantum processor outperforming a classical processor by a very large factor,” said Zac Dutton, who leads the quantum information processing group at Raytheon BBN. “It shows that quantum computing can actually work.”