The possibility that quantum computing could turbocharge machine learning is one of the most tantalizing applications for the emerging technology. But an 18-year-old student just put that vision in doubt after finding a classical solution to one of its most promising real-world applications.
One of the poster boys for the incipient field of quantum machine learning (QML) was a solution to the “recommendation problem”—essentially, how Netflix determines what movie you might like—published in 2016 that was exponentially faster than any classical algorithm. But as reported in Quanta, in the process of trying to verify its unassailability, Ewin Tang came up with a classical version just as fast.
Tang, a prodigy who enrolled at the University of Texas at age 14, was set the task of proving there was no fast classical alternative to the quantum solution by quantum computing expert Scott Aaronson as an independent research project. Ultimately, he ended up taking inspiration from the quantum algorithm to design a classical one exponentially faster than any of its predecessors.
His research is now undergoing peer review before publication, but has already been presented informally at a meeting of quantum computing experts who broadly agreed that the algorithm was correct, according to Quanta.
In some senses it’s good news, because Tang has found a way to translate speedups thought to only be possible with an (as yet unbuilt) quantum computer to more conventional machines. But that’s probably little comfort if you’re one of the companies investing millions trying to build quantum hardware.
Machine learning has long been considered one of the potential early applications for quantum computers, because there are fundamental synergies between the two fields. They’re both best when dealing with huge amounts of data; they’re resistant to uncertainty; and they can tease out subtle patterns traditional computing approaches may miss. The hope is that large-scale quantum machines will ultimately be able to perform these calculations exponentially faster than machine learning running on conventional hardware.
The field of QML was really born in 2008 with the development of an algorithm now referred to as HHL, which was capable of solving the kind of matrix calculations at the heart of many machine learning problems.
Since then, a QML startup incubator has popped up at the University of Toronto; the companies building large-scale quantum computers have publicly pinned their hopes on the field; and there have been a number of early-stage demos of the approach.
Quantum computing startup Rigetti showed at the end of last year that it could run a clustering algorithm on one of its prototype quantum chips, while IBM demonstrated a simple classification task on a quantum computer in March.
Earlier in 2017, researchers used a quantum computer built by D-Wave to crunch data from the Large Hadron Collider to look for photon signatures that could indicate the Higgs Boson. And just last month, one of the creators of the HHL algorithm developed a quantum equivalent of generative adversarial networks (GANs), which pit two neural networks against each other with one trying to trick the other with ever more convincing fakes of things like photos or audio.
But while these demonstrations have shown it is entirely possible to do machine learning on quantum computers, none of them have proved that it would be any faster, at least when dealing with classical data.
And that points to the main limitation of QML. Quantum computers operate on quantum states, not the 0s and 1s that make up the datasets of real-world machine learning problems. Converting that data into a form these machines can run on and then translating the answers back into something human-readable is a major outstanding challenge.
One area where that limitation doesn’t apply is carrying out analysis on quantum data, and so this may end up being one of the early applications of QML. Aside from that, there may be more creative—and as yet undiscovered—approaches to machine learning that exploit the underlying physics of these machines rather than simply trying to copy approaches developed for conventional computers.
And at the end of the day, even if a quantum speedup remains elusive for the time being, Tang’s new algorithm shows that research into quantum algorithms can inspire breakthroughs in classical approaches that can push machine learning forward.