Quantum Researchers Eye AI Advances
Researchers wringing out new quantum computing architectures are increasingly looking at the nascent processing technology as a way to advance machine-learning algorithms for new AI applications.
As quantum computing efforts scale up at corporate, university and government laboratories, the promising yet largely unproven technology could help unlock facets of artificial intelligence, leading to more powerful cognitive computing technologies like IBM’s (NYSE: IBM) Watson platform. Applications range from developing new materials to faster searches of big data, researchers said.
Among the early leaders along with IBM is D-Wave Systems Inc., which recently doubled the capacity of its D-Wave Two system to 1,098 quantum bits, or qubits. The D-Wave 2X processor installed at the USC-Lockheed Martin Quantum Computing Center at the University of Southern California’s Viterbi School of Engineering is being used for optimization and sampling problems as well as machine-learning development.
Greg Tallant, a Lockheed Martin Corp. (NYSE: LMT) fellow at the USC computing center, noted that the upgraded processor could be used to speed the debugging of millions of lines of code or to help solve the aerospace industry’s toughest computational problems.
The USC-Lockheed Martin Quantum Computing Center hosts one of two D-Wave systems currently operating outside the company’s headquarters in Palo Alto, Calif. The other, owned by Google, is hosted at NASA’s Ames Research Center. A third is being installed at the Energy Department’s Los Alamos National Laboratory.
The upgraded system represents the third generation of D-Wave’s “quantum annealing” processor approach that solves problems by “tuning” qubits from their superposition state to a classical state. Center researchers have been testing the quantum annealing approach since 2011. They next hope to demonstrate “quantum enhancement” over classical high-performance computers, a goal that may “perhaps, finally be within reach,” they assert.
Meanwhile IBM’s quantum processor consists of five superconducting qubits. A cloud-enabled version was unveiled in May, allowing users to run algorithms and experiments or work with individual qubits, IBM said.
The goal of these early efforts is to build a “universal quantum computer” with processors containing up to 100 qubits. IBM estimates that a 50-qubit machine could outperform the world’s fastest supercomputers.
IBM and the USC-Lockheed Martin center are seeking to leverage quantum processors to tackle difficult optimization problems that involved finding, for example, the best combination of things at the lower cost. Examples range from mission planning to financial analysis. Optimization applications are expect to be among the first applications to benefit from quantum speed-up.
With Moore’s Law running out of steam, IBM researchers argue that quantum computing could speed new drug discoveries or help safeguard cloud computing platforms along with boosting AI technologies that would advance its Watson cognitive computing platform.
IBM’s approach uses qubits made with superconducting metals on a silicon chip that can be designed and manufactured using standard silicon fabrication techniques. Last year, IBM scientists demonstrated critical breakthroughs to detect quantum errors by combining superconducting qubits in latticed arrangements, and whose quantum circuit design is the only physical architecture that can scale to larger dimensions.
Meanwhile, computer scientists like Thomas Conte, the Georgia Tech engineering professor co-chairing the IEEE’s “Rebooting Computing” initiative, agrees that quantum computing has a role to play in advancing technologies like machine learning. “It’s going to have it’s own niche, its own node in the cloud,” Conte said. “But it’s not low power.”