Achieving the Quantum Advantage in Software
Daniel Justice coauthored this blog post.
The Department of Defense (DoD) faces a number of computationally challenging software engineering problems, including machine learning and artificial intelligence (AI) along with validating and verifying increasingly complex software systems. Finding the ideal solution to these challenges, known as combinatorial optimization problems, is non-deterministic polynomial hard and, with classical computing paradigms, could take billions of years to solve. In the SEI's Emerging Technology Center (ETC), we are working to apply quantum computing to solve these mission-critical problems for the DoD. As described in this post, our latest effort focuses on near-term quantum computing for software verification and validation within the DoD.
Since the beginning of the post-Dennard scaling era and the end of Moore's Law, the integrated circuit computing paradigm has reached its fundamental limits. At the same time, the DoD needs a new computing paradigm to help it gain a mission advantage in machine learning and artificial intelligence (ML/AI). Michael Hayduk, chief of the computing and communications division at the Air Force Research Laboratory, told the Defense Innovation Board in July 2018, that "the Pentagon is especially intrigued by the potential of quantum computing to develop secure communications and inertial navigation in GPS denied and contested environments."
The federal government has also made quantum computing a priority. In December 2018, Congress passed the National Quantum Initiative Act, which spells out a 10-year plan to accelerate the advancement of quantum computing initiatives and directs $1.2 billion to the effort.
Quantum computers work fundamentally differently than classical computers by leveraging the quantum phenomena of superposition and entanglement to construct the fundamental computing elements, qubits. The recent surge in quantum computing research is due in large part to the improvements of error-connected Universal Gate Processing Units (UG QPU), which are Noisy Intermediate Scale QPU (NISQ). See page 20 of John Preskill's paper Computing in the NISQ Era and Beyond.
One challenging problem faced by the DoD to which quantum computing might be applied is satisfiability (SAT), which is a common algorithm used in software verification and validation. Examples include the Lockheed Martin challenge problems, which involve complicated software components of aircraft flight controls that automate evasive maneuvers if the aircraft is in danger. As we describe here, as a result of the complexity of the code and the maneuvers it controls, classical computers are unable to verify and validate that the code is safe. Quantum computing enables a new paradigm of optimization algorithms and computational capabilities to solve these problems in mission-critical time for the sake of safety and mission success.
Achieving a Quantum Advantage in the NISQ Era
As described in a published fact sheet about this work, quantum algorithms and computers promise polynomial and superpolynomial speedup for algorithms like Grover's and Shor's, but these algorithms require in excess of O(105-106) qubits with quantum error correction. While actual quantum computers are available from several different companies, we are currently in the Noisy Intermediate-Scale Quantum (NISQ) era. The number of qubits in current and near-term quantum computers is small (O(101-102)), and their operations are noisy (i.e., bits have a high likelihood of flipping as they interact with each other and their environments). Today's quantum computers don't have enough qubits to run quantum error correction, so algorithms that mitigate noise some other way are necessary.
Specifically, we are investigating near-term optimization algorithms that can run effectively on NISQ QPUs, like Variational Quantum Eigensolver (VQE) and Quantum Approximation Optimization Algorithm (QAOA). As Preskill wrote in Computing in the NISQ Era and Beyond,
Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks, which surpass the capabilities of today's classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications.
Working in the NISQ era presents a number of challenges for us, including the following:
- The number of qubits is small so the problem sizes aren't big
- The solutions that quantum computers come up with aren't always good
Our challenge is figuring out how to use NISQ devices not only to solve specific mission applications for the DoD, but also to help determine when they will demonstrate so-called quantum advantage: a quantum computer solving a problem of practical interest faster than a classical computer. To help the DoD achieve the quantum advantage, our current work focuses on the following:
- benchmarking variational quantum optimization techniques, such as QAOA, and their ability to tolerate NISQ-era QPUs
- improving circuit generation for NISQ-era QPUs
- analyzing the hierarchy of the problems of interest and identifying which parts can be mapped effectively to QPUs
- addressing the challenges of scaling up to O(102-103) qubits and predicting/projecting quantum advantage
- developing software tools to help data scientists and engineers use quantum computers
- creating material to facilitate a quantum computing-literate workforce in the DoD
As we continue to explore these problems, we will extend our work to new applications, including studying quantum machine learning involving quantum algorithms to perform machine learning and artificial intelligence tasks. In addition, we are working on quantum interactive proof systems, using QPUs to form interactive proof systems, and verifying and validating quantum computation.
We hope this work will give us better ways to verify the solution to complex problems more quickly, as well as confidently confirm that problems that are outsourced to quantum computing services were actually solved using quantum computers. Our work will also make it possible to use a quantum computer as a training kernel or classification model for machine learning so that we can train algorithms and classifications much more quickly than previously possible.
As this is one of the first major forays into quantum computing for the SEI, we continue to pay attention to opportunities moving forward in this new ecosystem.
In 2020, look for blog posts covering quantum computing related topics including quantum material science, internet, communications, and sensing.
View the fact sheet, Quantum Computing for Mission: Performant Quantum Computing at the SEI, which served as one of the sources for this post.
Read Quantum Computing in the NISQ era and beyond by John Preskill.
Visit our JupyterHub site.
More In Quantum Computing
Get updates on our latest work.
Sign up to have the latest post sent to your inbox weekly.Subscribe Get our RSS feed
More In Quantum Computing
Get updates on our latest work.
Each week, our researchers write about the latest in software engineering, cybersecurity and artificial intelligence. Sign up to get the latest post sent to your inbox the day it's published.Subscribe Get our RSS feed