Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage Articles Quantum Acceleration in 2020

Quantum Acceleration in 2020

Key Takeaways

  • With use cases where quantum can outperform classical computing being identified across diverse industries, investment in quantum computing is increasing.
  • A growing number of quantum devices, algorithms and software development tools are now available and are already in use by innovative global enterprises.
  • Quantum hardware devices are becoming increasingly powerful, with innovation led by Google, IBM, Rigetti, Honeywell, D-wave and IonQ.
  • Quantum software is also advancing, with hardware-agnostic platforms that abstract software development across devices, and algorithms seeking to add business value by leveraging the noisy and low quantum volume devices available today.
  • Quantum programming languages, software development toolkits and workflow management software are enabling a new generation of software developers to create their own quantum applications.

In Gartner’s latest Hype Cycle for Compute Infrastructure, quantum computing is at the peak of "inflated expectations" (though not yet in the "trough of disillusionment"). Gartner expects it to remain there for the next decade. While many emerging opportunities point to how enterprise end users will be able to use quantum computers to outperform classical computers in solving useful business problems, the question of when they will be able to do so remains open. Nevertheless, a lot of exciting work is happening in the quantum computing space and significant milestones are rapidly advancing its development.

In this article, we will provide an overview of these advancements on both the hardware and software fronts. Along the way we’ll share the results of our own research and development in this field. We will also sketch out some of the steps that organizations can take now to be "quantum ready."

The level of investment – both public and private – focused on quantum computing is one indicator that interest in this technology is picking up steam. Just recently, for example, the White House Office of Science and Technology Policy (OSTP) and the National Science Foundation (NSF) announced the establishment of three Quantum Leap Challenge Institutes that will receive a total of $75M to accelerate quantum information science (QIS) R&D. What’s more, private investment in quantum technology, coming primarily from VCs, quadrupled from 2012 and 2018. Indeed, experts project that by 2025, the quantum computing market will hit $770 million.

IBM, Honeywell, Google, Microsoft, and Amazon have all placed significant bets on quantum with an eye to improving a range of business outcomes through hardware and software. Enterprise end users, from the petrochemical and pharmaceutical industries to financial services and logistics, are increasingly committed to research and experimentation with quantum technology.

Another indicator of quantum computing’s current acceleration is the sheer number of tools – algorithms, software, hardware – currently available or in development in academia, software startups, and open source software communities. This growth in the quantum toolkit is partially driven by the collective efforts of academic research and research partnerships across the board from universities to the private sector. The following are just a few recent examples:

  • MIT researchers recently introduced a quantum computing architecture that can perform low-error quantum computations while also sharing quantum information between processors.
  • ETH Zürich, in partnership with Microsoft, created a more efficient quantum algorithm (requiring one tenth the resources) for simulating the catalytic process, an application that can help combat climate change.
  • The Department of Energy/Los Alamos National Laboratory unveiled an algorithm that "fast-forwards" quantum calculations to overcome the challenges posed by the time constraints arising from decoherence.

Given the growth of the quantum computing ecosystem, not to mention the technology’s enormous potential even in the near-term, companies in the "quantum contemplation" stage need to make a decision as to whether or not they will invest now in building quantum capabilities, or adopt a wait-and-see approach. While choosing the right path with emerging technologies is always challenging, the advances made thus far will only intensify the urgency associated with this choice. At Zapata, the company we started several years ago to enable enterprise quantum applications, we firmly believe that quantum devices will soon cross the threshold and begin to outperform classical computers in particular industries. When this happens, it may be challenging for those who are not quantum-ready to catch up.

The Quantum Computing Landscape: Hardware

We’ll revisit how companies can best frame their decisions around quantum readiness at the end of this article. Before we get there, however, it’s worth taking a closer look at the full landscape of quantum and quantum-adjacent technologies available today.  

At the lowest level of the quantum stack, we have quantum hardware. While there are numerous models for harnessing the computing potential of quantum states, there are only a handful of firms building or investing in the construction of quantum computing devices.

IBM has been a pioneer in the field, having built 28 quantum computers and made them accessible through its IBM Quantum Experience over the last four years. Google Quantum AI Lab has already created several quantum processors, including the Sycamore chip it used to demonstrate quantum supremacy. Rigetti has been building quantum chips for several years and is now also offering a quantum cloud computing platform.

More recently, Honeywell announced it had built a quantum computer with a quantum volume of 64, the largest ever measured. IBM also announced that it had doubled its systems’ quantum volume from 32 to 64 using one of its newest 27-qubit Falcon processors. The claims by both IBM and Honeywell were soon dwarfed by that of IonQ, who claimed to have created a machine with "an expected" quantum volume exceeding 4,000,000. Quantum volume is a measurement of computer power that combines the number of qubits – the quantum version of the bits used by classical computers – with other performance indicators, such as error rate and time to decoherence, to give a more holistic picture of a machine’s capabilities. Partners like JP Morgan are already running algorithms on the Honeywell machines for proof-of-concept efforts related to fraud detection, trading strategy optimization, and security.

Finally, D-Wave Systems has had machines relying on a process called quantum annealing in the market since 2012. NEC, Volkswagen, DENSO, and Lockheed Martin all currently make use of this technology.

When discussing hardware, we should also mention quantum simulators. These are purpose-built software tools that simulate quantum devices on classical hardware. The work being done on simulators provides valuable insight into how to get performance from quantum computing algorithms and architectures. Popular simulators include Intel’s qHiPSTER, IBM Q’s Cloud Simulator, Rigetti’s Forest, and Qulacs.

At this stage of its evolution, the different approaches to quantum hardware tend to be uniquely suited to different use cases. For this reason, quantum computing software generally must account for the specific quantum hardware it runs on. To address this challenge, there are a number of platforms that aim to be hardware-agnostic so users can run programs on a range of quantum (and quantum-inspired) devices. Given the diversity of the devices and how rapidly they are evolving, it’s also important to allow users to tune parameters and make specifications to get the most performance out of their current devices.  

Solving the problems associated with the idiosyncrasies of particular quantum computing devices has been one focus of Zapata. Specifically, our team has been working with IBM to execute the variational quantum factoring (VQF) algorithm on one of their premium backends. The goal of this work is, in part, to develop an alternative to Shor’s algorithm. Shor’s quantum factoring algorithm, first proposed by Peter Shor nearly 30 years ago, has attracted a lot of attention due to its potential for breaking today’s most advanced encryption protocols. The challenges involved in working with this algorithm, however, have stemmed from the assumption that for problems of practical interest, the hardware required to run it would require tens of millions of qubits.

This is far too many for noisy intermediate-scale quantum devices that are available in the near-term, rendering the potential of quantum computers to compromise modern cryptosystems with Shor’s algorithm a distant reality. Hybrid approximate classical/quantum methods that utilize classical pre- and post-processing techniques, like the proposed VQF approach, may be more amenable to factoring on a quantum computer in the next decade.

One key takeaway from this research is that we can use a hybrid approach to simulate the impact of noise on quantum optimization. By carefully inspecting hardware characteristics and sources of noise, we can account for these details on the algorithmic level and improve the performance of quantum algorithms.

Furthermore, our broader research has uncovered the relative strengths of different hardware platforms for specific use cases. For example, we have developed an application benchmark for fermionic quantum simulations. This gives researchers a way to "test-drive" quantum devices to assess how well they perform in real world applications such as quantum chemistry. In this case, our benchmark showed that Google’s Sycamore chip has real value beyond the theoretical demonstration of quantum supremacy, particularly for simulating strongly correlated electronic systems, which can be very challenging to simulate on classical computers.

Similarly, in the field of machine learning, some of our team members have investigated how connectivity between the qubits in the quantum hardware can affect the performance of a quantum device focused on generative modeling tasks. One example of such generative modeling would be the generative adversarial networks (GANs) used to produce photorealistic images based on a training set of sample images. In practice, our approach can be used to apply near-term quantum devices to enhance machine learning tasks.

The Quantum Computing Landscape: Algorithms, Toolkits and Workflows

2020 has also seen the steady emergence of new quantum algorithms, algorithms for noisy quantum devices offering a concrete reduction in computational cost when compared with classical computing. An example of this is a set of quantum techniques recently developed by Zapata, leveraging Bayesian inference with engineered likelihood functions, that can derive the maximum quantum speedup on any given noisy quantum computer (quantum devices today are considered "noisy" because their sensitivity to the environment can quickly degrade the accuracy of the calculations they perform). These techniques are especially useful for quantum amplitude estimation, a ubiquitous element of many quantum algorithms employed in chemistry, machine learning, and finance.

Given a noisy quantum device, our research team has proposed a novel approach to optimizing the rate of information gain in estimation algorithms. This is done by: incorporating the effect of errors in the algorithm design, maximizing the rate of information gain with the available hardware capabilities, and designing quantum circuits with fewer qubits and shorter depth.

Many frameworks and tools have emerged for developing quantum applications based on these algorithms. Microsoft’s Quantum Development Kit (QDK), for example, provides a tool set integrated with leading development environments, open-source resources, and the company’s high-level programming language, Q#. It also offers access to quantum inspired optimization (QIO) solvers for running optimization problems in the cloud. For building quantum circuits and algorithms that take advantage of quantum processors, IBM offers Qiskit, an open-source quantum computing library for Python. Cirq is yet another quantum programming library created by the team of scientists and engineers at Google. It contains a growing set of functionalities allowing users to manipulate and simulate quantum circuits. Finally, Quil is a quantum programming toolkit from Rigetti that also provides a diverse array of functionalities and data structures for supporting quantum computation.

There are also packages, such as Xanadu’s Strawberry Fields and D-Wave's Leap, aimed at quantum backends that are not based on the gate model paradigm. In addition, we see the ongoing creation of domain-specific tools, such as OpenFermion and Xanadu’s PennyLane, purpose-built for running quantum chemistry and quantum machine learning applications, respectively.

These quantum development tools can be brought together by workflow management software, which facilitates the coordination and automation of processes across a complex ecosystem. Tools like this have long played a role in accounting, engineering, supply chain management and other areas. Workflow tools are a crucial step towards advancing quantum applications because, put simply, quantum computing is extremely complex. This complexity stems from the vast expansion of use cases, devices and approaches to application development. Accessing quantum hardware is an endeavor in itself, and at this stage, there are no quantum "operating systems" to simplify, for example, the processes of executing code, aggregating and analyzing data, or managing processes across platforms. This makes reproducing experiments on different devices difficult. To coordinate and manage quantum experiments across different devices, workflow tools are invaluable in the quantum stack.

Zapata Computing’s Orquestra is one such workflow tool, allowing users to compose and run quantum workflows across a range of devices, both quantum and classical. Think of Orquestra as the equivalent to the assembly line in the Industrial Revolution, standardizing processes and bringing interchangeable quantum parts into workflows. Orquestra enables teams to swap out and interchange quantum and classical subroutines to build applications. By modularizing the components of the quantum software pipeline, quantum algorithms can be easily upgraded as quantum devices mature, without needing to build entirely new algorithms for every new iteration.

To illustrate the power of this orchestration, Zapata collaborated with IBM on an optimization problem leveraging a hybrid quantum-classical system. By using Orquestra for orchestrating both the pre-processing on classical devices and running on IBM’s quantum backend for solving the resulting optimization problem, the joint team was able to factor an integer much larger than the record at the time the experiment was carried out. Similarly, we have shown how Orquestra can be used to swap out simulators and optimizers when conducting a Variational Quantum Eigensolver (VQE) calculation using a combination of tools – Google’s Cirq, Rigetti’s PyQuil, and IBMs Qiskit.

Quantum readiness in the age of quantum acceleration

The advances in quantum hardware, algorithms, software and workflows we have been discussing have led organizations in a range of industries to seek out effective ways to adopt and use this evolving technology.

Without engaging in too much hyperbole, as quantum computing advances, it will have a broad and disruptive impact. While there are still technical hurdles to overcome before its full impact is felt, with each advancement, we discover new, near-term applications for quantum. As its potential is gradually realized, we return to the question of quantum readiness and how organizations can approach their own journey to quantum.

To decide whether or not, and how much, to invest in quantum technologies, organizations have to answer these questions:

  • How will quantum disrupt my industry? For some industries like security, finance, pharmaceuticals and logistics, the disruption will be game changing. With security, quantum will permanently change our approach to encryption. Financial services organizations and any businesses that rely heavily on statistical modeling of future outcomes will benefit from quantum’s ability to perform calculations in ways that go beyond classical capabilities. For example, quantum mechanics can accelerate certain statistical sampling tasks by improving the convergence on statistical error. With the ability to simulate quantum interactions, which are intractable to simulate efficiently and accurately on classical computers, pharmaceutical companies and material sciences companies will be able to explore chemical space much more effectively. Another way to accelerate the discovery of new molecules and compounds is by performing quantum-enhanced machine learning to extract correlations in the data that are hard to extract classically. Organizations that benefit from optimizing supply chains and distribution routes will see significant operational transformation when they unlock novel ways in which quantum or quantum-inspired techniques explore the solution space of some of the most complex optimization problems. The bottom line is this: If quantum computing will disrupt your industry, then the pace of acceleration means an investment in quantum is warranted now.

  • What are my specific use cases? Quantum computers are uniquely suited for tasks that involve simulating complex evolutions, such as financial models or climate change forecasting. Quantum computing can also be a powerful tool for modeling quantum interactions, which is useful for drug development and material science applications, such as creating more sustainable batteries. Quantum devices can also enhance the ability to augment and sample from probability distributions with limited data, as in the case of modeling pandemics and rare diseases. The effective use of quantum computing depends on creativity and a deep understanding of both the specific problem to be tackled and the capabilities of available quantum computing resources. The disruptive potential of quantum mechanics as a computational resource has led to an increase in research on quantum use cases across the board from academia to government institutions to enterprise organizations. The land grab of quantum IP has begun and the time to get in the game is now.

  • Do I want to build, buy, or partner? If quantum will have a disruptive impact on your industry and investment seems attractive, the final question that you need to ask involves whether you want to: Build your own quantum capabilities (through upskilling and hiring); buy capabilities through acquisition (or "acqui-hiring,"); or partner with a consultancy. On the build side, there is an acute shortage of quantum talent. The quantum computing field is still relatively new and requires novel problem-solving skills for software engineers accustomed to classical computing. The narrow talent pool also means that buying quantum capabilities, through acquisition or hiring, can get expensive. Partnering with consultancies gives you immediate access to talent and will likely be the most cost-effective and efficient option in the near term. At the same time, you will need to pay special attention to IP issues: Will you own the resulting IP or your partner? In other words, there are pros and cons to each option and you will need to weigh them against the strategic importance of quantum in your industry.

2020 continues to prove itself to be a year of major acceleration for quantum computing, with research continuously producing exciting results and a growing set of quantum tools and technologies available. Investment is ramping up, and powerful global enterprises are already exploring quantum use cases to gain an edge over their competitors. The hardware is becoming more powerful, and a growing number of software tools are lowering the barriers to entry for developers. With advances in quantum algorithms, it can be expected that enterprises will unlock significant value with the quantum devices available in the near term and keep pace with the technology as it matures.

In the future, companies that have already taken the quantum plunge will use these tools to test specific applications, prove concepts, and build workflows that can take advantage of quantum devices as they mature over time. As quantum technologies continue to evolve, and as more companies get in the game, progress in this field will accelerate even more dramatically. Eventually, we believe that those focused on quantum will achieve a business advantage that simply can’t be matched.

About the Authors

Yudong Cao obtained his Ph.D. in Computer Science from Purdue University in 2016. After graduation, he joined the Aspuru-Guzik group at Harvard University where he focused on developing and deploying algorithms for noisy intermediate-scale quantum devices. This work has served as the foundation for the applications and solutions Zapata offers enterprise clients today.

Tim Hirzel has a BA in Computer Science from Harvard University and an MS from MIT’s Media Lab. Since 2005, Tim has been a software engineer and architect in science-based technology startups. Today he is focused on delivering a best-in-class quantum computing platform for Zapata and its customers.


Rate this Article