BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Articles Cats, Qubits, and Teleportation: The Spooky World of Quantum Computation Applications (Part 3)

Cats, Qubits, and Teleportation: The Spooky World of Quantum Computation Applications (Part 3)

Leia em Português

Bookmarks

Key Takeaways

  • The same factors which make quantum theory so startling also make quantum computers very difficult to implement in practice: quantum phenomena don't manifest themselves in everyday life.
  • When thinking about how powerful a quantum computer is, it’s important to consider the error rates, as well as the number of qubits. Increasing the number of qubits does not improve a quantum computer if the error rate also goes up.  
  • Given the cost, size, and physical delicacy of quantum computers, they're a perfect fit for the 'pay per use' cloud consumption model.
  • Tools such as the QISkit quantum SDK allow quantum programs to be written, and then compiled for execution on hardware. QISkit includes a python-based quantum DSL and qasm, a quantum assembly language.
  • Quantum advantage will have been achieved when quantum computers are large enough and robust enough to be useful. We’re not there yet, and we don’t even know exactly where the boundary is.

Problem domains

The earlier articles in this series introduced quantum theory, and explained how these counter-intuitive quantum behaviours enabled an extraordinary set of computational algorithms. In particular, Peter Shor discovered how quantum computers could do efficient factoring of integers, and Lov Grover showed they gave a significant speedup when searching unstructured data. There hasn’t been an algorithm breakthrough to rival Shor’s or Grover’s in the last fifteen years, but as quantum computers have started to look more viable, there's been a lot of thinking about the use cases those algorithms could be applied to.

So far, no one's thought up a very useful application for quantum teleportation. Sending information faster than light is incredibly cool, but the requirement to send half an entangled pair to the recipient first, and then also do classical communication, is an onerous one. Light's also pretty fast, so for almost all communication on earth, the travel time of the photons is less than the processing time of the computer sending the message, and much much less than the reaction time of the person receiving the message.

Factoring, on the other hand, has a very obvious application. Paradoxically, factoring is both the most useful and least useful quantum algorithm. Breaking public key cryptography would be an extraordinary achievement, with shattering implications for retailers, the finance industry, and intelligence organisations. However, once they're known to be insecure, factoring-based public key encryption will be obsolete, and the industry will shift to "quantum-safe" cryptography protocols. Once factor-based encryption isn't being used, there's little value in being able to read transmissions encrypted that way. 

Unlike factoring, whose only purpose is to undo something someone else did, a quantum solution to the travelling salesman problem could advance human capabilities. The problem has variations in transport (obviously!), logistics, microchip manufacture, and DNA sequencing. One caution is that, while precise solutions for the travelling salesman are intractable once the problem gets big, there are some very good, and very cheap, approximations. Precise solutions to the salesman problem can take many years (or even millions of years) to compute, even for tens of cities. This is clearly impractical for all but the most trivial problems. However, if the requirement for perfection is relaxed, a million-city problem can be solved to within 2 or 3% of the optimum solution. This is good enough for most purposes.

At first glance, unstructured search may seem to have few applications. Grover described his use case as searching through a huge phone book to find the name of a person with a given number, and it’s hard to imagine doing that often. In the context of data search, Grover’s algorithm could be useful for situations where creating a reverse-lookup table isn't possible, or cases where a large body of data cannot be indexed. However, Grover’s algorithm is actually much more general. What’s being searched for, mathematically, is something which satisfies a given function (“a function inversion”).That mean we can search for the solution to any problem, as long as the answer can be easily verified. This means Grover’s algorithm is the slower, but more general, cousin to Shor’s algorithm; it could be used for brute-force attacks on symmetric-key or post-quantum cryptography schemes (as long as the key isn’t too long).

There are also intriguing possibilities for things other than cracking codes. For example, Grover’s algorithm could be used for some machine learning problems. Although it can’t be used to find the optimum solution to problems like the traveling salesman problem, it can find solutions which meet some minimum criteria (“travel time under three days”, for example). In many cases this is good enough.

Finally, because they can consider a range of possibilities, quantum computers have a natural application for complex modelling problems. Modelling financial data or risk is one potential application, and airline logistics is another one.

Natural sciences

Quantum computers were first proposed when physicists realised that their most powerful computers could not simulate even small quantum systems. (At the moment, the best known simulation, by an IBM team, can model 56 particles, but this required some very clever mathematics, several days of computation, and 3 terabytes of memory.)

Because more complex systems of particles cannot be simulated, it can be very difficult to understand them or make predictions about their behaviour. For example, some molecules have a property known as ‘strong correlation’, which means conventional chemistry techniques just don’t work. Even though we know the right equations, doing a precise calculation would take far too long, and doing an approximate calculation would give an answer which was ‘too wrong’. This hampers progress in a number of domains, such as low-temperature physics, material science, and drug design.

In general, there are two main plays for quantum; cost reduction for existing computations which are currently expensive, and actually doing calculations which are currently so expensive or slow they're effectively impossible.

Barriers to practical implementation

The same factors which make quantum theory so startling also make quantum computers very difficult to implement in practice: quantum phenomena don't manifest themselves in everyday life. Although quantum phenomena can be observed in the lab, it generally requires extreme conditions - individual isolated photons or particles, a vacuum, and temperatures a few thousandths of a degree above absolute zero. That’s colder than the space between the stars.

These conditions are challenging to achieve, but with enough resources and the benefit of modern science, they are technically possible. However, there's also a more philosophical challenge. The reason everyday things don't seem to be quantum is that they're macroscopic; quantum effects only manifest themselves at a tiny scale. When particles interact with other particles, they start to become part of a macroscopic - that is, classical - system. (Quite why macroscopic systems seem classical is still a matter of debate, but it’s broadly accepted that ‘decoherence’ of quantum states from contact with external influences is responsible.)

For a quantum computer to stay 'quantum', therefore, it has to be totally isolated from any interaction with the outside world. This is a bit of a paradox, because any 'wall' or 'barrier' is itself part of the outside world, and 'being contained by the wall' is itself an interaction. And it gets worse! Even if perfect isolation was technically possible, it wouldn't be what we wanted: we need to be able to interact with the system at the beginning of a computation to set the initial state of the bits (write to the quantum memory), and then interact with it again at the end of the calculation to measure their state and get the answer (read the quantum memory). And those operations need to be classical (we can’t ‘maybe set the memory to 0’ or ‘set the memory to 0 in one universe’).  In other words, we need to be able to turn on and off the connection to the outside world, and turn and off the very quantum-ness of the computer.

How current implementations work

Current quantum computer implementations do seem to be overcoming these barriers. However, they can do so only for a very short amount of time.  As of last year, a quantum computer could do calculations for about 0.0001 seconds before 'decoherence'; that is, before unwanted interactions with the outside world destroyed the quantum state.

All proposed quantum computers work by either isolating individual photons, or isolating individual atomic particles. Quantum effects (such as  double-slit interference) can be observed in molecules as big as buckyballs, but it’s a lot harder. Qubit implementations tend to be as small as possible to minimise decoherence.

Prospects for the future

Post-quantum cryptography

Once we have a fault-tolerant universal quantum computer with thousands of qubits (which is years away), public-key factoring-based cryptography algorithms, such as RSA, will no longer be secure. Other algorithms, such as those based on the discrete logarithm problem or the elliptic-curve discrete logarithm problem are equally vulnerable, since they can also be broken by Shor’s algorithm.

That doesn’t mean the end of public-key cryptography, though. (This is just as well, given how important public-key cryptography is to most of the internet, particularly online commerce.) A number of possible schemes exist, including "lattice-based" cryptography protocols.

Hardware Developments

Realising quantum advantage will require further developments in material science. Businesses, academic institutions, and governments are continuing to push up the qubit count, and quality. 

As quantum computers get bigger, the preferred architecture will probably shift from homogeneous to a modular one. Rather than every qubit being connected to every other one, scalability will be achieved by grouping qubits into isolated sub-units. It’s a bit like microservices, except for qubits instead of software. Just like with microservices, communication between the modules will be necessary, and will probably require special communication centres in the architecture. Either physical qubits will need to be moved around the network, or photons mixed in with the physical qubits to act as quantum-capable communication gateways.

All of the current quantum computers need to be run at operating temperatures of around -270 degrees C to protect the delicate quantum states from noise. As well as being expensive, this kind of extreme refrigeration is difficult and bulky. Finding a way to do quantum computation at room temperature is an area of ongoing research. In 2013, a Canadian research team showed that quantum states could survive at room temperature for 39 minutes, which would be enough for many computations. Unfortunately, that particular system still needed to be cooled to 4 Kelvin in order to set the initial state or read out the state, so it still needed a huge fridge, and was only room temperature for part of the time.  

Fault tolerance

Even when running colder than deep space, current quantum systems do suffer from significant levels of noise and errors. How to make quantum computers fault-tolerant is an area of active research. Classical computers are also subject to physical errors, but fixing those errors is fairly straightforward. The only kind of an error that a single bit can have is a bit-flip. To guard against bit-flips, it’s enough to have multiple copies of the same bit (say, 3), and then take a majority vote when measuring.

This algorithm doesn’t directly translate to quantum computers, because measuring the qubits to take a majority vote destroys the quantum-ness of the state. However, it is possible to do a ‘parity-check’ instead, to see if two qubits are in the same state. This can then be used to catch cases where a qubit is in the wrong state, and correct it by flipping it from up to down, or adjusting the phase. Error-correction protocols replace physical qubits with logical ones, where each logical qubit is made up of several physical ones. Even if the physical qubits are fragile, the logical qubit can stay robust.

The downside is that implementing this kind of fault tolerance involves a lot of information redundancy, and that means a lot of overhead. A quantum error-tolerance protocol needs at least five physical qubits per logical qubit. The current best protocol, surface code, can tolerate a higher operation inaccuracy (1%), but it requires nine physical qubits. Some quantum operations can’t be corrected just by spreading state across physical qubits. Fault tolerance for those operations is possible, but it need a steady injection of extra qubits (known as ‘magic states’) to keep the system error-free. Current quantum computers just don’t have enough qubits to spare for that kind of fault tolerance. It seems likely that practical fault tolerance will be a at least a decade away. One of the interesting questions at the moment is how to do useful quantum computation without fault tolerance. For some computations an approximate answer is good enough, so quantum computation is most likely to be useful for this category of problems, in the near term. For example, some quantum chemistry problems are so hard that an approximate quantum computer could significantly improve the accuracy of calculations.

When thinking about how powerful a quantum computer is, it’s important to consider the error rates, as well as the number of qubits. Increasing the number of qubits does not improve a quantum computer if the error rate also goes up. 

Quantum Cloud

Given the cost, size, and physical delicacy of quantum computers, they're a perfect fit for the 'pay per use' cloud consumption model. Since the computers need to be kept at temperatures below a Kelvin, the quantum cloud is definitely the coldest cloud, and makes the cooling requirements of most data centres look trivial. It's likely that more cloud providers will start offering quantum capabilities built into their current clouds.

Conceptual Compression

Conceptual Compression is a shrinking in the conceptual overhead of some programming tasks, so that developers need to understand far fewer concepts to take advantage of a technology. Another way of thinking of it is as a shift from low-level abstractions to higher-level abstractions - and, critically, making these abstractions non-leaky. It’s not a dumbing-down, it’s a freeing-up of developer brainpower to focus on more interesting things. Conceptual compression has been a steady trend in our industry from its earliest days. We have seen a shift from assembly languages to higher-level languages, the introduction of garbage collection to reduce the development cost (and functional impact) of memory management, the replacement of raw SQL calls with ORM, the introduction of highly accessible machine learning libraries, the replacement of hardware with IaaS, and the replacement of individual systems with PaaS. The move of quantum computers to the cloud is another example of conceptual compression; it means people who want do use them don’t have to worry about managing their own superconducting transmon qubits or the health of their dilution refrigerator.

There is a similar trend in quantum programming. Fifteen years ago, anyone wanting to implement a quantum algorithm would need to implement the gate sequences directly at the hardware-level. Now, tools such as the QISkit quantum SDK allow quantum programs to be written, and then compiled for execution on hardware. QISkit includes a python-based quantum DSL and qasm, a quantum assembly language. However, even with the python version, someone wanting to take advantage of quantum capabilities needs to understand quantum computation fundamentals. QISkit programs are written in terms of quantum gates and registers. To extend the compression analogy, compression reduces file size, but it doesn’t make files vanish. At the moment, the mental file-size required to write an effective quantum-based algorithm is still pretty big. It seems clear that, over time, quantum developers will be able to take advantage of more high-level abstractions.

In the future, we will almost certainly see the development of quantum libraries. We may even see the elimination of quantum libraries; that is, if quantum hardware becomes ubiquitous enough, quantum libraries may be replaced by general-purpose optimisation libraries which automatically choose which parts of a given calculation should be done in a quantum way, and which in a classical way. (This is similar to how modern machine learning libraries will interrogate system hardware and run GPU-optimised versions of calculations where GPUs are available, so that machine-learning developers do not need to think at the GPU-level.)

Quantum advantage and quantum readiness

Quantum advantage is defined as the ability of quantum computers to solve problems that classical computers cannot, for a practical purpose. Quantum advantage will have been achieved when quantum computers are large enough and robust enough to be useful. We’re not there yet, and we don’t even know exactly where the boundary is. However, we are in a period of “quantum readiness” where momentum is gathering and the pieces are slotting into place. 

Although no one can predict exactly when we will see quantum advantage, recent progress has been so impressive that it seems likely that the milestone is inevitable. At the moment, quantum computers are terrifically expensive, available only to a select few, limited in capacity, and, truth be told, fairly unreliable. This sounds depressing, but it's actually pretty similar to where classical computers were seventy or eighty years ago. In 1943, IBM's Thomas J. Watson told his investors "I think there is a world market for maybe five computers." By 1949, Popular Mechanics breathlessly predicted that “Computers in the future may weigh no more than 1.5 tons.” It's tempting to laugh at how unambitious those predictions were, with the benefit of seventy years' hindsight. It may be in another seventy years our descendants will also be laughing about how we got excited over 50 qubits and had to use huge refrigerated cylinders to achieve 0.0001 second coherence times. 

This is part three of a series on quantum computing. Part One, which provides an introduction to the topic and focuses on quantum computation, can be found at “Cats, Qubits, and Teleportation: The Spooky World of Quantum Computation (Part One)”. Part Two, which focuses on quantum algorithms can be found at “Cats, Qubits, and Teleportation: The Spooky World of Quantum Algorithms (Part 2)

About the Author

Holly Cummins is a full-stack developer, and cloud computing technical lead. She is also a frequent speaker, Java Champion, and author of Enterprise OSGi in Action. Holly holds a DPhil (PhD) in Quantum Computation from the University of Oxford.

Rate this Article

Adoption
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Community comments

  • Request for more

    by Ilya Komarov,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Thanks for the great introduction to the quantum computing. I'm excited and I want to know more.What would you recommend for deeper learning of the subject? (Some textbooks on quantum algorithms, some information of existing quantum computers).

  • Computing and physics

    by Teresa Schofield ,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Thank you for writing these nice intros to Quantum computing and how it may find real application in future. I shall certainly share them with young professionals I know.
    I'm an electronic engineer so have always known that Silicon physics affects technology progress. Hence, I watch companies like Rigetti or research at leading universities in the UK with some hope that they are creating the foundation for solving problems decades from now.
    FYI I was fortunate to attend the lecture that Sir Michael Pepper gave at the IET and recommend that to show how difficult this work is.
    I look forward to reading more from you as I see other good writing on AI topics by people like Rachel Thomas. Thank you.

  • Very informative article

    by Wang Guibao,

    Your message is awaiting moderation. Thank you for participating in the discussion.

    Very informative article

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

BT