BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News IBM May Have Found a Path to Dealing with Decoherence in Current Quantum Computers

IBM May Have Found a Path to Dealing with Decoherence in Current Quantum Computers

Bookmarks

In a recent Nature paper, researchers from IBM and other institutions devised two quantum algorithms to train a kind of support vector machine (SVM) classifier using two different methods, a variational quantum circuit and a quantum kernel estimator. The key idea behind both algorithms is using the quantum state space as a representation of the feature space to efficiently build a map to identify significant features from raw data. Identifying the right features in data is extremely important to building a classifier that works correctly, but tends to become a computationally expensive problem with large datasets.

We’ve shown that as quantum computers become more powerful in the years to come, and their Quantum Volume increases, they will be able to perform feature mapping, a key component of machine learning, on highly complex data structures at a scale far beyond the reach of even the most powerful classical computers.

This is where the exponentially large quantum state space and the ability of quantum computers to explore it in parallel come into play. According to the paper authors, since the ability of using an exponentially growing quantum space is only efficiently accessible on a quantum computer, their research could provide a lead into quantum advantage.

Even more significantly, IBM researchers seem to have foungnd a novel way to deal with decoherence, which is a critical factor in the process of bringing a quantum system into a quasi-classical state and carrying through any measurements. The main issue with decoherence is the fast decay of a wave function, which has the undesirable effect of generating noise and errors after a very short time period. The paper proposes two approaches, one called probabilistic error correction and the other zero noise extrapolation, to keep decoherence under control.

Just as significantly, our feature-mapping worked as predicted: no classification errors with our engineered data, even as the IBM Q systems’ processors experienced decoherence.

Error correction and noise represent one of the most significant hurdles to overcome to make quantum hardware useful. So, this result could open the way to more practical uses of quantum computing today.

Along with quantum supremacy – the conjecture that quantum computers have the ability to solve problems that classical computers cannot – quantum machine learning is an important area of research. Earlier this year, InfoQ covered a different result in quantum machine learning that was brought about by researchers at Google and proposed a model of neural networks fitting within the current limitation of quantum hardware.

Rate this Article

Adoption
Style

BT