# Noisy Intermediate-Scale Quantum Technology

This blog is about the overview of the emerging quantum computing era after proven quantum supremacy by google.

Recently google can up with the white paper for the quantum computing library that it has come up in collaboration with the University of Waterloo, and Volkswagen. This library can be used to build prototype machine learning models using the TFQ i.e. Tensorflow Quantum.

Under the hood, TFQ uses CIRQ and TF. CIRQ is a python library for writing code synonymous with the quantum circuits for quantum chips or the simulators.CIRQ was developed to support the software capability for the NISQ quantum chips i.e. Noisy Intermediate-Scale Quantum.

This blog, we will focus on the NISQ and in the next blog, we will summarize the white paper on TFQ by TensorFlow and then we will build our own model using TFQ in simulation. For starters, you can go through this amazing** ****series of Quantum computing by Jonathan Hui.** This series will help you understand the quantum basics in terms of notations, quantum vector math, single qubit-gates, multiple qubit-gates and then some real-life application algorithm leveraging the quantum computing to overcome the curse of computational complexity like that of Shor’s Algorithm that is used to break the RSA encryption.

Therefore, If you go through that series first and then read these blogs, it will make more sense to you.

So the very elementary question that comes to mind is why quantum computing? It turns out that there are a lot of physical processes that cannot be modelled by classical computers because of the limitation in the expressiveness of these classical computers. The concept of superposition and entanglement are the keys in the quantum mechanical model to solve these problems.

With quantum computers we should be able to probe more deeply into the properties of complex molecules and exotic materials, and also to explore fundamental physics in new ways, for example by simulating the properties of elementary particles, or the quantum behaviour of a black hole, or the evolution of the universe right after the big bang.

There are two principles upon which we have put bet on the fact quantum computing will lead us to the better frontier of solving problems. The first is the quantum complexity and second is the quantum error correction. Underlying both of these principles is the idea of quantum entanglement.

Long story short, quantum entanglement is like reading a book where each page of the book is not is sequence to the next page, its rather all the pages are interlinked to each other in some random fashion. To understand the book as a whole you need to decode the underlying interlinking of all the pages.

There are three good reasons for thinking that quantum computers have capabilities surpassing what classical computers can do.

**Quantum algorithms for classically intractable problems.**First, we know of problems that are believed to be hard for classical computers, but for which quantum algorithms have been discovered that could solve these problems easily. The best known example is the problem of finding the prime factors of a large composite integer. We believe factoring is hard because many smart people have tried for many decades to find better factoring algorithms and haven’t succeeded. Perhaps a fast classical factoring algorithm will be discovered in the future, but that would be a big surprise.**Complexity theory arguments.**The theoretical computer scientists have provided arguments, based on complexity theory, showing (under reasonable assumptions) that quantum states which are easy to prepare with a quantum computer have superclassical properties; specifically, if we measure all the qubits in such a state we are sampling from a correlated probability distribution that can’t be sampled from by any efficient classical means.**No known classical algorithm can simulate a quantum computer.**But perhaps the most persuasive argument we have that quantum computing is powerful is simply that we don’t know how to simulate a quantum computer using a digital computer; that remains true even after many decades of effort by physicists to find better ways to simulate quantum systems.

# NISQ era

- NISQ - This stands for Noisy IntermediateScale Quantum. Here “intermediate scale” refers to the size of quantum computers which will be available in the next few years, with a number of qubits ranging from 50 to a few hundred.
- “Noisy” emphasizes that we’ll have imperfect control over those qubits; the noise will place serious limitations on what quantum devices can achieve in the near term.

## Qubit quality

- The quality of qubits determines with what accuracy we can actually perform quantum operations.
- We don’t have an estimation of whether low error rates can be maintained in the larger devices or larger quantum circuitries or not. Roughly, we can say, we don’t expect to be able to execute a circuit that contains many more than 1000 gates i.e. 1000 fundamental two-qubit operations because the noise will overwhelm the signal in a circuit bigger than that.
- You cannot add quantum gates for the noise error correction because then you have to compromise on the computational capability of the circuit.

**Quantum Optimizer**

- For some optimization problems, even finding an approximation solution is NP-hard if the approximation ratio is sufficiently close to one, and in those cases, we don’t expect a quantum computer to be able to find the approximate solution efficiently for hard instances of the problem. But for many problems there is a big gap between the approximation achieved by the best classical algorithm we currently know and the barrier of NP-hardness. So it would not be shocking to discover that quantum computers have an advantage over classical ones for the task of finding approximate solutions, an advantage that some users might find quite valuable
**Hybrid quantum-classical algorithm**We use a quantum processor to prepare an n-qubit state, then measure all the qubits and process the measurement outcomes using a classical optimizer; this classical optimizer instructs the quantum processor to alter slightly how the n-qubit state is prepared. This cycle is repeated many times until it converges to a quantum state from which the approximate solution can be extracted. When applied to classical combinatorial optimization problems, this procedure goes by the name**Quantum Approximate Optimization Algorithm (QAOA)****.**When applied to quantum problems this hybrid quantum-classical procedure goes by the name Variational Quantum Eigensolver (VQE).

# Quantum deep learning

- Machine learning is transforming technology and having a big impact on science as well, so it is natural to wonder about the potential of combining machine learning with quantum technology. There are a variety of different notions of “quantum machine learning.” Much of the literature on the subject builds on quantum algorithms that speed up linear algebra related tasks.
- Deep learning itself has become a broad subject, but to be concrete let us contemplate a restricted Boltzmann machine. This may be regarded as a spin system in thermal equilibrium at a low but nonzero temperature, with many hidden layers of spins separating the input and output. (The word “restricted” means that there are no couplings among spins within a given layer; instead only spins in successive layers are coupled.) The system may have millions of coupling parameters, which are optimized during a training phase to achieve a desired joint probability distribution for input and output.
- The quantum analog of such a machine could have a similar structure and function, but where the spins are qubits rather than classical bits. It may be that quantum deep learning networks have advantages over classical ones; for example, they might be easier to train for some purposes. But we don’t really know — it’s another opportunity to try it and see how well it works.
- One possible reason for being hopeful about the potential of quantum machine learning rests on the concept known as QRAM — quantum random access memory. By QRAM we mean representing a large amount of classical data very succinctly, by encoding a vector with N components in just log N qubits. But typical proposals for quantum machine learning applications are confounded by severe input/output bottlenecks.
- More broadly, it seems plausible that quantum deep learning would be more powerful than its classical counterpart when trying to learn probability distributions for which quantum entanglement has a significant role. That is, quantum deep learning networks might be very well suited for quantum tasks, but for applications of deep learning that are widely pursued today it is unclear why quantum networks would have an advantage.

So this was just an overview of NISQ, It was important to touch the surface of NISQ so that we could get the context of TFQ. In the next piece, we will talk about TFQ white-paper.

Source : https://arxiv.org/pdf/1801.00862.pdf