Pablo Bermejo (DIPC-Multiverse Computing)
We are currently experiencing the era of NISQ (Noise Intermediate Scale Quantum) devices. This means that there is still an uncertain near-term future for fields such as quantum computing, which completely relies on quantum hardware that is not yet capable of delivering the required performance to achieve game-changer applications. In spite of this obstacle, we are navigating an exciting landscape of interesting contributions to this promising area.
In this talk we will explore one of the most promising topics within quantum computing: quantum machine learning. This intersection of quantum computing and machine learning is firmly believed to actually bring some advantages over classical paradigms in the upcoming years. Quantum machine learning has inherent features that make it an attractive area of research for both communities, which have focused lately on its implementation in NISQ devices. Here, I will talk about a few algorithms that we have developed in this context. This will build upon two of the previous seminars given in this series, introducing the VQE algorithm (Cristian Tabares from QUINFOG) and the concept of barren plateaus and its caveats for optimization processes (Nadir Samos from INMA). First, we will see how to perform unsupervised learning (clustering) using as few quantum resources as 1 qubit, and how to solve continuous variable optimization problems in quantum circuits at the expense of tomography. Besides, we will explain in detail a novel approach to tackle common barriers appearing in optimization problems employing gradient methods (mostly in the form of barren plateaus and local minima) based on coordinates transformations. We will also include several benchmarks performed on well-known quantum machine learning algorithms.