In this work, we aim to solve a crucial shortcoming of important near-term quantum algorithms. To run powerful, far-term quantum algorithms, one needs a large, nearly flawless quantum computer. The demands are well beyond the capabilities of today’s near-term intermediate-scale quantum (NISQ) devices. In the past decade, though, “near-term” quantum algorithms such as the variational quantum eigensolver (VQE) have shown promise for making use of NISQ devices. Unfortunately, we are finding evidence that they are unlikely to outperform supercomputers for solving problems of commercial value.
The bottleneck is the staggering number of measurements of the quantum computer required. These measurements of the quantum computer return statistical samples. For valuable problems, the number of samples, and hence time, required for sufficiently accurate estimates is prohibitively large. This has been referred to as the “measurement problem”.
Our proposed solution to the measurement problem aims to maximize the rate of information gain during the sampling process. We optimize this rate of information using our framework of engineered likelihood functions (ELFs). This leads to the design of quantum circuits which balance the accrual of error with the increase in statistical power afforded by circuit depth. We hope this work provides an avenue for achieving quantum advantage as soon as possible.
Send the Zapata team an email: email@example.com
Our most recent publications