When compared to fault-tolerant quantum computational strategies, variational quantum algorithms stand as one of the candidates with the potential of achieving quantum advantage for real-world applications in the near term. However, the optimization of the circuit parameters remains arduous and is impeded by many obstacles such as the presence of barren plateaus, many local minima in the optimization landscape, and limited quantum resources. A non-random initialization of the parameters seems to be key to the success of the parametrized quantum circuits (PQC) training. Drawing and extending ideas from the field of meta-learning, we address this parameter initialization task with the help of machine learning and propose FLIP: a FLexible Initializer for arbitrarily-sized Parametrized quantum circuits.
FLIP can be applied to any family of PQCs, and instead of relying on a generic set of initial parameters, it is tailored to learn the structure of successful parameters from a family of related problems which are used as the training set. The flexibility advocated to FLIP hinges in the possibility of predicting the initialization of parameters in quantum circuits with a larger number of parameters from those used in the training phase. This is a critical feature lacking in other meta-learning parameter initializing strategies proposed to date. We illustrate the advantage of using FLIP in three scenarios: a family of problems with proven barren plateaus, PQC training to solve max-cut problem instances, and PQC training for finding the ground state energies of 1D Fermi-Hubbard models.