Minimising statistical errors in calibration of quantum-gate sets
Abstract: Calibration of quantum gates is a necessary hurdle to overcome on the way to a reliable quantum computer. In a paper, a protocol called Gate Set Calibration protocol (GSC) has been introduced and used to learn coherent errors from multi-qubit quantum gates. Here, we extend this study in a number of ways: First, we perform a statistical analysis of the measurement uncertainties. Second, we find explicit measurement settings that minimize this uncertainty, while also requiring that the protocol involves only a small number of distinct gates, aiding physical realizability. We numerically demonstrate that, just by adding two more single-qubit gates to GSC, the statistical error produced in the calibration of a CNOT gate is divided by a factor of more than two.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.