Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unifying (Quantum) Statistical and Parametrized (Quantum) Algorithms (2310.17716v1)

Published 26 Oct 2023 in quant-ph, cs.LG, and stat.ML

Abstract: Kearns' statistical query (SQ) oracle (STOC'93) lends a unifying perspective for most classical machine learning algorithms. This ceases to be true in quantum learning, where many settings do not admit, neither an SQ analog nor a quantum statistical query (QSQ) analog. In this work, we take inspiration from Kearns' SQ oracle and Valiant's weak evaluation oracle (TOCT'14) and establish a unified perspective bridging the statistical and parametrized learning paradigms in a novel way. We explore the problem of learning from an evaluation oracle, which provides an estimate of function values, and introduce an extensive yet intuitive framework that yields unconditional lower bounds for learning from evaluation queries and characterizes the query complexity for learning linear function classes. The framework is directly applicable to the QSQ setting and virtually all algorithms based on loss function optimization. Our first application is to extend prior results on the learnability of output distributions of quantum circuits and Clifford unitaries from the SQ to the (multi-copy) QSQ setting, implying exponential separations between learning stabilizer states from (multi-copy) QSQs versus from quantum samples. Our second application is to analyze some popular quantum machine learning (QML) settings. We gain an intuitive picture of the hardness of many QML tasks which goes beyond existing methods such as barren plateaus and the statistical dimension, and contains crucial setting-dependent implications. Our framework not only unifies the perspective of cost concentration with that of the statistical dimension in a unified language but exposes their connectedness and similarity.

Citations (5)

Summary

We haven't generated a summary for this paper yet.