Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
91 tokens/sec
Gemini 2.5 Pro Premium
42 tokens/sec
GPT-5 Medium
18 tokens/sec
GPT-5 High Premium
12 tokens/sec
GPT-4o
92 tokens/sec
DeepSeek R1 via Azure Premium
92 tokens/sec
GPT OSS 120B via Groq Premium
480 tokens/sec
Kimi K2 via Groq Premium
195 tokens/sec
2000 character limit reached

Deep Quantum Neural Networks are Gaussian Process (2305.12664v1)

Published 22 May 2023 in quant-ph

Abstract: The overparameterization of variational quantum circuits, as a model of Quantum Neural Networks (QNN), not only improves their trainability but also serves as a method for evaluating the property of a given ansatz by investigating their kernel behavior in this regime. In this study, we shift our perspective from the traditional viewpoint of training in parameter space into function space by employing the Bayesian inference in the Reproducing Kernel Hilbert Space (RKHS). We observe the influence of initializing parameters using random Haar distribution results in the QNN behaving similarly to a Gaussian Process (QNN-GP) at wide width or, empirically, at a deep depth. This outcome aligns with the behaviors observed in classical neural networks under similar circumstances with Gaussian initialization. Moreover, we present a framework to examine the impact of finite width in the closed-form relationship using a $ 1/d$ expansion, where $d$ represents the dimension of the circuit's Hilbert space. The deviation from Gaussian output can be monitored by introducing new quantum meta-kernels. Furthermore, we elucidate the relationship between GP and its parameter space equivalent, characterized by the Quantum Neural Tangent Kernels (QNTK). This study offers a systematic way to study QNN behavior in over- and under-parameterized scenarios, based on the perturbation method, and addresses the limitations of tracking the gradient descent methods for higher-order corrections like dQNTK and ddQNTK. Additionally, this probabilistic viewpoint lends itself naturally to accommodating noise within our model.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)