Papers
Topics
Authors
Recent
2000 character limit reached

LDP for the covariance process in fully connected neural networks

Published 12 May 2025 in math.PR | (2505.08062v1)

Abstract: In this work, we study large deviation properties of the covariance process in fully connected Gaussian deep neural networks. More precisely, we establish a large deviation principle (LDP) for the covariance process in a functional framework, viewing it as a process in the space of continuous functions. As key applications of our main results, we obtain posterior LDPs under Gaussian likelihood in both the infinite-width and mean-field regimes. The proof is based on an LDP for the covariance process as a Markov process valued in the space of non-negative, symmetric trace-class operators equipped with the trace norm.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 16 likes about this paper.