Papers
Topics
Authors
Recent
Search
2000 character limit reached

An Optimal Weighted Least-Squares Method for Operator Learning

Published 11 Dec 2025 in math.NA | (2512.11168v1)

Abstract: We consider the problem of learning an unknown, possibly nonlinear operator between separable Hilbert spaces from supervised data. Inputs are drawn from a prescribed probability measure on the input space, and outputs are (possibly noisy) evaluations of the target operator. We regard admissible operators as square-integrable maps with respect to a fixed approximation measure, and we measure reconstruction error in the corresponding Bochner norm. For a finite-dimensional approximation space $V$ of dimension $N$, we study weighted least squares estimators in $V$ and establish probabilistic stability and accuracy bounds in the Bochner norm. We show that there exist sampling measures and weights - defined via an operator-level Christoffel function - that yield uniformly well-conditioned Gram matrices and near-optimal sample complexity, with a number of training samples $M$ on the order of $N \log N$. We complement the analysis by constructing explicit operator approximation spaces in cases of interest: rank-one linear operators that are dense in the class of bounded linear operators, and rank-one polynomial operators that are dense in the Bochner space under mild assumptions on the approximation measure. For both families we describe implementable procedures for sampling from the associated optimal measures. Finally, we demonstrate the effectiveness of this framework on several benchmark problems, including learning solution operators for the Poisson equation, viscous Burgers' equation, and the incompressible Navier-Stokes equations.

Summary

Paper to Video (Beta)

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.