Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 98 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 165 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Claude Sonnet 4 29 tok/s Pro
2000 character limit reached

On discrete least square projection in unbounded domain with random evaluations and its application to parametric uncertainty quantification (1403.6579v1)

Published 26 Mar 2014 in math.NA

Abstract: This work is concerned with approximating multivariate functions in unbounded domain by using discrete least-squares projection with random points evaluations. Particular attention are given to functions with random Gaussian or Gamma parameters. We first demonstrate that the traditional Hermite (Laguerre) polynomials chaos expansion suffers from the \textit{instability} in the sense that an \textit{unfeasible} number of points, which is relevant to the dimension of the approximation space, is needed to guarantee the stability in the least square framework. We then propose to use the Hermite/Laguerre {\em functions} (rather than polynomials) as bases in the expansion. The corresponding design points are obtained by mapping the uniformly distributed random points in bounded intervals to the unbounded domain, which involved a mapping parameter $L$. By using the Hermite/Laguerre {\em functions} and a proper mapping parameter, the stability can be significantly improved even if the number of design points scales \textit{linearly} (up to a logarithmic factor) with the dimension of the approximation space. Apart from the stability, another important issue is the rate of convergence. To speed up the convergence, an effective scaling factor is introduced, and a principle for choosing quasi-optimal scaling factor is discussed. Applications to parametric uncertainty quantification are illustrated by considering a random ODE model together with an elliptic problem with lognormal random input.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube