2000 character limit reached
A multidimensional Tauberian theorem for Laplace transforms of ultradistributions
Published 3 Feb 2019 in math.FA | (1902.00902v2)
Abstract: We obtain a multidimensional Tauberian theorem for Laplace transforms of Gelfand-Shilov ultradistributions. The result is derived from a Laplace transform characterization of bounded sets in spaces of ultradistributions with supports in a convex acute cone of $\mathbb{R}{n}$, also established here.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.