2000 character limit reached
A multidimensional Tauberian theorem for Laplace transforms of ultradistributions (1902.00902v2)
Published 3 Feb 2019 in math.FA
Abstract: We obtain a multidimensional Tauberian theorem for Laplace transforms of Gelfand-Shilov ultradistributions. The result is derived from a Laplace transform characterization of bounded sets in spaces of ultradistributions with supports in a convex acute cone of $\mathbb{R}{n}$, also established here.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.