Fourier Uncertainty Principles, Scale Space Theory and the Smoothest Average (2005.01665v1)
Abstract: Let $f \in L{2}(\mathbb{R}n)$ and suppose we are interested in computing its average at a fixed scale. This is easy: we pick the density $u_{}$ of a probability distribution with mean 0 and some moment at the desired scale and compute the convolution $u_{} * f$. Is there a particularly natural choice for $u$? This question is studied in scale space theory and the Gaussian is a popular answer. We were interested whether a canonical choice for $u$ can arise from a new axiom: having fixed a scale, the average should oscillate as little as possible, i.e. $$ u_{} = \arg\min_{u_{}} \sup_{f \in L2(\mathbb{R}n)} \frac{| \nabla (u_{} *f) |{L2(\mathbb{R}n)}}{|f|{L2(\mathbb{R}n)}}.$$ This optimal function turns out to be a minimizer of an uncertainty principle: for $\alpha > 0$ and $\beta > n/2$, there exists $c_{\alpha, \beta,n} > 0$ such that for all $u \in L1(\mathbb{R}n)$ $$ | |\xi|{\beta} \cdot \widehat{u}|{\alpha}_{L{\infty}(\mathbb{R}n)} \cdot | |x|{\alpha} \cdot u |{\beta}_{L1(\mathbb{R}n)} \geq c_{\alpha, \beta,n} |u|{L1(\mathbb{R}n)}{\alpha + \beta}.$$ For $\beta = 1$, any nonnegative extremizer of the inequality serves as the best averaging function in the sense above, $\beta \neq 1$ corresponds to other derivatives. For $(n, \beta)=(1,1)$ we use the Shannon-Whittaker formula to prove that the characteristic function $u(x) = \chi{[-1/2,1/2]}$ is a local minimizer among functions defined on $[-1/2,1/2]$ for $\alpha \in \left{2,3,4,5,6\right}$. We provide a sufficient condition for general $\alpha$ in terms of a sign pattern for the hypergeometric function $_1F_2$.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.