Random Growth via Gradient Flow Aggregation (2309.14313v1)
Abstract: We introduce Gradient Flow Aggregation (GFA), a random growth model. Given a set of existing particles $\left{x_1, \dots, x_n\right} \subset \mathbb{R}2$, a new particle arrives from a random direction at $\infty$ and flows in direction $\nabla E$ where $$ E(x) = \sum_{i=1}{n} \frac{1}{|x-x_i|{\alpha}} \qquad \mbox{where} ~0 < \alpha < \infty.$$ The case $\alpha = 0$ will refer to the logarithmic energy $- \sum\log |x-x_i|$. Particles stop once they are at distance 1 of one of the existing particles at which point they are added to the set and remain fixed for all time. We prove, under a non-degeneracy assumption, a Beurling-type estimate which, via Kesten's method, can be used to deduce sub-ballistic growth for $0 \leq \alpha < 1$ $$\mbox{diam}(\left{x_1, \dots, x_n\right}) \leq c_{\alpha} \cdot n{\frac{3 \alpha +1}{2\alpha + 2}}.$$ This is optimal when $\alpha =0$. The case $\alpha = 0$ leads to a `round' full-dimensional tree. The larger the value of $\alpha$ the sparser the tree. Some instances of the higher-dimensional setting are also discussed.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.