Scaled-Attachment Random Recursive Trees
- Scaled-Attachment Random Recursive Trees (SARRTs) are random tree models where each new node attaches using a scaled index from an i.i.d. random variable, generalizing uniform recursive trees.
- The construction interprets node depths as a renewal process with increments defined by -log(X), leading to precise asymptotic laws for typical depth and extremal branch lengths.
- SARRTs extend naturally to biased, power-of-choice, and greedy models, linking recursive network growth with continuous-time branching, martingale methods, and large deviation techniques.
Scaled-Attachment Random Recursive Trees (SARRTs) are random tree models that generalize classical recursive tree dynamics by coupling attachment decisions to scaled random processes. Formally, in a SARRT on vertices labelled $0, 1, ..., n$, each node connects to a parent given by , where are i.i.d. random variables sampled from a distribution on (Devroye et al., 2012). This framework encompasses uniform random recursive trees (URRTs) as the special case when , and can be extended to various "power-of-choice" and greedy structures. The model facilitates an elementary yet powerful analysis of distances and depths in recursively growing networks, revealing rich connections to renewal theory, large deviations, and probabilistic combinatorics.
1. Formal Construction and Recurrence
In a SARRT, the growth mechanism is defined by a scaled random attachment kernel. Node attaches to , interpreting this as a renewal step with random step-size . The process recursively yields a parent sequence: where denotes the ancestor at steps above .
This rule interpolates between uniform recursive trees (no bias, all previous nodes equally likely as parent) and heavily-biased trees (when is deterministic or strongly concentrated near $0$ or $1$), and can emulate attachment probability dictated by node degree, index, weight, or fitness.
2. Depth Distribution, Renewal Theory, and Central Limit Behavior
The depth of node (distance to the root) behaves as a renewal process with increments distributed as . Define and (finite when is sufficiently regular). Applying renewal theory yields: and, if ,
This law of large numbers and central limit theorem explicitly quantifies the typical depth scaling and its concentration for the entire class; thus calibrates the rate at which the tree deepens as it grows.
3. Maximum and Minimum Depth: Large Deviations and Tail Scaling
The extremal depth properties (height and minimum late-node depth ) are governed by large deviation principles involving the Legendre–Fenchel transform of the log-moment generating function : Define the rate function .
The height admits the asymptotic: with . The minimum depth among late nodes is: with
These formulas capture the rare-event behavior for "long" and "short" branches, respectively.
For the URRT case (), , and computation yields , establishing independently of branching random walk theory.
4. Generalizations to Power-of-Choice and Greedy Models
SARRT analysis extends to biased attachment kernels, e.g., choosing for i.i.d. uniform . Then , and scaling for proceeds via identical renewal and large deviation arguments. Greedy DAG variants or -dags' typical and maximum depths are thus captured in the same formalism.
5. Asymptotic Expressions and Universal Scaling Laws
Summarizing, | Statistic | Formula | Scaling Constant | |-------------------|-------------------------------------------------------|------------------------------------| | Typical Depth | | | | Height | | | | Min Late Depth | | or $0$ |
All constants depend only on the law of and are computable via integrals and rate function minimizations.
6. Connections to Continuous-Time Branching Processes and Exploration Algorithms
Recent work positions SARRTs at the interface of continuous time branching process theory. For instance, in network evolution models with limited memory (Angel et al., 21 Oct 2025), SARRTs with kernel correspond to recursive trees where each new vertex only attaches to later vertices, interpreted as the scaling parameter. The limiting local structure is expressed as a sin-tree generated by a continuous time branching process stopped at an exponential time.
Exploration algorithms developed for tracking the ancestral paths of youngest vertices reveal the relation between global height and local fringe distributions, and describe phase transitions in the geometry of the tree (e.g., polynomial versus logarithmic height asymptotics).
7. Tree Limits and Macroscopic Geometry
The limit theory for random trees ("long dendron" convergence) applies to SARRTs whenever the typical distance between random vertices, rescaled by , converges in probability to a constant $2a > 0$ (Janson, 2020). The global metric structure of large SARRTs thus reduces to a degenerate metric space where typical distances concentrate sharply.
8. Statistical Mechanics Connections: Broadcasting, Percolation, and Coalescents
SARRTs serve as a substrate for stochastic processes such as information broadcasting, percolation, and coalescence. For instance, depth-dependent broadcasting or two-colouring dynamics can be analyzed directly through SARRT scaling, with limiting distributions for monochromatic cluster sizes available via Pólya urn methods or analytic combinatorics (Desmarais et al., 2021). Coalescent processes induced by operations such as tree "lifting" can also, in principle, be studied in SARRTs, predicting genealogical partition dynamics via multiple-merger coalescents with attachment-parameter-dependent rate measures (Pitters, 2016).
9. Martingale Methods and Random Recursive Metric Spaces
Generalizations to random recursive metric spaces reveal that SARRTs are instances where each "block" is an edge with an attachment probability determined by a scaling kernel (Desmarais, 2022). The insertion depth (distance from root to newly-inserted vertex) admits a martingale central limit theorem with explicit scaling: when the attachment kernel is parametrized by weight variables .
10. Summary and Broader Implications
SARRTs constitute a unifying probabilistic model for recursive network growth, interpolating between uniform and preferential attachment via a scaling kernel. The renewal-theoretic analysis yields explicit asymptotic laws for the typical, maximum, and minimum depths, governed by the mean increment and large deviation constants tied to the rate function . These results not only provide elementary proofs for classical recursive tree statistics but also extend to diverse greedy and power-of-choice models, network exploration algorithms, and coalescent processes.
The deep connections to renewal theory, large deviation techniques, continuum tree limits, and martingale methods position SARRTs as a flexible framework for analyzing random recursive structures in combinatorics, probability, and statistical network science. The universality of the logarithmic depth law, the possibility of controlling tail behavior via kernel adjustments, and the extension to random measure trees and higher-dimensional metric spaces all point to the enduring significance of the scaled-attachment paradigm.