Minimal prior knowledge needed for few augmentation steps
Determine the minimum amount of pre-training parametric knowledge, modeled as a partial subgraph G of a ground-truth knowledge graph G*, that suffices for a system to answer queries using only a small number of test-time augmentation steps (e.g., retrieval or verification queries). Formulate the requirement precisely for multi-step reasoning tasks such as s–t connectivity, and characterize thresholds or conditions under which constant expected augmentation is achievable across relevant graph families and observation models.
References
Specifically, it is not clear how much pre-training knowledge is required to answer queries with a small number of augmentation steps, which is a desirable property in practice.
— Prior Makes It Possible: From Sublinear Graph Algorithms to LLM Test-Time Methods
(2510.16609 - Blum et al., 18 Oct 2025) in Abstract