2000 character limit reached
Efficient Sketches for Training Data Attribution and Studying the Loss Landscape (2402.03994v2)
Published 6 Feb 2024 in cs.LG and stat.ML
Abstract: The study of modern machine learning models often necessitates storing vast quantities of gradients or Hessian vector products (HVPs). Traditional sketching methods struggle to scale under these memory constraints. We present a novel framework for scalable gradient and HVP sketching, tailored for modern hardware. We provide theoretical guarantees and demonstrate the power of our methods in applications like training data attribution, Hessian spectrum analysis, and intrinsic dimension computation for pre-trained LLMs. Our work sheds new light on the behavior of pre-trained LLMs, challenging assumptions about their intrinsic dimensionality and Hessian properties.