Diversity Is All You Need for Contrastive Learning: Spectral Bounds on Gradient Magnitudes
Abstract: We derive non-asymptotic spectral bands that bound the squared InfoNCE gradient norm via alignment, temperature, and batch spectrum, recovering the (1/\tau{2}) law and closely tracking batch-mean gradients on synthetic data and ImageNet. Using effective rank (R_{\mathrm{eff}}) as an anisotropy proxy, we design spectrum-aware batch selection, including a fast greedy builder. On ImageNet-100, Greedy-64 cuts time-to-67.5\% top-1 by 15\% vs.\ random (24\% vs.\ Pool--P3) at equal accuracy; CIFAR-10 shows similar gains. In-batch whitening promotes isotropy and reduces 50-step gradient variance by (1.37\times), matching our theoretical upper bound.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.