2000 character limit reached
VaultGemma: A Differentially Private Gemma Model (2510.15001v1)
Published 15 Oct 2025 in cs.CR and cs.AI
Abstract: We introduce VaultGemma 1B, a 1 billion parameter model within the Gemma family, fully trained with differential privacy. Pretrained on the identical data mixture used for the Gemma 2 series, VaultGemma 1B represents a significant step forward in privacy-preserving LLMs. We openly release this model to the community
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.