Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
114 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
35 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Artemis: Efficient Commit-and-Prove SNARKs for zkML (2409.12055v2)

Published 18 Sep 2024 in cs.CR

Abstract: Ensuring that AI models are both verifiable and privacy-preserving is important for trust, accountability, and compliance. To address these concerns, recent research has focused on developing zero-knowledge machine learning (zkML) techniques that enable the verification of various aspects of ML models without revealing sensitive information. However, while recent zkML advances have made significant improvements to the efficiency of proving ML computations, they have largely overlooked the costly consistency checks on committed model parameters and input data, which have become a dominant performance bottleneck. To address this gap, this paper introduces a new Commit-and-Prove SNARK (CP-SNARK) construction, Artemis, that effectively addresses the emerging challenge of commitment verification in zkML pipelines. In contrast to existing approaches, Artemis is compatible with any homomorphic polynomial commitment, including those without trusted setup. We present the first implementation of this CP-SNARK, evaluate its performance on a diverse set of ML models, and show substantial improvements over existing methods, achieving significant reductions in prover costs and maintaining efficiency even for large-scale models. For example, for the VGG model, we reduce the overhead associated with commitment checks from 11.5x to 1.1x. Our results indicate that Artemis provides a concrete step toward practical deployment of zkML, particularly in settings involving large-scale or complex models.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.