Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework (2212.08349v1)

Published 16 Dec 2022 in cs.LG, cs.AI, cs.CL, and cs.CR

Abstract: Knowledge distillation (KD) has been widely used for model compression and knowledge transfer. Typically, a big teacher model trained on sufficient data transfers knowledge to a small student model. However, despite the success of KD, little effort has been made to study whether KD leaks the training data of the teacher model. In this paper, we experimentally reveal that KD suffers from the risk of privacy leakage. To alleviate this issue, we propose a novel knowledge distillation method, swing distillation, which can effectively protect the private information of the teacher model from flowing to the student model. In our framework, the temperature coefficient is dynamically and adaptively adjusted according to the degree of private information contained in the data, rather than a predefined constant hyperparameter. It assigns different temperatures to tokens according to the likelihood that a token in a position contains private information. In addition, we inject noise into soft targets provided to the student model, in order to avoid unshielded knowledge transfer. Experiments on multiple datasets and tasks demonstrate that the proposed swing distillation can significantly reduce (by over 80% in terms of canary exposure) the risk of privacy leakage in comparison to KD with competitive or better performance. Furthermore, swing distillation is robust against the increasing privacy budget.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Junzhuo Li (10 papers)
  2. Xinwei Wu (10 papers)
  3. Weilong Dong (9 papers)
  4. Shuangzhi Wu (29 papers)
  5. Chao Bian (21 papers)
  6. Deyi Xiong (104 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.