Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RefBERT: Compressing BERT by Referencing to Pre-computed Representations (2106.08898v1)

Published 11 Jun 2021 in cs.CL and cs.LG

Abstract: Recently developed large pre-trained LLMs, e.g., BERT, have achieved remarkable performance in many downstream natural language processing applications. These pre-trained LLMs often contain hundreds of millions of parameters and suffer from high computation and latency in real-world applications. It is desirable to reduce the computation overhead of the models for fast training and inference while keeping the model performance in downstream applications. Several lines of work utilize knowledge distillation to compress the teacher model to a smaller student model. However, they usually discard the teacher's knowledge when in inference. Differently, in this paper, we propose RefBERT to leverage the knowledge learned from the teacher, i.e., facilitating the pre-computed BERT representation on the reference sample and compressing BERT into a smaller student model. To guarantee our proposal, we provide theoretical justification on the loss function and the usage of reference samples. Significantly, the theoretical result shows that including the pre-computed teacher's representations on the reference samples indeed increases the mutual information in learning the student model. Finally, we conduct the empirical evaluation and show that our RefBERT can beat the vanilla TinyBERT over 8.1\% and achieves more than 94\% of the performance of $\BERTBASE$ on the GLUE benchmark. Meanwhile, RefBERT is 7.4x smaller and 9.5x faster on inference than BERT$_{\rm BASE}$.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Xinyi Wang (152 papers)
  2. Haiqin Yang (32 papers)
  3. Liang Zhao (353 papers)
  4. Yang Mo (11 papers)
  5. Jianping Shen (13 papers)
Citations (3)