Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DE$^3$-BERT: Distance-Enhanced Early Exiting for BERT based on Prototypical Networks (2402.05948v1)

Published 3 Feb 2024 in cs.LG and cs.CL

Abstract: Early exiting has demonstrated its effectiveness in accelerating the inference of pre-trained LLMs like BERT by dynamically adjusting the number of layers executed. However, most existing early exiting methods only consider local information from an individual test sample to determine their exiting indicators, failing to leverage the global information offered by sample population. This leads to suboptimal estimation of prediction correctness, resulting in erroneous exiting decisions. To bridge the gap, we explore the necessity of effectively combining both local and global information to ensure reliable early exiting during inference. Purposefully, we leverage prototypical networks to learn class prototypes and devise a distance metric between samples and class prototypes. This enables us to utilize global information for estimating the correctness of early predictions. On this basis, we propose a novel Distance-Enhanced Early Exiting framework for BERT (DE$3$-BERT). DE$3$-BERT implements a hybrid exiting strategy that supplements classic entropy-based local information with distance-based global information to enhance the estimation of prediction correctness for more reliable early exiting decisions. Extensive experiments on the GLUE benchmark demonstrate that DE$3$-BERT consistently outperforms state-of-the-art models under different speed-up ratios with minimal storage or computational overhead, yielding a better trade-off between model performance and inference efficiency. Additionally, an in-depth analysis further validates the generality and interpretability of our method.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jianing He (3 papers)
  2. Qi Zhang (784 papers)
  3. Weiping Ding (53 papers)
  4. Duoqian Miao (25 papers)
  5. Jun Zhao (469 papers)
  6. Liang Hu (64 papers)
  7. Longbing Cao (85 papers)
X Twitter Logo Streamline Icon: https://streamlinehq.com